
The thrill of working at a start-up that is starting to scale massively is something else. Simpl (FinTech startup of the year - 2020) was formed in 2015 by Nitya Sharma, an investment banker from Wall Street and Chaitra Chidanand, a tech executive from the Valley, when they teamed up with a very clear mission - to make money simple so that people can live well and do amazing things. Simpl is the payment platform for the mobile-first world, and we’re backed by some of the best names in fintech globally (folks who have invested in Visa, Square and Transferwise), and
has Joe Saunders, Ex Chairman and CEO of Visa as a board member.
Everyone at Simpl is an internal entrepreneur who is given a lot of bandwidth and resources to create the next breakthrough towards the long term vision of “making money Simpl”. Our first product is a payment platform that lets people buy instantly, anywhere online, and pay later. In
the background, Simpl uses big data for credit underwriting, risk and fraud modelling, all without any paperwork, and enables Banks and Non-Bank Financial Companies to access a whole new consumer market.
In place of traditional forms of identification and authentication, Simpl integrates deeply into merchant apps via SDKs and APIs. This allows for more sophisticated forms of authentication that take full advantage of smartphone data and processing power
Skillset:
Workflow manager/scheduler like Airflow, Luigi, Oozie
Good handle on Python
ETL Experience
Batch processing frameworks like Spark, MR/PIG
File formats: parquet, JSON, XML, thrift, avro, protobuff
Rule engine (drools - business rule management system)
Distributed file systems like HDFS, NFS, AWS, S3 and equivalent
Built/configured dashboards
Nice to have:
Data platform experience for eg: building data lakes, working with near - realtime
applications/frameworks like storm, flink, spark.
AWS
File encoding types: Thrift, Avro, Protobuff, Parquet, JSON, XML
HIVE, HBASE

About Simpl
About
Simpl has revolutionized online checkout in India by creating a Pay-Later platform, empowering e-commerce merchants to offer their consumers a 1-click checkout, a line of credit at POS, and full buyer protection. It aims to empower merchants to own their customer's checkout experience. With Simpl merchants are able to provide consumers with an easy, safe, and intuitive user experience that builds a trusted relationship between the two.
Connect with the team
Company social profiles
Similar jobs
Job Title : Visualizer – Pitches & Proposals
Experience : 7 to 12 Years
Location : Gurgaon, India
Shift Timing : 11:00 AM – 8:00 PM IST
Work Setup : Hybrid – 3 Days from Office, 2 Days from Home
Work Days : Monday to Friday | Full-Time
Job Overview :
We are looking for a highly experienced Visualizer – Pitches & Proposals to join our Global Visual Solutions team.
The role requires exceptional communication skills, as you’ll work closely with European stakeholders to design impactful pitch decks, proposals, infographics, and co-branded materials.
A keen eye for visual storytelling, brand alignment, and high-end design execution is essential.
Mandatory Skills :
PowerPoint, Adobe InDesign, Photoshop, Illustrator, XD, Pitch Decks, Infographics, Proposals, Branding, Visual Storytelling.
Key Responsibilities :
- Design and visually enhance pitch decks, bid proposals, co-branded templates, and presentations.
- Translate complex data and narratives into infographics, data visualizations, and visually appealing layouts.
- Collaborate with global teams to ensure timely delivery of creative assets for business development and marketing needs.
- Maintain brand consistency across all internal and external materials.
- Work closely with stakeholders to interpret briefs, incorporate feedback, and deliver high-quality outputs.
- Contribute to creative brainstorming, content structuring, and layout design for digital and print formats.
- Translate UX research into wireframes, user flows, mockups, and interactive templates where applicable.
Required Skills :
- Expert proficiency in PowerPoint and Adobe Creative Suite (InDesign, Photoshop, Illustrator, XD).
- Hands-on experience with pitch decks, proposals, infographics, co-branded templates, and marketing collateral.
- Strong understanding of visual hierarchy, layout design, typography, and branding.
- Familiarity with Keynote, Google Slides, and advanced MS Office tools.
- Excellent communication and stakeholder management skills, especially with international teams.
- Highly organized, deadline-driven, and detail-oriented.
Candidate Notes :
- Resumes must be well-aligned and professionally formatted.
- Portfolio links are mandatory with every profile submission.
- Strict Communication Requirement : Excellent spoken and written English for European stakeholder alignment.
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.
Director - Data engineering
What are we looking for
real solver?
Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.
Your Responsibilities
what you will wake up to solve.
1. Delivery & Tactical Rigor
- Methodology Implementation: Implement and manage a unified, 'DataOps-First' methodology for data engineering delivery (ETL/ELT pipelines, Data Modeling, MLOps, Data Governance) within assigned business units. This ensures predictable outcomes and trusted data integrity by reducing architecture variability at the project level.
- Operational Stewardship: Drive initiatives to optimize team utilization and enhance operational efficiency within the practice. You manage the commercial success of your squads, ensuring data delivery models (from migration to modern data stack implementation) are executed profitably, scalably, and cost-effectively.
- Execution & Technical Resolution
- Technical Escalation: Serve as the primary escalation point for delivery issues, personally leading the resolution of complex data integration bottlenecks and pipeline failures to protect client timelines and data reliability standards.
- Quality Enforcement
- Quality Oversight: Execute and monitor technical data quality standards, ensuring engineering teams adhere to strict policies regarding data lineage, automated quality checks (observability), security/privacy compliance (GDPR/CCPA/PII), and active catalog management.
2. Strategic Growth & Practice Scaling
- Talent & Scaling Execution: Execute the strategy for data engineering talent acquisition and development within your business units. Implement objective metrics to assess and grow the 'Data-Native' DNA of your teams, ensuring squads are consistently equipped to handle petabyte-scale environments and high-impact delivery.
- Offerings Alignment: Drive the adoption of standardized regional offerings (e.g., Modern Data Platform, Data Mesh, Lakehouse Implementation). Ensure your teams leverage the profitable frameworks defined by the practice to accelerate time-to-insight and eliminate architectural fragmentation in client environments.
- Innovation & IP Development: Lead the practical integration of Vector Databases and LLM-ready architectures into project delivery. Champion the hands-on development of IP and reusable accelerators (e.g., automated ingestion engines) that improve delivery speed and enhance data availability across your portfolio.
3. Leadership & Unit Management
- Unit Leadership: Directly lead, mentor, and manage the Engineering Managers and Lead Architects within your business unit. Hold your teams accountable for project-level operational consistency, technical talent development, and strict adherence to the practice's data governance standards.
- Stakeholder Communication: Clearly articulate the business unit’s operational performance, technical quality metrics, and delivery progress to the C-suite Stakeholders and regional client leadership, bridging the gap between technical execution and business value.
- Ecosystem Alignment: Maintain strong technical relationships with key partner contacts (Snowflake, Databricks, AWS/GCP). Align team delivery capabilities with current product roadmaps and ensure squad-level participation in training, certifications, and partner-led enablement opportunities.
Welcome to Searce
The ‘process-first’, AI-native modern tech consultancy that's rewriting the rules.
We don’t do traditional.
As an engineering-led consultancy, we are dedicated to relentlessly improving the real business outcomes. Our solvers co-innovate with clients to futurify operations and make processes smarter, faster & better.
Functional Skills
1. Delivery Management & Operational Excellence
- Methodology Execution: Expert capability in implementing and enforcing a unified delivery methodology (DataOps, Agile, Mesh Principles) within specific business units. Proven track record of auditing squad-level adherence to ensure consistency across the project lifecycle.
- Operational Performance: High proficiency in managing day-to-day operational metrics, including squad utilization, resource forecasting, and productivity tracking. Skilled at optimizing team performance to meet profitability and efficiency targets.
- SOW & Risk Mitigation: Proven experience in operationalizing Statement of Work (SOW) requirements and identifying technical delivery risks early. Expert at mitigating scope creep and data-specific bottlenecks (e.g., latency, ingestion gaps) before they impact client outcomes.
- Technical Escalation Leadership: Demonstrated ability to lead "war room" efforts to resolve complex pipeline failures or data integrity issues. Skilled at providing clear, rapid remediation plans and communicating technical status directly to regional stakeholders.
2. Architectural Implementation & Technical Oversight
- Modern Stack Proficiency: Deep, hands-on expertise in implementing Cloud-Native architectures (Lakehouse, Data Mesh, MPP) on Snowflake, Databricks, or hyperscalers. Ability to conduct deep-dive architectural reviews and course-correct design decisions at the squad level to ensure scalability.
- Operationalizing Governance: Proven experience in embedding data quality and observability (completeness, freshness, accuracy) directly into the CI/CD pipeline. Responsible for technical enforcement of regulatory compliance (GDPR/PII) and maintaining the integrity of data catalogs across active projects.
- Applied Domain Expertise: Practical experience leading the delivery of high-growth solutions, specifically Generative AI infrastructure (RAG, Vector DBs), Real-Time Streaming, and large-scale platform migrations with a focus on zero-downtime execution.
- DataOps & Engineering Standards: Expert-level mastery of DataOps, including the setup and management of orchestration frameworks (Airflow, Dagster) and Infrastructure as Code (IaC). You ensure that automation is a baseline requirement, not an afterthought, for all delivery teams.
3. Unit Management & Commercial Execution
- Unit & Team Management: Proven success in leading and mentoring Engineering Managers and Lead Architects. Responsible for the operational metrics, technical output, and career development of the business unit's talent pool.
- Offerings Implementation & Scoping: Expertise in translating service offerings (e.g., Data Maturity Assessments, Lakehouse Builds) into accurate project scopes, technical estimates, and resource plans to ensure delivery is both profitable and competitive.
- Talent Growth & Mentorship: Functional ability to implement growth frameworks for data engineering roles. Focus on hands-on coaching and scaling high-performance technical talent to meet the demands of complex, petabyte-scale environments.
- Partner Enablement: Functional competence in managing regional technical relationships with major partners (Snowflake, Databricks, GCP/AWS). Drives squad-level certifications, joint technical enablement, and alignment with partner product roadmaps.
Tech Superpowers
- Modern Data Architect – Reimagines business with the Modern Data Stack (MDS) to deliver data mesh implementations, insights, & real value to clients.
- End-to-End Ecosystem Thinker – Builds modular, reusable data products across ingestion, transformation (ETL/ELT), governance, and consumption layers.
- Distributed Compute Savant – Crafts resilient, high-throughput architectures that survive petabyte-scale volume and data skew without breaking the bank.
- Governance & Integrity Guardian – Embeds data quality, complete lineage, and privacy-by-design (GDPR/PII) into every table, view, and pipeline.
- AI-Ready Orchestrator – Engineers pipelines that bridge structured data with Unstructured/Vector stores, powering RAG models and Generative AI workflows.
- Product-Minded Strategist – Balances architectural purity with time-to-insight; treats every dataset as a measurable "Data Product" with clear ROI.
- Pragmatic Stack Curator – Chooses the simplest tools that compound reliability; fluent in SQL, Python, Spark, dbt, and Cloud Warehouses.
- Builder @ Heart – Writes, reviews, and optimizes queries daily; proves architectures with cost-performance benchmarks, not slideware. Business-first, data-second, outcome focused technology leader.
Experience & Relevance
- Executive Experience: Minimum 10+ years of progressive experience in data engineering and analytics, with at least 3 years in a Senior Manager or Director -level role managing multiple technical teams and owning significant operational and efficiency metrics for a large data service line.
- Delivery Standardization: Demonstrated success in defining and implementing globally consistent, repeatable delivery methodologies (DataOps/Agile Data Warehousing) across diverse teams.
- Architectural Depth: Must retain deep, current expertise in Modern Data Stack architectures (Lakehouse, MPP, Mesh) and maintain the ability to personally validate high-level architectural and data pipeline design decisions.
- Operational Leadership: Proven expertise in managing and scaling large professional services organizations, demonstrated ability to optimize utilization, resource allocation, and operational expense.
- Domain Expertise: Strong background in Enterprise Data Platforms, Applied AI/ML, Generative AI integration, or large-scale Cloud Data Migration.
- Communication: Exceptional executive-level presentation and negotiation skills, particularly in communicating complex operational, data quality, and governance metrics to C-level stakeholders.
Join the ‘real solvers’
ready to futurify?
If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’. Don’t Just Send a Resume. Send a Statement.
• Java 8/12• Frameworks - Spring MVC, REST, Spring Boot, Hibernate, (Optional – Play)• Oracle DB• Elastic Search (Optional but good to have)
• Junit, Mockito• Messaging – Should have knowledge of how RabbitMQ works
• Agile / TDD
• Good Experience in Core-Java concepts
• Experience of Continuous Integration / Continuous Delivery frameworks – Jenkins• Apache, Tomcat
• Good to have basic knowledge in AWS
• Good communication abilities & team-working skills
At Anarock Tech, we are building a modern technology platform with automated analytics and reporting tools. This offers timely solutions to our real estate clients, while delivering financially favorable and efficient results.
If it excites you to - drive innovation, create industry-first solutions, build new capabilities ground-up, and work with multiple new technologies, Anarock is the place for you.
Key job responsibilities
- You will own multiple global products, and help create and drive the overall strategy, vision, and roadmap for each.
- You will have a high degree of ownership over critical features and the overall customer experience.
- You will be customer obsessed and technology savvy, with a deep passion toward improving existing solutions and finding newer opportunities.
- You will take up and deliver on innovative/emerging technology options, and work closely with multiple teams.
- You will contribute to engineering discussions around technology decisions and strategy related to your products.
- You will embrace ambiguity with critical and analytical thinking, obtain buy-in from a diverse set of stakeholders and senior executives, and drive end-to-end product management.
Basic Qualifications
- 5+ years of product management experience.
- Strong written and verbal communication skills, and experience presenting to senior executives and external stakeholders.
- Proven proficiency in Product Design.
- Demonstrated ability to conceptualize, launch and scale new products in a fast-paced entrepreneurial environment.
- Demonstrated success working with a wide set of stakeholders - engineering, design, field operations, Operations, Marketing, Sales etc.
- Strong technical acumen and understanding of emerging technologies
Skills that will help you build a success story with us
- An ability to quickly understand and solve new problems
- Negotiation and conflict management
- Strong interpersonal skills
- Excellent data interpretation
- Stress tolerance
- Context-switching
- Intrinsically motivated
- A tactical and strategic track record for delivering research-driven results
- Ability to motivate and lead a team and manage multiple requirements
- Innovative and people management with an understanding Competitive Analysis.
- Ready to travel, as required
Quick Glances:
- https://www.anarock.com/aboutus">What to look for at Anarock
- https://www.linkedin.com/company/anarock-technology/">Who are we A glimpse of Anarock Tech, know us better
- https://www.anarock.com/news-and-media">Anarock - Media – Visit our media page
Anarock Ethos - Values Over Value:
Our assurance of consistent ethical dealing with clients and partners reflects our motto - Values Over Value.
We value diversity within ANAROCK Group and are committed to offering equal opportunities in employment. We do not discriminate against any team member or applicant for employment based on nationality, race, colour, religion, caste, gender identity/expression, sexual orientation, disability, social origin and status, indigenous status, political opinion, age, marital status or any other personal characteristics or status. ANAROCK Group values all talent and will do its utmost to hire, nurture and grow them.
At Prolifics, we are currently implementing multiple solutions on Tosca Automation and we are looking to hire talented Tosca Engineer for our development center in India. This position would be based out of location and is a permanent position.
If you are looking for a high growth company with rock-solid stability, if you thrive in the energetic atmosphere of high profile projects, we want to talk to you today! Let’s connect and explore possibilities of having you onboard the Prolifics team!
Role & Responsibilities-
Job Title: Tosca Engineer
Primary skills: TOSCA
Location: HYD
Educational Qualification: B.Tech/BE/M.Tech/MCA/M.Sc
- VB scripting related triaging which is used by TOSCA
- Hands on experience related to TOSCA connection with other Applications
- Hands on exp of desktop applications automations using TOSCA.
- AS1, AS2, certifications in Tosca.
Description
We are a dynamic UK-based technology company that is fundamentally changing the way international logistics operates. We’re searching for a full-stack developer who is excited by the prospect of working at the bleeding edge of high tech in a rapidly growing scale-up. As we establish a global presence, we're expanding our team in India at a pace and looking for fantastic engineers to join us. We've recently raised our Series A round from leading US investor https://bvp.com/" target="_blank">Bessemer Venture Partners (LinkedIn, Twilio, Shopify) alongside https://episode1.com/" target="_blank">Episode 1 (Zoopla, Betfair, Shazam) and supply chain-focused fund https://www.dynamo.vc/" target="_blank">Dynamo Ventures (Sennder, Stord).
You will build and maintain our public-facing application. The role will mainly involve developing real-time frontend and backend services, while frequently interacting with a multi-cloud environment. You will build a customer-facing product deeply connected to the ML components of the system, solving new problems regarding how users interact with the results and how they re-train the models through their input.
Specifically, we want someone who can:
- Solve problems from understanding the available data to providing the functional UI requirements
- Architect solid frontend components, including snapshot testing and integration testing
- Develop and maintain APIs and data structures for ML-powered features
- Use cloud-native libraries, in a multi-cloud system
- Work with the DevOps to build CI/CD pipelines and help manage the infrastructure
Requirements
- Ability to develop user interfaces following HTML usability best practices
- Extensive knowledge of modern front-end development, using React with Hooks and CSS-in-JS
- Capable to build scalable APIs using the most recent Python features
- Extensive experience with SQL and NoSQL databases
- Have worked commercially with containerized tools like Docker, Docker compose, Kubernetes, etc.
- Experience that would be good to have:
- 2 - 3 years of commercial experience
- Ability to sensibly design features from a UI and UX perspective
- Experience with real-time interfaces and backends
- Support existing code of PHP webserver
- Proficient in PHP, JavaScript, jQuery, HTML, and MySQL
- Hands-on any PHP-based framework/CMS like Laravel, CakePHP, Zend, CodeIgnitor, Yii.
- Must write efficient code with documentation.
- Any Database: MySQL, MongoDB, NoSQL DBs, postgresql.
- Has experience debugging code.
- Analysis of current DB structure and change table structures as and when required
- Implementation of visual representation of DB data in the form of charts and graphs
- The eventual implementation of a new UI theme with customizations such as drop-down menu data, as well as linking the UI theme front-end with the back-end MySQL
- Implementation of new PHP web services for functions such as analytics, and new modules to existing services
- Ensure cross-platform compatibility of information retrieved from web services on
- Android and iOS platforms, in terms of Push Notifications, platform-specific issues, etc.
- Implementation of SSL security to all web services, logins, and API calls
- Familiar with XMPP would be a plus point.









