50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.
Location : Bangalore, Hyderabad, Mumbai, and Gurgaon
Responsibilities:
· Designing, building, and operating scalable on-premises or cloud data architecture
· Analyzing business requirements and translating them into technical specifications
· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)
· Design, develop, and maintain scalable data pipelines and ETL processes
· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness
· Implement data governance and security best practices to ensure compliance and data integrity
· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring
· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Requirements
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines
· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks
· Comprehensive understanding of the Snowflake and Databricks ecosystem
· Strong programming skills in languages like SQL and Python or PySpark.
· Experience with data modeling, ETL processes, and data warehousing concepts.
· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.
Review Criteria
- Strong Senior Backend Engineer profiles
- Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
- Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
- Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
- Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
- Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
- Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
- (Company) – Must have worked in product companies / startups, preferably Series A to Series D
- (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
Role & Responsibilities
As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.
This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.
Key Responsibilities-
- Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
- Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
- Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
- Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
- Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
- Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs
The Impact You'll Make-
- Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
- Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
- Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
- Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
- Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.
Job Description:
Exp Range - [5y to 10y]
Require Skills:
- Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
- Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
- Must Have – Experience working with models such as Random Forest, Kmeans clustering, BERT…
- Should Have – Exposure to querying warehouses and APIs
- Should Have – Experience with writing moderate to complex SQL queries
- Should Have – Experience analyzing and presenting data with BI tools or Excel
- Must Have – Very strong communication skills to work with technical and non-technical stakeholders in a global environment
Roles and Responsibilities:
- Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
- Analyze and present insights about the data and processes to Business Stakeholders
- Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
- Develop and deploy customized models on Production data sets to generate analytical insights and predictions
- Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
- Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
- Share knowledge and best practices with broader teams to make everyone aware and more productive.
Qualifications:
- Minimum bachelor's degree in engineering or computer applications or AI/Data science
- Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
- Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
AccioJob is conducting a Walk-In Hiring Drive with Global IT Consulting for the position of Software Engineer.
To apply, register and select your slot here: https://go.acciojob.com/6ED2rL
Required Skills: DSA, SQL, OOPS
Eligibility:
Degree: BTech./BE
Branch: Computer Science/CSE/Other CS related branch, IT
Graduation Year: 2024, 2025
Work Details:
Work Location: Bangalore (Onsite)
CTC: ₹11.1 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
Coding Assignment, Technical Interview 1, Technical Interview 2, Technical Interview 3
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/eapv4u
👇 FAST SLOT BOOKING 👇
[ 📲 DOWNLOAD ACCIOJOB APP ]
About the company:
At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.
Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.
Our key to success:
We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.
Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.
Our culture of oneness
At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.
Requirements:
Technical skills
- 8+ years of experience in a role Business or System or Functional Analyst;
- Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
- Experience of working with Restful APIs (writing requirements, API usage);
- Experience in Microservices architecture;
- Experience of working with Agile methodologies (Scrum, Kanban);
- Knowledge of SQL;
- Knowledge of UML, BPMN;
- Understanding of key UX/UI practices and processes;
- Understanding of software development lifecycle;
- Understanding of architecture of WEB-based application;
- English Upper-Intermediate or higher.
Soft Skills
- Excellent communication and presentation skills;
- Proactiveness;
- Organized, detail-oriented with ability to keep overall solution in mind;
- Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
- Good time-management skills, ability to handle multitasking activities.
Good to haves
- Experience in enterprise software development or finance domain;
- Experience in delivery of desktop and web-applications;
- Experience of successful system integration project.
Responsibilities:
- Participation in discovery phases and workshops with Customer, covering key business and product requirements;
- Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
- Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
- Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
- Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
- Preparing UAT scenarios, validation cases;
- User Acceptance Testing;
- Demo for internal stakeholders;
- Creating documentation (user guides, technical guides, presentations).
Project Description:
Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.
It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.
Our system can optimize and simplify all processes related to retail in this business area.
Few things that our product can do:
- Robust Online Reporting
- Repair Management Software
- 3rd Party Integrations
- Customer Connect Marketing
- Time and Attendance
- Carrier Commission Reconciliation
As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design and develop backend components and sub-systems for large-scale platforms under guidance from senior engineers.
- Contribute to building and evolving the next-generation customer data platform.
- Write clean, efficient, and well-tested code with a focus on scalability and performance.
- Explore and experiment with modern technologies—especially open-source frameworks—
- and build small prototypes or proof-of-concepts.
- Use AI-assisted development tools to accelerate coding, testing, debugging, and learning while adhering to engineering best practices.
- Participate in code reviews, design discussions, and continuous improvement of the platform.
Qualifications:
- 0–2 years of experience (or strong academic/project background) in backend development with Java.
- Good fundamentals in algorithms, data structures, and basic performance optimizations.
- Bachelor’s or Master’s degree in Computer Science or IT (B.E / B.Tech / M.Tech / M.S) from premier institutes.
Technical Skill Set:
- Strong aptitude and analytical skills with emphasis on problem solving and clean coding.
- Working knowledge of SQL and NoSQL databases.
- Familiarity with unit testing frameworks and writing testable code is a plus.
- Basic understanding of distributed systems, messaging, or streaming platforms is a bonus.
AI-Assisted Engineering (LLM-Era Skills):
- Familiarity with modern AI coding tools such as Cursor, Claude Code, Codex, Windsurf, Opencode, or similar.
- Ability to use AI tools for code generation, refactoring, test creation, and learning new systems responsibly.
- Willingness to learn how to combine human judgment with AI assistance for high-quality engineering outcomes.
Soft Skills & Nice to Have
- Appreciation for technology and its ability to create real business value, especially in data and marketing platforms.
- Clear written and verbal communication skills.
- Strong ownership mindset and ability to execute in fast-paced environments.
- Prior internship or startup experience is a plus.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design, build, and own large-scale, distributed backend and platform systems.
- Drive architectural decisions for high-throughput, low-latency services with strong scalability and reliability guarantees.
- Build and evolve core components of a real-time Customer Data Platform, especially around data ingestion, streaming, and processing.
- Evaluate and adopt open-source and emerging platform technologies; build prototypes where needed.
- Own critical subsystems end-to-end, ensuring performance, maintainability, and operational excellence.
- Mentor junior engineers and uphold high standards through code and design reviews.
- Effectively use modern AI-assisted coding tools to accelerate development while maintaining engineering rigor.
- 4–6 years of strong backend/platform engineering experience with solid fundamentals in algorithms, data structures, and optimizations.
- Proven experience designing and operating production-grade distributed systems.
- B.E / B.Tech / M.Tech / M.S / MCA in Computer Science or equivalent from premier institutes.
Qualifications:
Technical Skills:
- Strong system and object-oriented design skills.
- Hands-on experience with SQL and NoSQL databases.
- Strong working knowledge of Kafka and streaming systems.
- Proficiency in Java, concurrency, and unit/integration testing.
- Familiarity with cloud-native environments (Kubernetes, CI/CD, observability).
AI-Assisted Engineering:
- Hands-on experience using modern AI coding platforms such as Opencode, Claude Code, Codex, Cursor, Windsurf, or similar.
- Ability to use AI tools for code generation, refactoring, testing, debugging, and design exploration responsibly.
Soft Skills & Nice to Have:
- Strong ownership mindset and ability to deliver in fast-paced environments.
- Clear written and verbal communication skills.
- Startup experience is a plus.
Job Description:
Summary
The Data Engineer will be responsible for designing, developing, and maintaining the data infrastructure. They must have experience with SQL and Python.
Roles & Responsibilities:
● Collaborate with product, business, and engineering stakeholders to understand key metrics, data needs, and reporting pain points.
● Design, build, and maintain clean, scalable, and reliable data models using DBT.
● Write performant SQL and Python code to transform raw data into structured marts and reporting layers.
● Create dashboards using Tableau or similar tools.
● Work closely with data platform engineers, architects, and analysts to ensure data pipelines are resilient, well-governed, and high quality.
● Define and maintain source-of-truth metrics and documentation in the analytics layer.
● Partner with product engineering teams to understand new features and ensure appropriate
instrumentation and event collection.
● Drive reporting outcomes by building dashboards or working with BI teams to ensure timely delivery of insights.
● Help scale our analytics engineering practice by contributing to internal tooling, frameworks, and best practices.
Who You Are:
Experience : 3 to 4 years of experience in analytics/data engineering, with strong hands-on expertise in DBT, SQL, Python and dashboarding tools.
● Experience working with modern data stacks (e.g., Snowflake, BigQuery, Redshift, Airflow).
● Strong data modeling skills (dimensional, star/snowflake schema, data vault, etc.).
● Excellent communication and stakeholder management skills.
● Ability to work independently and drive business outcomes through data.
● Exposure to product instrumentation and working with event-driven data is a plus.
● Prior experience in a fast-paced, product-led company is preferred.
We are seeking a Data Engineer with 3–4 years of relevant experience to join our team. The ideal candidate should have strong expertise in Python and SQL and be available to join immediately.
Location: Bangalore
Experience: 3–4 Years
Joining: Immediate Joiner preferred
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and data models
- Extract, transform, and load (ETL) data from multiple sources
- Write efficient and optimized SQL queries for data analysis and reporting
- Develop data processing scripts and automation using Python
- Ensure data quality, integrity, and performance across systems
- Collaborate with cross-functional teams to support business and analytics needs
- Troubleshoot data-related issues and optimize existing processes
Required Skills & Qualifications:
- 3–4 years of hands-on experience as a Data Engineer or similar role
- Strong proficiency in Python and SQL
- Experience working with relational databases and large datasets
- Good understanding of data warehousing and ETL concepts
- Strong analytical and problem-solving skills
- Ability to work independently and in a team-oriented environment
Preferred:
- Experience with cloud platforms or data tools (added advantage)
- Exposure to performance tuning and data optimization
Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.
Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.
Handling the challenges and breaking.
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locati
Job Details
- Job Title: Java Full Stack Developer
- Industry: Global digital transformation solutions provider
- Domain: Information technology (IT)
- Experience Required: 5-7 years
- Working Mode: 3 days in office, Hybrid model.
- Job Location: Bangalore
- CTC Range: Best in Industry
Job Description:
SDET (Software Development Engineer in Test)
Job Responsibilities:
• Test Automation: • Develop, maintain, and execute automated test scripts using test automation frameworks. • Design and implement testing tools and frameworks to support automated testing.
• Software Development: • Participate in the design and development of software components to improve testability. • Write code actively, contribute to the development of tools, and work closely with developers to debunk complex issues.
• Quality Assurance: • Collaborate with the development team to understand software features and technical implementations. • Develop quality assurance standards and ensure adherence to the best testing practices.
• Integration Testing: • Conduct integration and functional testing to ensure that components work as expected individually and when combined.
• Performance and Scalability Testing: • Perform performance and scalability testing to identify bottlenecks and optimize application performance. • Test Planning and Execution: • Create detailed, comprehensive, and well-structured test plans and test cases. • Execute manual and/or automated tests and analyze results to ensure product quality.
• Bug Tracking and Resolution: • Identify, document, and track software defects using bug tracking tools. • Verify fixes and work closely with developers to resolve issues. • Continuous Improvement: • Stay updated on emerging tools and technologies relevant to the SDET role. • Constantly look for ways to improve testing processes and frameworks.
Skills and Qualifications: • Strong programming skills, particularly in languages such as COBOL, JCL, Java, C#, Python, or JavaScript. • Strong experience in Mainframe environments. • Experience with test automation tools and frameworks like Selenium, JUnit, TestNG, or Cucumber. • Excellent problem-solving skills and attention to detail. • Familiarity with CI/CD tools and practices, such as Jenkins, Git, Docker, etc. • Good understanding of web technologies and databases is often beneficial. • Strong communication skills for interfacing with cross-functional teams.
Qualifications • 5+ years of experience as a software developer, QA Engineer, or SDET. • 5+ years of hands-on experience with Java or Selenium. • 5+ years of hands-on experience with Mainframe environments. • 4+ years designing, implementing, and running test cases. • 4+ years working with test processes, methodologies, tools, and technology. • 4+ years performing functional and UI testing, quality reporting. • 3+ years of technical QA management experience leading on and offshore resources. • Passion around driving best practices in the testing space. • Thorough understanding of Functional, Stress, Performance, various forms of regression testing and mobile testing. • Knowledge of software engineering practices and agile approaches. • Experience building or improving test automation frameworks. • Proficiency CICD integration and pipeline development in Jenkins, Spinnaker or other similar tools. • Proficiency in UI automation (Serenity/Selenium, Robot, Watir). • Experience in Gherkin (BDD /TDD). • Ability to quickly tackle and diagnose issues within the quality assurance environment and communicate that knowledge to a varied audience of technical and non-technical partners. • Strong desire for establishing and improving product quality. • Willingness to take challenges head on while being part of a team. • Ability to work under tight deadlines and within a team environment. • Experience in test automation using UFT and Selenium. • UFT/Selenium experience in building object repositories, standard & custom checkpoints, parameterization, reusable functions, recovery scenarios, descriptive programming and API testing. • Knowledge of VBScript, C#, Java, HTML, and SQL. • Experience using GIT or other Version Control Systems. • Experience developing, supporting, and/or testing web applications. • Understanding of the need for testing of security requirements. • Ability to understand API – JSON and XML formats with experience using API testing tools like Postman, Swagger or SoapUI. • Excellent communication, collaboration, reporting, analytical and problem-solving skills. • Solid understanding of Release Cycle and QA /testing methodologies • ISTQB certification is a plus.
Skills: Python, Mainframe, C#
Notice period - 0 to 15days only
About the Role
We're seeking a Python Backend Developer to join our insurtech analytics team. This role focuses on developing backend APIs, automating insurance reporting processes, and supporting data analysis tools. You'll work with insurance data, build REST APIs, and help streamline operational workflows through automation.
Key Responsibilities
- Automate insurance reporting processes including bordereaux, reconciliations, and data extraction from various file formats
- Support and maintain interactive dashboards and reporting tools for business stakeholders
- Develop Python scripts and applications for data processing, validation, and transformation
- Develop and maintain backend APIs using FastAPI or Flask
- Perform data analysis and generate insights from insurance datasets
- Automate recurring analytical and reporting tasks
- Work with SQL databases to query, analyze, and extract data
- Collaborate with business users to understand requirements and deliver solutions
- Document code, processes, and create user guides for dashboards and tools
- Support data quality initiatives and implement validation checks
Requirements
Essential
- 2+ years of Python development experience
- Strong knowledge of Python libraries: Pandas, NumPy for data manipulation
- Experience building web applications or dashboards with Python frameworks
- Knowledge of FastAPI or Flask for building backend APIs and applications
- Proficiency in SQL and working with relational databases
- Experience with data visualization libraries (Matplotlib, Plotly, Seaborn)
- Ability to work with Excel, CSV, and other data file formats
- Strong problem-solving and analytical thinking skills
- Good communication skills to work with non-technical stakeholders
Desirable
- Experience in insurance or financial services industry
- Familiarity with insurance reporting processes (bordereaux, reconciliations, claims data)
- Experience with Azure cloud services (Azure Functions, Blob Storage, SQL Database)
- Experience with version control systems (Git, GitHub, Azure DevOps)
- Experience with API development and RESTful services
Tech Stack
Python 3.x, FastAPI, Flask, Pandas, NumPy, Plotly, Matplotlib, SQL Server, MS Azure, Git, Azure DevOps, REST APIs, Excel/CSV processing libraries
Employment Type: Full-time, Permanent
Location: Near Bommasandra Metro Station, Bangalore (Work from Office – 5 days/week)
Notice Period: 15 days or less preferred
About the Company:
SimStar Asia Ltd is a joint vendor of the SimGems and StarGems Group — a Hong Kong–based multinational organization engaged in the global business of conflict-free, high-value diamonds.
SimStar maintains the highest standards of integrity. Any candidate found engaging in unfair practices at any stage of the interview process will be disqualified and blacklisted.
Experience Required
- 4+ years of relevant professional experience.
Key Responsibilities
- Hands-on backend development using Python (mandatory).
- Write optimized and complex SQL queries; perform query tuning and performance optimization.
- Work extensively with the Odoo framework, including development and deployment.
- Manage deployments using Docker and/or Kubernetes.
- Develop frontend components using OWL.js or any modern JavaScript framework.
- Design scalable systems with a strong foundation in Data Structures, Algorithms, and System Design.
- Handle API integrations and data exchange between systems.
- Participate in technical discussions and architecture decisions.
Interview Expectations
- Candidates must be comfortable writing live code during interviews.
- SQL queries and optimization scenarios will be part of the technical assessment.
Must-Have Skills
- Python backend development
- Advanced SQL
- Odoo Framework & Deployment
- Docker / Kubernetes
- JavaScript frontend (OWL.js preferred)
- System Design fundamentals
- API integration experience
About Snabbit: Snabbit is India’s first Quick-Service App, delivering home services in just 15 minutes through a hyperlocal network of trained and verified professionals. Backed by Nexus Venture Partners (investors in Zepto, Unacademy, and Ultrahuman), Snabbit is redefining convenience in home services with quality and speed at its core. Founded by Aayush Agarwal, former Chief of Staff at Zepto, Snabbit is pioneering the Quick-Commerce revolution in services. In a short period, we’ve completed thousands of jobs with unmatched customer satisfaction and are scaling rapidly.
At Snabbit, we don’t just build products—we craft solutions that transform everyday lives. This is a playground for engineers who love solving complex problems, building systems from the ground up, and working in a fast-paced, ownership-driven environment. You’ll work alongside some of the brightest minds, pushing boundaries and creating meaningful impact at scale.
Responsibilities: ● Design, implement, and maintain backend services and APIs
● Develop and architect complex UI features for iOS and Android apps using Flutter
● Write high-quality, efficient, and maintainable code, adhering to industry best practices.
● Participate in design discussions to develop scalable solutions and implement them.
● Take ownership of feature delivery timelines and coordinate with cross-functional teams
● Troubleshoot and debug issues to ensure smooth system operations. ● Design, develop, and own end-to-end features for in-house software and tools
● Optimize application performance and implement best practices for mobile development
● Deploy and maintain services infrastructure on AWS. Requirements: ● Education: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
● Experience: ○ 3-5 years of hands-on experience as a full-stack developer.
○ Expertise in developing backend services and mobile applications.
○ Experience in leading small technical projects or features
○ Proven track record of delivering complex mobile applications to production
● Technical Skills:
○ Strong knowledge of data structures, algorithms, and design patterns. ○ Proficiency in Python and Advanced proficiency in Flutter with deep understanding of widget lifecycle and state management
○ Proficiency in RESTful APIs and microservices architecture ○ Knowledge of mobile app deployment processes and app store guidelines
○ Familiarity with version control systems (Git) and agile development methodologies
○ Experience with AWS or other relevant cloud technologies
○ Experience with databases (SQL, NoSQL) and data modeling
● Soft Skills:
○ Strong problem-solving and debugging abilities with ability to handle complex technical challenges and drive best practices within the team
○ Leadership qualities with ability to mentor and guide junior developers ○ Strong stakeholder management and client communication skills
○ A passion for learning and staying updated with technology trends.
We are looking for a Python Backend Developer to design, build, and maintain scalable backend services and APIs. The role involves working with modern Python frameworks, databases (SQL and NoSQL), and building well-tested, production-grade systems.
You will collaborate closely with frontend developers, AI/ML engineers, and system architects to deliver reliable and high-performance backend solutions.
Key Responsibilities
- Design, develop, and maintain backend services using Python
- Build and maintain RESTful APIs using FastAPI
- Design efficient data models and queries using MongoDB and SQL databases (PostgreSQL/MySQL)
- Ensure high performance, security, and scalability of backend systems
- Write unit tests, integration tests, and API tests to ensure code reliability
- Debug, troubleshoot, and resolve production issues
- Follow clean code practices, documentation, and version control workflows
- Participate in code reviews and contribute to technical discussions
- Work closely with cross-functional teams to translate requirements into technical solutions
Required Skills & Qualifications
Technical Skills
- Strong proficiency in Python
- Hands-on experience with FastAPI
- Experience with MongoDB (schema design, indexing, aggregation)
- Solid understanding of SQL databases and relational data modelling
- Experience writing and maintaining automated tests
- Unit testing (e.g., pytest)
- API testing
- Understanding of REST API design principles
- Familiarity with Git and collaborative development workflows
Good to Have
- Experience with async programming in Python (async/await)
- Knowledge of ORMs/ODMs (SQLAlchemy, Tortoise, Motor, etc.)
- Basic understanding of authentication & authorisation (JWT, OAuth)
- Exposure to Docker / containerised environments
- Experience working in Agile/Scrum teams
What We Value
- Strong problem-solving and debugging skills
- Attention to detail and commitment to quality
- Ability to write testable, maintainable, and well-documented code
- Ownership mindset and willingness to learn
- Teamwork
What We Offer
- Opportunity to work on real-world, production systems
- Technically challenging problems and ownership of components
- Collaborative engineering culture
Review Criteria:
- Strong Software Engineer fullstack profile using NodeJS / Python and React
- 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
- Must have strong experience in working on Typescript
- Must have experience in message-based systems like Kafka, RabbitMq, Redis
- Databases - PostgreSQL & NoSQL databases like MongoDB
- Product Companies Only
- Tier 1 Engineering Institutes preferred (IIT, NIT, BITS, IIIT, DTU or equivalent)
Preferred:
- Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
- Experience in mentoring, coaching the team.
Role & Responsibilities:
We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.
The Ideal Candidate Will Be Able To-
- Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
- Mentor team members to develop in line with product requirements.
- Collaborate with Senior Architect for design and technology choices for product development roadmap.
- Do code reviews.
Ideal Candidate:
- Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
- Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
- Experience with AWS services such as S3, IAM, Lambda etc.
- Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
- Eye for user responsive designs on the frontend.
Role Overview:
We are looking for a detail-oriented Quality Assurance (QA) Tester who is
passionate about delivering high-quality consumer-facing applications. This role
involves manual testing with exposure to automation, API testing, databases, and
mobile/web platforms, while working closely with engineering and product teams
across the SDLC.
Products:
• Openly – A conversation-first social app focused on meaningful interactions.
• Playroom – Voicechat – A real-time voice chat platform for live community
engagement.
• FriendChat – A chatroom-based social app for discovering and connecting with
new people.
Key Responsibilities:
• Perform manual testing for Android, web, and native applications.
• Create and execute detailed test scenarios, test cases, and test plans.
• Conduct REST API testing using Postman.
• Validate data using SQL and MongoDB.
• Identify, report, and track defects with clear reproduction steps.
• Support basic automation testing using Selenium (Java) and Appium.
• Perform regression, smoke, sanity, and exploratory testing.
• Conduct risk analysis and highlight quality risks early in the SDLC.• Collaborate closely with developers and product teams for defect resolution.
• Participate in CI/CD pipelines and support automated test executions.
• Use ADB tools for Android testing across devices and environments.
Required Skills & Technical Expertise:
• Strong knowledge of Manual Testing fundamentals.
• Hands-on experience with Postman and REST APIs.
• Working knowledge of SQL and MongoDB.
• Ability to design effective test scenarios.
• Basic understanding of Automation Testing concepts.
• Familiarity with SDLC and QA methodologies.
• Exposure to Selenium with Java and Appium.
• Understanding of Android, web, and native application testing.
• Experience using proxy tools for debugging and network inspection.
Good to Have:
• Exposure to CI/CD tools and pipelines.
• Hands-on experience with Appium, K6, Kafka, and proxy tools.
• Basic understanding of performance and load testing.
• Awareness of risk-based testing strategies.
Key Traits:
• High attention to detail and quality.
• Strong analytical and problem-solving skills.
• Clear communication and collaboration abilities.
• Eagerness to learn and grow in automation and advanced testing tools.
We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.
Key Responsibilities
- Collaborate with business users and stakeholders to understand business processes and data requirements
- Design and implement dimensional data models, including fact and dimension tables
- Identify, design, and implement data transformation and cleansing logic
- Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
- Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
- Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
- Provide high-level design, research, and effort estimates for data integration initiatives
- Provide production support for ETL processes to ensure data availability and SLA adherence
- Analyze and resolve data pipeline and performance issues
- Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
- Translate business requirements into well-defined technical data specifications
- Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
- Define and document BI usage through use cases, prototypes, testing, and deployment
- Support and enhance data governance and data quality processes
- Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
- Train and support business users, IT analysts, and developers
- Lead and collaborate with teams spread across multiple locations
Required Skills & Qualifications
- Bachelor’s degree in Computer Science or a related field, or equivalent work experience
- 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
- Strong expertise in data warehousing concepts, tools, and best practices
- Excellent SQL skills
- Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
- Hands-on experience with Google Cloud Platform (GCP) services, including:
- BigQuery
- Cloud SQL
- Cloud Composer (Airflow)
- Dataflow
- Dataproc
- Cloud Functions
- Google Cloud Storage (GCS)
- Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
- Strong experience integrating data using APIs, XML, JSON, and similar formats
- In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
- Solid understanding of SDLC, Agile, and Scrum methodologies
- Strong problem-solving, multitasking, and organizational skills
- Experience handling large-scale datasets and database design
- Strong verbal and written communication skills
- Experience leading teams across multiple locations
Good to Have
- Experience with SSRS and SSIS
- Exposure to AWS and/or Azure cloud platforms
- Experience working with enterprise BI and analytics tools
Why Join Us
- Opportunity to work on large-scale, enterprise data platforms
- Exposure to modern cloud-native data engineering technologies
- Collaborative environment with strong stakeholder interaction
- Career growth and leadership opportunities
About Upsurge Labs
We're building the infrastructure and products that will shape how human civilization operates in the coming decades. The specifics evolve—the ambition doesn't.
The Role
The way software gets built is undergoing a fundamental shift. AI can now write, test, debug, and ship production-grade systems across web, mobile, embedded, robotics, and infrastructure. The bottleneck is no longer typing code—it's knowing what to build, why, and how the pieces fit together.
We're hiring Systems Engineers: people who can navigate an entire development cycle—from problem definition to production deployment—by directing AI tools and reasoning from first principles. You won't specialize in one stack. You'll operate across all of them.
This role replaces traditional dev teams. You'll work largely autonomously, shipping complete systems that previously required 3-5 specialists.
What You'll Do
- Own entire products and systems end-to-end: architecture, implementation, deployment, iteration
- Work across domains as needed—backend services, frontend interfaces, mobile apps, data pipelines, DevOps, embedded software, robotic systems
- Use AI tools to write, review, test, and debug code at high velocity
- Identify when AI output is wrong, incomplete, or subtly broken—and know how to fix it or when to escalate
- Make architectural decisions: database selection, protocol choices, system boundaries, performance tradeoffs
- Collaborate directly with designers, domain experts, and leadership
- Ship. Constantly.
What You Bring
First-principles thinking
You understand how systems work at a foundational level. When something breaks, you reason backward from the error to potential causes. You know the difference between a network timeout, a malformed query, a race condition, and a misconfigured environment—even if you haven't memorized the fix.
Broad technical fluency
You don't need to be an expert in everything. But you need working knowledge across:
- How web systems work: HTTP, DNS, TLS, REST, WebSockets, authentication flows
- How databases work: relational vs document vs key-value, indexing, query structure, transactions
- How infrastructure works: containers, orchestration, CI/CD, cloud primitives, networking basics
- How frontend works: rendering, state management, browser APIs, responsive design
- How mobile works: native vs cross-platform tradeoffs, app lifecycle, permissions
- How embedded/robotics software works: real-time constraints, sensor integration, communication protocols
You should be able to read code in any mainstream language and understand what it's doing.
AI-native workflow
You've already built real things using AI tools. You know how to prompt effectively, how to structure problems so AI can help, how to validate AI output, and when to step in manually.
High agency
You don't wait for permission or detailed specs. You figure out what needs to happen and make it happen. Ambiguity doesn't paralyze you.
Proof of work
Show us what you've built. Live products, GitHub repos, side projects, internal tools—anything that demonstrates you can ship complete systems.
What We Don't Care About
- Degrees or formal credentials
- Years of experience in a specific language or framework
- Whether you came from a "traditional" engineering path
What You'll Get
- Direct line to the CEO
- Autonomy to own large problem spaces
- A front-row seat to how engineering work is evolving
- Colleagues who ship fast and think clearly

Full‑Stack Engineer (Python/Django & Next.js)
Location: Bangalore
Experience: 2–8 years of hands‑on full‑stack development
We’re looking for a passionate Full‑Stack Engineer to join our team and help build secure, scalable systems that power exceptional customer experiences.
Key Skills -
• Architect and develop secure, scalable applications
• Collaborate closely with product & design teams
• Manage CI/CD pipelines and deployments
• Mentor engineers and enforce coding best practices
What we’re looking for:
• Strong expertise in Python/Django & Next.js/React
• Hands‑on with PostgreSQL, Docker, AWS/GCP
• Experience leading engineering teams
• Excellent problem‑solving & communication skills
If you’re excited about building impactful products and driving engineering excellence. Apply now !!
The Opportunity
Planview is looking for a passionate Sr Data Scientist to join our team tasked with developing innovative tools for connected work. You are an experienced expert in supporting enterprise
applications using Data Analytics, Machine Learning, and Generative AI.
You will use this experience to lead other data scientists and data engineers. You will also effectively engage with product teams to specify, validate, prototype, scale, and deploy features with a consistent customer experience across the Planview product suite.
Responsibilities (What you'll do)
- Enable Data Science features within Planview applications by working in a fast-paced start-up mindset.
- Collaborate closely with product management to enable Data Science features that deliver significant value to customers, ensuring that these features are optimized for operational efficiency.
- Manage every stage of the AI/ML development lifecycle, from initial concept through deployment in a production environment.
- Provide leadership to other Data Scientists by exemplifying exceptional quality in work, nurturing a culture of continuous learning, and offering daily guidance in their research endeavors.
- Effectively communicate ideas drawn from complex data with clarity and insight.
Qualifications (What you'll bring)
- Master’s in operations research, Statistics, Computer Science, Data Science, or related field.
- 8+ years of experience as a data scientist, data engineer, or ML engineer.
- Demonstrable history for bringing Data Science features to Enterprise applications.
- Exceptional Python and SQL coding skills.
- Experience with Optimization, Machine Learning, Generative AI, NLP, Statistics, and Simulation.
- Experience with AWS Data and ML Technologies (Sagemaker, Glue, Athena, Redshift)
Preferred qualifications:
- Experience working with datasets in the domains of project management, software development, and resource planning.
- Experience with common libraries and frameworks in data science (Scikit Learn, TensorFlow, PyTorch).
- Experience with ML platform tools (AWS SageMaker).
- Skilled at working as part of a global, diverse workforce of high-performing individuals.
- AWS Certification is a plus
We are seeking a highly skilled and experienced Python Developer with a strong background in fintech to join our dynamic team. The ideal candidate will have at least 7+ years of professional experience in Python development, with a proven track record of delivering high-quality software solutions in the fintech industry.
Responsibilities:
Design, build, and maintain RESTful APIs using Django and Django Rest Framework.
Integrate AI/ML models into existing applications to enhance functionality and provide data-driven insights.
Collaborate with cross-functional teams, including product managers, designers, and other developers, to define and implement new features and functionalities.
Manage deployment processes, ensuring smooth and efficient delivery of applications.
Implement and maintain payment gateway solutions to facilitate secure transactions.
Conduct code reviews, provide constructive feedback, and mentor junior members of the development team.
Stay up-to-date with emerging technologies and industry trends, and evaluate their potential impact on our products and services.
Maintain clear and comprehensive documentation for all development processes and integrations.
Requirements:
Proficiency in Python and Django/Django Rest Framework.
Experience with REST API development and integration.
Knowledge of AI/ML concepts and practical experience integrating AI/ML models.
Hands-on experience with deployment tools and processes.
Familiarity with payment gateway integration and management.
Strong understanding of database systems (SQL, PostgreSQL, MySQL).
Experience with version control systems (Git).
Strong problem-solving skills and attention to detail.
Excellent communication and teamwork skills.
Job Types: Full-time, Permanent
Work Location: In person
About Us
MIC Global is a full-stack micro-insurance provider, purpose-built to design and deliver embedded parametric micro-insurance solutions to platform companies. Our mission is to make insurance more accessible for new, emerging, and underserved risks using our MiIncome loss-of-income products, MiConnect, MiIdentity, Coverpoint technology, and more — backed by innovative underwriting capabilities as a Lloyd’s Coverholder and through our in-house reinsurer, MicRe.
We operate across 12+ countries, with our Global Operations Center in Bangalore supporting clients worldwide, including a leading global ride-hailing platform and a top international property rental marketplace. Our distributed teams across the UK, USA, and Asia collaborate to ensure that no one is beyond the reach of financial security.
About the Team
As a Lead Data Specialist at MIC Global, you will play a key role in transforming data into actionable insights that inform strategic and operational decisions. You will work closely with Product, Engineering, and Business teams to analyze trends, build dashboards, and ensure that data pipelines and reporting structures are accurate, automated, and scalable.
This is a hands-on, analytical, and technically focused role ideal for someone experienced in data analytics and engineering practices. You will use SQL, Python, and modern BI tools to interpret large datasets, support pricing models, and help shape the data-driven culture across MIC Global
Key Roles and Responsibilities
Data Analytics & Insights
- Analyze complex datasets to identify trends, patterns, and insights that support business and product decisions.
- Partner with Product, Operations, and Finance teams to generate actionable intelligence on customer behavior, product performance, and risk modeling.
- Contribute to the development of pricing models, ensuring accuracy and commercial relevance.
- Deliver clear, concise data stories and visualizations that drive executive and operational understanding.
- Develop analytical toolkits for underwriting, pricing and claims
Data Engineering & Pipeline Management
- Design, implement, and maintain reliable data pipelines and ETL workflows.
- Write clean, efficient scripts in Python for data cleaning, transformation, and automation.
- Ensure data quality, integrity, and accessibility across multiple systems and environments.
- Work with Azure data services to store, process, and manage large datasets efficiently.
Business Intelligence & Reporting
- Develop, maintain, and optimize dashboards and reports using Power BI (or similar tools).
- Automate data refreshes and streamline reporting processes for cross-functional teams.
- Track and communicate key business metrics, providing proactive recommendations.
Collaboration & Innovation
- Collaborate with engineers, product managers, and business leads to align analytical outputs with company goals.
- Support the adoption of modern data tools and agentic AI frameworks to improve insight generation and automation.
- Continuously identify opportunities to enhance data-driven decision-making across the organization.
Ideal Candidate Profile
- 10+ years of relevant experience in data analysis or business intelligence, ideally
- within product-based SaaS, fintech, or insurance environments.
- Proven expertise in SQL for data querying, manipulation, and optimization.
- Hands-on experience with Python for data analytics, automation, and scripting.
- Strong proficiency in Power BI, Tableau, or equivalent BI tools.
- Experience working in Azure or other cloud-based data ecosystems.
- Solid understanding of data modeling, ETL processes, and data governance.
- Ability to translate business questions into technical analysis and communicate findings effectively.
Preferred Attributes
- Experience in insurance or fintech environments, especially operations, and claims analytics.
- Exposure to agentic AI and modern data stack tools (e.g., dbt, Snowflake, Databricks).
- Strong attention to detail, analytical curiosity, and business acumen.
- Collaborative mindset with a passion for driving measurable impact through data.
Benefits
- 33 days of paid holiday
- Competitive compensation well above market average
- Work in a high-growth, high-impact environment with passionate, talented peers
- Clear path for personal growth and leadership development
About Us
MIC Global is a full-stack micro-insurance provider, purpose-built to design and deliver embedded parametric micro-insurance solutions to platform companies. Our mission is to make insurance more accessible for new, emerging, and underserved risks using our MiIncome loss-of-income products, MiConnect, MiIdentity, Coverpoint technology, and more — backed by innovative underwriting capabilities as a Lloyd’s Coverholder and through our in-house reinsurer, MicRe.
We operate across 12+ countries, with our Global Operations Center in Bangalore supporting clients worldwide, including a leading global ride-hailing platform and a top international property rental marketplace. Our distributed teams across the UK, USA, and Asia collaborate to ensure that no one is beyond the reach of financial security.
About the Team
We're seeking a mid-level Data Engineer with strong DBA experience to join our insurtech data analytics team. This role focuses on supporting various teams including infrastructure, reporting, and analytics. You'll be responsible for SQL performance optimization, building data pipelines, implementing data quality checks, and helping teams with database-related challenges. You'll work closely with the infrastructure team on production support, assist the reporting team with complex queries, and support the analytics team in building visualizations and dashboards.
Key Roles and Responsibilities
Database Administration & Optimization
- Support infrastructure team with production database issues and troubleshooting
- Debug and resolve SQL performance issues, identify bottlenecks, and optimize queries
- Optimize stored procedures, functions, and views for better performance
- Perform query tuning, index optimization, and execution plan analysis
- Design and develop complex stored procedures, functions, and views
- Support the reporting team with complex SQL queries and database design
Data Engineering & Pipelines
- Design and build ETL/ELT pipelines using Azure Data Factory and Python
- Implement data quality checks and validation rules before data enters pipelines
- Develop data integration solutions to connect various data sources and systems
- Create automated data validation, quality monitoring, and alerting mechanisms
- Develop Python scripts for data processing, transformation, and automation
- Build and maintain data models to support reporting and analytics requirements
Support & Collaboration
- Help data analytics team build visualizations and dashboards by providing data models and queries
- Support reporting team with data extraction, transformation, and complex reporting queries
- Collaborate with development teams to support application database requirements
- Provide technical guidance and best practices for database design and query optimization
Azure & Cloud
- Work with Azure services including Azure SQL Database, Azure Data Factory, Azure Storage, Azure Functions, and Azure ML
- Implement cloud-based data solutions following Azure best practices
- Support cloud database migrations and optimizations
- Work with Agentic AI concepts and tools to build intelligent data solutions
Ideal Candidate Profile
Essential
- 5-8 years of experience in data engineering and database administration
- Strong expertise in MS SQL Server (2016+) administration and development
- Proficient in writing complex SQL queries, stored procedures, functions, and views
- Hands-on experience with Microsoft Azure services (Azure SQL Database, Azure Data Factory, Azure Storage)
- Strong Python scripting skills for data processing and automation
- Experience with ETL/ELT design and implementation
- Knowledge of database performance tuning, query optimization, and indexing strategies
- Experience with SQL performance debugging tools (XEvents, Profiler, or similar)
- Understanding of data modeling and dimensional design concepts
- Knowledge of Agile methodology and experience working in Agile teams
- Strong problem-solving and analytical skills
- Understanding of Agentic AI concepts and tools
- Excellent communication skills and ability to work with cross-functional teams
Desirable
- Knowledge of insurance or financial services domain
- Experience with Azure ML and machine learning pipelines
- Experience with Azure DevOps and CI/CD pipelines
- Familiarity with data visualization tools (Power BI, Tableau)
- Experience with NoSQL databases (Cosmos DB, MongoDB)
- Knowledge of Spark, Databricks, or other big data technologies
- Azure certifications (Azure Data Engineer Associate, Azure Database Administrator Associate)
- Experience with version control systems (Git, Azure Repos)
Tech Stack
- MS SQL Server 2016+, Azure SQL Database, Azure Data Factory, Azure ML, Azure Storage, Azure Functions, Python, T-SQL, Stored Procedures, ETL/ELT, SQL Performance Tools (XEvents, Profiler), Agentic AI Tools, Azure DevOps, Power BI, Agile, Git
Benefits
- 33 days of paid holiday
- Competitive compensation well above market average
- Work in a high-growth, high-impact environment with passionate, talented peers
- Clear path for personal growth and leadership development
Location: Hybrid (Bangalore)
Travel: Quarterly travel to Seattle(US)
Education: B.Tech from premium institutes only
Note: Only immediate joiners required/ 0 to 15 Days — no other applications accepted.
Role Summary
We are seeking top-tier Lead Engineers who can design, build, and deliver large-scale distributed systems with high performance, reliability, and operational excellence. The ideal candidate will be a hands-on engineer with expert system design ability, deep understanding of distributed architectures, and strong communication and leadership skills.
The Lead Engineer must be able to convert complex and ambiguous requirements into a fully engineered architecture and implementation plan covering components, data flows, infrastructure, observability, and operations.
Key Responsibilities
1. End-to-End System Architecture
- Architect scalable, reliable, and secure systems from initial concept through production rollout.
- Define system boundaries, components, service responsibilities, and integration points.
- Produce high-level (HLD) and low-level design (LLD) documents.
- Ensure designs meet performance, reliability, security, and cost objectives.
- Make informed design trade-offs with solid technical reasoning.
2. Component & Communication Design
- Break complex systems into independently deployable services.
- Define APIs, communication contracts, data models, and event schemas.
- Apply modern architecture patterns such as microservices, event-driven design, DDD, CQRS, and hexagonal architecture.
- Ensure component clarity, maintainability, and extensibility.
3. Communication Protocol & Middleware
- Design both sync and async communication layers: REST, RPC, gRPC, message queues, event streams (Kafka/Kinesis/Pulsar).
- Define retry/timeout strategies, circuit breakers, rate limiting, and versioning strategies.
- Handle backpressure, partitioning, delivery semantics (at-least/at-most/exactly once).
4. Data Architecture & Storage Strategy
- Architect data models and storage strategies for SQL and NoSQL databases, distributed caches, blob stores, and search indexes.
- Define sharding/partitioning, replication, consistency, indexing, backup/restore, and schema evolution strategies.
- Design real-time and batch data processing pipelines.
5. Operational Readiness
- Define observability (metrics, logs, traces) requirements.
- Collaborate with DevOps to ensure deployment, monitoring, alerts, and incident management readiness.
- Provide production support as a senior technical owner.
6. Leadership & Influence
- Lead technical discussions, design reviews, and cross-team collaboration.
- Mentor engineers and help elevate team practices.
- Influence technology direction and architectural standards.
Required Qualifications
- 10+ years of professional software engineering experience with strong backend and distributed systems background.
- Proven track record of leading large-scale architecture and delivery of production systems.
- Expert in system design with the ability to simplify ambiguity and craft robust solutions.
- Strong programming experience in one or more languages (Java, Go, Python, C++).
- Deep understanding of distributed systems, message streaming, queues, RPC/REST, and event-driven architecture.
- Experience with cloud platforms (AWS/Azure/GCP) and container technologies (Kubernetes/Docker).
- Strong communication, documentation, and leadership skills.
Preferred Skills
- Experience with large-scale messaging/streaming (Kafka/Pulsar), caching, and NoSQL.
- Experience designing for high availability, fault tolerance, and performance at scale.
- Mentoring and leading global engineering teams.
- Familiarity with observability tooling (Grafana, Prometheus, Jaeger).
Role Summary:
We are seeking experienced Application Support Engineers to join our client-facing support team. The ideal candidate will be the first point of contact for client issues, ensuring timely resolution, clear communication, and high customer satisfaction in a fast-paced trading environment.
Key Responsibilities:
• Act as the primary contact for clients reporting issues related to trading applications and platforms.
• Log, track, and monitor issues using internal tools and ensure resolution within defined TAT (Turnaround Time).
• Liaise with development, QA, infrastructure, and other internal teams to drive issue resolution.
• Provide clear and timely updates to clients and stakeholders regarding issue status and resolution.
• Maintain comprehensive logs of incidents, escalations, and fixes for future reference and audits.
• Offer appropriate and effective resolutions for client queries on functionality, performance, and usage.
• Communicate proactively with clients about upcoming product features, enhancements, or changes.
• Build and maintain strong relationships with clients through regular, value-added interactions.
• Collaborate in conducting UAT, release validations, and production deployment verifications.
• Assist in root cause analysis and post-incident reviews to prevent recurrences.
Required Skills & Qualifications:
• Bachelor's degree in computer science, IT, or related field.
• 2+ years in Application/Technical Support, preferably in the broking/trading domain.
• Sound understanding of capital markets – Equity, F&O, Currency, Commodities.
• Strong technical troubleshooting skills – Linux/Unix, SQL, log analysis.
• Familiarity with trading systems, RMS, OMS, APIs (REST/FIX), and order lifecycle.
• Excellent communication and interpersonal skills for effective client interaction.
• Ability to work under pressure during trading hours and manage multiple priorities.
• Customer-centric mindset with a focus on relationship building and problem-solving.
Nice to Have:
• Exposure to broking platforms like NOW, NEST, ODIN, or custom-built trading tools.
• Experience interacting with exchanges (NSE, BSE, MCX) or clearing corporations.
• Knowledge of scripting (Shell/Python) and basic networking is a plus.
• Familiarity with cloud environments (AWS/Azure) and monitoring tools.
Why Join Us?
• Be part of a team supporting mission-critical systems in real-time.
• Work in a high-energy, tech-driven environment.
• Opportunities to grow into domain/tech leadership roles.
• Competitive salary and benefits, health coverage, and employee wellness programs.
JOB DETAILS:
- Job Title: Senior Business Analyst
- Industry: Ride-hailing
- Experience: 4-7 years
- Working Days: 5 days/week
- Work Mode: ONSITE
- Job Location: Bangalore
- CTC Range: Best in Industry
Required Skills: Data Visualization, Data Analysis, Strong in Python and SQL, Cross-Functional Communication & Stakeholder Management
Criteria:
1. Candidate must have 4–7 years of experience in analytics / business analytics roles.
2. Candidate must be currently based in Bangalore only (no relocation allowed).
3. Candidate must have hands-on experience with Python and SQL.
4. Candidate must have experience working with databases/APIs (Mongo, Presto, REST or similar).
5. Candidate must have experience building dashboards/visualizations (Tableau, Metabase or similar).
6. Candidate must be available for face-to-face interviews in Bangalore.
7. Candidate must have experience working closely with business, product, and operations teams.
Description
Job Responsibilities:
● Acquiring data from primary/secondary data sources like mongo/presto/Rest APIs.
● Candidate must have strong hands-on experience in Python and SQL.
● Build visualizations to communicate data to key decision-makers and preferably familiar with building interactive dashboards in Tableau/Metabase
● Establish relationship between output metric and its drivers in order to identify critical drivers and control the critical drivers so as to achieve the desired value of output metric
● Partner with operations/business teams to consult, develop and implement KPIs, automated reporting/process solutions, and process improvements to meet business needs
● Collaborating with our business owners + product folks and perform data analysis of experiments and recommend the next best action for the business. Involves being embedded into business decision teams for driving faster decision making
● Collaborating with several functional teams within the organization and use raw data and metrics to back up assumptions, develop hypothesis/business cases and complete root cause analyses; thereby delivering output to business users
Job Requirements:
● Undergraduate and/or graduate degree in Math, Economics, Statistics, Engineering, Computer Science, or other quantitative field.
● Around 4-6 years of experience being embedded in analytics and adjacent business teams working as analyst aiding decision making
● Proficiency in Excel and ability to structure and present data in creative ways to drive insights
● Some basic understanding of (or experience in) evaluating financial parameters like return-on-investment (ROI), cost allocation, optimization, etc. is good to have
👉 ● Candidate must have strong hands-on experience in Python and SQL.
What’s there for you?
● Opportunity to understand the overall business & collaborate across all functional departments
● Prospect to disrupt the existing mobility industry business models (ideate, pilot, monitor & scale)
● Deal with the ambiguity of decision making while balancing long-term/strategic business needs and short-term/tactical moves
● Full business ownership working style which translates to freedom to pick problem statements/workflow and self-driven culture
Roles & Responsibilities
- Data Engineering Excellence: Design and implement data pipelines using formats like JSON, Parquet, CSV, and ORC, utilizing batch and streaming ingestion.
- Cloud Data Migration Leadership: Lead cloud migration projects, developing scalable Spark pipelines.
- Medallion Architecture: Implement Bronze, Silver, and gold tables for scalable data systems.
- Spark Code Optimization: Optimize Spark code to ensure efficient cloud migration.
- Data Modeling: Develop and maintain data models with strong governance practices.
- Data Cataloging & Quality: Implement cataloging strategies with Unity Catalog to maintain high-quality data.
- Delta Live Table Leadership: Lead the design and implementation of Delta Live Tables (DLT) pipelines for secure, tamper-resistant data management.
- Customer Collaboration: Collaborate with clients to optimize cloud migrations and ensure best practices in design and governance.
Educational Qualifications
- Experience: Minimum 5 years of hands-on experience in data engineering, with a proven track record in complex pipeline development and cloud-based data migration projects.
- Education: Bachelor’s or higher degree in Computer Science, Data Engineering, or a related field.
- Skills
- Must-have: Proficiency in Spark, SQL, Python, and other relevant data processing technologies. Strong knowledge of Databricks and its components, including Delta Live Table (DLT) pipeline implementations. Expertise in on-premises to cloud Spark code optimization and Medallion Architecture.
Good to Have
- Familiarity with AWS services (experience with additional cloud platforms like GCP or Azure is a plus).
Soft Skills
- Excellent communication and collaboration skills, with the ability to work effectively with clients and internal teams.
- Certifications
- AWS/GCP/Azure Data Engineer Certification.
Company Description
eShipz is a rapidly expanding logistics automation platform designed to optimize shipping operations and enhance post-purchase customer experiences. The platform offers solutions such as multi-carrier integrations, real-time tracking, NDR management, returns, freight audits, and more. Trusted by over 350 businesses, eShipz provides easy-to-use analytics, automated shipping processes, and reliable customer support. As a trusted partner for eCommerce businesses and enterprises, eShipz delivers smarter, more efficient shipping solutions. Visit www.eshipz.com for more information.
Role Description
The Python Support Engineer role at eShipz requires supporting clients by providing technical solutions and resolving issues related to the platform. Responsibilities include troubleshooting reported problems, delivering technical support in a professional manner, and assisting with software functionality and operating systems. The engineer will also collaborate with internal teams to ensure a seamless customer experience. This is a full-time on-site role located in Sanjay Nagar, Greater Bengaluru Area.
Qualifications
- Strong proficiency in Troubleshooting and Technical Support skills to identify and address software or technical challenges effectively.
- Capability to provide professional Customer Support and Customer Service, ensuring high customer satisfaction and resolving inquiries promptly.
- Proficiency and knowledge of Operating Systems to diagnose and resolve platform-specific issues efficiently.
- Excellent problem-solving, communication, and interpersonal skills.
- Bachelor's degree in computer science, IT, or a related field.
- Experience working with Python and an understanding of backend systems is a plus.
- Technical Skill:
- Python Proficiency: Strong understanding of core Python (Data structures, decorators, generators, and exception handling).
- Frameworks: Familiarity with web frameworks like Django, Flask, or FastAPI.
- Databases: Proficiency in SQL (PostgreSQL/MySQL) and understanding of ORMs like SQLAlchemy or Django ORM.
- Infrastructure: Basic knowledge of Linux/Unix commands, Docker, and CI/CD pipelines (Jenkins/GitHub Actions).
- Version Control: Comfortable using Git for branching, merging, and pull requests.
- Soft Skill:
- Analytical Thinking: A logical approach to solving complex, "needle-in-a-haystack" problems.
- Communication: Ability to explain technical concepts to both developers and end-users.
- Patience & Empathy: Managing high-pressure situations when critical systems are down.
- Work Location: Sanjay Nagar, Bangalore (WFO)
- Work Timing :
- Mon - Fri (WFO)(9:45 A.M. - 6: 15 P.M.)
- 1st & 3rd SAT (WFO)(9:00 A.M. - 2:00 P.M.)
- 2nd & 4th SAT (WFH)(9:00 A.M. - 2:00 P.M.)
About the role
We are seeking a seasoned Backend Tech Lead with deep expertise in Golang and Python to lead our backend team. The ideal candidate has 6+ years of experience in backend technologies and 2–3 years of proven engineering mentoring experience, having successfully scaled systems and shipped B2C applications in collaboration with product teams.
Responsibilities
Technical & Product Delivery
● Oversee design and development of backend systems operating at 10K+ RPM scale.
● Guide the team in building transactional systems (payments, orders, etc.) and behavioral systems (analytics, personalization, engagement tracking).
● Partner with product managers to scope, prioritize, and release B2C product features and applications.
● Ensure architectural best practices, high-quality code standards, and robust testing practices.
● Own delivery of projects end-to-end with a focus on scalability, reliability, and business impact.
Operational Excellence
● Champion observability, monitoring, and reliability across backend services.
● Continuously improve system performance, scalability, and resilience.
● Streamline development workflows and engineering processes for speed and quality.
Requirements
● Experience:
7+ years of professional experience in backend technologies.
2-3 years as Tech lead and driving delivery.
● Technical Skills:
Strong hands-on expertise in Golang and Python.
Proven track record with high-scale systems (≥10K RPM).
Solid understanding of distributed systems, APIs, SQL/NoSQL databases, and cloud platforms.
● Leadership Skills:
Demonstrated success in managing teams through 2–3 appraisal cycles.
Strong experience working with product managers to deliver consumer-facing applications.
● Excellent communication and stakeholder management abilities.
Nice-to-Have
● Familiarity with containerization and orchestration (Docker, Kubernetes).
● Experience with observability tools (Prometheus, Grafana, OpenTelemetry).
● Previous leadership experience in B2C product companies operating at scale.
What We Offer
● Opportunity to lead and shape a backend engineering team building at scale.
● A culture of ownership, innovation, and continuous learning.
● Competitive compensation, benefits, and career growth opportunities.
Lead Software Engineer
Bidgely is seeking an exceptional and visionary Lead Software Engineer to join its core team in Bangalore. As a Lead Software Engineer, you will be working closely with EMs and org heads in shaping the roadmap and planning and set the technical direction for the team, influence architectural decisions, and mentor other engineers while delivering highly reliable, scalable products powered by large data, advanced machine learning models, and responsive user interfaces. Renowned for your deep technical expertise, you are capable of deconstructing any system, solving complex problems creatively, and elevating those around you. Join our innovative and dynamic team that thrives on creativity, technical excellence, and a belief that nothing is impossible with collaboration and hard work.
Responsibilities
- Lead the design and delivery of complex, scalable web services, APIs, and backend data modules.
- Define and drive adoption of best practices in system architecture, component reusability, and software design patterns across teams.
- Provide technical leadership in product, architectural, and strategic engineering discussions.
- Mentor and guide engineers at all levels, fostering a culture of learning and growth.
- Collaborate with cross-functional teams (engineering, product management, data science, and UX) to translate business requirements into scalable, maintainable solutions.
- Champion and drive continuous improvement initiatives for code quality, performance, security, and reliability.
- Evaluate and implement emerging technologies, tools, and methodologies to ensure competitive advantage.
- Present technical concepts and results clearly to both technical and non-technical stakeholders; influence organizational direction and recommend key technical investments.
Requirements
- 6+ years of experience in designing and developing highly scalable backend and middle tier systems.
- BS/MS/PhD in Computer Science or a related field from a leading institution.
- Demonstrated mastery of data structures, algorithms, and system design; experience architecting large-scale distributed systems and leading significant engineering projects.
- Deep fluency in Java, Spring, Hibernate, J2EE, RESTful services; expertise in at least one additional backend language/framework.
- Strong hands-on experience with both SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB, Cassandra, Redis) databases, including schema design, optimization, and performance tuning for large data sets.
- Experience with Distributed Systems, Cloud Architectures, CI/CD, and DevOps principles.
- Strong leadership, mentoring, and communication skills; proven ability to drive technical vision and alignment across teams.
- Track record of delivering solutions in fast-paced and dynamic start-up environments.
- Commitment to quality, attention to detail, and a passion for coaching others.
Python Backend Developer
We are seeking a skilled Python Backend Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be on developing server-side logic to ensure high performance and responsiveness to requests from the front end. You will also be responsible for integrating front-end elements built by your coworkers into the application, as well as managing AWS resources.
Roles & Responsibilities
- Develop and maintain scalable, secure, and robust backend services using Python
- Design and implement RESTful APIs and/or GraphQL endpoints
- Integrate user-facing elements developed by front-end developers with server-side logic
- Write reusable, testable, and efficient code
- Optimize components for maximum performance and scalability
- Collaborate with front-end developers, DevOps engineers, and other team members
- Troubleshoot and debug applications
- Implement data storage solutions (e.g., PostgreSQL, MySQL, MongoDB)
- Ensure security and data protection
Mandatory Technical Skill Set
- Implementing optimal data storage (e.g., PostgreSQL, MySQL, MongoDB, S3)
- Python backend development experience
- Design, implement, and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, or GitHub Actions
- Implemented and managed containerization platforms such as Docker and orchestration tools like Kubernetes
- Previous hands-on experience in:
- EC2, S3, ECS, EMR, VPC, Subnets, SQS, CloudWatch, CloudTrail, Lambda, SageMaker, RDS, SES, SNS, IAM, S3, Backup, AWS WAF
- SQL
Specific Knowledge/Skills
- 4-6 years of experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies
Review Criteria:
- Strong Software Engineer fullstack profile using NodeJS / Python and React
- 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
- Must have strong experience in working on Typescript
- Must have experience in message-based systems like Kafka, RabbitMq, Redis
- Databases - PostgreSQL & NoSQL databases like MongoDB
- Product Companies Only
- Tier 1 Engineering Institutes (IIT, NIT, BITS, IIIT, DTU or equivalent)
Preferred:
- Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
- Experience in mentoring, coaching the team.
Role & Responsibilities:
We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.
The Ideal Candidate Will Be Able To-
- Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
- Mentor team members to develop in line with product requirements.
- Collaborate with Senior Architect for design and technology choices for product development roadmap.
- Do code reviews.
Ideal Candidate:
- Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
- Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
- Experience with AWS services such as S3, IAM, Lambda etc.
- Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
- Eye for user responsive designs on the frontend.
Perks, Benefits and Work Culture:
- We prioritize people above all else. While we're recognized for our innovative technology solutions, it's our people who drive our success. That’s why we offer a comprehensive and competitive benefits package designed to support your well-being and growth:
- Medical Insurance with coverage up to INR 8,00,000 for the employee and their family
Qualifications:
- Must have a Bachelor’s degree in computer science or equivalent.
- Must have at least 5+ years’ experience as a SDET.
- At least 1+ year of leadership experience or managing a team.
Responsibilities:
- Design, develop and execute automation scripts using open-source tools.
- Troubleshooting any errors and streamlining the testing procedures.
- Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
- Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
- Good time-management skills and commitment to meet deadlines.
- Stay up-to-date with new testing tools and test strategies.
- Driving technical projects and providing leadership in an innovative and fast-paced environment.
Requirements:
- Experience in the Automation - API and UI as well as Manual Testing on Web Application.
- Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
- Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
- Must be proficient in Core Java / Type Script and Java 17.
- Experience in JUnit-5.
- Good to have TypeScript experience.
- Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
- Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
- Good understanding of software & systems architecture.
- Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
- Strong experience REST based components testing, back-end, DB and micro services testing.
Work Location: Jayanagar - Bangalore.
🚀 Hiring: Java Developer at Deqode
⭐ Experience: 4+ Years
📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Requirements
✅ Strong proficiency in Java (Java 8/11/17)
✅ Experience with Spring / Spring Boot
✅ Knowledge of REST APIs, Microservices architecture
✅ Familiarity with SQL/NoSQL databases
✅ Understanding of Git, CI/CD pipelines
✅ Problem-solving skills and attention to detail
Job Description:
Exp Range - [6y to 10y]
Qualifications:
- Minimum Bachelors Degree in Engineering or Computer Applications or AI/Data science
- Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
- Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred
Require Skills:
- Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
- Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
- Must Have – Experience working with models such as Random forest, Kmeans clustering, BERT…
- Should Have – Exposure to querying warehouses and APIs
- Should Have – Experience with writing moderate to complex SQL queries
- Should Have – Experience analyzing and presenting data with BI tools or Excel
- Must Have – Very strong communication skills to work with technical and non technical stakeholders in a global environment
Roles and Responsibilities:
- Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
- Analyse and present insights about the data and processes to Business Stakeholders
- Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
- Develop and deploy customized models on Production data sets to generate analytical insights and predictions
- Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
- Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
- Share knowledge and best practices with broader teams to make everyone aware and more productive.
Job Title: QA Tester – FinTech (Manual + Automation Testing)
Location: Bangalore, India
Job Type: Full-Time
Experience Required: 3 Years
Industry: FinTech / Financial Services
Function: Quality Assurance / Software Testing
About the Role:
We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.
Key Responsibilities:
- Analyze business and functional requirements for financial products and translate them into test scenarios.
- Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
- Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
- Conduct API testing using Postman, Rest Assured, or similar tools.
- Perform functional, regression, integration, and system testing across web and mobile platforms.
- Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
- Log and track defects using JIRA or a similar defect management tool.
- Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
- Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
- Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.
Required Skills and Experience:
- 3+ years of hands-on experience in manual and automation testing.
- Solid understanding of QA methodologies, STLC, and SDLC.
- Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
- Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
- Knowledge of API testing, including RESTful services.
- Familiarity with SQL to validate data in databases.
- Understanding of CI/CD processes and basic scripting for automation integration.
- Good problem-solving skills and attention to detail.
- Excellent communication and documentation skills.
Preferred Qualifications:
- Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
- Experience with mobile app testing (iOS/Android).
- Working knowledge of test management tools like TestRail, Zephyr, or Xray.
- Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
- Basic knowledge of version control systems (e.g., Git).
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.
Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.
Key Responsibilities
- Advanced Troubleshooting & Incident Management:
- Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
- Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
- Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
- Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
- Python-Specific Tasks:
- Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
- Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
- Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
- Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
- Collaboration and Escalation:
- Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
- Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
- Documentation and Process Improvement:
- Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
- Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
- Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
- Customer Communication:
- Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.
Required Technical Skills
- Programming/Scripting:
- Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
- Experience with other scripting languages like Bash or Shell
- Databases:
- Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
- Application/Web Technologies:
- Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
- Knowledge of application architectures (e.g., microservices, SOA) is a plus.
- Monitoring & Tools:
- Experience with support ticketing systems (e.g., JIRA, ServiceNow).
- Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)
An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.
Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.
Key Responsibilities
- Advanced Troubleshooting & Incident Management:
- Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
- Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
- Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
- Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
- Python-Specific Tasks:
- Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
- Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
- Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
- Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
- Collaboration and Escalation:
- Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
- Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
- Documentation and Process Improvement:
- Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
- Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
- Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
- Customer Communication:
- Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.
Required Technical Skills
- Programming/Scripting:
- Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
- Experience with other scripting languages like Bash or Shell
- Databases:
- Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
- Application/Web Technologies:
- Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
- Knowledge of application architectures (e.g., microservices, SOA) is a plus.
- Monitoring & Tools:
- Experience with support ticketing systems (e.g., JIRA, ServiceNow).
- Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)
Job Responsibilities :
- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.
- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.
- Ability to take full ownership of the software development lifecycle from requirement to release.
- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.
- Embrace design and code reviews to deliver quality code.
- Play a key role in taking Trendlyne to the next level as a world-class engineering team
-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.
- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.
- Develop and maintain scalable, robust backend systems using Python and Django framework.
- Proficient understanding of the performance of web and mobile applications.
- Mentor junior developers and foster skill development within the team.
Job Requirements :
- 1+ years of experience with Python and Django.
- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.
- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS
Who are we :
Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.
Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.
What do we do :
We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
Job Description
Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.
Responsibilities
- Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
- Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
- Implement daily data summarization and data normalization routines.
- Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
- Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
- Contribute to documentation, code reviews, and team knowledge sharing.
Required Skills and Experience
- 5+ years of professional experience programming in C# and Microsoft .NET framework.
- Strong understanding of message-based and real-time programming architectures.
- Experience working with AWS services, specifically S3, for data retrieval and processing.
- Experience with SQL and Microsoft SQL Server.
- Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
- Excellent interpersonal and communication skills.
- Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.
Review Criteria
- Strong Implementation Manager / Customer Success Implementation / Technical Solutions / Post-Sales SaaS Delivery
- 3+ years of hands-on experience in software/tech Implementation roles within technical B2B SaaS companies, preferably working with global or US-based clients
- Must have direct experience leading end-to-end SaaS product implementations — including onboarding, workflow configuration, API integrations, data setup, and customer training
- Must have strong technical understanding — including ability to read and write basic SQL queries, debug API workflows, and interpret JSON payloads for troubleshooting or configuration validation.
- Must have worked in post-sales environments, owning customer success and delivery after deal closure, ensuring product adoption, accurate setup, and smooth go-live.
- Must have experience collaborating cross-functionally with product, engineering, and sales teams to ensure timely resolution of implementation blockers and seamless client onboarding.
- (Company): B2B SaaS startup or growth-stage company
- Mandatory (Note): Good growth opportunity, this role will have team leading option after a few months
Preferred
- Preferred (Experience): Previous experience in FinTech SaaS like BillingTech, finance automation, or subscription management platforms will be a strong plus
Job Specific Criteria
- CV Attachment is mandatory
- Are you open to work in US timings (4/5:00 PM - 3:00 AM) - to target the US market?
- Please provide CTC Breakup (Fixed + Variable)?
- It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
- It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
Role & Responsibilities
As the new hire in this role, you'll be the voice of the customer in the company, and lead the charge in developing our customer-centric approach, working closely with our tech, design, and product teams.
What you will be doing:
You will be responsible for converting, onboarding, managing, and proactively ensuring success for our customers/prospective clients.
- Implementation
- Understand client billing models and configure company contracts, pricing, metering, and invoicing accurately.
- Lead pilots and implementation for new customers, ensuring complete onboarding within 3–8 weeks.
- Translate complex business requirements into structured company workflows and setup.
- Pre-sales & Technical Discovery
- Support sales with live demos, sandbox setups, and RFP responses.
- Participate in technical discovery calls to map company capabilities to client needs.
- Create and maintain demo environments showcasing relevant use cases.
- Internal Coordination & Escalation
- Act as the voice of the customer internally — share structured feedback with product and engineering.
- Create clear, well-scoped handoff documents when working with technical teams.
- Escalate time-sensitive issues appropriately and follow through on resolution.
- Documentation & Enablement
- Create client-specific documentation (e.g., onboarding guides, configuration references).
- Contribute to internal wikis, training material, and product documentation.
- Write simple, to-the-point communication — clear enough for a CXO and detailed enough for a developer.
Ideal Candidate
- 3-7 years of relevant experience
- Willing to work in US time zone (~430 am IST) on weekdays (Mon-Fri)
- Ability to understand and shape the product at a granular level
- Ability to empathize with the customers, and understand their pain points
- Understanding of SaaS architecture and APIs conceptually — ability to debug API workflows and usage issues
- Previous experience in salesforce CRM
- Entrepreneurial drive, and willingness to wear multiple hats as per company’s requirements
- Strong analytical skills and a structured problem-solving approach
- (Strongly preferred) Computer science background and basic coding experience
- Ability to understand functional aspects related to the product e.g., accounting/revenue recognition, receivables, billing etc
- Self-motivated and proactive in managing tasks and responsibilities, requiring minimal follow-ups.
- Self-driven individual with high ownership and strong work ethic
- Not taking yourself too seriously.
Job Description – SEO Specialist
Company: Capace Software Pvt. Ltd.
Location: Bhopal / Bangalore (On-site)
Experience: 2+ Years
Budget: Up to ₹4 LPA
Position: Full-Time
About the Role
Capace Software Pvt. Ltd. is looking for a skilled SEO Specialist with strong expertise in On-Page SEO, Off-Page SEO, and Technical SEO. The ideal candidate will be responsible for improving our search engine ranking, driving organic traffic, and ensuring technical search requirements are met across websites.
Key Responsibilities
🔹 On-Page SEO
- Optimize meta titles, descriptions, header tags, and URLs
- Conduct in-depth keyword research and implement strategic keyword placement
- Optimize website content for relevancy and readability
- Implement internal linking strategies
- Optimize images, schema, and site structure for SEO
- Ensure webpages follow SEO best practices
🔹 Off-Page SEO
- Create and execute backlink strategies
- Manage directory submissions, social bookmarking, classified listings
- Conduct competitor backlink analysis
- Build high-quality guest post links and outreach
- Improve brand visibility through digital promotions
🔹 Technical SEO
- Conduct website audits (crawl errors, index issues, technical fixes)
- Optimize website speed and performance
- Implement schema markup and structured data
- Manage XML sitemaps and robots.txt
- Resolve indexing, crawling, and canonical issues
- Work with developers to implement technical updates
Requirements
- Minimum 2+ years of experience in SEO
- Strong knowledge of On-Page, Off-Page & Technical SEO
- Experience with tools like:
- Google Analytics
- Google Search Console
- Ahrefs / SEMrush / Ubersuggest
- Screaming Frog (good to have)
- Understanding of HTML, CSS basics (preferred)
- Strong analytical and reporting skills
- Good communication and documentation skills
What We Offer
- Competitive salary up to ₹4 LPA
- Opportunity to work on multiple SaaS products and websites
- Supportive team & learning-focused environment
- Career growth in digital marketing & SEO domain
Job Summary:
We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:
- Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
- Implement and maintain RESTful APIs, ensuring high performance and scalability.
- Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
- Develop and manage Docker containers, enabling efficient development and deployment pipelines.
- Integrate messaging services like Apache Kafka into microservice architectures.
- Design and maintain data models using PostgreSQL or other SQL databases.
- Implement unit testing using JUnit and mocking frameworks to ensure code quality.
- Develop and execute API automation tests using Cucumber or similar tools.
- Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
- Work with Kubernetes for orchestrating containerized services.
- Utilize Couchbase or similar NoSQL technologies when necessary.
- Participate in code reviews, design discussions, and contribute to best practices and standards.
Required Skills & Qualifications:
- Strong experience in Java (11 or above) and Spring Boot framework.
- Solid understanding of microservices architecture and deployment on Azure.
- Hands-on experience with Docker, and exposure to Kubernetes.
- Proficiency in Kafka, with real-world project experience.
- Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
- Experience in writing unit tests using JUnit and mocking tools.
- Experience with Cucumber or similar frameworks for API automation testing.
- Exposure to CI/CD tools, DevOps processes, and Git-based workflows.
Nice to Have:
- Azure certifications (e.g., Azure Developer Associate)
- Familiarity with Couchbase or other NoSQL databases.
- Familiarity with other cloud providers (AWS, GCP)
- Knowledge of observability tools (Prometheus, Grafana, ELK)
Soft Skills:
- Strong problem-solving and analytical skills.
- Excellent verbal and written communication.
- Ability to work in an agile environment and contribute to continuous improvement.
Why Join Us:
- Work on cutting-edge microservice architectures
- Strong learning and development culture
- Opportunity to innovate and influence technical decisions
- Collaborative and inclusive work environment
Job Description: Business Analyst – Data Integrations
Location: Bangalore / Hybrid / Remote
Company: LodgIQ
Industry: Hospitality / SaaS / Machine Learning
About LodgIQ
Headquartered in New York, LodgIQ delivers a revolutionary B2B SaaS platform to the
travel industry. By leveraging machine learning and artificial intelligence, we enable precise
forecasting and optimized pricing for hotel revenue management. Backed by Highgate
Ventures and Trilantic Capital Partners, LodgIQ is a well-funded, high-growth startup with a
global presence.
About the Role
We’re looking for a skilled Business Analyst – Data Integrations who can bridge the gap
between business operations and technology teams, ensuring smooth, efficient, and scalable
integrations. If you’re passionate about hospitality tech and enjoy solving complex data
challenges, we’d love to hear from you!
What You’ll Do
Key Responsibilities
Collaborate with vendors to gather requirements for API development and ensure
technical feasibility.
Collect API documentation from vendors; document and explain business logic to
use external data sources effectively.
Access vendor applications to create and validate sample data; ensure the accuracy
and relevance of test datasets.
Translate complex business logic into documentation for developers, ensuring
clarity for successful integration.
Monitor all integration activities and support tickets in Jira, proactively resolving
critical issues.
Lead QA testing for integrations, overseeing pilot onboarding and ensuring solution
viability before broader rollout.
Document onboarding processes and best practices to streamline future
integrations and improve efficiency.
Build, train, and deploy machine learning models for forecasting, pricing, and
optimization, supporting strategic goals.
Drive end-to-end execution of data integration projects, including scoping, planning,
delivery, and stakeholder communication.
Gather and translate business requirements into actionable technical specifications,
liaising with business and technical teams.
Oversee maintenance and enhancement of existing integrations, performing RCA
and resolving integration-related issues.
Document workflows, processes, and best practices for current and future
integration projects.
Continuously monitor system performance and scalability, recommending
improvements to increase efficiency.
Coordinate closely with Operations for onboarding and support, ensuring seamless
handover and issue resolution.
Desired Skills & Qualifications
Strong experience in API integration, data analysis, and documentation.
Familiarity with Jira for ticket management and project workflow.
Hands-on experience with machine learning model development and deployment.
Excellent communication skills for requirement gathering and stakeholder
engagement.
Experience with QA test processes and pilot rollouts.
Proficiency in project management, data workflow documentation, and system
monitoring.
Ability to manage multiple integrations simultaneously and work cross-functionally.
Required Qualifications
Experience: Minimum 4 years in hotel technology or business analytics, preferably
handling data integration or system interoperability projects.
Technical Skills:
Basic proficiency in SQL or database querying.
Familiarity with data integration concepts such as APIs or ETL workflows
(preferred but not mandatory).
Eagerness to learn and adapt to new tools, platforms, and technologies.
Hotel Technology Expertise: Understanding of systems such as PMS, CRS, Channel
Managers, or RMS.
Project Management: Strong organizational and multitasking abilities.
Problem Solving: Analytical thinker capable of troubleshooting and driving resolution.
Communication: Excellent written and verbal skills to bridge technical and non-
technical discussions.
Attention to Detail: Methodical approach to documentation, testing, and deployment.
Preferred Qualification
Exposure to debugging tools and troubleshooting methodologies.
Familiarity with cloud environments (AWS).
Understanding of data security and privacy considerations in the hospitality industry.
Why LodgIQ?
Join a fast-growing, mission-driven company transforming the future of hospitality.
Work on intellectually challenging problems at the intersection of machine learning,
decision science, and human behavior.
Be part of a high-impact, collaborative team with the autonomy to drive initiatives from
ideation to production.
Competitive salary and performance bonuses.
For more information, visit https://www.lodgiq.com

Global digital transformation solutions provider.
Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore



















