50+ SQL Jobs in Pune | SQL Job openings in Pune
Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.



About the company
We are the most trusted provider of data collection and management, marketing program management, and analytical solutions for our Crop and Animal Health industry clients. With data services at the core—surrounded by an extensible array of streamlined software solutions—our unified platform represents over three decades of innovation and expertise in the agriculture, crop protection, specialty chemical and animal health industries.
Backed by an entrepreneurial, creative and energetic work force, teammates at AGDATA are pushing the boundaries of technology to enhance our relationships with our clients. We are a growing team, focused on adding creative, knowledgeable individuals who are ready to jump right in and make an immediate impact.
- 30+ years of experience in the Crop and Animal Health industry
- More than 20 billion USD sales processed annually
- Over 2,15,000 payments issued via marketing programs yearly
What’s the role?
If you are looking for an opportunity to solve deep technical problems, build innovative solutions, and work with top-notch software developers in the Pune area, AGDATA might have the role for you.
You must be able to look at the big picture from both business and technology perspective, possess strong analytical, design, and problem-solving skills, and enjoy working with data and algorithms.
You are not afraid of ambiguity, dealing with nebulous requirements, and get excited about difficult challenges.
Our ideal candidate will have...
- 7+ years of software development experience with emphasis on web technologies, cloud computing (Azure preferred), and SaaS
- Deep hands-on experience in Microsoft technologies stack such as .Net 6+, C# (strong knowledge of collections, async await patterns), Web API, windows services, and relational database (MSSQL)
- Proven experience on front end technologies like Angular
- Expertise in RESTful API, SOA, Microservice, AMQP and distributed architecture and design
- Ability to understand complex data relationships
- Experience in Unit Testing
- Experience in Azure cloud services/ Azure DevOps
- Demonstrated skill in aligning application decisions to an overarching solution and systems architecture
- Structured thinker, effective communicator, with excellent programming and analytic skills
In this role, you will ...
- Take your problem-solving skills and expertise in system design to the next level by delivering innovative solutions
- Actively contribute to the development process by writing high-quality code
- Utilize your full stack development skills and work with diverse technologies to deliver outstanding results
- Adapt quickly to new technologies and leverage your past experiences to stay ahead
- Exhibit a passion for building software and delivering high-quality products, prioritizing user experience
- Engage in all phases of the software development life cycle, including design, implementation, and unit testing
- Think from the perspective of our customers, optimizing their experience with our software
How AGDATA will support you:
Supporting your health & well-being:
- Comprehensive medical coverage – up to INR 7.5 lakh for employee and dependents, including parents
- OPD benefit – coverage of up to INR 15 thousand covering expenses across specialties
- Paternity leave up to 14 working days with the option to split leave
Emphasizing work life balance: Flexible hybrid work policy
Experiencing a work culture that promotes from within: In 2023, 14% of our associates were promoted internally
Being comfortable in the office: Coming into our brand-new office space? Free snacks and top class facilities will be available
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/wNrG3R
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2026
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/wNrG3R
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/gqHtdK
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2024, 2025
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/gqHtdK
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru
Job Description – Java Developer
Role: Java Developer
Location: Pune / Mumbai
Experience: 5 to 10 years
Required Skills:
We are looking for an experienced Java Developer with strong expertise in Core Java, Spring, Spring Boot, and Hibernate. The candidate should have solid experience in designing, developing, and deploying enterprise-grade applications, with strong understanding of OOPs concepts, data structures, and algorithms. Hands-on experience with RESTful APIs, Microservices, and Database technologies (MySQL/Oracle/SQL Server) is essential.
The ideal candidate should be well-versed in version control systems (Git), build tools (Maven/Gradle), and CI/CD pipelines (Jenkins). Exposure to cloud platforms (AWS/Azure/GCP) and containerization (Docker/Kubernetes) will be a strong plus.
Key Responsibilities:
- Design, develop, and maintain scalable and high-performance applications.
- Write clean, reusable, and efficient code following best practices.
- Collaborate with cross-functional teams to deliver quality solutions.
- Perform code reviews, debugging, and performance tuning.
- Ensure application security, reliability, and scalability.
Good To Have Skills:
- Knowledge of front-end technologies (JavaScript, Angular/React).
- Familiarity with Agile/Scrum methodologies.
- Strong problem-solving and analytical skills.

Role Overview
We are looking for a highly skilled Product Engineer to join our dynamic team. This is an exciting opportunity to work on innovative FinTech solutions and contribute to the future of global payments. If you're passionate about backend development, API design, and scalable architecture, we'd love to hear from you!
Key Responsibilities
- Design, develop, and maintain scalable, high-performance backend systems.
- Write clean, maintainable, and efficient code while following best practices.
- Build and optimize RESTful APIs and database queries.
- Collaborate with cross-functional teams to deliver 0 to 1 products.
- Ensure smooth CI/CD pipeline implementation and deployment automation.
- Contribute to open-source projects and stay updated with industry trends.
- Maintain a strong focus on security, performance, and reliability.
- Work with payment protocols and financial regulations to ensure compliance.
Required Skills & Qualifications
- ✅ 3+ years of professional software development experience.
- ✅ Proficiency in any backend language (with preference for Ruby on Rails).
- ✅ Strong foundation in architecture, design, and database optimization.
- ✅ Experience in building APIs and working with SQL/NoSQL databases.
- ✅ Familiarity with CI/CD practices and automation tools.
- ✅ Excellent problem-solving and analytical skills.
- ✅ Strong track record of open-source contributions (minimum 50 stars on GitHub).
- ✅ Passion for FinTech and payment systems.
- ✅ Strong communication skills and ability to work collaboratively in a team.
Nice to Have
- Prior experience in financial services or payment systems.
- Exposure to microservices architecture and cloud platforms.
- Knowledge of containerization tools like Docker & Kubernetes.

We’re hiring a Full Stack Developer (5+ years, Pune location) to join our growing team!
You’ll be working with React.js, Node.js, JavaScript, APIs, and cloud deployments to build scalable and high-performing web applications.
Responsibilities include developing responsive apps, building RESTful APIs, working with SQL/NoSQL databases, and deploying apps on AWS/Docker.
Experience with CI/CD, Git, secure coding practices (OAuth/JWT), and Agile collaboration is a must.
If you’re passionate about full stack development and want to work on impactful projects, we’d love to connect!
- Develop, and maintain Java applications using Core Java, Spring framework, JDBC, and threading concepts.
- Strong understanding of the Spring framework and its various modules.
- Experience with JDBC for database connectivity and manipulation
- Utilize database management systems to store and retrieve data efficiently.
- Proficiency in Core Java8 and thorough understanding of threading concepts and concurrent programming.
- Experience in in working with relational and nosql databases.
- Basic understanding of cloud platforms such as Azure and GCP and gain experience on DevOps practices is added advantage.
- Knowledge of containerization technologies (e.g., Docker, Kubernetes)
- Perform debugging and troubleshooting of applications using log analysis techniques.
- Understand multi-service flow and integration between components.
- Handle large-scale data processing tasks efficiently and effectively.
- Hands on experience using Spark is an added advantage.
- Good problem-solving and analytical abilities.
- Collaborate with cross-functional teams to identify and solve complex technical problems.
- Knowledge of Agile methodologies such as Scrum or Kanban
- Stay updated with the latest technologies and industry trends to improve development processes continuously and methodologies
If interested please share your resume with details :
Total Experience -
Relevant Experience in Java,Spring,Data structures,Alogorithm,SQL, -
Relevant Experience in Cloud - AWS/Azure/GCP -
Current CTC -
Expected CTC -
Notice Period -
Reason for change -
Profile: AWS Data Engineer
Mandate skills :AWS + Databricks + Pyspark + SQL role
Location: Bangalore/Pune/Hyderabad/Chennai/Gurgaon:
Notice Period: Immediate
Key Requirements :
- Design, build, and maintain scalable data pipelines to collect, process, and store from multiple datasets.
- Optimize data storage solutions for better performance, scalability, and cost-efficiency.
- Develop and manage ETL/ELT processes to transform data as per schema definitions, apply slicing and dicing, and make it available for downstream jobs and other teams.
- Collaborate closely with cross-functional teams to understand system and product functionalities, pace up feature development, and capture evolving data requirements.
- Engage with stakeholders to gather requirements and create curated datasets for downstream consumption and end-user reporting.
- Automate deployment and CI/CD processes using GitHub workflows, identifying areas to reduce manual, repetitive work.
- Ensure compliance with data governance policies, privacy regulations, and security protocols.
- Utilize cloud platforms like AWS and work on Databricks for data processing with S3 Storage.
- Work with distributed systems and big data technologies such as Spark, SQL, and Delta Lake.
- Integrate with SFTP to push data securely from Databricks to remote locations.
- Analyze and interpret spark query execution plans to fine-tune queries for faster and more efficient processing.
- Strong problem-solving and troubleshooting skills in large-scale distributed systems.

• Data Pipeline Development: Design and implement scalable data pipelines using PySpark and Databricks on AWS cloud infrastructure
• ETL/ELT Operations: Extract, transform, and load data from various sources using Python, SQL, and PySpark for batch and streaming data processing
• Databricks Platform Management: Develop and maintain data workflows, notebooks, and clusters in Databricks environment for efficient data processing
• AWS Cloud Services: Utilize AWS services including S3, Glue, EMR, Redshift, Kinesis, and Lambda for comprehensive data solutions
• Data Transformation: Write efficient PySpark scripts and SQL queries to process large-scale datasets and implement complex business logic
• Data Quality & Monitoring: Implement data validation, quality checks, and monitoring solutions to ensure data integrity across pipelines
• Collaboration: Work closely with data scientists, analysts, and other engineering teams to support analytics and machine learning initiatives
• Performance Optimization: Monitor and optimize data pipeline performance, query efficiency, and resource utilization in Databricks and AWS environments
Required Qualifications:
• Experience: 3+ years of hands-on experience in data engineering, ETL development, or related field
• PySpark Expertise: Strong proficiency in PySpark for large-scale data processing and transformations
• Python Programming: Solid Python programming skills with experience in data manipulation libraries (pandas etc)
• SQL Proficiency: Advanced SQL skills including complex queries, window functions, and performance optimization
• Databricks Experience: Hands-on experience with Databricks platform, including notebook development, cluster management, and job scheduling
• AWS Cloud Services: Working knowledge of core AWS services (S3, Glue, EMR, Redshift, IAM, Lambda)
• Data Modeling: Understanding of dimensional modeling, data warehousing concepts, and ETL best practices
• Version Control: Experience with Git and collaborative development workflows
Preferred Qualifications:
• Education: Bachelor's degree in Computer Science, Engineering, Mathematics, or related technical field
• Advanced AWS: Experience with additional AWS services like Athena, QuickSight, Step Functions, and CloudWatch
• Data Formats: Experience working with various data formats (JSON, Parquet, Avro, Delta Lake)
• Containerization: Basic knowledge of Docker and container orchestration
• Agile Methodology: Experience working in Agile/Scrum development environments
• Business Intelligence Tools: Exposure to BI tools like Tableau, Power BI, or Databricks SQL Analytics
Technical Skills Summary:
Core Technologies:
- PySpark & Spark SQL
- Python (pandas, boto3)
- SQL (PostgreSQL, MySQL, Redshift)
- Databricks (notebooks, clusters, jobs, Delta Lake)
AWS Services:
- S3, Glue, EMR, Redshift
- Lambda, Athena
- IAM, CloudWatch
Development Tools:
- Git/GitHub
- CI/CD pipelines, Docker
- Linux/Unix command line


About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.
Data Axle Pune is pleased to have achieved certification as a Great Place to Work!
Roles & Responsibilities:
We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.
We are looking for a Senior Data Scientist who will be responsible for:
- Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
- Design or enhance ML workflows for data ingestion, model design, model inference and scoring
- Oversight on team project execution and delivery
- If senior, establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
- Visualize and publish model performance results and insights to internal and external audiences
Qualifications:
- Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
- Minimum of 3.5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
- Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
- Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
- Proficiency in Python and SQL required; PySpark/Spark experience a plus
- Ability to conduct a productive peer review and proper code structure in Github
- Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
- Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.
It is not intended to be a complete list of assigned duties but to describe a position level.
Job Overview :
We are looking for an experienced PL/SQL Developer to join our Professional Services team. The role involves developing and configuring enterprise-grade solutions, supporting clients during testing, and collaborating with internal teams. Candidates with strong expertise in PL/SQL and Unix/Linux are preferred, along with exposure to cloud, DevOps, or financial domains.
Key Responsibilities
- Develop and configure software features as per design specifications and enterprise standards.
- Interact with clients to resolve technical queries and support User Acceptance Testing (UAT).
- Collaborate with internal R&D, Professional Services, and Customer Support teams.
- Occasionally work at client sites or across different time zones.
- Ensure secure, scalable, and high-quality code.
Must-Have Skills
- Strong hands-on experience in PL/SQL, SQL, and Databases (Oracle, MS-SQL, MySQL, Postgres, MongoDB).
- Proficiency in Unix/Linux commands and shell scripting.
Nice-to-Have Skills
- Basic understanding of DevOps tools (Jenkins, Artifactory, Docker, Kubernetes).
- Exposure to Cloud environments (AWS preferred).
- Awareness of Python programming.
- Experience in AML, Fraud, or Financial Markets domain.
- Knowledge of Actimize (AIS/RCM/UDM).
Education & Experience
- Bachelor’s degree in Computer Science, Engineering, or equivalent.
- 4–8 years of overall IT experience, with 4+ years in software development.
- Strong problem-solving, communication, and customer interaction skills.
- Ability to work independently in time-sensitive environments.
Integration Developer
ROLE TITLE
Integration Developer
ROLE LOCATION(S)
Bangalore/Hyderabad/Chennai/Coimbatore/Noida/Kolkata/Pune/Indore
ROLE SUMMARY
The Integration Developer is a key member of the operations team, responsible for ensuring the smooth integration and functioning of various systems and software within the organization. This role involves technical support, system troubleshooting, performance monitoring, and assisting with the implementation of integration solutions.
ROLE RESPONSIBILITIES
· Design, develop, and maintain integration solutions using Spring Framework, Apache Camel, and other integration patterns such as RESTful APIs, SOAP services, file-based FTP/SFTP, and OAuth authentication.
· Collaborate with architects and cross-functional teams to design integration solutions that are scalable, secure, and aligned with business requirements.
· Resolve complex integration issues, performance bottlenecks, and data discrepancies across multiple systems. Support Production issues and fixes.
· Document integration processes, technical designs, APIs, and workflows to ensure clarity and ease of use.
· Participate in on-call rotation to provide 24/7 support for critical production issues.
· Develop source code / version control management experience in a collaborative work environment.
TECHNICAL QUALIFICATIONS
· 5+ years of experience in Java development with strong expertise in Spring Framework and Apache Camel for building enterprise-grade integrations.
· Proficient with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Hands-on experience with RESTful APIs, SOAP services, and file-based integrations using FTP and SFTP protocols.
· Strong analytical and troubleshooting skills for resolving complex integration and system issues.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
· Good understanding of containerization and cloud-native development, with experience in using Docker, Kubernetes, and Azure AKS.
· Experience with OAuth for secure authentication and authorization in integration solutions.
· Strong experience level using GitHub Source Control application.
· Strong background in SQL databases (e.g., T-SQL, Stored Procedures) and working with data in an integration context.
· Skilled with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
GENERAL QUALIFICATIONS
· Excellent analytical and problem-solving skills, with a keen attention to detail.
· Effective communication skills, with the ability to collaborate with technical and non-technical stakeholders.
· Experience working in a fast paced, production support environment with a focus on incident management and resolution.
· Experience in the insurance domain is considered a plus.
EDUCATION REQUIREMENTS
· Bachelor’s degree in Computer Science, Information Technology, or related field.
🚀 Hiring: Java Developer
⭐ Experience: 4+ Years
📍 Location: Pune
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Requirements
✅ Strong proficiency in Java (Java 8/11/17)
✅ Experience with Spring / Spring Boot
✅ Knowledge of REST APIs, Microservices architecture
✅ Familiarity with SQL/NoSQL databases
✅ Understanding of Git, CI/CD pipelines
✅ Problem-solving skills and attention to detail

Job Title : Software Development Engineer (Python, Django & FastAPI + React.js)
Experience : 2+ Years
Location : Nagpur / Remote (India)
Job Type : Full Time
Collaboration Hours : 11:00 AM – 7:00 PM IST
About the Role :
We are seeking a Software Development Engineer to join our growing team. The ideal candidate will have strong expertise in backend development with Python, Django, and FastAPI, as well as working knowledge of AWS.
While backend development is the primary focus, you should also be comfortable contributing to frontend development using JavaScript, TypeScript, and React.
Mandatory Skills : Python, Django, FastAPI, AWS, JavaScript/TypeScript, React, REST APIs, SQL/NoSQL.
Key Responsibilities :
- Design, develop, and maintain backend services using Python (Django / FastAPI).
- Deploy, scale, and manage applications on AWS cloud services.
- Collaborate with frontend developers and contribute to React (JS/TS) development when required.
- Write clean, efficient, and maintainable code following best practices.
- Ensure system performance, scalability, and security.
- Participate in the full software development lifecycle : planning, design, development, testing, and deployment.
- Work collaboratively with cross-functional teams to deliver high-quality solutions.
Requirements :
- Bachelor’s degree in Computer Science, Computer Engineering, or related field.
- 2+ years of professional software development experience.
- Strong proficiency in Python, with hands-on experience in Django and FastAPI.
- Practical experience with AWS cloud services.
- Basic proficiency in JavaScript, TypeScript, and React for frontend development.
- Solid understanding of REST APIs, databases (SQL/NoSQL), and software design principles.
- Familiarity with Git and collaborative workflows.
- Strong problem-solving ability and adaptability in a fast-paced environment.
Good to Have :
- Experience with Docker for containerization.
- Knowledge of CI/CD pipelines and DevOps practices.
Shift timings : Afternoon
Job Summary
We are seeking an experienced Senior Java Developer with strong expertise in legacy system migration, server management, and deployment. The candidate will be responsible for maintaining, enhancing, and migrating an existing Java/JSF (PrimeFaces), EJB, REST API, and SQL Server-based application to a modern Spring Boot architecture. The role involves ensuring smooth production deployments, troubleshooting server issues, and optimizing the existing infrastructure.
Key Responsibilities
● Maintain & Enhance the existing Java, JSF (PrimeFaces), EJB, REST API, andSQL Server application.
● Migrate the legacy system to Spring Boot while ensuring minimal downtime.
● Manage deployments using Ansible, GlassFish/Payara, and deployer.sh scripts.
● Optimize and troubleshoot server performance (Apache, Payara, GlassFish).
● Handle XML file generation, email integrations, and REST API maintenance.
● Database management (SQL Server) including query optimization and schema updates.
● Collaborate with teams to ensure smooth transitions during migration.
● Automate CI/CD pipelines using Maven, Ansible, and shell scripts.
● Document migration steps, deployment processes, and system architecture.
Required Skills & Qualifications
● 8+ years of hands-on experience with Java, JSF (PrimeFaces), EJB, and REST APIs.
● Strong expertise in Spring Boot (migration experience from legacy Java is a must).
● Experience with Payara/GlassFish server management and deployment.
● Proficient in Apache, Ansible, and shell scripting (deployer.sh).
● Solid knowledge of SQL Server (queries, stored procedures, optimization).
● Familiarity with XML processing, email integrations, and Maven builds.
● Experience in production deployments, server troubleshooting, and performance tuning.
● Ability to work independently and lead migration efforts.
Preferred Skills
● Knowledge of microservices architecture (helpful for modernization).
● Familiarity with cloud platforms (AWS/Azure) is a plus.

Role Overview
We're looking for experienced Data Engineers who can independently design, build, and manage scalable data platforms. You'll work directly with clients and internal teams to develop robust data pipelines that support analytics, AI/ML, and operational systems.
You’ll also play a mentorship role and help establish strong engineering practices across our data projects.
Key Responsibilities
- Design and develop large-scale, distributed data pipelines (batch and streaming)
- Implement scalable data models, warehouses/lakehouses, and data lakes
- Translate business requirements into technical data solutions
- Optimize data pipelines for performance and reliability
- Ensure code is clean, modular, tested, and documented
- Contribute to architecture, tooling decisions, and platform setup
- Review code/design and mentor junior engineers
Must-Have Skills
- Strong programming skills in Python and advanced SQL
- Solid grasp of ETL/ELT, data modeling (OLTP & OLAP), and stream processing
- Hands-on experience with frameworks like Apache Spark, Flink, etc.
- Experience with orchestration tools like Airflow
- Familiarity with CI/CD pipelines and Git
- Ability to debug and scale data pipelines in production
Preferred Skills
- Experience with cloud platforms (AWS preferred, GCP or Azure also fine)
- Exposure to Databricks, dbt, or similar tools
- Understanding of data governance, quality frameworks, and observability
- Certifications (e.g., AWS Data Analytics, Solutions Architect, Databricks) are a bonus
What We’re Looking For
- Problem-solver with strong analytical skills and attention to detail
- Fast learner who can adapt across tools, tech stacks, and domains
- Comfortable working in fast-paced, client-facing environments
- Willingness to travel within India when required

Job Summary:
We are looking for a highly skilled and experienced Data Engineer with deep expertise in Airflow, dbt, Python, and Snowflake. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines and transformation frameworks to enable robust data workflows across the organization.
Key Responsibilities:
- Design and implement scalable ETL/ELT pipelines using Apache Airflow for orchestration.
- Develop modular and maintainable data transformation models using dbt.
- Write high-performance data processing scripts and automation using Python.
- Build and maintain data models and pipelines on Snowflake.
- Collaborate with data analysts, data scientists, and business teams to deliver clean, reliable, and timely data.
- Monitor and optimize pipeline performance and troubleshoot issues proactively.
- Follow best practices in version control, testing, and CI/CD for data projects.
Must-Have Skills:
- Strong hands-on experience with Apache Airflow for scheduling and orchestrating data workflows.
- Proficiency in dbt (data build tool) for building scalable and testable data models.
- Expert-level skills in Python for data processing and automation.
- Solid experience with Snowflake, including SQL performance tuning, data modeling, and warehouse management.
- Strong understanding of data engineering best practices including modularity, testing, and deployment.
Good to Have:
- Experience working with cloud platforms (AWS/GCP/Azure).
- Familiarity with CI/CD pipelines for data (e.g., GitHub Actions, GitLab CI).
- Exposure to modern data stack tools (e.g., Fivetran, Stitch, Looker).
- Knowledge of data security and governance best practices.
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.
🚀 Hiring: Tableau Developer
⭐ Experience: 5+ Years
📍 Location: Pune, Gurgaon, Bangalore, Chennai, Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners or 15 Days
(Only immediate joiners & candidates serving notice period)
We are looking for a skilled Tableau Developer to join our team. The ideal candidate will have hands-on experience in creating, maintaining, and optimizing dashboards, reports, and visualizations that enable data-driven decision-making across the organization.
⭐ Key Responsibilities:
✅Develop and maintain Tableau dashboards & reports
✅Translate business needs into data visualizations
✅Work with SQL & multiple data sources for insights
✅Optimize dashboards for performance & usability
✅Collaborate with stakeholders for BI solutions
We are looking for a Java Developer with 1–3 years of experience for our Bangalore location. The candidate should have strong expertise in Core Java concepts (OOPs, Collections, Exception Handling, Multithreading) along with hands-on experience in Spring, Spring Boot, Hibernate/JPA, RESTful APIs, and Microservices. Proficiency in SQL databases (MySQL/PostgreSQL/Oracle), Maven/Gradle, and Git is essential. The role requires excellent problem-solving, coding, and debugging skills, with additional exposure to cloud platforms (AWS/Azure/GCP), CI/CD tools (Jenkins, Docker, Kubernetes), and Agile methodologies being an added advantage. During interviews, the panel should focus on evaluating the candidate’s Java fundamentals, Spring Boot/REST API development, database optimization, problem-solving abilities, and coding/debugging skills.

Primary Skills - Magento, Magento 1.X,SQL, .Net
Should be willing to work on shifts - 02:00 PM – 11:00 PM IST
5 + Years of Strong experience in Magento 1.x and Magento 2.x development (both front-end and back-end).
Hands-on experience with Magento Extensions – installation, customization, and development.
Proficiency in PHP, MySQL, and OOP principles.
Experience with Magento theme customization, module development, and extension integration.
Knowledge of JavaScript, Knockout.js, Require JS, jQuery, HTML5, CSS3.
Understanding of Magento REST & GraphQL APIs for integrations.
Experience with version control (Git/Bitbucket) and build tools (Composer).
Exposure to production support environments (incident, problem, change management – ITIL knowledge preferred).
Exp: 10+ Years
CTC: 1.7 LPM
Location: Pune
SnowFlake Expertise Profile
Should hold 10 + years of experience with strong skills with core understanding of cloud data warehouse principles and extensive experience in designing, building, optimizing, and maintaining robust and scalable data solutions on the Snowflake platform.
Possesses a strong background in data modelling, ETL/ELT, SQL development, performance tuning, scaling, monitoring and security handling.
Responsibilities:
* Collaboration with Data and ETL team to review code, understand current architecture and help improve it based on Snowflake offerings and experience
* Review and implement best practices to design, develop, maintain, scale, efficiently monitor data pipelines and data models on the Snowflake platform for
ETL or BI.
* Optimize complex SQL queries for data extraction, transformation, and loading within Snowflake.
* Ensure data quality, integrity, and security within the Snowflake environment.
* Participate in code reviews and contribute to the team's development standards.
Education:
* Bachelor’s degree in computer science, Data Science, Information Technology, or anything equivalent.
* Relevant Snowflake certifications are a plus (e.g., Snowflake certified Pro / Architecture / Advanced).

🚀 We’re Hiring: Senior Python Backend Developer 🚀
📍 Location: Baner, Pune (Work from Office)
💰 Compensation: ₹6 LPA
🕑 Experience Required: Minimum 2 years as a Python Backend Developer
About Us
Foto Owl AI is a fast-growing product-based company headquartered in Baner, Pune.
We specialize in:
⚡ Hyper-personalized fan engagement
🤖 AI-powered real-time photo sharing
📸 Advanced media asset management
What You’ll Do
As a Senior Python Backend Developer, you’ll play a key role in designing, building, and deploying scalable backend systems that power our cutting-edge platforms.
Architect and develop complex, secure, and scalable backend services
Build and maintain APIs & data pipelines for web, mobile, and AI-driven platforms
Optimize SQL & NoSQL databases for high performance
Manage AWS infrastructure (EC2, S3, RDS, Lambda, CloudWatch, etc.)
Implement observability, monitoring, and security best practices
Collaborate cross-functionally with product & AI teams
Mentor junior developers and conduct code reviews
Troubleshoot and resolve production issues with efficiency
What We’re Looking For
✅ Strong expertise in Python backend development
✅ Solid knowledge of Data Structures & Algorithms
✅ Hands-on experience with SQL (PostgreSQL/MySQL) and NoSQL (MongoDB, Redis, etc.)
✅ Proficiency in RESTful APIs & Microservice design
✅ Knowledge of Docker, Kubernetes, and cloud-native systems
✅ Experience managing AWS-based deployments
Why Join Us?
At Foto Owl AI, you’ll be part of a passionate team building world-class media tech products used in sports, events, and fan engagement platforms. If you love scalable backend systems, real-time challenges, and AI-driven products, this is the place for you.
Job Summary:
We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.
Key Responsibilities:
- Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
- Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
- Develop and manage infrastructure using Terraform or CloudFormation.
- Manage and orchestrate containers using Docker and Kubernetes (EKS).
- Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
- Write robust automation scripts using Python and Shell scripting.
- Monitor system performance and availability, and ensure high uptime and reliability.
- Execute and optimize SQLqueries for MSSQL and PostgresQL databases.
- Maintain clear documentation and provide technical support to stakeholders and clients.
Required Skills:
- Minimum 4+ years of experience in a DevOps or related role.
- Proven experience in client-facing engagements and communication.
- Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
- Proficiency in Infrastructure as Code using Terraform or CloudFormation.
- Hands-on experience with Docker and Kubernetes (EKS).
- Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
- Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
- Proficient in Python and Shell scripting.
Preferred Qualifications:
- AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
- Experience working in Agile/Scrum environments.
- Strong problem-solving and analytical skills.
Responsibility:
∙Develop and maintain code following predefined cost, company and security
standards.
∙Work on bug fixes, supporting in the maintenance and improvement of existing
applications.
∙Elaborate interfaces using standards and design principles defined by the team.
∙Develop systems with high availability.
∙Attend and contribute to development meetings.
∙Well versed with Unit testing and PSR Standards.
∙Master Software Development lifecycle, standards and technologies used by the
team.
∙Deliver on time with high quality.
∙Write Automation tests before to API call to code it and test it.
∙Trouble Shooting and debugging skills.
∙Perform technical documentation of the implemented tasks.


Job Requirement :
- 3-5 Years of experience in Data Science
- Strong expertise in statistical modeling, machine learning, deep learning, data warehousing, ETL, and reporting tools.
- Bachelors/ Masters in Data Science, Statistics, Computer Science, Business Intelligence,
- Experience with relevant programming languages and tools such as Python, R, SQL, Spark, Tableau, Power BI.
- Experience with machine learning frameworks like TensorFlow, PyTorch, or Scikit-learn
- Ability to think strategically and translate data insights into actionable business recommendations.
- Excellent problem-solving and analytical skills
- Adaptability and openness towards changing environment and nature of work
- This is a startup environment with evolving systems and procedures, the ideal candidate will be comfortable working in a fast-paced, dynamic environment and will have a strong desire to make a significant impact on the business.
Job Roles & Responsibilities:
- Conduct in-depth analysis of large-scale datasets to uncover insights and trends.
- Build and deploy predictive and prescriptive machine learning models for various applications.
- Design and execute A/B tests to evaluate the effectiveness of different strategies.
- Collaborate with product managers, engineers, and other stakeholders to drive data-driven decision-making.
- Stay up-to-date with the latest advancements in data science and machine learning.
Job Summary:
We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.
Key Responsibilities:
- Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
- Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
- Develop and manage infrastructure using Terraform or CloudFormation.
- Manage and orchestrate containers using Docker and Kubernetes (EKS).
- Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
- Write robust automation scripts using Python and Shell scripting.
- Monitor system performance and availability, and ensure high uptime and reliability.
- Execute and optimize SQL queries for MSSQL and PostgreSQL databases.
- Maintain clear documentation and provide technical support to stakeholders and clients.
Required Skills:
- Minimum 4+ years of experience in a DevOps or related role.
- Proven experience in client-facing engagements and communication.
- Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
- Proficiency in Infrastructure as Code using Terraform or CloudFormation.
- Hands-on experience with Docker and Kubernetes (EKS).
- Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
- Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
- Proficient in Python and Shell scripting.
Preferred Qualifications:
- AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
- Experience working in Agile/Scrum environments.
- Strong problem-solving and analytical skills.
Work Mode & Timing:
- Hybrid – Pune-based candidates preferred.
- Working hours: 12:30 PM to 9:30 PM IST to align with client time zones.
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution


- 5+ years of experience
- FlaskAPI, RestAPI development experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies
🚀 Hiring: Manual Tester
⭐ Experience: 5+ Years
📍 Location: Pan India
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Must-Have Skills:
✅5+ years of experience in Manual Testing
✅Solid experience in ETL, Database, and Report Testing
✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations
✅Working knowledge of BI tools such as Power BI
✅Ability to write effective Test Cases and Test Scenarios

Full Stack Developer (Node.js & React)
Location: Pune, India (Local or Ready to Relocate)
Employment Type: 6–8 Month Contract (Potential Conversion to FTE Based on Performance)
About the Role
We are seeking a highly skilled Full Stack Developer with expertise in Node.js and React to join our dynamic team in Pune. This role involves designing, developing, and deploying scalable web applications. You will collaborate with cross-functional teams to deliver high-impact solutions while adhering to best practices in coding, testing, and security.
Key Responsibilities
- Develop and maintain server-side applications using Node.js (Express/NestJS) and client-side interfaces with React.js (Redux/Hooks).
- Architect RESTful APIs and integrate with databases (SQL/NoSQL) and third-party services.
- Implement responsive UI/UX designs with modern front-end libraries (e.g., Material-UI, Tailwind CSS).
- Write unit/integration tests (Jest, Mocha, React Testing Library) and ensure code quality via CI/CD pipelines.
- Collaborate with product managers, designers, and QA engineers in an Agile environment.
- Troubleshoot performance bottlenecks and optimize applications for scalability.
- Document technical specifications and deployment processes.
Required Skills & Qualifications
- Experience: 3+ years in full-stack development with Node.js and React.
- Backend Proficiency:
- Strong knowledge of Node.js, Express, or NestJS.
- Experience with databases (PostgreSQL, MongoDB, Redis).
- API design (REST/GraphQL) and authentication (JWT/OAuth).
- Frontend Proficiency:
- Expertise in React.js (Functional Components, Hooks, Context API).
- State management (Redux, Zustand) and modern CSS frameworks.
- DevOps & Tools:
- Git, Docker, AWS/Azure, and CI/CD tools (Jenkins/GitHub Actions).
- Testing frameworks (Jest, Cypress, Mocha).
- Soft Skills:
- Problem-solving mindset and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
- Location: Based in Pune or willing to relocate immediately.
Preferred Qualifications
- Experience with TypeScript, Next.js, or serverless architectures.
- Knowledge of microservices, message brokers (Kafka/RabbitMQ), or container orchestration (Kubernetes).
- Familiarity with Agile/Scrum methodologies.
- Contributions to open-source projects or a strong GitHub portfolio.
What We Offer
- Competitive Contract Compensation with timely payouts.
- Potential for FTE Conversion: Performance-based path to a full-time role.
- Hybrid Work Model: Flexible in-office (Pune) and remote options.
- Learning Opportunities: Access to cutting-edge tools and mentorship.
- Collaborative Environment: Work with industry experts on innovative projects.
Apply Now!
Ready to make an impact? Send your resume and GitHub/Portfolio links with the subject line:
"Full Stack Developer (Node/React) - Pune".
Local candidates or those relocating to Pune will be prioritized. Applications without portfolios will not be considered.
Equal Opportunity Employer
We celebrate diversity and are committed to creating an inclusive environment for all employees.
🌐 Job Title: Senior Azure Developer
🏢 Department: Digital Engineering
📍 Location: Pune (Work from Office)
📄 Job Type: Full-time
💼 Experience Required: 5+ years
💰 Compensation: Best in the industry
🔧 Roles & Responsibilities:
- Design, develop, and implement solutions using Microsoft Azure with .NET and other technologies.
- Collaborate with business analysts and end users to define system requirements.
- Work with QA teams to ensure solution integrity and functionality.
- Communicate frequently with stakeholders and team members to track progress and validate requirements.
- Evaluate and present technical solutions and recommendations.
- Provide technical mentoring and training to peers and junior developers.
💡 Technical Requirements:
- Minimum 2 years of hands-on development experience in:
- Azure Logic Apps
- Azure Service Bus
- Azure Web/API Apps
- Azure Functions
- Azure SQL Database / Cosmos DB
- Minimum 2 years’ experience in enterprise software development using .NET stack:
- REST APIs
- Web Applications
- Distributed Systems
- Familiarity with security best practices (e.g., OWASP).
- Knowledge of NoSQL data stores is an added advantage.
Job Title: PostgreSQL Database Administrator
Experience: 6–8 Years
Work Mode: Hybrid
Locations: Hyderabad / Pune
Joiners: Only immediate joiners & candidates who have completed notice period
Required Skills
- Strong hands-on experience in PostgreSQL administration (6+ years).
- Excellent understanding of SQL and query optimization techniques.
- Deep knowledge of database services, architecture, and internals.
- Experience in performance tuning at both DB and OS levels.
- Familiarity with DataGuard or similar high-availability solutions.
- Strong experience in job scheduling and automation.
- Comfortable with installing, configuring, and upgrading PostgreSQL.
- Basic to intermediate knowledge of Linux system administration.
- Hands-on experience with shell scripting for automation and monitoring tasks.
Key Responsibilities
- Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
- Write and optimize complex SQL queries for performance and scalability.
- Manage database storage structures and ensure optimal disk usage and performance.
- Monitor, analyze, and resolve database performance issues using tools and logs.
- Perform database tuning, configuration adjustments, and query optimization.
- Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
- Install and upgrade PostgreSQL database software to new versions as required.
- Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
- Perform regular database backups and restorations to ensure data integrity and availability.
- Apply security patches and updates on time.
- Collaborate with developers for schema design, stored procedures, and access privileges.
- Document configurations, processes, and performance tuning results.
Senior Software Engineer – Java
Location: Pune (Hybrid – 3 days from office)
Experience: 8–15 Years
Domain: Information Technology (IT)
Joining: Immediate joiners only
Preference: Local candidates only (Pune-based)
Job Description:
We are looking for experienced and passionate Senior Java Engineers to join a high-performing development team. The role involves building and maintaining robust, scalable, and low-latency backend systems and microservices in a fast-paced, agile environment.
Key Responsibilities:
- Work within a high-velocity scrum team to deliver enterprise-grade software solutions.
- Architect and develop scalable end-to-end web applications and microservices.
- Collaborate with cross-functional teams to analyze requirements and deliver optimal technical solutions.
- Participate in code reviews, unit testing, and deployment.
- Mentor junior engineers while remaining hands-on with development tasks.
- Provide accurate estimates and support the team lead in facilitating development processes.
Mandatory Skills & Experience:
- 6–7+ years of enterprise-level Java development experience.
- Strong in Java 8 or higher (Java 11 preferred), including lambda expressions, Stream API, Completable Future.
- Minimum 4+ years working with Microservices, Spring Boot, and Hibernate.
- At least 3+ years of experience designing and developing RESTful APIs.
- Kafka – minimum 2 years’ hands-on experience in the current/most recent project.
- Solid experience with SQL.
- AWS – minimum 1.5 years of experience.
- Understanding of CI/CD pipelines and deployment processes.
- Exposure to asynchronous programming, multithreading, and performance tuning.
- Experience working in at least one Fintech domain project (mandatory).
Nice to Have:
- Exposure to Golang or Rust.
- Experience with any of the following tools: MongoDB, Jenkins, Sonar, Oracle DB, Drools, Adobe AEM, Elasticsearch/Solr/Algolia, Spark.
- Strong systems design and data modeling capabilities.
- Experience in payments or asset/wealth management domain.
- Familiarity with rules engines and CMS/search platforms.
Candidate Profile:
- Strong communication and client-facing skills.
- Proactive, self-driven, and collaborative mindset.
- Passionate about clean code and quality deliverables.
- Prior experience in building and deploying multiple products in production.
Note: Only candidates who are based in Pune and can join immediately will be considered.
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
Job Overview
We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.
You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.
Key Responsibilities
- Develop, maintain, and execute automated test cases using Cypress.
- Design robust test strategies and plans based on product requirements and user stories.
- Work with cross-functional teams to identify test requirements and ensure proper coverage.
- Perform regression, integration, smoke, and exploratory testing as needed.
- Report and track defects, and work with developers to resolve issues quickly.
- Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
- Continuously improve testing tools, processes, and best practices.
- Optimize test scripts for performance, reliability, and maintainability.
Required Skills & Qualifications
- Hands-on experience with Cypress and JavaScript-based test automation.
- Strong understanding of QA methodologies, tools, and processes.
- Experience in testing web applications across multiple browsers and devices.
- Familiarity with REST APIs and tools like Postman or Swagger.
- Experience with version control systems like Git.
- Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
- Excellent analytical and problem-solving skills.
- Strong written and verbal communication.
Preferred Qualifications
- Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
- Familiarity with performance testing or security testing.
- Background in Agile or Scrum methodologies.
- Basic understanding of DevOps practices.

Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.

.NET + Angular Full Stack Developer (4–5 Years Experience)
Location: Pune/Remote
Experience Required: 4 to 5 years
Communication: Fluent English (verbal & written)
Technology: .NET, Angular
Only immediate joiners who can start on 21st July should apply.
Job Overview
We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.
Key Responsibilities
- Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
- Write clean, scalable, and maintainable code for both backend and frontend components
- Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
- Work closely with designers, QA, and other developers to ensure high-quality product delivery
- Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
- Troubleshoot and debug application issues and provide timely solutions
- Participate in discussions on architecture, design patterns, and technical best practices
Must-Have Skills
✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)
✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)
✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)
✅ Familiarity with Entity Framework or Dapper
✅ Strong knowledge of RESTful API design and integration
✅ Version control using Git
✅ Excellent verbal and written communication skills
✅ Ability to work in a client-facing role and handle discussions independently
Good-to-Have / Optional Skills
Understanding or experience in Microservices Architecture
Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)


Job title - Python developer
Exp – 4 to 6 years
Location – Pune/Mum/B’lore
PFB JD
Requirements:
- Proven experience as a Python Developer
- Strong knowledge of core Python and Pyspark concepts
- Experience with web frameworks such as Django or Flask
- Good exposure to any cloud platform (GCP Preferred)
- CI/CD exposure required
- Solid understanding of RESTful APIs and how to build them
- Experience working with databases like Oracle DB and MySQL
- Ability to write efficient SQL queries and optimize database performance
- Strong problem-solving skills and attention to detail
- Strong SQL programing (stored procedure, functions)
- Excellent communication and interpersonal skill
Roles and Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using pyspark
- Work closely with data scientists and analysts to provide them with clean, structured data.
- Optimize data storage and retrieval for performance and scalability.
- Collaborate with cross-functional teams to gather data requirements.
- Ensure data quality and integrity through data validation and cleansing processes.
- Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
- Stay up to date with industry best practices and emerging technologies in data engineering.

Job Purpose
Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.
Key Responsibilities:
- Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
- Perform data transformation and validation for accuracy and consistency.
- Upload processed datasets into SQL Server using SSIS packages.
- Monitor and optimize database performance, identifying and resolving bottlenecks.
- Perform regular backups, restorations, and recovery checks to ensure data continuity.
- Manage user access and implement robust database security policies.
- Oversee database storage allocation and utilization.
- Conduct routine maintenance and support incident management, including root cause analysis and resolution.
- Design and implement scalable database solutions and architecture.
- Create and maintain stored procedures, views, and other database components.
- Optimize SQL queries for performance and scalability.
- Execute ETL processes and support seamless integration of multiple data sources.
- Maintain data integrity and quality through validation and cleansing routines.
- Collaborate with cross-functional teams on data solutions and project deliverables.
Educational Qualification: Any Graduate
Required Skills & Qualifications:
- Proven experience with SQL Server or similar relational database platforms.
- Strong expertise in SSIS, ETL processes, and data warehousing.
- Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
- Experience in database security, user role management, and access control.
- Familiarity with backup/recovery strategies and database maintenance best practices.
- Strong analytical skills with experience working with large and complex datasets.
- Solid understanding of data modeling, normalization, and schema design.
- Knowledge of incident and change management processes.
- Excellent communication and collaboration skills.
- Experience with Python for data manipulation and automation is a strong plus.
Job Title : Ab Initio Developer
Location : Pune
Experience : 5+ Years
Notice Period : Immediate Joiners Only
Job Summary :
We are looking for an experienced Ab Initio Developer to join our team in Pune.
The ideal candidate should have strong hands-on experience in Ab Initio development, data integration, and Unix scripting, with a solid understanding of SDLC and data warehousing concepts.
Mandatory Skills :
Ab Initio (GDE, EME, graphs, parameters), SQL/Teradata, Data Warehousing, Unix Shell Scripting, Data Integration, DB Load/Unload Utilities.
Key Responsibilities :
- Design and develop Ab Initio graphs/plans/sandboxes/projects using GDE and EME.
- Manage and configure standard environment parameters and multifile systems.
- Perform complex data integration from multiple source and target systems with business rule transformations.
- Utilize DB Load/Unload Utilities effectively for optimized performance.
- Implement generic graphs, ensure proper use of parallelism, and maintain project parameters.
- Work in a data warehouse environment involving SDLC, ETL processes, and data analysis.
- Write and maintain Unix Shell Scripts and use utilities like sed, awk, etc.
- Optimize and troubleshoot performance issues in Ab Initio jobs.
Mandatory Skills :
- Strong expertise in Ab Initio (GDE, EME, graphs, parallelism, DB utilities, multifile systems).
- Experience with SQL and databases like SQL Server or Teradata.
- Proficiency in Unix Shell Scripting and Unix utilities.
- Data integration and ETL from varied source/target systems.
Good to Have :
- Experience in Ab Initio and AWS integration.
- Knowledge of Message Queues and Continuous Graphs.
- Exposure to Metadata Hub.
- Familiarity with Big Data tools such as Hive, Impala.
- Understanding of job scheduling tools.
We’re hiring a Maximo Technical Lead with hands-on experience in Maximo 7.6 or higher, Java, and Oracle DB. The role involves leading Maximo implementations, upgrades, and support projects, especially for manufacturing clients.
Key Skills:
IBM Maximo (MAS 8.x preferred)
Java, Oracle 12c+, WebSphere
Maximo Mobile / Asset Management / Cognos / BIRT
SQL, scripting, troubleshooting
Experience leading tech teams and working with clients
Good to Have:
IBM Maximo Certification
MES/Infrastructure planning knowledge
Experience with Rail or Manufacturing domain
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.

Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.

Job Summary:
As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.
Key Responsibilities:
- Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
- Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
- Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
- Work with AWS DMS and RDS for database integration and migration
- Optimize data flows and system performance for speed and cost-effectiveness
- Deploy and manage infrastructure using AWS CloudFormation templates
- Collaborate with cross-functional teams to gather requirements and build robust data solutions
- Ensure data integrity, quality, and security across all systems and processes
Required Skills & Experience:
- 6+ years of experience in Data Engineering with strong AWS expertise
- Proficient in Python and PySpark for data processing and ETL development
- Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
- Strong SQL skills for building complex queries and performing data analysis
- Familiarity with AWS CloudFormation and infrastructure as code principles
- Good understanding of serverless architecture and cost-optimized design
- Ability to write clean, modular, and maintainable code
- Strong analytical thinking and problem-solving skills
Skill Name: ETL Automation Testing
Location: Bangalore, Chennai and Pune
Experience: 5+ Years
Required:
Experience in ETL Automation Testing
Strong experience in Pyspark.
Job Title : Data Engineer – Snowflake Expert
Location : Pune (Onsite)
Experience : 10+ Years
Employment Type : Contractual
Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.
Job Summary :
We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.
The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.
Responsibilities :
- Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
- Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
- Ensure high data quality, security, and adherence to governance frameworks.
- Conduct code reviews and align development with best practices.
Qualifications :
- Bachelor’s in Computer Science, Data Science, IT, or related field.
- Snowflake certifications (Pro/Architect) preferred.

Required Skills:
- Hands-on experience with Databricks, PySpark
- Proficiency in SQL, Python, and Spark.
- Understanding of data warehousing concepts and data modeling.
- Experience with CI/CD pipelines and version control (e.g., Git).
- Fundamental knowledge of any cloud services, preferably Azure or GCP.
Good to Have:
- Bigquery
- Experience with performance tuning and data governance.
Competitive Salary
About Solidatus
At Solidatus, we empower organizations to connect and visualize their data relationships, making it easier to identify, access, and understand their data. Our metadata management technology helps businesses establish a sustainable data foundation, ensuring they meet regulatory requirements, drive digital transformation, and unlock valuable insights.
We’re experiencing rapid growth—backed by HSBC, Citi, and AlbionVC, we secured £14 million in Series A funding in 2021. Our achievements include recognition in the Deloitte UK Technology Fast 50, multiple A-Team Innovation Awards, and a top 1% place to work ranking from The FinancialTechnologist.
Now is an exciting time to join us as we expand internationally and continue shaping the future of data management.
About the Engineering Team
Engineering is the heart of Solidatus. Our team of world-class engineers, drawn from outstanding computer science and technical backgrounds, plays a critical role in crafting the powerful, elegant solutions that set us apart. We thrive on solving challenging visualization and data management problems, building technology that delights users and drives real-world impact for global enterprises.
As Solidatus expands its footprint, we are scaling our capabilities with a focus on building world-class connectors and integrations to extend the reach of our platform. Our engineers are trusted with the freedom to explore, innovate, and shape the product’s future — all while working in a collaborative, high-impact environment. Here, your code doesn’t just ship — it empowers some of the world's largest and most complex organizations to achieve their data ambitions.
Who We Are & What You’ll Do
Join our Data Integration team and help shape the way data flows!
Your Mission:
To expand and refine our suite of out-of-the-box integrations, using our powerful API and SDK to bring in metadata for visualisation from a vast range of sources including databases with diverse SQL dialects.
But that is just the beginning. At our core, we are problem-solvers and innovators. You’ll have the chance to:
Design
intuitive layouts
representing flow of data across complex deployments of diverse technologies
Design and optimize API connectivity and parsers reading from source systems metadata
Explore new paradigms for representing data lineage
Enhance our data ingestion capabilities to handle massive volumes of data
Dig deep into data challenges to build smarter, more scalable solutions
Beyond engineering, you’ll collaborate with users, troubleshoot tricky issues, streamline development workflows, and contribute to a culture of continuous improvement.
What We’re Looking For
- We don’t believe in sticking to a single tech stack just for the sake of it. We’re engineers first, and we pick the best tools for the job. More than ticking off a checklist, we value mindset, curiosity, and problem-solving skills.
- You’re quick to learn and love diving into new technologies
- You push for excellence and aren’t satisfied with “just okay”
- You can break down complex topics in a way that anyone can understand
- You should have 6–8 years of proven experience in developing, and delivering high-quality, scalable software solutions
- You should be a strong self-starter with the ability to take ownership of tasks and drive them to completion with minimal supervision.
- You should be able to mentor junior developers, perform code reviews, and ensure adherence to best practices in software engineering.
Tech & Skills We’d Love to See
Must-have:·
- Strong hands-on experience with Java, Spring Boot RESTful APIs, and Node.js
- Solid knowledge of databases, SQL dialects, and data structures
Nice-to-have:
- Experience with C#, ASP.NET Core, TypeScript, React.js, or similar frameworks
- Bonus points for data experience—we love data wizards
If you’re passionate about engineering high-impact solutions, playing with cutting- edge tech, and making data work smarter, we’d love to have you on board!