50+ SQL Jobs in Pune | SQL Job openings in Pune
Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

About NonStop io Technologies:
NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.
Brief Description:
As a Business Systems Analyst working at NonStop io, you will analyse and define system requirements to support the software products in the clinical genetics domain. The BSA will oversee the software enhancements through the entire SDLC, from discovery, design, development, and validation. The BSA is responsible for communicating technical and functional requirements with stakeholders and users, creating JIRA tickets detailing requirements for the Engineering team, creating timelines for deliverables, and reporting project status updates upon request. At Ambry Genetics, you will have the opportunity to create innovative solutions and leverage the latest technologies to make our diagnostics testing even better.
Essential Functions
● Communicate functional requirements with stakeholders and the internal scrum team
● Collect, synthesize, and document requirements following standards to ensure clear communication with business stakeholders and IT internal stakeholders, such as the scrum team
● Report project status updates to PMO to help senior leadership understand current state, next steps, and project timelines
● Work with Technical Leads to develop solutions that align with defined business logic, goals, and objectives, and determine the technical and operational feasibility of the said solution and
● Identify technical constraints to the proposed software solution
● Identify and report the impact of the proposed software solution on other systems and workflows
● Develop diagrams and flowcharts to assist the development team in their understanding of the software development solution
● Ensure requirements are understood and approved by stakeholders
● Work with QA to develop test plans
● Create acceptance criteria and training material for end users
● Work with the QA team to test and validate developed features against requirements and coordinate end-user validation
● Coordinate change control protocols that ensure the business approval of production deployments
● Become a subject matter expert for software products
● Leading the team by providing requirement guidance, driving the business initiatives, and overseeing the team to ensure business tasks are completed on time with expected quality
● Other duties as assigned
Qualifications
● Domain Knowledge of the genetics, genetic testing, Provider/ patient portal, and/or healthcare
● Bachelor's or advanced degree in a Biology-related discipline is preferred, or Computer Science and similar relevant specialization, or equivalent work experience
● 2-5yearsofbusiness analysis experience with software development
● Strong working knowledge and experience using JIRA
● Strong verbal and written skills
● Experience facilitating project meetings and requirements gathering
● Knowledge of SDLC, Agile, and Scrum methodologies
● Scrum master experience is highly preferred
● Project Manager experience is nice to have
● Strong proficiency with SQL (if/then, joins, etc)
● Strong working experience with relational/non-relational databases
● Excellent problem-solving skills and attention to detail
● Strong communication skills and ability to work effectively in a collaborative team environment.
What We Offer:
- Competitive salary and benefits
- A collaborative and innovative work environment
- Opportunities for professional growth and development
If you are passionate about leveraging your analytical skills to transform healthcare systems, we would love to hear from you!


About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.
Data Axle Pune is pleased to have achieved certification as a Great Place to Work!
Roles & Responsibilities:
We are looking for a Senior Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.
We are looking for a Senior Data Scientist who will be responsible for:
- Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
- Design or enhance ML workflows for data ingestion, model design, model inference and scoring
- Oversight on team project execution and delivery
- Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
- Visualize and publish model performance results and insights to internal and external audiences
Qualifications:
- Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
- Minimum of 5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
- Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
- Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
- Proficiency in Python and SQL required; PySpark/Spark experience a plus
- Ability to conduct a productive peer review and proper code structure in Github
- Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
- Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.
It is not intended to be a complete list of assigned duties but to describe a position level.
Job Summary:
We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.
Key Responsibilities:
- Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
- Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
- Develop and manage infrastructure using Terraform or CloudFormation.
- Manage and orchestrate containers using Docker and Kubernetes (EKS).
- Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
- Write robust automation scripts using Python and Shell scripting.
- Monitor system performance and availability, and ensure high uptime and reliability.
- Execute and optimize SQL queries for MSSQL and PostgreSQL databases.
- Maintain clear documentation and provide technical support to stakeholders and clients.
Required Skills:
- Minimum 4+ years of experience in a DevOps or related role.
- Proven experience in client-facing engagements and communication.
- Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
- Proficiency in Infrastructure as Code using Terraform or CloudFormation.
- Hands-on experience with Docker and Kubernetes (EKS).
- Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
- Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
- Proficient in Python and Shell scripting.
Preferred Qualifications:
- AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
- Experience working in Agile/Scrum environments.
- Strong problem-solving and analytical skills.
Work Mode & Timing:
- Hybrid – Pune-based candidates preferred.
- Working hours: 12:30 PM to 9:30 PM IST to align with client time zones.
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Responsibilities:
- Design, develop, and maintain scalable applications using Java/Kotlin Core Concepts and Spring Boot MVC.
- Build and optimize REST APIs for seamless client-server communication.
- Develop and ensure efficient HTTP/HTTPS request-response mechanisms.
- Handle Java/Kotlin version upgrades confidently, ensuring code compatibility and leveraging the latest features.
- Solve complex business logic challenges with a methodical and innovative approach.
- Write optimized SQL queries with Postgres DB.
- Ensure code quality through adherence to design patterns (e.g., Singleton, Factory, Observer, MVC) and unit testing frameworks like JUnit.
- Integrate third-party APIs and develop large-scale systems with technical precision.
- Debug and troubleshoot production issues.
Requirements:
- 2 to 4 years of hands-on experience in Java/Kotlin Spring Boot development.
- Proven expertise in handling version upgrades for Java and Kotlin with confidence.
- Strong logical thinking and problem-solving skills, especially in implementing complex algorithms.
- Proficiency with Git, JIRA, and managing software package versions.
- Familiarity with SaaS-based products, XML parsing/generation, and generating PDFs, XLS, CSVs using Spring Boot.
- Strong understanding of JPA, Hibernate, and core Java concepts (OOP).
Skills (Good to Have):
- Exposure to Docker, Redis, and Elasticsearch.
- Knowledge of transaction management and solving computational problems.
- Eagerness to explore new technologies.


- 5+ years of experience
- FlaskAPI, RestAPI development experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies

AccioJob is conducting a Walk-In Hiring Drive with Global Consulting and Services for the position of Python Automation Engineer.
To apply, register and select your slot here: https://go.acciojob.com/b7BZZZ
Required Skills: Excel, Python, Panda, Numpy, SQL
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: All
- Graduation Year: 2023, 2024, 2025
Work Details:
- Work Location: Pune (Onsite)
- CTC: 3 LPA to 6 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
Profile & Background Screening Round,
Technical Interview 1
Technical Interview 2
Tech+Managerial Round
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/b7BZZZ
Or, apply through our newly launched app:https://go.acciojob.com/4wvBDe
🚀 Hiring: Manual Tester
⭐ Experience: 5+ Years
📍 Location: Pan India
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Must-Have Skills:
✅5+ years of experience in Manual Testing
✅Solid experience in ETL, Database, and Report Testing
✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations
✅Working knowledge of BI tools such as Power BI
✅Ability to write effective Test Cases and Test Scenarios

Full Stack Developer (Node.js & React)
Location: Pune, India (Local or Ready to Relocate)
Employment Type: 6–8 Month Contract (Potential Conversion to FTE Based on Performance)
About the Role
We are seeking a highly skilled Full Stack Developer with expertise in Node.js and React to join our dynamic team in Pune. This role involves designing, developing, and deploying scalable web applications. You will collaborate with cross-functional teams to deliver high-impact solutions while adhering to best practices in coding, testing, and security.
Key Responsibilities
- Develop and maintain server-side applications using Node.js (Express/NestJS) and client-side interfaces with React.js (Redux/Hooks).
- Architect RESTful APIs and integrate with databases (SQL/NoSQL) and third-party services.
- Implement responsive UI/UX designs with modern front-end libraries (e.g., Material-UI, Tailwind CSS).
- Write unit/integration tests (Jest, Mocha, React Testing Library) and ensure code quality via CI/CD pipelines.
- Collaborate with product managers, designers, and QA engineers in an Agile environment.
- Troubleshoot performance bottlenecks and optimize applications for scalability.
- Document technical specifications and deployment processes.
Required Skills & Qualifications
- Experience: 3+ years in full-stack development with Node.js and React.
- Backend Proficiency:
- Strong knowledge of Node.js, Express, or NestJS.
- Experience with databases (PostgreSQL, MongoDB, Redis).
- API design (REST/GraphQL) and authentication (JWT/OAuth).
- Frontend Proficiency:
- Expertise in React.js (Functional Components, Hooks, Context API).
- State management (Redux, Zustand) and modern CSS frameworks.
- DevOps & Tools:
- Git, Docker, AWS/Azure, and CI/CD tools (Jenkins/GitHub Actions).
- Testing frameworks (Jest, Cypress, Mocha).
- Soft Skills:
- Problem-solving mindset and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
- Location: Based in Pune or willing to relocate immediately.
Preferred Qualifications
- Experience with TypeScript, Next.js, or serverless architectures.
- Knowledge of microservices, message brokers (Kafka/RabbitMQ), or container orchestration (Kubernetes).
- Familiarity with Agile/Scrum methodologies.
- Contributions to open-source projects or a strong GitHub portfolio.
What We Offer
- Competitive Contract Compensation with timely payouts.
- Potential for FTE Conversion: Performance-based path to a full-time role.
- Hybrid Work Model: Flexible in-office (Pune) and remote options.
- Learning Opportunities: Access to cutting-edge tools and mentorship.
- Collaborative Environment: Work with industry experts on innovative projects.
Apply Now!
Ready to make an impact? Send your resume and GitHub/Portfolio links with the subject line:
"Full Stack Developer (Node/React) - Pune".
Local candidates or those relocating to Pune will be prioritized. Applications without portfolios will not be considered.
Equal Opportunity Employer
We celebrate diversity and are committed to creating an inclusive environment for all employees.

We are looking for a skilled Automation Anywhere Engineer with a strong background in RPA development, Python scripting, and experience with CoPilot integrations. The ideal candidate will play a key role in designing, developing, and implementing automation solutions to streamline business processes and improve operational efficiency.
Required Skills:
- 2–6 years of hands-on experience in Automation Anywhere (A2019 or higher).
- Strong programming skills in Python for automation and integration.
- Good understanding of RPA concepts, lifecycle, and best practices.
- Experience working with CoPilot (Microsoft Power Platform/AI CoPilot or equivalent).
- Knowledge of API integration and web services (REST/SOAP).
- Familiarity with process analysis and design techniques.
- Ability to write clean, reusable, and well-documented code.
- Strong problem-solving and communication skills.
🌐 Job Title: Senior Azure Developer
🏢 Department: Digital Engineering
📍 Location: Pune (Work from Office)
📄 Job Type: Full-time
💼 Experience Required: 5+ years
💰 Compensation: Best in the industry
🔧 Roles & Responsibilities:
- Design, develop, and implement solutions using Microsoft Azure with .NET and other technologies.
- Collaborate with business analysts and end users to define system requirements.
- Work with QA teams to ensure solution integrity and functionality.
- Communicate frequently with stakeholders and team members to track progress and validate requirements.
- Evaluate and present technical solutions and recommendations.
- Provide technical mentoring and training to peers and junior developers.
💡 Technical Requirements:
- Minimum 2 years of hands-on development experience in:
- Azure Logic Apps
- Azure Service Bus
- Azure Web/API Apps
- Azure Functions
- Azure SQL Database / Cosmos DB
- Minimum 2 years’ experience in enterprise software development using .NET stack:
- REST APIs
- Web Applications
- Distributed Systems
- Familiarity with security best practices (e.g., OWASP).
- Knowledge of NoSQL data stores is an added advantage.
Job Title: PostgreSQL Database Administrator
Experience: 6–8 Years
Work Mode: Hybrid
Locations: Hyderabad / Pune
Joiners: Only immediate joiners & candidates who have completed notice period
Required Skills
- Strong hands-on experience in PostgreSQL administration (6+ years).
- Excellent understanding of SQL and query optimization techniques.
- Deep knowledge of database services, architecture, and internals.
- Experience in performance tuning at both DB and OS levels.
- Familiarity with DataGuard or similar high-availability solutions.
- Strong experience in job scheduling and automation.
- Comfortable with installing, configuring, and upgrading PostgreSQL.
- Basic to intermediate knowledge of Linux system administration.
- Hands-on experience with shell scripting for automation and monitoring tasks.
Key Responsibilities
- Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
- Write and optimize complex SQL queries for performance and scalability.
- Manage database storage structures and ensure optimal disk usage and performance.
- Monitor, analyze, and resolve database performance issues using tools and logs.
- Perform database tuning, configuration adjustments, and query optimization.
- Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
- Install and upgrade PostgreSQL database software to new versions as required.
- Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
- Perform regular database backups and restorations to ensure data integrity and availability.
- Apply security patches and updates on time.
- Collaborate with developers for schema design, stored procedures, and access privileges.
- Document configurations, processes, and performance tuning results.
Senior Software Engineer – Java
Location: Pune (Hybrid – 3 days from office)
Experience: 8–15 Years
Domain: Information Technology (IT)
Joining: Immediate joiners only
Preference: Local candidates only (Pune-based)
Job Description:
We are looking for experienced and passionate Senior Java Engineers to join a high-performing development team. The role involves building and maintaining robust, scalable, and low-latency backend systems and microservices in a fast-paced, agile environment.
Key Responsibilities:
- Work within a high-velocity scrum team to deliver enterprise-grade software solutions.
- Architect and develop scalable end-to-end web applications and microservices.
- Collaborate with cross-functional teams to analyze requirements and deliver optimal technical solutions.
- Participate in code reviews, unit testing, and deployment.
- Mentor junior engineers while remaining hands-on with development tasks.
- Provide accurate estimates and support the team lead in facilitating development processes.
Mandatory Skills & Experience:
- 6–7+ years of enterprise-level Java development experience.
- Strong in Java 8 or higher (Java 11 preferred), including lambda expressions, Stream API, Completable Future.
- Minimum 4+ years working with Microservices, Spring Boot, and Hibernate.
- At least 3+ years of experience designing and developing RESTful APIs.
- Kafka – minimum 2 years’ hands-on experience in the current/most recent project.
- Solid experience with SQL.
- AWS – minimum 1.5 years of experience.
- Understanding of CI/CD pipelines and deployment processes.
- Exposure to asynchronous programming, multithreading, and performance tuning.
- Experience working in at least one Fintech domain project (mandatory).
Nice to Have:
- Exposure to Golang or Rust.
- Experience with any of the following tools: MongoDB, Jenkins, Sonar, Oracle DB, Drools, Adobe AEM, Elasticsearch/Solr/Algolia, Spark.
- Strong systems design and data modeling capabilities.
- Experience in payments or asset/wealth management domain.
- Familiarity with rules engines and CMS/search platforms.
Candidate Profile:
- Strong communication and client-facing skills.
- Proactive, self-driven, and collaborative mindset.
- Passionate about clean code and quality deliverables.
- Prior experience in building and deploying multiple products in production.
Note: Only candidates who are based in Pune and can join immediately will be considered.
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting

Job Summary:
We are looking for an experienced Full Stack Developer with expertise in Angular 15+, PHP, Node.js and SQL databases. As a key member of our engineering team, you will design, develop, and maintain both the front-end and back-end of our applications. If you are passionate about building scalable, high-performance web solutions and have experience with cloud technologies, we encourage you to apply.
Key Responsibilities:
- Front-End Development:
- Develop responsive, high-performance web applications using Angular 15+.
- Ensure a seamless and engaging user experience by collaborating closely with UX/UI designers.
- Implement modern web technologies and best practices for building dynamic, scalable applications.
- Back-End Development:
- Build and maintain PHP-based server-side applications, ensuring reliability, security, and scalability.
- Work on backend systems using Node.js and PHP maintaining seamless integration, performance, and security across multiple services.
- Develop and integrate RESTful APIs to support front-end functionality.
- Design and optimize database schemas and queries for SQL databases (e.g., MySQL, PostgreSQL).
- Cloud and Infrastructure Integration:
- Integrate and manage cloud services, including AWS Lambda, AWS SQS, Firebase, and Google Cloud Tasks.
- Work with the team to ensure efficient cloud-based deployments and architecture optimization.
- Collaboration and Code Quality:
- Collaborate with cross-functional teams to define and implement software requirements.
- Ensure code quality and maintainability by conducting code reviews and following industry best practices.
- Write unit and integration tests to ensure software reliability and robustness.
- Continuous Improvement:
- Stay up to date with emerging technologies and trends in web development and cloud services.
- Identify and resolve performance bottlenecks and improve application performance.
Required Skills and Qualifications:
- 4-5 years of professional experience in full-stack web development.
- Proficiency in Angular 15+, PHP, Node.js and SQL databases (MySQL, PostgreSQL, etc.).
- Strong understanding of web application architecture, APIs, and cloud integration.
- Experience with version control tools like Git.
- Solid understanding of front-end build tools and optimization techniques.
Preferred Skills:
- Experience with Joomla 3+ and its framework.
- Familiarity with cloud platforms such as AWS Lambda, Firebase, and Google Cloud Tasks.
- Knowledge of other cloud services and serverless architectures.
- Experience with Cypress for end-to-end testing and test automation.
Education:
- Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience.
Job Overview
We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.
You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.
Key Responsibilities
- Develop, maintain, and execute automated test cases using Cypress.
- Design robust test strategies and plans based on product requirements and user stories.
- Work with cross-functional teams to identify test requirements and ensure proper coverage.
- Perform regression, integration, smoke, and exploratory testing as needed.
- Report and track defects, and work with developers to resolve issues quickly.
- Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
- Continuously improve testing tools, processes, and best practices.
- Optimize test scripts for performance, reliability, and maintainability.
Required Skills & Qualifications
- Hands-on experience with Cypress and JavaScript-based test automation.
- Strong understanding of QA methodologies, tools, and processes.
- Experience in testing web applications across multiple browsers and devices.
- Familiarity with REST APIs and tools like Postman or Swagger.
- Experience with version control systems like Git.
- Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
- Excellent analytical and problem-solving skills.
- Strong written and verbal communication.
Preferred Qualifications
- Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
- Familiarity with performance testing or security testing.
- Background in Agile or Scrum methodologies.
- Basic understanding of DevOps practices.

Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.

.NET + Angular Full Stack Developer (4–5 Years Experience)
Location: Pune/Remote
Experience Required: 4 to 5 years
Communication: Fluent English (verbal & written)
Technology: .NET, Angular
Only immediate joiners who can start on 21st July should apply.
Job Overview
We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.
Key Responsibilities
- Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
- Write clean, scalable, and maintainable code for both backend and frontend components
- Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
- Work closely with designers, QA, and other developers to ensure high-quality product delivery
- Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
- Troubleshoot and debug application issues and provide timely solutions
- Participate in discussions on architecture, design patterns, and technical best practices
Must-Have Skills
✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)
✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)
✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)
✅ Familiarity with Entity Framework or Dapper
✅ Strong knowledge of RESTful API design and integration
✅ Version control using Git
✅ Excellent verbal and written communication skills
✅ Ability to work in a client-facing role and handle discussions independently
Good-to-Have / Optional Skills
Understanding or experience in Microservices Architecture
Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)


Job title - Python developer
Exp – 4 to 6 years
Location – Pune/Mum/B’lore
PFB JD
Requirements:
- Proven experience as a Python Developer
- Strong knowledge of core Python and Pyspark concepts
- Experience with web frameworks such as Django or Flask
- Good exposure to any cloud platform (GCP Preferred)
- CI/CD exposure required
- Solid understanding of RESTful APIs and how to build them
- Experience working with databases like Oracle DB and MySQL
- Ability to write efficient SQL queries and optimize database performance
- Strong problem-solving skills and attention to detail
- Strong SQL programing (stored procedure, functions)
- Excellent communication and interpersonal skill
Roles and Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using pyspark
- Work closely with data scientists and analysts to provide them with clean, structured data.
- Optimize data storage and retrieval for performance and scalability.
- Collaborate with cross-functional teams to gather data requirements.
- Ensure data quality and integrity through data validation and cleansing processes.
- Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
- Stay up to date with industry best practices and emerging technologies in data engineering.
AccioJob is conducting a Walk-In hiring drive in partnership with MindStix to fill the SDE 1 position at their Pune office.
Apply, Register, and select your Slot here: https://go.acciojob.com/hLMAv4
Job Description:
- Role: SDE 1
- Work Location: Pune
- CTC: 5 LPA - 6 LPA
Eligibility Criteria:
- Degree: B.Tech, BE, M.Tech, MCA, BCA
- Branch: Open to all streams
- Graduation Year: 2024 and 2025
- Notice Period: Candidates should have a notice period of 10 days or less
Evaluation Process:
- Offline Assessment at AccioJob Pune Skill Centre
- Company-side Process: In-person Assignment 2 Technical Rounds, 1 HR Round
Note: Please bring your laptop and microphone for the test.
Register Here: https://go.acciojob.com/hLMAv4

Job Purpose
Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.
Key Responsibilities:
- Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
- Perform data transformation and validation for accuracy and consistency.
- Upload processed datasets into SQL Server using SSIS packages.
- Monitor and optimize database performance, identifying and resolving bottlenecks.
- Perform regular backups, restorations, and recovery checks to ensure data continuity.
- Manage user access and implement robust database security policies.
- Oversee database storage allocation and utilization.
- Conduct routine maintenance and support incident management, including root cause analysis and resolution.
- Design and implement scalable database solutions and architecture.
- Create and maintain stored procedures, views, and other database components.
- Optimize SQL queries for performance and scalability.
- Execute ETL processes and support seamless integration of multiple data sources.
- Maintain data integrity and quality through validation and cleansing routines.
- Collaborate with cross-functional teams on data solutions and project deliverables.
Educational Qualification: Any Graduate
Required Skills & Qualifications:
- Proven experience with SQL Server or similar relational database platforms.
- Strong expertise in SSIS, ETL processes, and data warehousing.
- Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
- Experience in database security, user role management, and access control.
- Familiarity with backup/recovery strategies and database maintenance best practices.
- Strong analytical skills with experience working with large and complex datasets.
- Solid understanding of data modeling, normalization, and schema design.
- Knowledge of incident and change management processes.
- Excellent communication and collaboration skills.
- Experience with Python for data manipulation and automation is a strong plus.
Job Title : Ab Initio Developer
Location : Pune
Experience : 5+ Years
Notice Period : Immediate Joiners Only
Job Summary :
We are looking for an experienced Ab Initio Developer to join our team in Pune.
The ideal candidate should have strong hands-on experience in Ab Initio development, data integration, and Unix scripting, with a solid understanding of SDLC and data warehousing concepts.
Mandatory Skills :
Ab Initio (GDE, EME, graphs, parameters), SQL/Teradata, Data Warehousing, Unix Shell Scripting, Data Integration, DB Load/Unload Utilities.
Key Responsibilities :
- Design and develop Ab Initio graphs/plans/sandboxes/projects using GDE and EME.
- Manage and configure standard environment parameters and multifile systems.
- Perform complex data integration from multiple source and target systems with business rule transformations.
- Utilize DB Load/Unload Utilities effectively for optimized performance.
- Implement generic graphs, ensure proper use of parallelism, and maintain project parameters.
- Work in a data warehouse environment involving SDLC, ETL processes, and data analysis.
- Write and maintain Unix Shell Scripts and use utilities like sed, awk, etc.
- Optimize and troubleshoot performance issues in Ab Initio jobs.
Mandatory Skills :
- Strong expertise in Ab Initio (GDE, EME, graphs, parallelism, DB utilities, multifile systems).
- Experience with SQL and databases like SQL Server or Teradata.
- Proficiency in Unix Shell Scripting and Unix utilities.
- Data integration and ETL from varied source/target systems.
Good to Have :
- Experience in Ab Initio and AWS integration.
- Knowledge of Message Queues and Continuous Graphs.
- Exposure to Metadata Hub.
- Familiarity with Big Data tools such as Hive, Impala.
- Understanding of job scheduling tools.
Java developer will be responsible for many duties throughout the development lifecycle of applications, from concept and design right through to testing.Duties/Responsibilities:
- To support and maintain existing Java code base, debug the application
- To analyse user and business requirements and design and implement appropriate solutions
- To design and code programs following in-house standards and good design principles
- To ensure that all programs are documented to the company standards
- To create unit test plans and perform unit testing of the programs
- To provide advice and guidance to other members of the team
Required Skills/Abilities:
- Hands on experience in designing and developing applications using Java EE platforms
- Object Oriented analysis and design using common design patterns
- Good knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate)
- Experience in the Spring Framework
- Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC)
- Experience in RESTFul webservices
- Experience with test-driven development
- Exposure to portal/mobility development - Desired
- Exposure to any of middleware solutions like MQ, Oracle fusion middleware(WebLogic), WebSphere, Open Source
We’re hiring a Maximo Technical Lead with hands-on experience in Maximo 7.6 or higher, Java, and Oracle DB. The role involves leading Maximo implementations, upgrades, and support projects, especially for manufacturing clients.
Key Skills:
IBM Maximo (MAS 8.x preferred)
Java, Oracle 12c+, WebSphere
Maximo Mobile / Asset Management / Cognos / BIRT
SQL, scripting, troubleshooting
Experience leading tech teams and working with clients
Good to Have:
IBM Maximo Certification
MES/Infrastructure planning knowledge
Experience with Rail or Manufacturing domain
AccioJob is conducting an offline hiring drive in partnership with MindStix to fill the SDE 1 position at their Pune office.
Apply, Register, and select your Slot here:
https://go.acciojob.com/Hb8ATw
Job Description:
- Role: SDE 1
- Work Location: Pune
- CTC: 5 LPA - 6 LPA
Eligibility Criteria:
- Degree: B.Tech, BE, M.Tech, MCA, BCA
- Branch: Open to all streams
- Graduation Year: 2024 and 2025
- Notice Period: Candidates should have a notice period of 10 days or less
Evaluation Process:
- Offline Assessment at AccioJob Pune Skill Centre
- Company-side Process: In-person Assignment 2 Technical Rounds, 1 HR Round
Note: Please bring your laptop and microphone for the test.
Register Here: https://go.acciojob.com/Hb8ATw
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.

Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.
Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd
Job Description:
- Role: Junior Business/Data Analyst (Internship + PPO)
- Work Location: Hyderabad
- Internship Stipend: 15,000 - 25,000/month
- Internship Duration: 3 months
- CTC on PPO: 5 LPA - 6 LPA
Eligibility Criteria:
- Degree: Open to all academic backgrounds
- Graduation Year: 2023, 2024, 2025
Required Skills:
- Proficiency in SQL, Excel, Power BI, and basic Python
- Strong analytical mindset and interest in solving business problems with data
Hiring Process:
- Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
- 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)
Note: Please bring your laptop and earphones for the test.
Register Here: https://go.acciojob.com/69d3Wd

Job Summary:
As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.
Key Responsibilities:
- Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
- Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
- Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
- Work with AWS DMS and RDS for database integration and migration
- Optimize data flows and system performance for speed and cost-effectiveness
- Deploy and manage infrastructure using AWS CloudFormation templates
- Collaborate with cross-functional teams to gather requirements and build robust data solutions
- Ensure data integrity, quality, and security across all systems and processes
Required Skills & Experience:
- 6+ years of experience in Data Engineering with strong AWS expertise
- Proficient in Python and PySpark for data processing and ETL development
- Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
- Strong SQL skills for building complex queries and performing data analysis
- Familiarity with AWS CloudFormation and infrastructure as code principles
- Good understanding of serverless architecture and cost-optimized design
- Ability to write clean, modular, and maintainable code
- Strong analytical thinking and problem-solving skills
Skill Name: ETL Automation Testing
Location: Bangalore, Chennai and Pune
Experience: 5+ Years
Required:
Experience in ETL Automation Testing
Strong experience in Pyspark.
Job Title : Data Engineer – Snowflake Expert
Location : Pune (Onsite)
Experience : 10+ Years
Employment Type : Contractual
Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.
Job Summary :
We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.
The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.
Responsibilities :
- Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
- Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
- Ensure high data quality, security, and adherence to governance frameworks.
- Conduct code reviews and align development with best practices.
Qualifications :
- Bachelor’s in Computer Science, Data Science, IT, or related field.
- Snowflake certifications (Pro/Architect) preferred.

Required Skills:
- Hands-on experience with Databricks, PySpark
- Proficiency in SQL, Python, and Spark.
- Understanding of data warehousing concepts and data modeling.
- Experience with CI/CD pipelines and version control (e.g., Git).
- Fundamental knowledge of any cloud services, preferably Azure or GCP.
Good to Have:
- Bigquery
- Experience with performance tuning and data governance.
Competitive Salary
About Solidatus
At Solidatus, we empower organizations to connect and visualize their data relationships, making it easier to identify, access, and understand their data. Our metadata management technology helps businesses establish a sustainable data foundation, ensuring they meet regulatory requirements, drive digital transformation, and unlock valuable insights.
We’re experiencing rapid growth—backed by HSBC, Citi, and AlbionVC, we secured £14 million in Series A funding in 2021. Our achievements include recognition in the Deloitte UK Technology Fast 50, multiple A-Team Innovation Awards, and a top 1% place to work ranking from The FinancialTechnologist.
Now is an exciting time to join us as we expand internationally and continue shaping the future of data management.
About the Engineering Team
Engineering is the heart of Solidatus. Our team of world-class engineers, drawn from outstanding computer science and technical backgrounds, plays a critical role in crafting the powerful, elegant solutions that set us apart. We thrive on solving challenging visualization and data management problems, building technology that delights users and drives real-world impact for global enterprises.
As Solidatus expands its footprint, we are scaling our capabilities with a focus on building world-class connectors and integrations to extend the reach of our platform. Our engineers are trusted with the freedom to explore, innovate, and shape the product’s future — all while working in a collaborative, high-impact environment. Here, your code doesn’t just ship — it empowers some of the world's largest and most complex organizations to achieve their data ambitions.
Who We Are & What You’ll Do
Join our Data Integration team and help shape the way data flows!
Your Mission:
To expand and refine our suite of out-of-the-box integrations, using our powerful API and SDK to bring in metadata for visualisation from a vast range of sources including databases with diverse SQL dialects.
But that is just the beginning. At our core, we are problem-solvers and innovators. You’ll have the chance to:
Design
intuitive layouts
representing flow of data across complex deployments of diverse technologies
Design and optimize API connectivity and parsers reading from source systems metadata
Explore new paradigms for representing data lineage
Enhance our data ingestion capabilities to handle massive volumes of data
Dig deep into data challenges to build smarter, more scalable solutions
Beyond engineering, you’ll collaborate with users, troubleshoot tricky issues, streamline development workflows, and contribute to a culture of continuous improvement.
What We’re Looking For
- We don’t believe in sticking to a single tech stack just for the sake of it. We’re engineers first, and we pick the best tools for the job. More than ticking off a checklist, we value mindset, curiosity, and problem-solving skills.
- You’re quick to learn and love diving into new technologies
- You push for excellence and aren’t satisfied with “just okay”
- You can break down complex topics in a way that anyone can understand
- You should have 6–8 years of proven experience in developing, and delivering high-quality, scalable software solutions
- You should be a strong self-starter with the ability to take ownership of tasks and drive them to completion with minimal supervision.
- You should be able to mentor junior developers, perform code reviews, and ensure adherence to best practices in software engineering.
Tech & Skills We’d Love to See
Must-have:·
- Strong hands-on experience with Java, Spring Boot RESTful APIs, and Node.js
- Solid knowledge of databases, SQL dialects, and data structures
Nice-to-have:
- Experience with C#, ASP.NET Core, TypeScript, React.js, or similar frameworks
- Bonus points for data experience—we love data wizards
If you’re passionate about engineering high-impact solutions, playing with cutting- edge tech, and making data work smarter, we’d love to have you on board!

About the Role:
We are seeking an experienced Tech Lead with 8+ years of hands-on experience in backend development using .NET or Java. The ideal candidate will have strong leadership capabilities, the ability to mentor a team, and a solid technical foundation to deliver scalable and maintainable backend systems. Prior experience in the healthcare domain is a plus.
Key Responsibilities:
- Lead a team of backend developers to deliver product and project-based solutions.
- Oversee the development and implementation of backend services and APIs.
- Collaborate with cross-functional teams including frontend, QA, DevOps, and Product.
- Perform code reviews and enforce best practices in coding and design.
- Ensure performance, quality, and responsiveness of backend applications.
- Participate in sprint planning, estimations, and retrospectives.
- Troubleshoot, analyze, and optimize application performance.
Required Skills:
- 8+ years of backend development experience in .NET or Java.
- Proven experience as a Tech Lead managing development teams.
- Strong understanding of REST APIs, microservices, and software design patterns.
- Familiarity with SQL and NoSQL databases.
- Good knowledge of Agile/Scrum methodologies.
Preferred Skills:
- Experience in the healthcare domain.
- Exposure to frontend frameworks like Angular or React.
- Understanding of cloud platforms such as Azure/AWS/GCP.
- CI/CD and DevOps practices.
What We Offer:
- Collaborative and value-driven culture.
- Projects with real-world impact in critical domains.
- Flexibility and autonomy in work.
- Continuous learning and growth opportunities.

What You’ll Do:
As a Data Scientist, you will work closely across DeepIntent Analytics teams located in New York City, India, and Bosnia. The role will support internal and external business partners in defining patient and provider audiences, and generating analyses and insights related to measurement of campaign outcomes, Rx, patient journey, and supporting evolution of DeepIntent product suite. Activities in this position include creating and scoring audiences, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.
- Explore ways to to create better audiences
- Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights
- Explore ways of using inference, statistical, machine learning techniques to improve the performance of existing algorithms and decision heuristics
- Design and deploy new iterations of production-level code
- Contribute posts to our upcoming technical blog
Who You Are:
- Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, OR, or Data Science. Graduate degree is strongly preferred
- 3+ years of working experience as Data Analyst, Data Engineer, Data Scientist in digital marketing, consumer advertisement, telecom, or other areas requiring customer level predictive analytics
- Background in either data engineering or analytics
- Hands on technical experience is required, proficiency in performing statistical analysis in Python, including relevant libraries, required
- You have an advanced understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications)
- Experience in programmatic, DSP related, marketing predictive analytics, audience segmentation or audience behaviour analysis or medical / healthcare experience
- You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference)
- Familiarity with data science tools such as, Xgboost, pytorch, Jupyter and strong LLM user experience (developer/API experience is a plus)
- You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing
Job Summary:
Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.
Key Responsibilities:
- Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
- Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
- Work on data migration tasks in AWS environments.
- Monitor and improve database performance; automate key performance indicators and reports.
- Collaborate with cross-functional teams to support data integration and delivery requirements.
- Write shell scripts for automation and manage ETL jobs efficiently.
Required Skills:
- Strong experience with MySQL, complex SQL queries, and stored procedures.
- Hands-on experience with AWS Glue, PySpark, and ETL processes.
- Good understanding of AWS ecosystem and migration strategies.
- Proficiency in shell scripting.
- Strong communication and collaboration skills.
Nice to Have:
- Working knowledge of Python.
- Experience with AWS RDS.

Profile: AWS Data Engineer
Mode- Hybrid
Experience- 5+7 years
Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram
Roles and Responsibilities
- Design and maintain ETL pipelines using AWS Glue and Python/PySpark
- Optimize SQL queries for Redshift and Athena
- Develop Lambda functions for serverless data processing
- Configure AWS DMS for database migration and replication
- Implement infrastructure as code with CloudFormation
- Build optimized data models for performance
- Manage RDS databases and AWS service integrations
- Troubleshoot and improve data processing efficiency
- Gather requirements from business stakeholders
- Implement data quality checks and validation
- Document data pipelines and architecture
- Monitor workflows and implement alerting
- Keep current with AWS services and best practices
Required Technical Expertise:
- Python/PySpark for data processing
- AWS Glue for ETL operations
- Redshift and Athena for data querying
- AWS Lambda and serverless architecture
- AWS DMS and RDS management
- CloudFormation for infrastructure
- SQL optimization and performance tuning

Job Overview:
We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.
Key Responsibilities:
- Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
- Integrate data from diverse sources and ensure its quality, consistency, and reliability.
- Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
- Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
- Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
- Automate data validation, transformation, and loading processes to support real-time and batch data processing.
- Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.
Required Skills:
- 5 to 7 years of hands-on experience in data engineering roles.
- Strong proficiency in Python and PySpark for data transformation and scripting.
- Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
- Solid understanding of SQL and database optimization techniques.
- Experience working with large-scale data pipelines and high-volume data environments.
- Good knowledge of data modeling, warehousing, and performance tuning.
Preferred/Good to Have:
- Experience with workflow orchestration tools like Airflow or Step Functions.
- Familiarity with CI/CD for data pipelines.
- Knowledge of data governance and security best practices on AWS.
Role - ETL Developer
Work Mode - Hybrid
Experience- 4+ years
Location - Pune, Gurgaon, Bengaluru, Mumbai
Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL
Required Skills:
- 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
- Experience in Pyspark, AWS, AWS Glue
- Experience in AWS ,Migration
- Experience with automated scripting and tracking KPIs/metrics for database performance
- Proficiency in shell scripting and ETL.
- Strong communication skills and a collaborative team player
- Knowledge of Python and AWS RDS is a plus

🚀 We’re Hiring- PHP Developer Deqode
📍 Location: Pune (Hybrid)
🕒Experience: 4–6 Years
⏱️ Notice Period: Immediate Joiner
We're looking for a skilled PHP Developer to join our team. If you have a strong grasp of secure coding practices, are experienced in PHP upgrades, and thrive in a fast-paced deployment environment, we’d love to connect with you!
🔧 Key Skills:
- PHP | MySQL | JavaScript | Jenkins | Nginx | AWS
🔐 Security-Focused Responsibilities Include:
- Remediation of PenTest findings
- XSS mitigation (input/output sanitization)
- API rate limiting
- 2FA integration
- PHP version upgrade
- Use of AWS Secrets Manager
- Secure session and password policies
AccioJob is conducting an exclusive diversity hiring drive with a reputed global IT consulting company for female candidates only.
Apply Here: https://links.acciojob.com/3SmQ0Bw
Key Details:
• Role: Application Developer
• CTC: ₹11.1 LPA
• Work Location: Pune, Chennai, Hyderabad, Gurgaon (Onsite)
• Required Skills: DSA, OOPs, SQL, and proficiency in any programming language
Eligibility Criteria:
• Graduation Year: 2024–2025
• Degree: B.E/B.Tech or M.E/M.Tech
• CS/IT branches: No prior experience required
• Non-CS/IT branches: Minimum 6 months of technical experience
• Minimum 60% in UG
Selection Process:
Offline Assessment at AccioJob Skill Center(s) in:
• Pune
• Hyderabad
• Noida
• Delhi
• Greater Noida
Further Rounds for Shortlisted Candidates Only:
• Coding Test
• Code Pairing Round
• Technical Interview
• Leadership Round
Note: Candidates must bring their own laptop & earphones for the assessment.
Apply Here: https://links.acciojob.com/3SmQ0Bw

Work Mode: Hybrid
Need B.Tech, BE, M.Tech, ME candidates - Mandatory
Must-Have Skills:
● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.
● Minimum of 3 years of proven experience as a Data Engineer.
● Strong proficiency in Python programming language and SQL.
● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.
● Good comprehension and critical thinking skills.
● Kindly note Salary bracket will vary according to the exp. of the candidate -
- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA
- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA
- Experience more than 8 yrs - Salary upto 40 LPA
Key Responsibilities would include:
1. Design, develop, and maintain enterprise-level Java applications.
2. Collaborate with cross-functional teams to gather and analyze requirements, and implement solutions.
3. Develop & customize the application using HTML5, CSS, and jQuery to create dynamic and responsive user interfaces.
4. Integrate with relational databases (RDBMS) to manage and retrieve data efficiently.
5. Write clean, maintainable, and efficient code following best practices and coding standards.
6. Participate in code reviews, debugging, and testing to ensure high-quality deliverables.
7. Troubleshoot and resolve issues in existing applications and systems.
Qualification requirement -
1. 4 years of hands-on experience in Java / J2ee development, preferably with enterprise-level projects.
2. Spring Framework including - SOA, AoP and Spring security
3. Proficiency in web technologies including HTML5, CSS, jQuery, and JavaScript.
4. Experience with RESTful APIs and web services.
5. Knowledge of build tools like Maven or Gradle
6. Strong knowledge of relational databases (e.g., MySQL, PostgreSQL, Oracle) and experience with SQL.
7. Experience with version control systems like Git.
8. Understanding of software development lifecycle (SDLC)
9. Strong problem-solving skills and attention to details.

At least 5 years of experience in testing and developing automation tests.
A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.
Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.
Familiarity with Playwright or other browser application testing frameworks is a significant advantage.
Proficiency in object-oriented programming and principles is required.
Extensive knowledge of AWS services is essential.
Strong expertise in REST API testing and SQL is required.
A solid understanding of testing and development life cycle methodologies is necessary.
Knowledge of the financial industry and trading systems is a plus
Job Title: Sr. QA Engineer
Location: Pune, Banner
Mode - Hybrid
Major Responsibilities:
- Understand product requirements and design test plans/ test cases.
- Collaborate with developers for discussing story design/ test cases/code walkthrough etc.
- Design automation strategy for regression test cases.
- Execute tests and collaborate with developers in case of issues.
- Review unit test coverage/ enhance existing unit test coverage
- Automate integration/end-to-end tests using Junit/ Mockito /Selenium/Cypress
Requirements:
- Experience of web application testing/ test automation
- Good analytical skills
- Exposure to test design techniques
- Exposure to Agile Development methodology, Scrums
- Should be able to read and understand code.
- Review and understand unit test cases/ suggest additional unit-level coverage points.
- Exposure to multi-tier web application deployment/architecture (SpringBoot)
- Good exposure to SQL query language
- Exposure to Configuration management tool for code investigation - GitHub
- Exposure to Web Service / API testing
- Cucumber – use case-driven test automation
- System understanding, writing test cases from scratch, requirement analysis, thinking from a user perspective, test designing, and requirement analysis
In your role as Software Engineer/Lead, you will directly work with other developers, Product Owners, and Scrum Masters to evaluate and develop innovative solutions. The purpose of the role is to design, develop, test, and operate a complex set of applications or platforms in the IoT Cloud area.
The role involves the utilization of advanced tools and analytical methods for gathering facts to develop solution scenarios. The job holder needs to be able to execute quality code, review code, and collaborate with other developers.
We have an excellent mix of people, which we believe makes for a more vibrant, more innovative, and more productive team.
- A bachelor’s degree, or master’s degree in information technology, computer science, or other relevant education
- At least 5 years of experience as Software Engineer, in an enterprise context
- Experience in design, development and deployment of large-scale cloud-based applications and services
- Good knowledge in cloud (AWS) serverless application development, event driven architecture and SQL / No-SQL databases
- Experience with IoT products, backend services and design principles
- Good knowledge at least of one backend technology like node.js (JavaScript, TypeScript) or JVM (Java, Scala, Kotlin)
- Passionate about code quality, security and testing
- Microservice development experience with Java (Spring) is a plus
- Good command of English in both Oral & Written

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.
- Shift: 2 PM 11 PM
- Work Mode: Hybrid (3 days a week) across Xebia locations
- Notice Period: Immediate joiners or those with a notice period of up to 30 days
Key Responsibilities:
- Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
- Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
- Ensure data integrity, consistency, and availability across all systems.
- Collaborate with data engineers, analysts, and stakeholders to optimize performance.
- Document standards and best practices for data engineering workflows.
Required Experience:
- 7-8 years of experience in data engineering, architecture, and pipeline development.
- Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
- Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
- Understanding of Data Lake table formats (Delta, Iceberg, etc.).
- Proficiency in Python for scripting and automation.
- Strong problem-solving skills and collaborative mindset.
⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
Best regards,
Vijay S
Assistant Manager - TAG
Job Description:
We are seeking a Tableau Developer with 5+ years of experience to join our Core Analytics team. The candidate will work on large-scale BI projects using Tableau and related tools.
Must Have:
- Strong expertise in Tableau Desktop and Server, including add-ons like Data and Server Management.
- Ability to interpret business requirements, build wireframes, and finalize KPIs, calculations, and designs.
- Participate in design discussions to implement best practices for dashboards and reports.
- Build scalable BI and Analytics products based on feedback while adhering to best practices.
- Propose multiple solutions for a given problem, leveraging toolset functionality.
- Optimize data sources and dashboards while ensuring business requirements are met.
- Collaborate with product, platform, and program teams for timely delivery of dashboards and reports.
- Provide suggestions and take feedback to deliver future-ready dashboards.
- Peer review team members’ dashboards, offering constructive feedback to improve overall design.
- Proficient in SQL, UI/UX practices, and alation, with an understanding of good data models for reporting.
- Mentor less experienced team members.
Here is the Job Description -
Location -- Viman Nagar, Pune
Mode - 5 Days Working
Required Tech Skills:
● Strong at PySpark, Python
● Good understanding of Data Structure
● Good at SQL query/optimization
● Strong fundamentals of OOPs programming
● Good understanding of AWS Cloud, Big Data.
● Data Lake, AWS Glue, Athena, S3, Kinesis, SQL/NoSQL DB