Cutshort logo

50+ SQL Jobs in Pune | SQL Job openings in Pune

Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Pune
4 - 8 yrs
₹4L - ₹14L / yr
skill iconAmazon Web Services (AWS)
skill iconKubernetes
skill iconDocker
Terraform
Linux/Unix
+3 more

Job Summary:

We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.


Key Responsibilities:

  • Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
  • Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
  • Develop and manage infrastructure using Terraform or CloudFormation.
  • Manage and orchestrate containers using Docker and Kubernetes (EKS).
  • Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
  • Write robust automation scripts using Python and Shell scripting.
  • Monitor system performance and availability, and ensure high uptime and reliability.
  • Execute and optimize SQLqueries for MSSQL and PostgresQL databases.
  • Maintain clear documentation and provide technical support to stakeholders and clients.


Required Skills:

  • Minimum 4+ years of experience in a DevOps or related role.
  • Proven experience in client-facing engagements and communication.
  • Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
  • Proficiency in Infrastructure as Code using Terraform or CloudFormation.
  • Hands-on experience with Docker and Kubernetes (EKS).
  • Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
  • Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
  • Proficient in Python and Shell scripting.


Preferred Qualifications:

  • AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
  • Experience working in Agile/Scrum environments.
  • Strong problem-solving and analytical skills.


Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
8 - 12 yrs
Best in industry
skill iconJava
MySQL
SQL
Microservices
RESTful APIs
+1 more

About the Role:

We are seeking an experienced Tech Lead with 8+ years of hands-on experience in backend development using Java. The ideal candidate will have strong leadership capabilities, the ability to mentor a team, and a solid technical foundation to deliver scalable and maintainable backend systems. Prior experience in the healthcare domain is a plus.


Key Responsibilities:

  • Lead a team of backend developers to deliver product and project-based solutions.
  • Oversee the development and implementation of backend services and APIs.
  • Collaborate with cross-functional teams including frontend, QA, DevOps, and Product.
  • Perform code reviews and enforce best practices in coding and design.
  • Ensure performance, quality, and responsiveness of backend applications.
  • Participate in sprint planning, estimations, and retrospectives.
  • Troubleshoot, analyze, and optimize application performance.

Required Skills:

  • 8+ years of backend development experience in Java.
  • Proven experience as a Tech Lead managing development teams.
  • Strong understanding of REST APIs, microservices, and software design patterns.
  • Familiarity with SQL and NoSQL databases.
  • Good knowledge of Agile/Scrum methodologies.

Preferred Skills:

  • Experience in the healthcare domain.
  • Exposure to frontend frameworks like Angular or React.
  • Understanding of cloud platforms such as Azure/AWS/GCP.
  • CI/CD and DevOps practices.

What We Offer:

  • Collaborative and value-driven culture.
  • Projects with real-world impact in critical domains.
  • Flexibility and autonomy in work.
  • Continuous learning and growth opportunities.
Read more
HighQ-labs
Lakshmi dantuluri
Posted by Lakshmi dantuluri
Bengaluru (Bangalore), Pune, Kochi (Cochin)
3 - 6 yrs
₹15L - ₹16L / yr
software progamming
rest api
skill iconHTML/CSS
skill iconAngular (2+)
Hibernate (Java)
+3 more

Responsibility:

∙Develop and maintain code following predefined cost, company and security

standards.

∙Work on bug fixes, supporting in the maintenance and improvement of existing

applications.

∙Elaborate interfaces using standards and design principles defined by the team.

∙Develop systems with high availability.

∙Attend and contribute to development meetings.

∙Well versed with Unit testing and PSR Standards.

∙Master Software Development lifecycle, standards and technologies used by the

team.

∙Deliver on time with high quality.

∙Write Automation tests before to API call to code it and test it.

∙Trouble Shooting and debugging skills.

∙Perform technical documentation of the implemented tasks.

Read more
UniAthena
HR Athena
Posted by HR Athena
Pune
3 - 7 yrs
₹5L - ₹10L / yr
skill iconPython
PowerBI
SQL
skill iconMachine Learning (ML)
Predictive modelling
+1 more

Job Requirement :

  • 3-5 Years of experience in Data Science
  • Strong expertise in statistical modeling, machine learning, deep learning, data warehousing, ETL, and reporting tools.
  • Bachelors/ Masters in Data Science, Statistics, Computer Science, Business Intelligence,
  • Experience with relevant programming languages and tools such as Python, R, SQL, Spark, Tableau, Power BI.
  • Experience with machine learning frameworks like TensorFlow, PyTorch, or Scikit-learn
  • Ability to think strategically and translate data insights into actionable business recommendations.
  • Excellent problem-solving and analytical skills
  • Adaptability and openness towards changing environment and nature of work
  • This is a startup environment with evolving systems and procedures, the ideal candidate will be comfortable working in a fast-paced, dynamic environment and will have a strong desire to make a significant impact on the business.

Job Roles & Responsibilities:

  • Conduct in-depth analysis of large-scale datasets to uncover insights and trends.
  • Build and deploy predictive and prescriptive machine learning models for various applications.
  • Design and execute A/B tests to evaluate the effectiveness of different strategies.
  • Collaborate with product managers, engineers, and other stakeholders to drive data-driven decision-making.
  • Stay up-to-date with the latest advancements in data science and machine learning.


Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
6 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconPython
SQL
PySpark
XGBoost

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.


Data Axle Pune is pleased to have achieved certification as a Great Place to Work!


Roles & Responsibilities:

We are looking for a Senior Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.


We are looking for a Senior Data Scientist who will be responsible for:

  1. Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
  2. Design or enhance ML workflows for data ingestion, model design, model inference and scoring
  3. Oversight on team project execution and delivery
  4. Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
  5. Visualize and publish model performance results and insights to internal and external audiences


Qualifications:

  1. Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
  2. Minimum of 5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  3. Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
  4. Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
  5. Proficiency in Python and SQL required; PySpark/Spark experience a plus
  6. Ability to conduct a productive peer review and proper code structure in Github
  7. Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
  8. Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.


It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Pune
4 - 6 yrs
₹3L - ₹12L / yr
skill iconAmazon Web Services (AWS)
SQL
PL/SQL
DevOps
skill iconDocker
+5 more

Job Summary:

We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.

Key Responsibilities:

  • Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
  • Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
  • Develop and manage infrastructure using Terraform or CloudFormation.
  • Manage and orchestrate containers using Docker and Kubernetes (EKS).
  • Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
  • Write robust automation scripts using Python and Shell scripting.
  • Monitor system performance and availability, and ensure high uptime and reliability.
  • Execute and optimize SQL queries for MSSQL and PostgreSQL databases.
  • Maintain clear documentation and provide technical support to stakeholders and clients.

Required Skills:

  • Minimum 4+ years of experience in a DevOps or related role.
  • Proven experience in client-facing engagements and communication.
  • Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
  • Proficiency in Infrastructure as Code using Terraform or CloudFormation.
  • Hands-on experience with Docker and Kubernetes (EKS).
  • Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
  • Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
  • Proficient in Python and Shell scripting.

Preferred Qualifications:

  • AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
  • Experience working in Agile/Scrum environments.
  • Strong problem-solving and analytical skills.

Work Mode & Timing:

  • Hybrid – Pune-based candidates preferred.
  • Working hours: 12:30 PM to 9:30 PM IST to align with client time zones.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Priyanka Seshadri
Posted by Priyanka Seshadri
Hyderabad, Pune, Bengaluru (Bangalore)
6 - 13 yrs
Best in industry
SQL
ETL
Banking
  • 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience on Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management toolis good to have.
  • Exposure to the financial domain knowledge is considered a plus
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Key Attributes include:

  • Team player with professional and positive approach
  • Creative, innovative and able to think outside of the box
  • Strong attention to detail during root cause analysis and defect issue resolution
  • Self-motivated & self-sufficient
  • Effective communicator both written and verbal
  • Brings a high level of energy with enthusiasm to generate excitement and motivate the team
  • Able to work under pressure with tight deadlines and/or multiple projects
  • Experience in negotiation and conflict resolution
Read more
Noovosoft Technologies

at Noovosoft Technologies

1 recruiter
Eman Khan
Posted by Eman Khan
Pune
2 - 4 yrs
₹10L - ₹20L / yr
skill iconJava
skill iconSpring Boot
06692
Hibernate (Java)
skill iconKotlin
+10 more

Responsibilities:

  • Design, develop, and maintain scalable applications using Java/Kotlin Core Concepts and Spring Boot MVC.
  • Build and optimize REST APIs for seamless client-server communication.
  • Develop and ensure efficient HTTP/HTTPS request-response mechanisms.
  • Handle Java/Kotlin version upgrades confidently, ensuring code compatibility and leveraging the latest features.
  • Solve complex business logic challenges with a methodical and innovative approach.
  • Write optimized SQL queries with Postgres DB.
  • Ensure code quality through adherence to design patterns (e.g., Singleton, Factory, Observer, MVC) and unit testing frameworks like JUnit.
  • Integrate third-party APIs and develop large-scale systems with technical precision.
  • Debug and troubleshoot production issues.


Requirements:

  • 2 to 4 years of hands-on experience in Java/Kotlin Spring Boot development.
  • Proven expertise in handling version upgrades for Java and Kotlin with confidence.
  • Strong logical thinking and problem-solving skills, especially in implementing complex algorithms.
  • Proficiency with Git, JIRA, and managing software package versions.
  • Familiarity with SaaS-based products, XML parsing/generation, and generating PDFs, XLS, CSVs using Spring Boot.
  • Strong understanding of JPA, Hibernate, and core Java concepts (OOP).


Skills (Good to Have):

  • Exposure to Docker, Redis, and Elasticsearch.
  • Knowledge of transaction management and solving computational problems.
  • Eagerness to explore new technologies.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Anurag Sinha
Posted by Anurag Sinha
Pune, Mumbai, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
RESTful APIs
skill iconFlask
skill iconKubernetes
DevOps
+2 more
  • 5+ years of experience
  • FlaskAPI, RestAPI development experience
  • Proficiency in Python programming.
  • Basic knowledge of front-end development.
  • Basic knowledge of Data manipulation and analysis libraries
  • Code versioning and collaboration. (Git)
  • Knowledge for Libraries for extracting data from websites.
  • Knowledge of SQL and NoSQL databases
  • Familiarity with RESTful APIs
  • Familiarity with Cloud (Azure /AWS) technologies


Read more
Global Consulting and Services

Global Consulting and Services

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹3L - ₹6L / yr
MS-Excel
skill iconPython
pandas
NumPy
SQL

AccioJob is conducting a Walk-In Hiring Drive with Global Consulting and Services for the position of Python Automation Engineer.


To apply, register and select your slot here: https://go.acciojob.com/b7BZZZ


Required Skills: Excel, Python, Panda, Numpy, SQL


Eligibility:

  • Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
  • Branch: All
  • Graduation Year: 2023, 2024, 2025


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 3 LPA to 6 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre

Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round,

Technical Interview 1

Technical Interview 2

Tech+Managerial Round


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/b7BZZZ

Or, apply through our newly launched app:https://go.acciojob.com/4wvBDe

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Bengaluru (Bangalore), Pune, Jaipur, Bhopal, Gurugram, Hyderabad
5 - 7 yrs
₹5L - ₹18L / yr
Software Testing (QA)
Manual testing
SQL
ETL

🚀 Hiring: Manual Tester

⭐ Experience: 5+ Years

📍 Location: Pan India

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Must-Have Skills:

✅5+ years of experience in Manual Testing

✅Solid experience in ETL, Database, and Report Testing

✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations

✅Working knowledge of BI tools such as Power BI

✅Ability to write effective Test Cases and Test Scenarios

Read more
VDart

VDart

Agency job
via VDart by Don Blessing
Pune
4 - 8 yrs
₹10L - ₹15L / yr
Fullstack Developer
skill iconReact.js
skill iconNodeJS (Node.js)
SQL
NOSQL Databases
+6 more

Full Stack Developer (Node.js & React)

Location: Pune, India (Local or Ready to Relocate)

Employment Type: 6–8 Month Contract (Potential Conversion to FTE Based on Performance)


About the Role

We are seeking a highly skilled Full Stack Developer with expertise in Node.js and React to join our dynamic team in Pune. This role involves designing, developing, and deploying scalable web applications. You will collaborate with cross-functional teams to deliver high-impact solutions while adhering to best practices in coding, testing, and security.


Key Responsibilities

  • Develop and maintain server-side applications using Node.js (Express/NestJS) and client-side interfaces with React.js (Redux/Hooks).
  • Architect RESTful APIs and integrate with databases (SQL/NoSQL) and third-party services.
  • Implement responsive UI/UX designs with modern front-end libraries (e.g., Material-UI, Tailwind CSS).
  • Write unit/integration tests (Jest, Mocha, React Testing Library) and ensure code quality via CI/CD pipelines.
  • Collaborate with product managers, designers, and QA engineers in an Agile environment.
  • Troubleshoot performance bottlenecks and optimize applications for scalability.
  • Document technical specifications and deployment processes.


Required Skills & Qualifications

  • Experience: 3+ years in full-stack development with Node.js and React.
  • Backend Proficiency:
  • Strong knowledge of Node.js, Express, or NestJS.
  • Experience with databases (PostgreSQL, MongoDB, Redis).
  • API design (REST/GraphQL) and authentication (JWT/OAuth).
  • Frontend Proficiency:
  • Expertise in React.js (Functional Components, Hooks, Context API).
  • State management (Redux, Zustand) and modern CSS frameworks.
  • DevOps & Tools:
  • Git, Docker, AWS/Azure, and CI/CD tools (Jenkins/GitHub Actions).
  • Testing frameworks (Jest, Cypress, Mocha).
  • Soft Skills:
  • Problem-solving mindset and ability to work in a fast-paced environment.
  • Excellent communication and collaboration skills.
  • Location: Based in Pune or willing to relocate immediately.


Preferred Qualifications

  • Experience with TypeScript, Next.js, or serverless architectures.
  • Knowledge of microservices, message brokers (Kafka/RabbitMQ), or container orchestration (Kubernetes).
  • Familiarity with Agile/Scrum methodologies.
  • Contributions to open-source projects or a strong GitHub portfolio.


What We Offer

  • Competitive Contract Compensation with timely payouts.
  • Potential for FTE Conversion: Performance-based path to a full-time role.
  • Hybrid Work Model: Flexible in-office (Pune) and remote options.
  • Learning Opportunities: Access to cutting-edge tools and mentorship.
  • Collaborative Environment: Work with industry experts on innovative projects.


Apply Now!

Ready to make an impact? Send your resume and GitHub/Portfolio links with the subject line:

"Full Stack Developer (Node/React) - Pune".

Local candidates or those relocating to Pune will be prioritized. Applications without portfolios will not be considered.


Equal Opportunity Employer

We celebrate diversity and are committed to creating an inclusive environment for all employees.

Read more
Service based company

Service based company

Agency job
via Jobdost by Sathish Kumar
Pune
5 - 10 yrs
₹15L - ₹20L / yr
Windows Azure
Azure Logic
Azure Service
SQL

🌐 Job Title: Senior Azure Developer

🏢 Department: Digital Engineering

📍 Location: Pune (Work from Office)

📄 Job Type: Full-time

💼 Experience Required: 5+ years

💰 Compensation: Best in the industry


🔧 Roles & Responsibilities:

  • Design, develop, and implement solutions using Microsoft Azure with .NET and other technologies.
  • Collaborate with business analysts and end users to define system requirements.
  • Work with QA teams to ensure solution integrity and functionality.
  • Communicate frequently with stakeholders and team members to track progress and validate requirements.
  • Evaluate and present technical solutions and recommendations.
  • Provide technical mentoring and training to peers and junior developers.

💡 Technical Requirements:

  • Minimum 2 years of hands-on development experience in:
  • Azure Logic Apps
  • Azure Service Bus
  • Azure Web/API Apps
  • Azure Functions
  • Azure SQL Database / Cosmos DB
  • Minimum 2 years’ experience in enterprise software development using .NET stack:
  • REST APIs
  • Web Applications
  • Distributed Systems
  • Familiarity with security best practices (e.g., OWASP).
  • Knowledge of NoSQL data stores is an added advantage.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Hyderabad
6 - 8 yrs
₹8L - ₹16L / yr
skill iconPostgreSQL
SQL
Oracle Data Guard

Job Title: PostgreSQL Database Administrator

Experience: 6–8 Years

Work Mode: Hybrid

Locations: Hyderabad / Pune

Joiners: Only immediate joiners & candidates who have completed notice period


Required Skills

  • Strong hands-on experience in PostgreSQL administration (6+ years).
  • Excellent understanding of SQL and query optimization techniques.
  • Deep knowledge of database services, architecture, and internals.
  • Experience in performance tuning at both DB and OS levels.
  • Familiarity with DataGuard or similar high-availability solutions.
  • Strong experience in job scheduling and automation.
  • Comfortable with installing, configuring, and upgrading PostgreSQL.
  • Basic to intermediate knowledge of Linux system administration.
  • Hands-on experience with shell scripting for automation and monitoring tasks.



Key Responsibilities

  • Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
  • Write and optimize complex SQL queries for performance and scalability.
  • Manage database storage structures and ensure optimal disk usage and performance.
  • Monitor, analyze, and resolve database performance issues using tools and logs.
  • Perform database tuning, configuration adjustments, and query optimization.
  • Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
  • Install and upgrade PostgreSQL database software to new versions as required.
  • Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
  • Perform regular database backups and restorations to ensure data integrity and availability.
  • Apply security patches and updates on time.
  • Collaborate with developers for schema design, stored procedures, and access privileges.
  • Document configurations, processes, and performance tuning results.



Read more
Fusionpact Technologies Inc
Pune
8 - 15 yrs
₹25L - ₹28L / yr
Microservices
06692
Apache Kafka
RESTful APIs
SQL
+3 more

Senior Software Engineer – Java


Location: Pune (Hybrid – 3 days from office)


Experience: 8–15 Years


Domain: Information Technology (IT)


Joining: Immediate joiners only


Preference: Local candidates only (Pune-based)


Job Description:


We are looking for experienced and passionate Senior Java Engineers to join a high-performing development team. The role involves building and maintaining robust, scalable, and low-latency backend systems and microservices in a fast-paced, agile environment.


Key Responsibilities:


  • Work within a high-velocity scrum team to deliver enterprise-grade software solutions.
  • Architect and develop scalable end-to-end web applications and microservices.
  • Collaborate with cross-functional teams to analyze requirements and deliver optimal technical solutions.
  • Participate in code reviews, unit testing, and deployment.
  • Mentor junior engineers while remaining hands-on with development tasks.
  • Provide accurate estimates and support the team lead in facilitating development processes.


Mandatory Skills & Experience:


  • 6–7+ years of enterprise-level Java development experience.
  • Strong in Java 8 or higher (Java 11 preferred), including lambda expressions, Stream API, Completable Future.
  • Minimum 4+ years working with Microservices, Spring Boot, and Hibernate.
  • At least 3+ years of experience designing and developing RESTful APIs.
  • Kafka – minimum 2 years’ hands-on experience in the current/most recent project.
  • Solid experience with SQL.
  • AWS – minimum 1.5 years of experience.
  • Understanding of CI/CD pipelines and deployment processes.
  • Exposure to asynchronous programming, multithreading, and performance tuning.
  • Experience working in at least one Fintech domain project (mandatory).


Nice to Have:


  • Exposure to Golang or Rust.
  • Experience with any of the following tools: MongoDB, Jenkins, Sonar, Oracle DB, Drools, Adobe AEM, Elasticsearch/Solr/Algolia, Spark.
  • Strong systems design and data modeling capabilities.
  • Experience in payments or asset/wealth management domain.
  • Familiarity with rules engines and CMS/search platforms.


Candidate Profile:


  • Strong communication and client-facing skills.
  • Proactive, self-driven, and collaborative mindset.
  • Passionate about clean code and quality deliverables.
  • Prior experience in building and deploying multiple products in production.


Note: Only candidates who are based in Pune and can join immediately will be considered.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Mumbai, Hyderabad, Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹18L / yr
skill iconPostgreSQL
Oracle Data Guard
SQL
Databases

🚀 Hiring: Postgres DBA at Deqode

⭐ Experience: 6+ Years

📍 Location: Pune & Hyderabad

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Looking for an experienced Postgres DBA with:-

✅ 6+ years in Postgres & strong SQL skills

✅ Good understanding of database services & storage management

✅ Performance tuning & monitoring expertise

✅ Knowledge of Dataguard admin, backups, upgrades

✅ Basic Linux admin & shell scripting

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Mumbai, Chennai, Nagpur, Ahmedabad, Kochi (Cochin)
4 - 10 yrs
₹3L - ₹14L / yr
cypress
Automation
UI
API
skill iconPostman
+4 more

Job Overview

We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.

You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.

Key Responsibilities

  • Develop, maintain, and execute automated test cases using Cypress.
  • Design robust test strategies and plans based on product requirements and user stories.
  • Work with cross-functional teams to identify test requirements and ensure proper coverage.
  • Perform regression, integration, smoke, and exploratory testing as needed.
  • Report and track defects, and work with developers to resolve issues quickly.
  • Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
  • Continuously improve testing tools, processes, and best practices.
  • Optimize test scripts for performance, reliability, and maintainability.

Required Skills & Qualifications

  • Hands-on experience with Cypress and JavaScript-based test automation.
  • Strong understanding of QA methodologies, tools, and processes.
  • Experience in testing web applications across multiple browsers and devices.
  • Familiarity with REST APIs and tools like Postman or Swagger.
  • Experience with version control systems like Git.
  • Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
  • Excellent analytical and problem-solving skills.
  • Strong written and verbal communication.

Preferred Qualifications

  • Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
  • Familiarity with performance testing or security testing.
  • Background in Agile or Scrum methodologies.
  • Basic understanding of DevOps practices.


Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Noida, Bengaluru (Bangalore), Pune
6 - 9 yrs
₹10L - ₹18L / yr
Windows Azure
SQL Azure
SQL
Data Warehouse (DWH)
skill iconData Analytics
+3 more

Hybrid work mode


(Azure) EDW Experience working in loading Star schema data warehouses using framework

architectures including experience loading type 2 dimensions. Ingesting data from various

sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.

Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),

Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Anusha Varghese
Posted by Anusha Varghese
Hyderabad, Bengaluru (Bangalore), Pune
4 - 11 yrs
₹10L - ₹25L / yr
ETL
SQL
snowflake
  • A minimum of 4-10 years of experience into data integration/orchestration services,  service architecture and providing data driven solutions for client requirements 
  • Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning. 
  • Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus 
  • Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must 
  • Exposure to the financial domain knowledge is considered a plus. 
  • Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. 
  • Prior experience with State Street and Charles River Development ( CRD) considered a plus. 
  • Experience in tools such as Visio, PowerPoint, Excel. 
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus. 
  • Strong SQL knowledge and debugging skills is a must.


Read more
Remote only
4 - 6 yrs
₹10L - ₹15L / yr
skill iconAngular (2+)
skill icon.NET
SQL
Relational Database (RDBMS)
Dependency injection

.NET + Angular Full Stack Developer (4–5 Years Experience)

Location: Pune/Remote

Experience Required: 4 to 5 years

Communication: Fluent English (verbal & written)

Technology: .NET, Angular

Only immediate joiners who can start on 21st July should apply.


Job Overview

We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.


Key Responsibilities

  • Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
  • Write clean, scalable, and maintainable code for both backend and frontend components
  • Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
  • Work closely with designers, QA, and other developers to ensure high-quality product delivery
  • Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
  • Troubleshoot and debug application issues and provide timely solutions
  • Participate in discussions on architecture, design patterns, and technical best practices

Must-Have Skills

✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)

✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)

✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)

✅ Familiarity with Entity Framework or Dapper

✅ Strong knowledge of RESTful API design and integration

✅ Version control using Git

✅ Excellent verbal and written communication skills

✅ Ability to work in a client-facing role and handle discussions independently

Good-to-Have / Optional Skills

Understanding or experience in Microservices Architecture

Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.
Read more
Mindstix Software Labs
Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹5L - ₹6L / yr
DSA
SQL
Object Oriented Programming (OOPs)

AccioJob is conducting a Walk-In hiring drive in partnership with MindStix to fill the SDE 1 position at their Pune office.


Apply, Register, and select your Slot here: https://go.acciojob.com/hLMAv4


Job Description:

  • Role: SDE 1
  • Work Location: Pune
  • CTC: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: B.Tech, BE, M.Tech, MCA, BCA
  • Branch: Open to all streams
  • Graduation Year: 2024 and 2025
  • Notice Period: Candidates should have a notice period of 10 days or less

Evaluation Process:

  1. Offline Assessment at AccioJob Pune Skill Centre
  2. Company-side Process: In-person Assignment 2 Technical Rounds, 1 HR Round

Note: Please bring your laptop and microphone for the test.


Register Here: https://go.acciojob.com/hLMAv4

Read more
DEMAND MEDIA BPM LLP

at DEMAND MEDIA BPM LLP

2 candid answers
Darshana Mate
Posted by Darshana Mate
Pune
1 - 5 yrs
₹2L - ₹6L / yr
SQL
PowerBI
skill iconPython

Job Purpose

Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.


Key Responsibilities:

  • Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
  • Perform data transformation and validation for accuracy and consistency.
  • Upload processed datasets into SQL Server using SSIS packages.
  • Monitor and optimize database performance, identifying and resolving bottlenecks.
  • Perform regular backups, restorations, and recovery checks to ensure data continuity.
  • Manage user access and implement robust database security policies.
  • Oversee database storage allocation and utilization.
  • Conduct routine maintenance and support incident management, including root cause analysis and resolution.
  • Design and implement scalable database solutions and architecture.
  • Create and maintain stored procedures, views, and other database components.
  • Optimize SQL queries for performance and scalability.
  • Execute ETL processes and support seamless integration of multiple data sources.
  • Maintain data integrity and quality through validation and cleansing routines.
  • Collaborate with cross-functional teams on data solutions and project deliverables.

 

Educational Qualification: Any Graduate

Required Skills & Qualifications:

  • Proven experience with SQL Server or similar relational database platforms.
  • Strong expertise in SSIS, ETL processes, and data warehousing.
  • Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
  • Experience in database security, user role management, and access control.
  • Familiarity with backup/recovery strategies and database maintenance best practices.
  • Strong analytical skills with experience working with large and complex datasets.
  • Solid understanding of data modeling, normalization, and schema design.
  • Knowledge of incident and change management processes.
  • Excellent communication and collaboration skills.
  • Experience with Python for data manipulation and automation is a strong plus.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Pune
5 - 8 yrs
₹10L - ₹18L / yr
Ab Initio
GDE
EME
SQL
Teradata
+5 more

Job Title : Ab Initio Developer

Location : Pune

Experience : 5+ Years

Notice Period : Immediate Joiners Only


Job Summary :

We are looking for an experienced Ab Initio Developer to join our team in Pune.

The ideal candidate should have strong hands-on experience in Ab Initio development, data integration, and Unix scripting, with a solid understanding of SDLC and data warehousing concepts.


Mandatory Skills :

Ab Initio (GDE, EME, graphs, parameters), SQL/Teradata, Data Warehousing, Unix Shell Scripting, Data Integration, DB Load/Unload Utilities.


Key Responsibilities :

  • Design and develop Ab Initio graphs/plans/sandboxes/projects using GDE and EME.
  • Manage and configure standard environment parameters and multifile systems.
  • Perform complex data integration from multiple source and target systems with business rule transformations.
  • Utilize DB Load/Unload Utilities effectively for optimized performance.
  • Implement generic graphs, ensure proper use of parallelism, and maintain project parameters.
  • Work in a data warehouse environment involving SDLC, ETL processes, and data analysis.
  • Write and maintain Unix Shell Scripts and use utilities like sed, awk, etc.
  • Optimize and troubleshoot performance issues in Ab Initio jobs.

Mandatory Skills :

  • Strong expertise in Ab Initio (GDE, EME, graphs, parallelism, DB utilities, multifile systems).
  • Experience with SQL and databases like SQL Server or Teradata.
  • Proficiency in Unix Shell Scripting and Unix utilities.
  • Data integration and ETL from varied source/target systems.

Good to Have :

  • Experience in Ab Initio and AWS integration.
  • Knowledge of Message Queues and Continuous Graphs.
  • Exposure to Metadata Hub.
  • Familiarity with Big Data tools such as Hive, Impala.
  • Understanding of job scheduling tools.
Read more
TCS

at TCS

Agency job
via Aavyan Consulting by Jayatri Paul
Bengaluru (Bangalore), Pune, Chennai
5 - 8 yrs
₹8L - ₹12L / yr
Maximo
skill iconJava
Oracle
SQL

We’re hiring a Maximo Technical Lead with hands-on experience in Maximo 7.6 or higher, Java, and Oracle DB. The role involves leading Maximo implementations, upgrades, and support projects, especially for manufacturing clients.


Key Skills:

IBM Maximo (MAS 8.x preferred)

Java, Oracle 12c+, WebSphere

Maximo Mobile / Asset Management / Cognos / BIRT

SQL, scripting, troubleshooting

Experience leading tech teams and working with clients


Good to Have:

IBM Maximo Certification

MES/Infrastructure planning knowledge

Experience with Rail or Manufacturing domain


https://lnkd.in/getubzJd

Read more
Mindstix Software Labs
Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹5L - ₹6L / yr
DSA
SQL
Object Oriented Programming (OOPs)

AccioJob is conducting an offline hiring drive in partnership with MindStix to fill the SDE 1 position at their Pune office.


Apply, Register, and select your Slot here: 

https://go.acciojob.com/Hb8ATw

Job Description:

  • Role: SDE 1
  • Work Location: Pune
  • CTC: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: B.Tech, BE, M.Tech, MCA, BCA
  • Branch: Open to all streams
  • Graduation Year: 2024 and 2025
  • Notice Period: Candidates should have a notice period of 10 days or less

Evaluation Process:

  1. Offline Assessment at AccioJob Pune Skill Centre
  2. Company-side Process: In-person Assignment 2 Technical Rounds, 1 HR Round

Note: Please bring your laptop and microphone for the test.


Register Here: https://go.acciojob.com/Hb8ATw

Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai, Kolkata
3 - 8 yrs
₹5L - ₹20L / yr
Oracle Analytics Cloud (OAC)
Fusion Data Intelligence (FDI) Specialist
RPD
OAC Reports
Data Visualization
+7 more

Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist

Experience : 3 to 8 years

Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata

Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)


Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.


Key Responsibilities :

  • Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
  • Build and optimize complex RPD models, OAC reports, and data visualizations.
  • Utilize SQL and PL/SQL for data querying and performance optimization.
  • Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
  • Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
  • Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
  • Implement cloud scripting using CURL for Oracle Cloud automation.
  • Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
  • Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.

Required Skills :

  • Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
  • Deep understanding of data modeling, reporting, and visualization techniques.
  • Proficiency in SQL, PL/SQL, and relational databases on Oracle.
  • Familiarity with DevOps tools, version control, and deployment automation.
  • Working knowledge of Oracle Cloud services, scripting, and monitoring.

Good to Have :

  • Prior experience in OBIEE to OAC migrations.
  • Exposure to data security models and cloud performance tuning.
  • Certification in Oracle Cloud-related technologies.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
Partner Company

Partner Company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹5L - ₹6L / yr
SQL
MS-Excel
PowerBI
skill iconPython

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.


Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd


Job Description:

  • Role: Junior Business/Data Analyst (Internship + PPO)
  • Work Location: Hyderabad
  • Internship Stipend: 15,000 - 25,000/month
  • Internship Duration: 3 months
  • CTC on PPO: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: Open to all academic backgrounds
  • Graduation Year: 2023, 2024, 2025

Required Skills:

  • Proficiency in SQLExcelPower BI, and basic Python
  • Strong analytical mindset and interest in solving business problems with data

Hiring Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
  2. 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)

Note: Please bring your laptop and earphones for the test.


Register Here: https://go.acciojob.com/69d3Wd

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Bengaluru (Bangalore), Pune, Chennai
5 - 12 yrs
₹5L - ₹25L / yr
PySpark
Automation
SQL

Skill Name: ETL Automation Testing

Location: Bangalore, Chennai and Pune

Experience: 5+ Years


Required:

Experience in ETL Automation Testing

Strong experience in Pyspark.

Read more
NeoGenCode Technologies Pvt Ltd
Pune
8 - 15 yrs
₹5L - ₹24L / yr
Data engineering
Snow flake schema
SQL
ETL
ELT
+5 more

Job Title : Data Engineer – Snowflake Expert

Location : Pune (Onsite)

Experience : 10+ Years

Employment Type : Contractual

Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.


Job Summary :

We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.

The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.

Responsibilities :

  • Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
  • Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
  • Ensure high data quality, security, and adherence to governance frameworks.
  • Conduct code reviews and align development with best practices.

Qualifications :

  • Bachelor’s in Computer Science, Data Science, IT, or related field.
  • Snowflake certifications (Pro/Architect) preferred.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Solidatus
Pune
6 - 8 yrs
₹0.5L - ₹0.5L / yr
skill iconJava
skill iconSpring Boot
skill iconNodeJS (Node.js)
Databases
SQL
+6 more

Competitive Salary


About Solidatus


At Solidatus, we empower organizations to connect and visualize their data relationships, making it easier to identify, access, and understand their data. Our metadata management technology helps businesses establish a sustainable data foundation, ensuring they meet regulatory requirements, drive digital transformation, and unlock valuable insights. 

 

We’re experiencing rapid growth—backed by HSBC, Citi, and AlbionVC, we secured £14 million in Series A funding in 2021. Our achievements include recognition in the Deloitte UK Technology Fast 50, multiple A-Team Innovation Awards, and a top 1% place to work ranking from The FinancialTechnologist. 

 

Now is an exciting time to join us as we expand internationally and continue shaping the future of data management. 


About the Engineering Team


Engineering is the heart of Solidatus. Our team of world-class engineers, drawn from outstanding computer science and technical backgrounds, plays a critical role in crafting the powerful, elegant solutions that set us apart. We thrive on solving challenging visualization and data management problems, building technology that delights users and drives real-world impact for global enterprises.

As Solidatus expands its footprint, we are scaling our capabilities with a focus on building world-class connectors and integrations to extend the reach of our platform. Our engineers are trusted with the freedom to explore, innovate, and shape the product’s future — all while working in a collaborative, high-impact environment. Here, your code doesn’t just ship — it empowers some of the world's largest and most complex organizations to achieve their data ambitions.


Who We Are & What You’ll Do


Join our Data Integration team and help shape the way data flows! 


Your Mission:


To expand and refine our suite of out-of-the-box integrations, using our powerful API and SDK to bring in metadata for visualisation from a vast range of sources including databases with diverse SQL dialects.

But that is just the beginning. At our core, we are problem-solvers and innovators. You’ll have the chance to:                                                        

Design

intuitive layouts

representing flow of data across complex deployments of diverse technologies

Design and optimize API connectivity and parsers reading from source systems metadata

Explore new paradigms for representing data lineage

Enhance our data ingestion capabilities to handle massive volumes of data

Dig deep into data challenges to build smarter, more scalable solutions

Beyond engineering, you’ll collaborate with users, troubleshoot tricky issues, streamline development workflows, and contribute to a culture of continuous improvement.


What We’re Looking For


  • We don’t believe in sticking to a single tech stack just for the sake of it. We’re engineers first, and we pick the best tools for the job. More than ticking off a checklist, we value mindset, curiosity, and problem-solving skills.
  • You’re quick to learn and love diving into new technologies
  • You push for excellence and aren’t satisfied with “just okay”
  • You can break down complex topics in a way that anyone can understand
  • You should have 6–8 years of proven experience in developing, and delivering high-quality, scalable software solutions 
  • You should be a strong self-starter with the ability to take ownership of tasks and drive them to completion with minimal supervision.
  • You should be able to mentor junior developers, perform code reviews, and ensure adherence to best practices in software engineering.


Tech & Skills We’d Love to See


Must-have:·

  • Strong hands-on experience with Java, Spring Boot RESTful APIs, and Node.js
  • Solid knowledge of databases, SQL dialects, and data structures


Nice-to-have:

  • Experience with C#, ASP.NET Core, TypeScript, React.js, or similar frameworks
  • Bonus points for data experience—we love data wizards


If you’re passionate about engineering high-impact solutions, playing with cutting- edge tech, and making data work smarter, we’d love to have you on board!

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 10 yrs
Best in industry
skill iconPython
Spark
Apache Airflow
skill iconDocker
SQL
+2 more

What You’ll Do:


As a Data Scientist, you will work closely across DeepIntent Analytics teams located in New York City, India, and Bosnia. The role will support internal and external business partners in defining patient and provider audiences, and generating analyses and insights related to measurement of campaign outcomes, Rx, patient journey, and supporting evolution of DeepIntent product suite. Activities in this position include creating and scoring audiences, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to to create better audiences 
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights 
  • Explore ways of using inference, statistical, machine learning techniques to improve the performance of existing algorithms and decision heuristics
  • Design and deploy new iterations of production-level code
  • Contribute posts to our upcoming technical blog  

Who You Are:

  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, OR, or Data Science. Graduate degree is strongly preferred 
  • 3+ years of working experience as Data Analyst, Data Engineer, Data Scientist in digital marketing, consumer advertisement, telecom, or other areas requiring customer level predictive analytics
  • Background in either data engineering or analytics
  • Hands on technical experience is required, proficiency in performing statistical analysis in Python, including relevant libraries, required
  • You have an advanced understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications)
  • Experience in programmatic, DSP related, marketing predictive analytics, audience segmentation or audience behaviour analysis or medical / healthcare experience
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference) 
  • Familiarity with data science tools such as, Xgboost, pytorch, Jupyter and strong LLM user experience (developer/API experience is a plus)
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing


Read more
Deqode

at Deqode

1 recruiter
Mokshada Solanki
Posted by Mokshada Solanki
Bengaluru (Bangalore), Mumbai, Pune, Gurugram
4 - 5 yrs
₹4L - ₹20L / yr
SQL
skill iconAmazon Web Services (AWS)
Migration
PySpark
ETL

Job Summary:

Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.


Key Responsibilities:

  • Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
  • Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
  • Work on data migration tasks in AWS environments.
  • Monitor and improve database performance; automate key performance indicators and reports.
  • Collaborate with cross-functional teams to support data integration and delivery requirements.
  • Write shell scripts for automation and manage ETL jobs efficiently.


Required Skills:

  • Strong experience with MySQL, complex SQL queries, and stored procedures.
  • Hands-on experience with AWS Glue, PySpark, and ETL processes.
  • Good understanding of AWS ecosystem and migration strategies.
  • Proficiency in shell scripting.
  • Strong communication and collaboration skills.


Nice to Have:

  • Working knowledge of Python.
  • Experience with AWS RDS.



Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune, Mumbai, Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹5L - ₹10L / yr
ETL
SQL
skill iconAmazon Web Services (AWS)
PySpark
KPI

Role - ETL Developer

Work ModeHybrid

Experience- 4+ years

Location - Pune, Gurgaon, Bengaluru, Mumbai

Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL

Required Skills:

  • 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
  • Experience in Pyspark, AWS, AWS Glue
  • Experience in AWS ,Migration
  • Experience with automated scripting and tracking KPIs/metrics for database performance
  • Proficiency in shell scripting and ETL.
  • Strong communication skills and a collaborative team player
  • Knowledge of Python and AWS RDS is a plus


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Indore
4 - 6 yrs
₹10L - ₹18L / yr
skill iconAmazon Web Services (AWS)
skill iconJavascript
skill iconPHP
SQL

🚀 We’re Hiring- PHP Developer Deqode

📍 Location: Pune (Hybrid)

🕒Experience: 4–6 Years

⏱️ Notice Period: Immediate Joiner


We're looking for a skilled PHP Developer to join our team. If you have a strong grasp of secure coding practices, are experienced in PHP upgrades, and thrive in a fast-paced deployment environment, we’d love to connect with you!


🔧 Key Skills:

- PHP | MySQL | JavaScript | Jenkins | Nginx | AWS


🔐 Security-Focused Responsibilities Include:

- Remediation of PenTest findings

- XSS mitigation (input/output sanitization)

- API rate limiting

- 2FA integration

- PHP version upgrade

- Use of AWS Secrets Manager

- Secure session and password policies



Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Pune, Hyderabad, Gurugram, Chennai
0 - 1 yrs
₹11.1L - ₹11.1L / yr
Data Structures
Algorithms
Object Oriented Programming (OOPs)
SQL
Any programming language

AccioJob is conducting an exclusive diversity hiring drive with a reputed global IT consulting company for female candidates only.


Apply Here: https://links.acciojob.com/3SmQ0Bw


Key Details:

• Role: Application Developer

• CTC: ₹11.1 LPA

• Work Location: Pune, Chennai, Hyderabad, Gurgaon (Onsite)

• Required Skills: DSA, OOPs, SQL, and proficiency in any programming language


Eligibility Criteria:

• Graduation Year: 2024–2025

• Degree: B.E/B.Tech or M.E/M.Tech

• CS/IT branches: No prior experience required

• Non-CS/IT branches: Minimum 6 months of technical experience

• Minimum 60% in UG


Selection Process:

Offline Assessment at AccioJob Skill Center(s) in:

• Pune

• Hyderabad

• Noida

• Delhi

• Greater Noida


Further Rounds for Shortlisted Candidates Only:

• Coding Test

• Code Pairing Round

• Technical Interview

• Leadership Round


Note: Candidates must bring their own laptop & earphones for the assessment.


Apply Here: https://links.acciojob.com/3SmQ0Bw

Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Pune
4 - 8 yrs
₹7L - ₹18L / yr
skill iconJava
skill iconJavascript
skill iconHTML/CSS
skill iconPostgreSQL
SQL
+8 more

Key Responsibilities would include: 


1. Design, develop, and maintain enterprise-level Java applications. 

2. Collaborate with cross-functional teams to gather and analyze requirements, and implement solutions. 

3. Develop & customize the application using HTML5, CSS, and jQuery to create dynamic and responsive user interfaces. 

4. Integrate with relational databases (RDBMS) to manage and retrieve data efficiently. 

5. Write clean, maintainable, and efficient code following best practices and coding standards. 

6. Participate in code reviews, debugging, and testing to ensure high-quality deliverables. 

7. Troubleshoot and resolve issues in existing applications and systems. 


Qualification requirement - 


1. 4 years of hands-on experience in Java / J2ee development, preferably with enterprise-level projects.

2. Spring Framework including - SOA, AoP and Spring security 

3. Proficiency in web technologies including HTML5, CSS, jQuery, and JavaScript.

4. Experience with RESTful APIs and web services.

5. Knowledge of build tools like Maven or Gradle

6. Strong knowledge of relational databases (e.g., MySQL, PostgreSQL, Oracle) and experience with SQL.

7. Experience with version control systems like Git.

8. Understanding of software development lifecycle (SDLC) 

9. Strong problem-solving skills and attention to details.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Ahmedabad
4 - 9 yrs
₹10L - ₹35L / yr
skill iconPython
pytest
skill iconAmazon Web Services (AWS)
Test Automation (QA)
SQL

At least 5 years of experience in testing and developing automation tests.

A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.

Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.

Familiarity with Playwright or other browser application testing frameworks is a significant advantage.

Proficiency in object-oriented programming and principles is required.

Extensive knowledge of AWS services is essential.

Strong expertise in REST API testing and SQL is required.

A solid understanding of testing and development life cycle methodologies is necessary.

Knowledge of the financial industry and trading systems is a plus

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune
3 - 5 yrs
₹6L - ₹15L / yr
Software Testing (QA)
SQL
TestNG
Selenium
Automation

Job Title: Sr. QA Engineer

Location: Pune, Banner

Mode - Hybrid


Major Responsibilities:


  • Understand product requirements and design test plans/ test cases.
  • Collaborate with developers for discussing story design/ test cases/code walkthrough etc.
  • Design automation strategy for regression test cases.
  • Execute tests and collaborate with developers in case of issues.
  • Review unit test coverage/ enhance existing unit test coverage
  • Automate integration/end-to-end tests using Junit/ Mockito /Selenium/Cypress


Requirements: 


  • Experience of web application testing/ test automation
  • Good analytical skills
  • Exposure to test design techniques
  • Exposure to Agile Development methodology, Scrums
  • Should be able to read and understand code.
  • Review and understand unit test cases/ suggest additional unit-level coverage points.
  • Exposure to multi-tier web application deployment/architecture (SpringBoot)
  • Good exposure to SQL query language
  • Exposure to Configuration management tool for code investigation - GitHub
  • Exposure to Web Service / API testing
  • Cucumber – use case-driven test automation
  • System understanding, writing test cases from scratch, requirement analysis, thinking from a user perspective, test designing, and requirement analysis


Read more
Innominds

at Innominds

1 video
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Pune
5yrs+
Upto ₹35L / yr (Varies
)
skill iconJava
skill iconAmazon Web Services (AWS)
SQL
Internet of Things (IOT)
Spring
+1 more

In your role as Software Engineer/Lead, you will directly work with other developers, Product Owners, and Scrum Masters to evaluate and develop innovative solutions. The purpose of the role is to design, develop, test, and operate a complex set of applications or platforms in the IoT Cloud area.


The role involves the utilization of advanced tools and analytical methods for gathering facts to develop solution scenarios. The job holder needs to be able to execute quality code, review code, and collaborate with other developers.


We have an excellent mix of people, which we believe makes for a more vibrant, more innovative, and more productive team.


  • A bachelor’s degree, or master’s degree in information technology, computer science, or other relevant education
  • At least 5 years of experience as Software Engineer, in an enterprise context
  • Experience in design, development and deployment of large-scale cloud-based applications and services
  • Good knowledge in cloud (AWS) serverless application development, event driven architecture and SQL / No-SQL databases
  • Experience with IoT products, backend services and design principles
  • Good knowledge at least of one backend technology like node.js (JavaScript, TypeScript) or JVM (Java, Scala, Kotlin)
  • Passionate about code quality, security and testing
  • Microservice development experience with Java (Spring) is a plus
  • Good command of English in both Oral & Written


Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Chennai, Bhopal, Jaipur
10 - 15 yrs
₹30L - ₹40L / yr
Spark
Google Cloud Platform (GCP)
skill iconPython
Apache Airflow
PySpark
+1 more

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.


  • Shift: 2 PM 11 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or those with a notice period of up to 30 days


Key Responsibilities:

  • Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
  • Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
  • Ensure data integrity, consistency, and availability across all systems.
  • Collaborate with data engineers, analysts, and stakeholders to optimize performance.
  • Document standards and best practices for data engineering workflows.

Required Experience:


  • 7-8 years of experience in data engineering, architecture, and pipeline development.
  • Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
  • Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
  • Understanding of Data Lake table formats (Delta, Iceberg, etc.).
  • Proficiency in Python for scripting and automation.
  • Strong problem-solving skills and collaborative mindset.


⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Gurugram, Chennai, Bhopal, Jaipur
5 - 10 yrs
₹15L - ₹24L / yr
Tableau
SQL

Job Description:

We are seeking a Tableau Developer with 5+ years of experience to join our Core Analytics team. The candidate will work on large-scale BI projects using Tableau and related tools.


Must Have:

  • Strong expertise in Tableau Desktop and Server, including add-ons like Data and Server Management.
  • Ability to interpret business requirements, build wireframes, and finalize KPIs, calculations, and designs.
  • Participate in design discussions to implement best practices for dashboards and reports.
  • Build scalable BI and Analytics products based on feedback while adhering to best practices.
  • Propose multiple solutions for a given problem, leveraging toolset functionality.
  • Optimize data sources and dashboards while ensuring business requirements are met.
  • Collaborate with product, platform, and program teams for timely delivery of dashboards and reports.
  • Provide suggestions and take feedback to deliver future-ready dashboards.
  • Peer review team members’ dashboards, offering constructive feedback to improve overall design.
  • Proficient in SQL, UI/UX practices, and alation, with an understanding of good data models for reporting.
  • Mentor less experienced team members.


Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune
2 - 5 yrs
₹3L - ₹10L / yr
PySpark
skill iconAmazon Web Services (AWS)
AWS Lambda
SQL
Data engineering
+2 more


Here is the Job Description - 


Location -- Viman Nagar, Pune

Mode - 5 Days Working


Required Tech Skills:


 ● Strong at PySpark, Python

 ● Good understanding of Data Structure 

 ● Good at SQL query/optimization 

 ● Strong fundamentals of OOPs programming 

 ● Good understanding of AWS Cloud, Big Data. 

 ● Data Lake, AWS Glue, Athena, S3, Kinesis, SQL/NoSQL DB  


Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Bengaluru (Bangalore), Pune, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Navi Mumbai, Kolkata, Rajasthan
5 - 24 yrs
₹9L - ₹70L / yr
skill iconC
skill iconC++
Visual C++
Embedded C++
Artificial Intelligence (AI)
+32 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-fri role, In office, with excellent perks and benefits!


Position Overview

We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9


Key Responsibilities:

1. System Architecture & Design

● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.

● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.

● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.


2. Perception & AI Integration

● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.

● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.

● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.


3. Embedded & Real-Time Systems

● Design high-performance embedded software stacks for real-time robotic control and autonomy.

● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.

● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.


4. Robotics Simulation & Digital Twins

● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.

● Leverage synthetic data generation (Omniverse Replicator) for training AI models.

● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.


5. Navigation & Motion Planning

● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.

● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.

● Implement reinforcement learning-based policies using Isaac Gym.


6. Performance Optimization & Scalability

● Ensure low-latency AI inference and real-time execution of robotics applications.

● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.

● Develop benchmarking and profiling tools to measure software performance on edge AI devices.


Required Qualifications:

● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.

● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.

● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.

● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.

● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.

● Strong background in robotic perception, planning, and real-time control.

● Experience with cloud-edge AI deployment and scalable architectures.


Preferred Qualifications

● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym

● Knowledge of robot kinematics, control systems, and reinforcement learning

● Expertise in distributed computing, containerization (Docker), and cloud robotics

● Familiarity with automotive, industrial automation, or warehouse robotics

● Experience designing architectures for autonomous systems or multi-robot systems.

● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics

● Experience with microservices or service-oriented architecture (SOA)

● Knowledge of machine learning and AI integration within robotic systems

● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort