Cutshort logo

50+ Python Jobs in Pune | Python Job openings in Pune

Apply to 50+ Python Jobs in Pune on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Pune
6 - 10 yrs
₹12L - ₹23L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Natural Language Processing (NLP)
Computer Vision
Data engineering
+8 more

Job Title : AI Architect

Location : Pune (On-site | 3 Days WFO)

Experience : 6+ Years

Shift : US or flexible shifts


Job Summary :

We are looking for an experienced AI Architect to design and deploy AI/ML solutions that align with business goals.

The role involves leading end-to-end architecture, model development, deployment, and integration using modern AI/ML tools and cloud platforms (AWS/Azure/GCP).


Key Responsibilities :

  • Define AI strategy and identify business use cases
  • Design scalable AI/ML architectures
  • Collaborate on data preparation, model development & deployment
  • Ensure data quality, governance, and ethical AI practices
  • Integrate AI into existing systems and monitor performance

Must-Have Skills :

  • Machine Learning, Deep Learning, NLP, Computer Vision
  • Data Engineering, Model Deployment (CI/CD, MLOps)
  • Python Programming, Cloud (AWS/Azure/GCP)
  • Distributed Systems, Data Governance
  • Strong communication & stakeholder collaboration

Good to Have :

  • AI certifications (Azure/GCP/AWS)
  • Experience in big data and analytics
Read more
Blitzy

at Blitzy

2 candid answers
1 product
Eman Khan
Posted by Eman Khan
Pune
6 - 10 yrs
₹40L - ₹70L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
Google Cloud Platform (GCP)
+1 more

Requirements

  • 7+ years of experience with Python
  • Strong expertise in Python frameworks (Django, Flask, or FastAPI)
  • Experience with GCP, Terraform, and Kubernetes
  • Deep understanding of REST API development and GraphQL
  • Strong knowledge of SQL and NoSQL databases
  • Experience with microservices architecture
  • Proficiency with CI/CD tools (Jenkins, CircleCI, GitLab)
  • Experience with container orchestration using Kubernetes
  • Understanding of cloud architecture and serverless computing
  • Experience with monitoring and logging solutions
  • Strong background in writing unit and integration tests
  • Familiarity with AI/ML concepts and integration points


Responsibilities

  • Design and develop scalable backend services for our AI platform
  • Architect and implement complex systems with high reliability
  • Build and maintain APIs for internal and external consumption
  • Work closely with AI engineers to integrate ML functionality
  • Optimize application performance and resource utilization
  • Make architectural decisions that balance immediate needs with long-term scalability
  • Mentor junior engineers and promote best practices
  • Contribute to the evolution of our technical standards and processes
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Anurag Sinha
Posted by Anurag Sinha
Pune, Mumbai, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
RESTful APIs
skill iconFlask
skill iconKubernetes
DevOps
+2 more
  • 5+ years of experience
  • FlaskAPI, RestAPI development experience
  • Proficiency in Python programming.
  • Basic knowledge of front-end development.
  • Basic knowledge of Data manipulation and analysis libraries
  • Code versioning and collaboration. (Git)
  • Knowledge for Libraries for extracting data from websites.
  • Knowledge of SQL and NoSQL databases
  • Familiarity with RESTful APIs
  • Familiarity with Cloud (Azure /AWS) technologies


Read more
Global Consulting and Services

Global Consulting and Services

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹3L - ₹6L / yr
MS-Excel
skill iconPython
pandas
NumPy
SQL

AccioJob is conducting a Walk-In Hiring Drive with Global Consulting and Services for the position of Python Automation Engineer.


To apply, register and select your slot here: https://go.acciojob.com/b7BZZZ


Required Skills: Excel, Python, Panda, Numpy, SQL


Eligibility:

  • Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
  • Branch: All
  • Graduation Year: 2023, 2024, 2025


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 3 LPA to 6 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre

Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round,

Technical Interview 1

Technical Interview 2

Tech+Managerial Round


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/b7BZZZ

Or, apply through our newly launched app:https://go.acciojob.com/4wvBDe

Read more
Tekdi Technologies Pvt. Ltd.
Anuja Gangurde
Posted by Anuja Gangurde
Pune
1 - 3 yrs
₹3L - ₹6L / yr
skill iconMachine Learning (ML)
Generative AI
Artificial Intelligence (AI)
skill iconPython
Large Language Models (LLM)

We are looking for a highly skilled AI/ML/Gen AI Data Scientist with expertise in Generative AI, Machine Learning, Deep Learning, and Natural Language Processing (NLP). The ideal candidate should have a strong foundation in Python-based AI frameworks and experience in developing, deploying, and optimizing AI models for real-world applications.


Key Responsibilities:

Develop and implement AI/ML models.

Work with Deep Learning architectures like Transformers (BERT, GPT, LLaMA) and CNNs/RNNs.

Fine-tune and optimize Large Language Models (LLMs) for various applications.

Design and train custom Machine Learning models using structured and unstructured data.

Leverage NLP techniques such as text summarization, Named Entity Recognition (NER).

Implement ML pipelines and deploy models in cloud environments (AWS/GCP/Azure).

Collaborate with cross-functional teams to integrate AI-driven solutions into business applications.

Stay updated with latest AI advancements and apply innovative techniques to improve model performance.


Required Skills & Qualifications:

1 to 3 years of experience in AI/ML, Deep Learning, and Generative AI.

 Strong proficiency in Python and ML frameworks like TensorFlow, PyTorch, Hugging Face, Scikit-learn.

 Hands-on experience with NLP models, including BERT, GPT, T5, LLaMA, and Stable Diffusion.

 Expertise in data preprocessing, feature engineering, and model evaluation.

 Experience with MLOps, cloud-based AI deployment, and containerization (Docker, Kubernetes).

 Knowledge of vector databases and retrieval-augmented generation (RAG) techniques.

 Ability to fine-tune LLMs and work with prompt engineering.

 Strong problem-solving skills and ability to work in agile environments.


Educational Requirements:

Bachelor's, Master's, or PhD in Computer Science/Artificial Intelligence/Information Technology


Preferred Skills (Good to Have):

Experience with Reinforcement Learning (RLHF) and multi-modal AI.

Familiarity with AutoML, Responsible AI (RAI), and AI Ethics.

Exposure to Graph Neural Networks (GNNs) and time-series forecasting.

Contributions to open-source AI projects or research papers.


Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
5 - 9 yrs
Best in industry
skill iconC++
skill iconDocker
skill iconKubernetes
ECS
skill iconAmazon Web Services (AWS)
+9 more

We are looking for a Senior Software Engineer to join our team and contribute to key business functions. The ideal candidate will bring relevant experience, strong problem-solving skills, and a collaborative

mindset.


Responsibilities:

  • Design, build, and maintain high-performance systems using modern C++
  • Architect and implement containerized services using Docker, with orchestration via Kubernetes or ECS
  • Build, monitor, and maintain data ingestion, transformation, and enrichment pipelines
  • Deep understanding of cloud platforms (preferably AWS) and hands-on experience in deploying and
  • managing applications in the cloud.
  • Implement and maintain modern CI/CD pipelines, ensuring seamless integration, testing, and delivery
  • Participate in system design, peer code reviews, and performance tuning


Qualifications:

  • 5+ years of software development experience, with strong command over modern C++
  • Deep understanding of cloud platforms (preferably AWS) and hands-on experience in deploying and managing applications in the cloud.
  • Apache Airflow for orchestrating complex data workflows.
  • EKS (Elastic Kubernetes Service) for managing containerized workloads.
  • Proven expertise in designing and managing robust data pipelines & Microservices.
  • Proficient in building and scaling data processing workflows and working with structured/unstructured data
  • Strong hands-on experience with Docker, container orchestration, and microservices architecture
  • Working knowledge of CI/CD practices, Git, and build/release tools
  • Strong problem-solving, debugging, and cross-functional collaboration skills


This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Chetna Jain
Posted by Chetna Jain
Bengaluru (Bangalore), Pune, Mumbai, Chennai
2 - 8 yrs
Best in industry
Robotic process automation (RPA),
Automation Anywhere
skill iconPython
SQL

We are looking for a skilled Automation Anywhere Engineer with a strong background in RPA development, Python scripting, and experience with CoPilot integrations. The ideal candidate will play a key role in designing, developing, and implementing automation solutions to streamline business processes and improve operational efficiency.


Required Skills:

  • 2–6 years of hands-on experience in Automation Anywhere (A2019 or higher).
  • Strong programming skills in Python for automation and integration.
  • Good understanding of RPA concepts, lifecycle, and best practices.
  • Experience working with CoPilot (Microsoft Power Platform/AI CoPilot or equivalent).
  • Knowledge of API integration and web services (REST/SOAP).
  • Familiarity with process analysis and design techniques.
  • Ability to write clean, reusable, and well-documented code.
  • Strong problem-solving and communication skills.
Read more
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
3.5 - 9 yrs
₹3L - ₹13L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
skill iconDjango
Amazon S3

Job Summary:

We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.

Key Responsibilities:

  • Develop and maintain backend applications using Python and frameworks like Django or Flask
  • Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
  • Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
  • Write clean, efficient, and testable code following best practices
  • Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
  • Monitor and optimize system performance and troubleshoot production issues
  • Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
  • Maintain and improve application security and compliance with industry standards

Required Skills:

  • Strong programming skills in Python
  • Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
  • Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
  • Good understanding of RESTful API design and microservices architecture
  • Hands-on experience with CI/CD, Git, and version control systems
  • Familiarity with containerization (Docker, ECS, or EKS) is a plus
  • Strong problem-solving and communication skills

Preferred Qualifications:

  • Experience with PySpark, Pandas, or data engineering tools
  • Working knowledge of Django, Flask, or other Python frameworks
  • AWS Certification (e.g., AWS Certified Developer – Associate) is a plus

Educational Qualification:

  • Bachelor's or Master’s degree in Computer Science, Engineering, or related field


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
5 - 7 yrs
₹5L - ₹12L / yr
MERN Stack
skill iconReact Native
skill iconMongoDB
skill iconPython

Job description

Opportunity to work on cutting-edge tech pieces & build from scratch

Ensure seamless performance while handling large volumes of data without system slowdowns

Collaborate with cross-functional teams to meet business goals


Required Skills:

Frontend: ReactJS (Next.js must)

Backend: Exp in Node.js, Python, or Java

Databases: SQL (must), MongoDB (nice to have)

Caching & Messaging: Redis, Kafka, or Cassandra exp

Cloud certification is a bonus

Read more
Data Axle

at Data Axle

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
6 - 9 yrs
Upto ₹40L / yr (Varies
)
skill iconPython
skill iconDjango
skill iconFlask
skill iconReact.js
skill iconAngular (2+)
+1 more

General Summary:

The Senior Software Engineer will be responsible for designing, developing, testing, and maintaining full-stack solutions. This role involves hands-on coding (80% of time), performing peer code reviews, handling pull requests and engaging in architectural discussions with stakeholders. You'll contribute to the development of large-scale, data-driven SaaS solutions using best practices like TDD, DRY, KISS, YAGNI, and SOLID principles. The ideal candidate is an experienced full-stack developer who thrives in a fast-paced, Agile environment.


Essential Job Functions:

  • Design, develop, and maintain scalable applications using Python and Django.
  • Build responsive and dynamic user interfaces using React and TypeScript.
  • Implement and integrate GraphQL APIs for efficient data querying and real-time updates.
  • Apply design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure maintainable and scalable code.
  • Develop and manage RESTful APIs for seamless integration with third-party services.
  • Design, optimize, and maintain SQL databases like PostgreSQL, MySQL, and MSSQL.
  • Use version control systems (primarily Git) and follow collaborative workflows.
  • Work within Agile methodologies such as Scrum or Kanban, participating in daily stand-ups, sprint planning, and retrospectives.
  • Write and maintain unit tests, integration tests, and end-to-end tests, following Test-Driven Development (TDD).
  • Collaborate with cross-functional teams, including Product Managers, DevOps, and UI/UX Designers, to deliver high-quality products


Essential functions are the basic job duties that an employee must be able to perform, with or without reasonable accommodation. The function is considered essential if the reason the position exists is to perform that function.


Supportive Job Functions:

  • Remain knowledgeable of new emerging technologies and their impact on internal systems.
  • Available to work on call when needed.
  • Perform other miscellaneous duties as assigned by management.


These tasks do not meet the Americans with Disabilities Act definition of essential job functions and usually equal 5% or less of time spent. However, these tasks still constitute important performance aspects of the job.


Skills

  • The ideal candidate must have strong proficiency in Python and Django, with a solid understanding of Object-Oriented Programming (OOPs) principles. Expertise in JavaScript,
  • TypeScript, and React is essential, along with hands-on experience in GraphQL for efficient data querying.
  • The candidate should be well-versed in applying design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure scalable and maintainable code architecture.
  • Proficiency in building and integrating REST APIs is required, as well as experience working with SQL databases like PostgreSQL, MySQL, and MSSQL.
  • Familiarity with version control systems (especially Git) and working within Agile methodologies like Scrum or Kanban is a must.
  • The candidate should also have a strong grasp of Test-Driven Development (TDD) principles.
  • In addition to the above, it is good to have experience with Next.js for server-side rendering and static site generation, as well as knowledge of cloud infrastructure such as AWS or GCP.
  • Familiarity with NoSQL databases, CI/CD pipelines using tools like GitHub Actions or Jenkins, and containerization technologies like Docker and Kubernetes is highly desirable.
  • Experience with microservices architecture and event-driven systems (using tools like Kafka or RabbitMQ) is a plus, along with knowledge of caching technologies such as Redis or
  • Memcached. Understanding OAuth2.0, JWT, SSO authentication mechanisms, and adhering to API security best practices following OWASP guidelines is beneficial.
  • Additionally, experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation, and familiarity with performance monitoring tools such as New Relic or Datadog will be considered an advantage.


Abilities:

  • Ability to organize, prioritize, and handle multiple assignments on a daily basis.
  • Strong and effective inter-personal and communication skills
  • Ability to interact professionally with a diverse group of clients and staff.
  • Must be able to work flexible hours on-site and remote.
  • Must be able to coordinate with other staff and provide technological leadership.
  • Ability to work in a complex, dynamic team environment with minimal supervision.
  • Must possess good organizational skills.


Education, Experience, and Certification:

  • Associate or bachelor’s degree preferred (Computer Science, Engineer, etc.), but equivalent work experience in a technology related area may substitute.
  • 2+ years relevant experience, required.
  • Experience using version control daily in a developer environment.
  • Experience with Python, JavaScript, and React is required.
  • Experience using rapid development frameworks like Django or Flask.
  • Experience using front end build tools.


Scope of Job:

  1. No direct reports.
  2. No supervisory responsibility.
  3. Consistent work week with minimal travel
  4. Errors may be serious, costly, and difficult to discover.
  5. Contact with others inside and outside the company is regular and frequent.
  6. Some access to confidential data.


Read more
Blitzy

at Blitzy

2 candid answers
1 product
Eman Khan
Posted by Eman Khan
Pune
5yrs+
₹11L - ₹30L / yr
skill iconPython
Selenium
Playwright
skill iconGit
Google Cloud Platform (GCP)
+2 more

About the role

We are looking for a Senior Automation Engineer to architect and implement automated testing frameworks that validate the runtime behavior of code generated by our AI platform. This role is critical in ensuring that our platform's output performs correctly in production environments. You'll work at the intersection of AI and quality assurance, creating innovative testing solutions that can validate AI-generated applications during actual execution.


What Success Looks Like

  • You architect and implement automated testing frameworks that validate the runtime behavior and performance of AI-generated applications
  • You develop intelligent test suites that can automatically assess application functionality in production environments
  • You create testing frameworks that can validate runtime behavior across multiple languages and frameworks
  • You establish quality metrics and testing protocols that measure real-world performance of generated applications
  • You build systems to automatically detect and flag runtime issues in deployed applications
  • You collaborate with our AI team to improve the platform based on runtime performance data
  • You implement automated integration and end-to-end testing that ensures generated applications work as intended in production
  • You develop metrics and monitoring systems to track runtime performance across different customer deployments


Areas of Ownership 

Our hiring process is designed for you to demonstrate deep expertise in automation testing with a focus on AI-powered systems.


Required Technical Experience:

  • 4+ years of experience with Selenium and automated testing frameworks
  • Strong expertise in Python (our primary automation language)
  • Experience with CI/CD tools (Jenkins, CircleCI, or similar)
  • Proficiency in version control systems (Git)
  • Experience testing distributed systems
  • Understanding of modern software development practices
  • Experience working with cloud platforms (GCP preferred)


Ways to stand out

  • Experience with runtime monitoring and testing of distributed systems
  • Knowledge of performance testing and APM (Application Performance Monitoring)
  • Experience with end-to-end testing of complex applications
  • Background in developing testing systems for enterprise-grade applications
  • Understanding of distributed tracing and monitoring
  • Experience with chaos engineering and resilience testing
Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Noida, Bengaluru (Bangalore), Pune
6 - 9 yrs
₹10L - ₹18L / yr
Windows Azure
SQL Azure
SQL
Data Warehouse (DWH)
skill iconData Analytics
+3 more

Hybrid work mode


(Azure) EDW Experience working in loading Star schema data warehouses using framework

architectures including experience loading type 2 dimensions. Ingesting data from various

sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.

Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),

Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

Read more
Tekdi Technologies Pvt. Ltd.
Tekdi Recruitment
Posted by Tekdi Recruitment
Pune
4 - 5 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconPython

Job Title: Software Engineer (Node.js)

Experience: 4+ Years

Location:Pune

About the Role:

We are looking for a talented and experienced Node.js Developer with a minimum of 4 years of hands-on experience to join our dynamic team. In this role, you will design, develop, and maintain high-performance applications. You should be passionate about writing clean, efficient, and scalable code.

Key Responsbilities:

  • Develop and maintain secure, scalable, and high-performance server-side applications.
  • Implement authentication, authorization, and data protection measures across the backend.
  • Follow backend best practices in code structure, error handling, and system design.
  • Stay up to date with backend security trends and evolving best practices.


Mandatory skills:


  • Strong hands-on experience in Node.js development (4+ years).
  • Knowledge of security best practices in backend development (e.g., input validation and sanitize, secure data storage).
  • Familiarity with authentication and authorization methods such as JWT, OAuth2, or session-based auth.


Good to Have Skills:


  • Experience with React.js for building dynamic user interfaces.
  • Working knowledge of Python for scripting or backend tasks.


Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience).



Required Soft Skills:

• Verbal Communication

• Written Communication

• Cooperation, Teamwork & Interpersonal Skills

• Customer Focus & Business Acumen

• Critical Thinking

• Initiative, Accountability & Result Orientation

• Learning and Continuous Improvement


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai
4 - 8 yrs
₹7L - ₹26L / yr
SRE
Reliability engineering
skill iconAmazon Web Services (AWS)
skill iconPython

Job Title: Site Reliability Engineer (SRE)

Experience: 4+ Years

Work Location: Bangalore / Chennai / Pune / Gurgaon

Work Mode: Hybrid or Onsite (based on project need)

Domain Preference: Candidates with past experience working in shoe/footwear retail brands (e.g., Nike, Adidas, Puma) are highly preferred.


🛠️ Key Responsibilities

  • Design, implement, and manage scalable, reliable, and secure infrastructure on AWS.
  • Develop and maintain Python-based automation scripts for deployment, monitoring, and alerting.
  • Monitor system performance, uptime, and overall health using tools like Prometheus, Grafana, or Datadog.
  • Handle incident response, root cause analysis, and ensure proactive remediation of production issues.
  • Define and implement Service Level Objectives (SLOs) and Error Budgets in alignment with business requirements.
  • Build tools to improve system reliability, automate manual tasks, and enforce infrastructure consistency.
  • Collaborate with development and DevOps teams to ensure robust CI/CD pipelines and safe deployments.
  • Conduct chaos testing and participate in on-call rotations to maintain 24/7 application availability.


Must-Have Skills

  • 4+ years of experience in Site Reliability Engineering or DevOps with a focus on reliability, monitoring, and automation.
  • Strong programming skills in Python (mandatory).
  • Hands-on experience with AWS cloud services (EC2, S3, Lambda, ECS/EKS, CloudWatch, etc.).
  • Expertise in monitoring and alerting tools like Prometheus, Grafana, Datadog, CloudWatch, etc.
  • Strong background in Linux-based systems and shell scripting.
  • Experience implementing infrastructure as code using tools like Terraform or CloudFormation.
  • Deep understanding of incident management, SLOs/SLIs, and postmortem practices.
  • Prior working experience in footwear/retail brands such as Nike or similar is highly preferred.


Read more
Atomic Loops

Atomic Loops

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 4 yrs
₹4L - ₹5L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Prompt engineering

AccioJob is conducting a Walk-In Hiring Drive with Atomic Loops for the position of AI/ML Developer Intern.


To apply, register, and select your slot here: https://go.acciojob.com/E8wPb8


Required Skills: Python, AI, Prompting, ML understanding


Eligibility: ALL

Degree: ALL

Branch: ALL

Graduation Year: 2019, 2020, 2021, 2022, 2023, 2024, 2025, 2026


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 4 LPA to 5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre

Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round, Company Side Process

Company Side Process

2 rounds will be for the intern role, and 3 rounds will be for the full-time role (Virtual or Face-to-Face)


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/E8wPb8

Read more
Atomic Loops

Atomic Loops

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹3.6L - ₹4.2L / yr
skill iconPython
skill iconDjango
FastAPI

AccioJob is conducting a Walk-In Hiring Drive with Atomic Loops for the position of Python Developer Intern.


To apply, register and select your slot here: https://go.acciojob.com/Bg2vSq


Required Skills: Python, Django, FastAPI


Eligibility: ALL

Degree: ALL

Branch: ALL

Graduation Year: 2025, 2026


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 3.6 LPA to 4.2 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre

Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round, Interview Round 1, Interview Round 2


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/Bg2vSq

Read more
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Pune, Chennai
0 - 1 yrs
₹10L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconJava
Data Structures
+1 more


About NxtWave


NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.

Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.

Know more:

🌐 NxtWave | NIAT

About the Role

As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.


Key Responsibilities

  • Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
  • Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
  • Mentor students in academic, career, and project development goals.
  • Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
  • Drive research-led content development, and contribute to innovation in teaching methodologies.
  • Support capstone projects, hackathons, and collaborative research opportunities with industry.
  • Foster a high-performance learning environment in classes of 70–100 students.
  • Collaborate with cross-functional teams for continuous student development and program quality.
  • Actively participate in faculty training, peer reviews, and academic audits.


Eligibility & Requirements

  • Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
  • Strong academic and research orientation, preferably with publications or project contributions.
  • Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
  • A deep commitment to education, student success, and continuous improvement.

Must-Have Skills

  • Expertise in Python, Java, JavaScript, and advanced programming paradigms.
  • Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
  • Excellent communication, classroom delivery, and presentation skills.
  • Familiarity with academic content tools like Google Slides, Sheets, Docs.
  • Passion for educating, mentoring, and shaping future developers.

Good to Have

  • Industry experience or consulting background in software development or research-based roles.
  • Proficiency in version control systems (e.g., Git) and agile methodologies.
  • Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
  • A drive to innovate in teaching, curriculum design, and student engagement.

Why Join Us?

  • Be at the forefront of shaping India’s tech education revolution.
  • Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
  • Competitive compensation with strong growth potential.
  • Create impact at scale by mentoring hundreds of future-ready tech leaders.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.
Read more
Techno Comp
shravan c
Posted by shravan c
Pune
6 - 8 yrs
₹5L - ₹9L / yr
ADF
Azure Data Factory
skill iconPython
databricks


Job Title: Developer

Work Location: Pune, MH

Skills Required: Azure Data Factory

Experience Range in Required Skills: 6-8 Years

Job Description: Azure, ADF, Databricks, Python

Essential Skills: Azure, ADF, Databricks, Python

Desirable Skills: Azure, ADF, Databricks, Python

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
DEMAND MEDIA BPM LLP

at DEMAND MEDIA BPM LLP

2 candid answers
Darshana Mate
Posted by Darshana Mate
Pune
1 - 5 yrs
₹2L - ₹6L / yr
SQL
PowerBI
skill iconPython

Job Purpose

Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.


Key Responsibilities:

  • Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
  • Perform data transformation and validation for accuracy and consistency.
  • Upload processed datasets into SQL Server using SSIS packages.
  • Monitor and optimize database performance, identifying and resolving bottlenecks.
  • Perform regular backups, restorations, and recovery checks to ensure data continuity.
  • Manage user access and implement robust database security policies.
  • Oversee database storage allocation and utilization.
  • Conduct routine maintenance and support incident management, including root cause analysis and resolution.
  • Design and implement scalable database solutions and architecture.
  • Create and maintain stored procedures, views, and other database components.
  • Optimize SQL queries for performance and scalability.
  • Execute ETL processes and support seamless integration of multiple data sources.
  • Maintain data integrity and quality through validation and cleansing routines.
  • Collaborate with cross-functional teams on data solutions and project deliverables.

 

Educational Qualification: Any Graduate

Required Skills & Qualifications:

  • Proven experience with SQL Server or similar relational database platforms.
  • Strong expertise in SSIS, ETL processes, and data warehousing.
  • Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
  • Experience in database security, user role management, and access control.
  • Familiarity with backup/recovery strategies and database maintenance best practices.
  • Strong analytical skills with experience working with large and complex datasets.
  • Solid understanding of data modeling, normalization, and schema design.
  • Knowledge of incident and change management processes.
  • Excellent communication and collaboration skills.
  • Experience with Python for data manipulation and automation is a strong plus.


Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Pune, Hyderabad
3 - 7 yrs
₹11L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconPython
skill icongrafana
AWS CloudFormation
Terraform
+4 more

We are seeking a highly skilled and motivated MLOps Engineer with 3-5 years of experience to join our engineering team. The ideal candidate should possess a strong foundation in DevOps or software engineering principles with practical exposure to machine learning operational workflows. You will be instrumental in operationalizing ML systems, optimizing the deployment lifecycle, and strengthening the integration between data science and engineering teams.

Required Skills:

• Hands-on experience with MLOps platforms such as MLflow and Kubeflow.

• Proficiency in Infrastructure as Code (laC) tools like Terraform or Ansible.

• Strong familiarity with monitoring and alerting frameworks (Prometheus, Grafana, Datadog, AWS CloudWatch).

• Solid understanding of microservices architecture, service discovery, and load balancing.

• Excellent programming skills in Python, with experience in writing modular, testable, and maintainable code.

• Proficient in Docker and container-based application deployments.

• Experience with CI/CD tools such as Jenkins or GitLab Cl.

• Basic working knowledge of Kubernetes for container orchestration.

• Practical experience with cloud-based ML platforms such as AWS SageMaker, Databricks, or Google Vertex Al.



Good-to-Have Skills:

• Awareness of security practices specific to ML pipelines, including secure model endpoints and data protection.

• Experience with scripting languages like Bash or PowerShell for automation tasks.

• Exposure to database scripting and data integration pipelines.

Experience & Qualifications:

• 3-5+ years of experience in MLOps, Site Reliability Engineering (SRE), or

Software Engineering roles.

• At least 2+ years of hands-on experience working on ML/Al systems in production settings.

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
3 - 6 yrs
₹15L - ₹21L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Large Language Models (LLM) tuning
Retrieval Augmented Generation (RAG)
+1 more
  • Strong AI/ML OR Software Developer Profile
  • Mandatory (Experience 1) - Must have 3+ YOE in Core Software Developement (SDLC)
  • Mandatory (Experience 2) - Must have 2+ years of experience in AI/ML, preferably in conversational AI domain (spped to text, text to speech, speech emotional recognition) or agentic AI systems.
  • Mandatory (Experience 3) - Must have hands-on experience in fine-tuning LLMs/SLM, model optimization (quantization, distillation) and RAG
  • Mandatory (Experience 4) - Hands-on Programming experience in Python, TensorFlow, PyTorch and model APIs (Hugging Face, LangChain, OpenAI, etc


Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Pune, Mohali
4 - 6 yrs
₹5L - ₹11L / yr
skill iconPython
TensorFlow
PyTorch
skill iconMachine Learning (ML)
Spark
+3 more

Skill Sets:

  • Expertise in ML/DL, model lifecycle management, and MLOps (MLflow, Kubeflow)
  • Proficiency in Python, TensorFlow, PyTorch, Scikit-learn, and Hugging Face models
  • Strong experience in NLP, fine-tuning transformer models, and dataset preparation
  • Hands-on with cloud platforms (AWS, GCP, Azure) and scalable ML deployment (Sagemaker, Vertex AI)
  • Experience in containerization (Docker, Kubernetes) and CI/CD pipelines
  • Knowledge of distributed computing (Spark, Ray), vector databases (FAISS, Milvus), and model optimization (quantization, pruning)
  • Familiarity with model evaluation, hyperparameter tuning, and model monitoring for drift detection

Roles and Responsibilities:

  • Design and implement end-to-end ML pipelines from data ingestion to production
  • Develop, fine-tune, and optimize ML models, ensuring high performance and scalability
  • Compare and evaluate models using key metrics (F1-score, AUC-ROC, BLEU etc)
  • Automate model retraining, monitoring, and drift detection
  • Collaborate with engineering teams for seamless ML integration
  • Mentor junior team members and enforce best practices


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
Client based at Pune location.

Client based at Pune location.

Agency job
Pune
5 - 10 yrs
₹15L - ₹25L / yr
Cloud Developer
skill iconAmazon Web Services (AWS)
large scale financial tracking system
grpc
cloudflare
+8 more

Minimum requirements

5+ years of industry software engineering experience (does not include internships nor includes co-ops)

Strong coding skills in any programming language (we understand new languages can be learned on the job so our interview process is language agnostic)

Strong collaboration skills, can work across workstreams within your team and contribute to your peers’ success

Have the ability to thrive on a high level of autonomy, responsibility, and think of yourself as entrepreneurial

Interest in working as a generalist across varying technologies and stacks to solve problems and delight both internal and external users

Preferred Qualifications

Experience with large-scale financial tracking systems

Good understanding and practical knowledge in cloud based services (e.g. gRPC, GraphQL, Docker/Kubernetes, cloud services such as AWS, etc.)

Read more
Partner Company

Partner Company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹5L - ₹6L / yr
SQL
MS-Excel
PowerBI
skill iconPython

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.


Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd


Job Description:

  • Role: Junior Business/Data Analyst (Internship + PPO)
  • Work Location: Hyderabad
  • Internship Stipend: 15,000 - 25,000/month
  • Internship Duration: 3 months
  • CTC on PPO: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: Open to all academic backgrounds
  • Graduation Year: 2023, 2024, 2025

Required Skills:

  • Proficiency in SQLExcelPower BI, and basic Python
  • Strong analytical mindset and interest in solving business problems with data

Hiring Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
  2. 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)

Note: Please bring your laptop and earphones for the test.


Register Here: https://go.acciojob.com/69d3Wd

Read more
Gameberry

at Gameberry

5 recruiters
Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 1 yrs
₹10L - ₹15L / yr
DSA
Object Oriented Programming (OOPs)
skill iconJava
skill iconPython
skill iconGo Programming (Golang)

AccioJob is organizing an exclusive offline hiring drive in collaboration with GameBerry Labs for the role of Software Development Engineer 1 (SDE 1).


To Apply, Register and select your Slot here: https://go.acciojob.com/Zq2UnA


Job Description:

  • Role: SDE 1
  • Work Location: Bangalore
  • CTC: 10 LPA - 15 LPA

Eligibility Criteria:

  • Education: B.Tech, BE, BCA, MCA, M.Tech
  • Branches: Circuit Branches (CSE, ECE, IT, etc.)
  • Graduation Year:
  • 2024 (Minimum 9 months of experience)
  • 2025 (Minimum 3-6 months of experience)

Evaluation Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Bangalore, Pune, Noida)
  2. Technical Interviews (2 Rounds - Virtual for most; In-person for Bangalore candidates)

Note: Carry your laptop and earphones for the assessment.


Register Here: https://go.acciojob.com/Zq2UnA

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Pune, Kolkata
4 - 6 yrs
₹7L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
NumPy
pandas

Key Technical Skillsets-

  • Design, develop, and maintain scalable applications using AWS services, Python, and Boto3.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement best practices for cloud architecture and application development.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot and resolve issues in development, test, and production environments.
  • Write clean, maintainable, and efficient code.
  • Participate in code reviews and contribute to team knowledge sharing.


Read more
InfoBeans

at InfoBeans

2 recruiters
Sanjana Thakur
Posted by Sanjana Thakur
Pune, Indore
7 - 13 yrs
₹12L - ₹35L / yr
skill iconPython
Automation
pytest
playwright

Python Automation Engineer -

JD : 


  • Engage with development teams to improve the quality of the application.
  • Provide in-depth technical mentoring across the test automation team.
  • Provide highly innovative solutions to automatically qualify the application.
  • Routinely exercise independent judgment in test automation methods, techniques and criteria for achieving objectives.

Experience/Exposure:

  • Mid-level programming skills in Python
  • Experience with UI driven test automation framework such as selenium, Playwright
  • Experience with CI/CD tool
  • Ability to troubleshoot complex software / hardware configuration problems
  • Strong analytical & problem solving, documentation, and communication skills
  • Passion for product quality and eagerness to learn new technologies.
  • Ability to function effectively in a fast-paced environment and manage continuously changing business needs. Excellent time management skills require


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Hyderabad, Mumbai, kolkata, Pune, chennai
4 - 10 yrs
₹7L - ₹20L / yr
skill iconMachine Learning (ML)
MLOps
skill iconPython
NumPy
  • Design and implement cloud solutions, build MLOps on Azure
  • Build CI/CD pipelines orchestration by GitLab CI, GitHub Actions, Circle CI, Airflow or similar tools
  • Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality
  • Data science models testing, validation and tests automation
  • Deployment of code and pipelines across environments
  • Model performance metrics
  • Service performance metrics
  • Communicate with a team of data scientists, data engineers and architect, document the processes


Read more
NeoGenCode Technologies Pvt Ltd
Pune
8 - 15 yrs
₹5L - ₹24L / yr
Data engineering
Snow flake schema
SQL
ETL
ELT
+5 more

Job Title : Data Engineer – Snowflake Expert

Location : Pune (Onsite)

Experience : 10+ Years

Employment Type : Contractual

Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.


Job Summary :

We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.

The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.

Responsibilities :

  • Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
  • Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
  • Ensure high data quality, security, and adherence to governance frameworks.
  • Conduct code reviews and align development with best practices.

Qualifications :

  • Bachelor’s in Computer Science, Data Science, IT, or related field.
  • Snowflake certifications (Pro/Architect) preferred.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Third Rock Techkno
Pune
5 - 7 yrs
₹10L - ₹15L / yr
ASP.NET
Entity Framework
skill iconC#
SQL server
skill iconAmazon Web Services (AWS)
+4 more

Required Qualifications:

  • 5+ years of professional software development experience.
  • Post-secondary degree in computer science, software engineering or related discipline, or equivalent working experience.
  • Development of distributed applications with Microsoft technologies: C# .NET/Core, SQL Server, Entity Framework.
  • Deep expertise with microservices architectures and design patterns.
  • Cloud Native AWS experience with services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly.
  • Mastery of both Windows and Linux environments and their use in the development and management of complex distributed systems architectures.
  • Git source code repository and continuous integration tools.


Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
Computer Networking
Linux administration
skill iconPython
Bash
Object Oriented Programming (OOPs)
+2 more

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Infrastructure Engineer, specifically for female candidates.


To Apply, Register and select your Slot herehttps://go.acciojob.com/kcYTAp


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Linux, Networking, One scripting language among Python, Bash, and PowerShell, OOPs, Cloud Platforms (AWS, Azure)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE Core With Cloud Certification
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/kcYTAp

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
skill iconPython
MySQL
Big Data

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Data Engineer, specifically for female candidates.


To Apply, Register and select your Slot here: https://go.acciojob.com/8p9ZXN


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Python, Database(MYSQL), Big Data(Spark, Kafka)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE – AI & DS / AI & ML
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/8p9ZXN

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 10 yrs
Best in industry
skill iconPython
Spark
Apache Airflow
skill iconDocker
SQL
+2 more

What You’ll Do:


As a Data Scientist, you will work closely across DeepIntent Analytics teams located in New York City, India, and Bosnia. The role will support internal and external business partners in defining patient and provider audiences, and generating analyses and insights related to measurement of campaign outcomes, Rx, patient journey, and supporting evolution of DeepIntent product suite. Activities in this position include creating and scoring audiences, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to to create better audiences 
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights 
  • Explore ways of using inference, statistical, machine learning techniques to improve the performance of existing algorithms and decision heuristics
  • Design and deploy new iterations of production-level code
  • Contribute posts to our upcoming technical blog  

Who You Are:

  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, OR, or Data Science. Graduate degree is strongly preferred 
  • 3+ years of working experience as Data Analyst, Data Engineer, Data Scientist in digital marketing, consumer advertisement, telecom, or other areas requiring customer level predictive analytics
  • Background in either data engineering or analytics
  • Hands on technical experience is required, proficiency in performing statistical analysis in Python, including relevant libraries, required
  • You have an advanced understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications)
  • Experience in programmatic, DSP related, marketing predictive analytics, audience segmentation or audience behaviour analysis or medical / healthcare experience
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference) 
  • Familiarity with data science tools such as, Xgboost, pytorch, Jupyter and strong LLM user experience (developer/API experience is a plus)
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Mumbai, Chennai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
aws
Amazon Redshift
+1 more

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Pune
3 - 20 yrs
₹20L - ₹50L / yr
skill iconJava
skill iconPython
Bash
Powershell
Agile Environment
+2 more

KEY DUTIES

  • Independently own and resolve high-priority or complex customer issues with minimal supervision
  • Reproduce and analyze product defects using advanced troubleshooting techniques and tools
  • Collaborate with developers to identify root causes and drive timely resolution of defects
  • Identify trends in escalations and provide feedback to improve product quality and customer experience
  • Document investigation findings, root causes, and resolution steps clearly for both internal and external audiences
  • Contribute to knowledge base articles and process improvements to enhance team efficiency
  • Represent the escalation team in product reviews or defect triage meetings
  • Build subject matter expertise in specific products or components
  • Mentor and assist junior team members by reviewing their investigations and coaching through complex cases
  • Participate in Agile ceremonies and contribute to team planning and backlog refinement
  • Other duties as assigned

BASIC QUALIFICATIONS

  • Typically requires 3–6 years of technical experience in a support, development, or escalation role
  • Strong technical troubleshooting and root cause analysis skills
  • Proficient in debugging tools, logs, and test environments
  • Ability to independently manage multiple complex issues and drive them to closure
  • Experience working with cross-functional teams in a collaborative, Agile environment
  • Proficiency with relevant scripting or programming languages (e.g., Python, Bash, PowerShell, Java)
  • Exceptional written and verbal communication skills — especially when engaging with customers in critical or escalated situations
  • Demonstrated customer-first mindset with an emphasis on clarity, empathy, and follow- through
  • Proactive and detail-oriented, with the ability to document and communicate technical concepts clearly
  • Comfortable presenting findings or recommendations to both technical and non-technical stakeholders
Read more
Metron Security Private Limited
Prathamesh Shinde
Posted by Prathamesh Shinde
Pune
2 - 5 yrs
₹5L - ₹8L / yr
skill iconPython
skill iconJava
skill iconNodeJS (Node.js)
skill iconReact.js

Mandatory Skills

  • Efficiently able to design and implement software features. 
  • Expertise in at least one Object Oriented Programming language (Python, typescript, Java, Node.js, Angular, react.js C#, C++).
  • Good knowledge on Data Structure and their correct usage.
  • Open to learn any new software development skill if needed for the project.
  • Alignment and utilisation of the core enterprise technology stacks and integration capabilities throughout the transition states.
  • Participate in planning, definition, and high-level design of the solution and exploration of solution alternatives.
  • Identify bottlenecks and bugs, and devise appropriate solutions.
  • Define, explore, and support the implementation of enablers to evolve solution intent, working directly with Agile teams to implement them.
  • Good knowledge on the implications of Cyber Security on the production. 
  • Experience architecting & estimating deep technical custom solutions & integrations.


Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Gruve
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Pune
5yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
SQL
Data engineering
Apache Spark
PySpark
+6 more

About the Company:

Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.

 

Why Gruve:

At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.

Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

 

Position summary:

We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. 


Key Roles & Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
  • Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
  • Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
  • Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
  • Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
  • Implement data governance, security, and compliance best practices.
  • Build and maintain data models, transformations, and data marts for analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
  • Automate infrastructure and deployments using Terraform, Airflow, or dbt.
  • Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
  • Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.


Basic Qualifications:

  • Bachelor’s or Master’s Degree in Computer Science or Data Science.
  • 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
  • Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
  • Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
  • Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
  • Proficiency in SQL, Python, or Scala for data transformation and analytics.
  • Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
  • Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
  • Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
  • Strong understanding of data governance, access control, and encryption strategies.
  • Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.


Preferred Qualifications:

  • Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
  • Experience in BI and analytics tools (Tableau, Power BI, Looker).
  • Familiarity with data observability tools (Monte Carlo, Great Expectations).
  • Experience with machine learning feature engineering pipelines in Databricks.
  • Contributions to open-source data engineering projects.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort