Cutshort logo

50+ Python Jobs in Pune | Python Job openings in Pune

Apply to 50+ Python Jobs in Pune on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
DEMAND MEDIA BPM LLP

at DEMAND MEDIA BPM LLP

2 candid answers
Darshana Mate
Posted by Darshana Mate
Pune
1 - 5 yrs
₹2L - ₹6L / yr
SQL
PowerBI
skill iconPython

Job Purpose

Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.


Key Responsibilities:

  • Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
  • Perform data transformation and validation for accuracy and consistency.
  • Upload processed datasets into SQL Server using SSIS packages.
  • Monitor and optimize database performance, identifying and resolving bottlenecks.
  • Perform regular backups, restorations, and recovery checks to ensure data continuity.
  • Manage user access and implement robust database security policies.
  • Oversee database storage allocation and utilization.
  • Conduct routine maintenance and support incident management, including root cause analysis and resolution.
  • Design and implement scalable database solutions and architecture.
  • Create and maintain stored procedures, views, and other database components.
  • Optimize SQL queries for performance and scalability.
  • Execute ETL processes and support seamless integration of multiple data sources.
  • Maintain data integrity and quality through validation and cleansing routines.
  • Collaborate with cross-functional teams on data solutions and project deliverables.

 

Educational Qualification: Any Graduate

Required Skills & Qualifications:

  • Proven experience with SQL Server or similar relational database platforms.
  • Strong expertise in SSIS, ETL processes, and data warehousing.
  • Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
  • Experience in database security, user role management, and access control.
  • Familiarity with backup/recovery strategies and database maintenance best practices.
  • Strong analytical skills with experience working with large and complex datasets.
  • Solid understanding of data modeling, normalization, and schema design.
  • Knowledge of incident and change management processes.
  • Excellent communication and collaboration skills.
  • Experience with Python for data manipulation and automation is a strong plus.


Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Pune, Hyderabad
3 - 7 yrs
₹11L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconPython
skill icongrafana
AWS CloudFormation
Terraform
+4 more

We are seeking a highly skilled and motivated MLOps Engineer with 3-5 years of experience to join our engineering team. The ideal candidate should possess a strong foundation in DevOps or software engineering principles with practical exposure to machine learning operational workflows. You will be instrumental in operationalizing ML systems, optimizing the deployment lifecycle, and strengthening the integration between data science and engineering teams.

Required Skills:

• Hands-on experience with MLOps platforms such as MLflow and Kubeflow.

• Proficiency in Infrastructure as Code (laC) tools like Terraform or Ansible.

• Strong familiarity with monitoring and alerting frameworks (Prometheus, Grafana, Datadog, AWS CloudWatch).

• Solid understanding of microservices architecture, service discovery, and load balancing.

• Excellent programming skills in Python, with experience in writing modular, testable, and maintainable code.

• Proficient in Docker and container-based application deployments.

• Experience with CI/CD tools such as Jenkins or GitLab Cl.

• Basic working knowledge of Kubernetes for container orchestration.

• Practical experience with cloud-based ML platforms such as AWS SageMaker, Databricks, or Google Vertex Al.



Good-to-Have Skills:

• Awareness of security practices specific to ML pipelines, including secure model endpoints and data protection.

• Experience with scripting languages like Bash or PowerShell for automation tasks.

• Exposure to database scripting and data integration pipelines.

Experience & Qualifications:

• 3-5+ years of experience in MLOps, Site Reliability Engineering (SRE), or

Software Engineering roles.

• At least 2+ years of hands-on experience working on ML/Al systems in production settings.

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
3 - 6 yrs
₹15L - ₹21L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Large Language Models (LLM) tuning
Retrieval Augmented Generation (RAG)
+1 more
  • Strong AI/ML OR Software Developer Profile
  • Mandatory (Experience 1) - Must have 3+ YOE in Core Software Developement (SDLC)
  • Mandatory (Experience 2) - Must have 2+ years of experience in AI/ML, preferably in conversational AI domain (spped to text, text to speech, speech emotional recognition) or agentic AI systems.
  • Mandatory (Experience 3) - Must have hands-on experience in fine-tuning LLMs/SLM, model optimization (quantization, distillation) and RAG
  • Mandatory (Experience 4) - Hands-on Programming experience in Python, TensorFlow, PyTorch and model APIs (Hugging Face, LangChain, OpenAI, etc


Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Pune, Mohali
4 - 6 yrs
₹5L - ₹11L / yr
skill iconPython
TensorFlow
PyTorch
skill iconMachine Learning (ML)
Spark
+3 more

Skill Sets:

  • Expertise in ML/DL, model lifecycle management, and MLOps (MLflow, Kubeflow)
  • Proficiency in Python, TensorFlow, PyTorch, Scikit-learn, and Hugging Face models
  • Strong experience in NLP, fine-tuning transformer models, and dataset preparation
  • Hands-on with cloud platforms (AWS, GCP, Azure) and scalable ML deployment (Sagemaker, Vertex AI)
  • Experience in containerization (Docker, Kubernetes) and CI/CD pipelines
  • Knowledge of distributed computing (Spark, Ray), vector databases (FAISS, Milvus), and model optimization (quantization, pruning)
  • Familiarity with model evaluation, hyperparameter tuning, and model monitoring for drift detection

Roles and Responsibilities:

  • Design and implement end-to-end ML pipelines from data ingestion to production
  • Develop, fine-tune, and optimize ML models, ensuring high performance and scalability
  • Compare and evaluate models using key metrics (F1-score, AUC-ROC, BLEU etc)
  • Automate model retraining, monitoring, and drift detection
  • Collaborate with engineering teams for seamless ML integration
  • Mentor junior team members and enforce best practices


Read more
Codnatives
Agency job
via Vysystem pvt Ltd by raja ram
Chennai, Mumbai, Bengaluru (Bangalore), Hyderabad, Pune
4 - 12 yrs
₹8L - ₹24L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)

Job Description:

We are seeking a highly skilled Python Developer with expertise in Artificial Intelligence and Machine Learning to join our innovative AI team. The ideal candidate will design, build, and deploy machine learning solutions while writing clean, scalable Python code to power real-world applications.

🔧 Responsibilities:

  • Develop and maintain robust Python applications focused on AI and ML use cases.
  • Design, train, and evaluate ML models (e.g., regression, classification, NLP, or computer vision).
  • Work with data scientists and ML engineers to productionize models using frameworks like Flask, FastAPI, or Docker.
  • Optimize algorithms for performance, scalability, and accuracy.
  • Build APIs and pipelines to integrate ML models into applications.
  • Implement and maintain unit/integration tests and participate in code reviews.
  • Use cloud platforms (AWS, Azure, GCP) for deploying AI/ML services.

💡 Required Skills:

  • Strong proficiency in Python and experience with ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch).
  • Experience with data processing tools (Pandas, NumPy, Spark).
  • Understanding of ML lifecycle, including data collection, cleaning, feature engineering, model training, and evaluation.
  • Experience building and consuming RESTful APIs.
  • Familiarity with SQL/NoSQL databases (e.g., PostgreSQL, MongoDB).
  • Version control using Git.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
Client based at Pune location.

Client based at Pune location.

Agency job
Pune
5 - 10 yrs
₹15L - ₹25L / yr
Cloud Developer
skill iconAmazon Web Services (AWS)
large scale financial tracking system
grpc
cloudflare
+8 more

Minimum requirements

5+ years of industry software engineering experience (does not include internships nor includes co-ops)

Strong coding skills in any programming language (we understand new languages can be learned on the job so our interview process is language agnostic)

Strong collaboration skills, can work across workstreams within your team and contribute to your peers’ success

Have the ability to thrive on a high level of autonomy, responsibility, and think of yourself as entrepreneurial

Interest in working as a generalist across varying technologies and stacks to solve problems and delight both internal and external users

Preferred Qualifications

Experience with large-scale financial tracking systems

Good understanding and practical knowledge in cloud based services (e.g. gRPC, GraphQL, Docker/Kubernetes, cloud services such as AWS, etc.)

Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune
5 - 9 yrs
₹16L - ₹20L / yr
skill iconPython
skill iconAngular (2+)
skill iconReact.js
skill iconFlask
API
+3 more

Job Title: Python Full Stack Developer

Location: Pune (Work from Office)

Experience: Minimum 5 Years


Job Summary:

We are looking for a highly skilled Python Full Stack Developer with a minimum of 5 years of hands-on experience. The ideal candidate will be proficient in building scalable web applications and APIs using Python technologies, and comfortable working with front-end frameworks like Angular or React. Experience with DevOps tools and practices is a plus.

Key Responsibilities:

  • Design, develop, and maintain scalable full-stack applications using Python.
  • Develop RESTful APIs using Flask.
  • Build responsive front-end interfaces using Angular or React.
  • Integrate front-end components with server-side logic.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Ensure the performance, quality, and responsiveness of applications.
  • Implement DevOps practices for CI/CD and deployment automation.
  • Participate in code reviews, testing, and bug fixing.

Required Skills:

  • Python (5+ years)
  • Flask or similar frameworks
  • REST APIs development
  • Angular or React (Front-end development)
  • DevOps tools and practices (CI/CD, Docker, etc.)
  • Strong understanding of software development best practices and design patterns

Eligibility Criteria:

  • Minimum 5 years of overall experience in software development
  • At least 5 years of strong hands-on experience in Python full stack development

Work Mode:

  • Work from Office – Pune location


Read more
Partner Company

Partner Company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹5L - ₹6L / yr
SQL
MS-Excel
PowerBI
skill iconPython

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.


Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd


Job Description:

  • Role: Junior Business/Data Analyst (Internship + PPO)
  • Work Location: Hyderabad
  • Internship Stipend: 15,000 - 25,000/month
  • Internship Duration: 3 months
  • CTC on PPO: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: Open to all academic backgrounds
  • Graduation Year: 2023, 2024, 2025

Required Skills:

  • Proficiency in SQLExcelPower BI, and basic Python
  • Strong analytical mindset and interest in solving business problems with data

Hiring Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
  2. 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)

Note: Please bring your laptop and earphones for the test.


Register Here: https://go.acciojob.com/69d3Wd

Read more
Gameberry

at Gameberry

5 recruiters
Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 1 yrs
₹10L - ₹15L / yr
DSA
Object Oriented Programming (OOPs)
skill iconJava
skill iconPython
skill iconGo Programming (Golang)

AccioJob is organizing an exclusive offline hiring drive in collaboration with GameBerry Labs for the role of Software Development Engineer 1 (SDE 1).


To Apply, Register and select your Slot here: https://go.acciojob.com/Zq2UnA


Job Description:

  • Role: SDE 1
  • Work Location: Bangalore
  • CTC: 10 LPA - 15 LPA

Eligibility Criteria:

  • Education: B.Tech, BE, BCA, MCA, M.Tech
  • Branches: Circuit Branches (CSE, ECE, IT, etc.)
  • Graduation Year:
  • 2024 (Minimum 9 months of experience)
  • 2025 (Minimum 3-6 months of experience)

Evaluation Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Bangalore, Pune, Noida)
  2. Technical Interviews (2 Rounds - Virtual for most; In-person for Bangalore candidates)

Note: Carry your laptop and earphones for the assessment.


Register Here: https://go.acciojob.com/Zq2UnA

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Pune, Kolkata
4 - 6 yrs
₹7L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
NumPy
pandas

Key Technical Skillsets-

  • Design, develop, and maintain scalable applications using AWS services, Python, and Boto3.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement best practices for cloud architecture and application development.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot and resolve issues in development, test, and production environments.
  • Write clean, maintainable, and efficient code.
  • Participate in code reviews and contribute to team knowledge sharing.


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune
7 - 12 yrs
₹20L - ₹25L / yr
skill iconMachine Learning (ML)
Computer Vision
skill iconPython
skill iconDeep Learning
skill iconData Science
+2 more

Job Overview:

We are looking for a skilled professional with:

 

  • 7+ years of overall experience, including minimum 5 years in Computer Vision, Machine Learning, Deep Learning, and algorithm development.
  •  
  • Proficiency in Data Science and Data Analysis techniques.
  •  
  • Hands-on programming experience with Python, R, MATLAB or Octave.
  •  
  • Experience with AI frameworks like TensorFlow, PySpark, Theano, and libraries such as PyTorch, Pandas, NumPy, etc.
  •  
  • Strong understanding of algorithms like Regression, SVM, Decision Trees, KNN, and Neural Networks.
  •  

Key Skills & Attributes: 

  • Fast learner with strong problem-solving abilities
  •  
  • Innovative thinking and approach
  •  
  • Excellent communication skills
  •  
  • High standards of integrity, accountability, and transparency
  •  
  • Exposure to or experience with international work environments

Notice Period : Immediate to 30Days


Read more
InfoBeans

at InfoBeans

2 recruiters
Sanjana Thakur
Posted by Sanjana Thakur
Pune, Indore
7 - 13 yrs
₹12L - ₹35L / yr
skill iconPython
Automation
pytest
playwright

Python Automation Engineer -

JD : 


  • Engage with development teams to improve the quality of the application.
  • Provide in-depth technical mentoring across the test automation team.
  • Provide highly innovative solutions to automatically qualify the application.
  • Routinely exercise independent judgment in test automation methods, techniques and criteria for achieving objectives.

Experience/Exposure:

  • Mid-level programming skills in Python
  • Experience with UI driven test automation framework such as selenium, Playwright
  • Experience with CI/CD tool
  • Ability to troubleshoot complex software / hardware configuration problems
  • Strong analytical & problem solving, documentation, and communication skills
  • Passion for product quality and eagerness to learn new technologies.
  • Ability to function effectively in a fast-paced environment and manage continuously changing business needs. Excellent time management skills require


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Hyderabad, Mumbai, kolkata, Pune, chennai
4 - 10 yrs
₹7L - ₹20L / yr
skill iconMachine Learning (ML)
MLOps
skill iconPython
NumPy
  • Design and implement cloud solutions, build MLOps on Azure
  • Build CI/CD pipelines orchestration by GitLab CI, GitHub Actions, Circle CI, Airflow or similar tools
  • Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality
  • Data science models testing, validation and tests automation
  • Deployment of code and pipelines across environments
  • Model performance metrics
  • Service performance metrics
  • Communicate with a team of data scientists, data engineers and architect, document the processes


Read more
Blitzy

at Blitzy

2 candid answers
1 product
Eman Khan
Posted by Eman Khan
Pune
6 - 10 yrs
₹40L - ₹70L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
Google Cloud Platform (GCP)
+1 more

Requirements

  • 7+ years of experience with Python
  • Strong expertise in Python frameworks (Django, Flask, or FastAPI)
  • Experience with GCP, Terraform, and Kubernetes
  • Deep understanding of REST API development and GraphQL
  • Strong knowledge of SQL and NoSQL databases
  • Experience with microservices architecture
  • Proficiency with CI/CD tools (Jenkins, CircleCI, GitLab)
  • Experience with container orchestration using Kubernetes
  • Understanding of cloud architecture and serverless computing
  • Experience with monitoring and logging solutions
  • Strong background in writing unit and integration tests
  • Familiarity with AI/ML concepts and integration points


Responsibilities

  • Design and develop scalable backend services for our AI platform
  • Architect and implement complex systems with high reliability
  • Build and maintain APIs for internal and external consumption
  • Work closely with AI engineers to integrate ML functionality
  • Optimize application performance and resource utilization
  • Make architectural decisions that balance immediate needs with long-term scalability
  • Mentor junior engineers and promote best practices
  • Contribute to the evolution of our technical standards and processes
Read more
NeoGenCode Technologies Pvt Ltd
Pune
8 - 15 yrs
₹5L - ₹24L / yr
Data engineering
Snow flake schema
SQL
ETL
ELT
+5 more

Job Title : Data Engineer – Snowflake Expert

Location : Pune (Onsite)

Experience : 10+ Years

Employment Type : Contractual

Mandatory Skills : Snowflake, Advanced SQL, ETL/ELT (Snowpipe, Tasks, Streams), Data Modeling, Performance Tuning, Python, Cloud (preferably Azure), Security & Data Governance.


Job Summary :

We are seeking a seasoned Data Engineer with deep expertise in Snowflake to design, build, and maintain scalable data solutions.

The ideal candidate will have a strong background in data modeling, ETL/ELT, SQL optimization, and cloud data warehousing principles, with a passion for leveraging Snowflake to drive business insights.

Responsibilities :

  • Collaborate with data teams to optimize and enhance data pipelines and models on Snowflake.
  • Design and implement scalable ELT pipelines with performance and cost-efficiency in mind.
  • Ensure high data quality, security, and adherence to governance frameworks.
  • Conduct code reviews and align development with best practices.

Qualifications :

  • Bachelor’s in Computer Science, Data Science, IT, or related field.
  • Snowflake certifications (Pro/Architect) preferred.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Third Rock Techkno
Pune
5 - 7 yrs
₹10L - ₹15L / yr
ASP.NET
Entity Framework
skill iconC#
SQL server
skill iconAmazon Web Services (AWS)
+4 more

Required Qualifications:

  • 5+ years of professional software development experience.
  • Post-secondary degree in computer science, software engineering or related discipline, or equivalent working experience.
  • Development of distributed applications with Microsoft technologies: C# .NET/Core, SQL Server, Entity Framework.
  • Deep expertise with microservices architectures and design patterns.
  • Cloud Native AWS experience with services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly.
  • Mastery of both Windows and Linux environments and their use in the development and management of complex distributed systems architectures.
  • Git source code repository and continuous integration tools.


Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
Computer Networking
Linux administration
skill iconPython
Bash
Object Oriented Programming (OOPs)
+2 more

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Infrastructure Engineer, specifically for female candidates.


To Apply, Register and select your Slot herehttps://go.acciojob.com/kcYTAp


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Linux, Networking, One scripting language among Python, Bash, and PowerShell, OOPs, Cloud Platforms (AWS, Azure)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE Core With Cloud Certification
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/kcYTAp

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
skill iconPython
MySQL
Big Data

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Data Engineer, specifically for female candidates.


To Apply, Register and select your Slot here: https://go.acciojob.com/8p9ZXN


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Python, Database(MYSQL), Big Data(Spark, Kafka)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE – AI & DS / AI & ML
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/8p9ZXN

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 10 yrs
Best in industry
skill iconPython
Spark
Apache Airflow
skill iconDocker
SQL
+2 more

What You’ll Do:


As a Data Scientist, you will work closely across DeepIntent Analytics teams located in New York City, India, and Bosnia. The role will support internal and external business partners in defining patient and provider audiences, and generating analyses and insights related to measurement of campaign outcomes, Rx, patient journey, and supporting evolution of DeepIntent product suite. Activities in this position include creating and scoring audiences, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to to create better audiences 
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights 
  • Explore ways of using inference, statistical, machine learning techniques to improve the performance of existing algorithms and decision heuristics
  • Design and deploy new iterations of production-level code
  • Contribute posts to our upcoming technical blog  

Who You Are:

  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, OR, or Data Science. Graduate degree is strongly preferred 
  • 3+ years of working experience as Data Analyst, Data Engineer, Data Scientist in digital marketing, consumer advertisement, telecom, or other areas requiring customer level predictive analytics
  • Background in either data engineering or analytics
  • Hands on technical experience is required, proficiency in performing statistical analysis in Python, including relevant libraries, required
  • You have an advanced understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications)
  • Experience in programmatic, DSP related, marketing predictive analytics, audience segmentation or audience behaviour analysis or medical / healthcare experience
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference) 
  • Familiarity with data science tools such as, Xgboost, pytorch, Jupyter and strong LLM user experience (developer/API experience is a plus)
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing


Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
6 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconPython
SQL
PySpark
XGBoost

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.


Data Axle Pune is pleased to have achieved certification as a Great Place to Work!


Roles & Responsibilities:

We are looking for a Senior Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.


We are looking for a Senior Data Scientist who will be responsible for:

  1. Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
  2. Design or enhance ML workflows for data ingestion, model design, model inference and scoring
  3. Oversight on team project execution and delivery
  4. Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
  5. Visualize and publish model performance results and insights to internal and external audiences


Qualifications:

  1. Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
  2. Minimum of 5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  3. Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
  4. Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
  5. Proficiency in Python and SQL required; PySpark/Spark experience a plus
  6. Ability to conduct a productive peer review and proper code structure in Github
  7. Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
  8. Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.


It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Gruve
Bengaluru (Bangalore), Pune
5 - 9 yrs
Upto ₹60L / yr (Varies
)
Generative AI
Retrieval Augmented Generation (RAG)
Chatbot
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

We are seeking a talented Engineer to join our AI team. You will technically lead experienced software and machine learning engineers to develop, test, and deploy AI-based solutions, with a primary focus on large language models and other machine learning applications. This is an excellent opportunity to apply your software engineering skills in a dynamic, real-world environment and gain hands-on experience in cutting-edge AI technology.


Key Roles & Responsibilities: 

  • Design and Develop AI-Powered Solutions: Architect and implement scalable AI/ML systems, focusing on Large Language Models (LLMs) and other deep learning applications.
  • End-to-End Model Development: Lead the entire lifecycle of AI models—from data collection and preprocessing to training, fine-tuning, evaluation, and deployment.
  • Fine-Tuning & Customization: Leverage techniques like LoRA (Low-Rank Adaptation) and Q-LoRA to efficiently fine-tune large models for specific business applications.
  • Reasoning Model Implementation: Work with advanced reasoning models such as DeepSeek-R1, exploring their applications in enterprise AI workflows.
  • Data Engineering & Dataset Creation: Design and curate high-quality datasets optimized for fine-tuning AI models, ensuring robust training and validation processes.
  • Performance Optimization & Efficiency: Optimize model inference, computational efficiency, and resource utilization for large-scale AI applications.
  • MLOps & CI/CD Pipelines: Implement best practices for MLOps, ensuring automated training, deployment, monitoring, and continuous improvement of AI models.
  • Cloud & Edge AI Deployment: Deploy and manage AI solutions in cloud environments (AWS, Azure, GCP) and explore edge AI deployment where applicable.
  • API Development & Microservices: Develop RESTful APIs and microservices to integrate AI models seamlessly into enterprise applications.
  • Security, Compliance & Ethical AI: Ensure AI solutions comply with industry standards, data privacy laws (e.g., GDPR, HIPAA), and ethical AI guidelines.
  • Collaboration & Stakeholder Engagement: Work closely with product managers, data engineers, and business teams to translate business needs into AI-driven solutions.
  • Mentorship & Technical Leadership: Guide and mentor junior engineers, fostering best practices in AI/ML development, model fine-tuning, and software engineering.
  • Research & Innovation: Stay updated with emerging AI trends, conduct experiments with cutting-edge architectures and fine-tuning techniques, and drive innovation within the team.

Basic Qualifications: 

  • A master's degree or PhD in Computer Science, Data Science, Engineering, or a related field 
  • Experience: 5-8 Years 
  • Strong programming skills in Python and Java 
  • Good understanding of machine learning fundamentals 
  • Hands-on experience with Python and common ML libraries (e.g., PyTorch, TensorFlow, scikit-learn) 
  • Familiar with frontend development and frameworks like React 
  • Basic knowledge of LLMs and transformer-based architectures is a plus.

Preferred Qualifications  

  • Excellent problem-solving skills and an eagerness to learn in a fast-paced environment 
  • Strong attention to detail and ability to communicate technical concepts clearly


Read more
Gruve
Pune, Bengaluru (Bangalore)
3 - 5 yrs
Upto ₹30L / yr (Varies
)
Retrieval Augmented Generation (RAG)
Generative AI
Chatbot
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more

We are seeking a talented Engineer to join our AI team. You will technically lead experienced software and machine learning engineers to develop, test, and deploy AI-based solutions, with a primary focus on large language models and other machine learning applications. This is an excellent opportunity to apply your software engineering skills in a dynamic, real-world environment and gain hands-on experience in cutting-edge AI technology.


Key Roles & Responsibilities: 

  • Design and implement software solutions that power machine learning models, particularly in LLMs 
  • Create robust data pipelines, handling data preprocessing, transformation, and integration for machine learning projects 
  • Collaborate with the engineering team to build and optimize machine learning models, particularly LLMs, that address client-specific challenges 
  • Partner with cross-functional teams, including business stakeholders, data engineers, and solutions architects to gather requirements and evaluate technical feasibility 
  • Design and implement a scale infrastructure for developing and deploying GenAI solutions 
  • Support model deployment and API integration to ensure interaction with existing enterprise systems.

Basic Qualifications: 

  • A master's degree or PhD in Computer Science, Data Science, Engineering, or a related field 
  • Experience: 3-5 Years 
  • Strong programming skills in Python and Java 
  • Good understanding of machine learning fundamentals 
  • Hands-on experience with Python and common ML libraries (e.g., PyTorch, TensorFlow, scikit-learn) 
  • Familiar with frontend development and frameworks like React 
  • Basic knowledge of LLMs and transformer-based architectures is a plus.

Preferred Qualifications 

  • Excellent problem-solving skills and an eagerness to learn in a fast-paced environment 
  • Strong attention to detail and ability to communicate technical concepts clearly 
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Mumbai, Chennai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
aws
Amazon Redshift
+1 more

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Pune
3 - 20 yrs
₹20L - ₹50L / yr
skill iconJava
skill iconPython
Bash
Powershell
Agile Environment
+2 more

KEY DUTIES

  • Independently own and resolve high-priority or complex customer issues with minimal supervision
  • Reproduce and analyze product defects using advanced troubleshooting techniques and tools
  • Collaborate with developers to identify root causes and drive timely resolution of defects
  • Identify trends in escalations and provide feedback to improve product quality and customer experience
  • Document investigation findings, root causes, and resolution steps clearly for both internal and external audiences
  • Contribute to knowledge base articles and process improvements to enhance team efficiency
  • Represent the escalation team in product reviews or defect triage meetings
  • Build subject matter expertise in specific products or components
  • Mentor and assist junior team members by reviewing their investigations and coaching through complex cases
  • Participate in Agile ceremonies and contribute to team planning and backlog refinement
  • Other duties as assigned

BASIC QUALIFICATIONS

  • Typically requires 3–6 years of technical experience in a support, development, or escalation role
  • Strong technical troubleshooting and root cause analysis skills
  • Proficient in debugging tools, logs, and test environments
  • Ability to independently manage multiple complex issues and drive them to closure
  • Experience working with cross-functional teams in a collaborative, Agile environment
  • Proficiency with relevant scripting or programming languages (e.g., Python, Bash, PowerShell, Java)
  • Exceptional written and verbal communication skills — especially when engaging with customers in critical or escalated situations
  • Demonstrated customer-first mindset with an emphasis on clarity, empathy, and follow- through
  • Proactive and detail-oriented, with the ability to document and communicate technical concepts clearly
  • Comfortable presenting findings or recommendations to both technical and non-technical stakeholders
Read more
Metron Security Private Limited
Prathamesh Shinde
Posted by Prathamesh Shinde
Pune
2 - 5 yrs
₹5L - ₹8L / yr
skill iconPython
skill iconJava
skill iconNodeJS (Node.js)
skill iconReact.js

Mandatory Skills

  • Efficiently able to design and implement software features. 
  • Expertise in at least one Object Oriented Programming language (Python, typescript, Java, Node.js, Angular, react.js C#, C++).
  • Good knowledge on Data Structure and their correct usage.
  • Open to learn any new software development skill if needed for the project.
  • Alignment and utilisation of the core enterprise technology stacks and integration capabilities throughout the transition states.
  • Participate in planning, definition, and high-level design of the solution and exploration of solution alternatives.
  • Identify bottlenecks and bugs, and devise appropriate solutions.
  • Define, explore, and support the implementation of enablers to evolve solution intent, working directly with Agile teams to implement them.
  • Good knowledge on the implications of Cyber Security on the production. 
  • Experience architecting & estimating deep technical custom solutions & integrations.


Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Gruve
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Pune
5yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
SQL
Data engineering
Apache Spark
PySpark
+6 more

About the Company:

Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.

 

Why Gruve:

At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.

Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

 

Position summary:

We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. 

Key Roles & Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
  • Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
  • Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
  • Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
  • Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
  • Implement data governance, security, and compliance best practices.
  • Build and maintain data models, transformations, and data marts for analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
  • Automate infrastructure and deployments using Terraform, Airflow, or dbt.
  • Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
  • Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.


Basic Qualifications:

  • Bachelor’s or Master’s Degree in Computer Science or Data Science.
  • 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
  • Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
  • Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
  • Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
  • Proficiency in SQL, Python, or Scala for data transformation and analytics.
  • Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
  • Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
  • Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
  • Strong understanding of data governance, access control, and encryption strategies.
  • Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.


Preferred Qualifications:

  • Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
  • Experience in BI and analytics tools (Tableau, Power BI, Looker).
  • Familiarity with data observability tools (Monte Carlo, Great Expectations).
  • Experience with machine learning feature engineering pipelines in Databricks.
  • Contributions to open-source data engineering projects.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Metron Security Private Limited
Vikram Nippani
Posted by Vikram Nippani
Pune
3 - 8 yrs
₹5L - ₹20L / yr
skill iconPython
Powershell
skill iconJavascript
skill iconNodeJS (Node.js)
skill iconPostgreSQL
+4 more

We are looking for passionate developers with 4 - 8 years of experience in software development to join Metron Security team as Software Engineer.

 

Metron Security provides automation and integration services to leading Cyber Security companies. Our engineering team works on leading security platforms including - Splunk, IBM’s QRadar, ServiceNow, Crowdstrike, Cybereason, and other SIEM and SOAR platforms.

 

Software Engineer is a challenging role within Cyber Security Engineering integration development. The role involves developing a product/service that achieves high performance data exchange between two or more Cyber Security platforms. A Software Engineer is responsible for End-to-End delivery of the project, right from getting the requirements from customer to deploying the project for them on prem or on cloud, depending on the nature of the project. We follow the best practices of Engineering and keep evolving, we are agile. The Software Engineer is at the core of the evolution process.

 

Each integration needs reskilling yourself with the required technology for that project. If you are passionate about programming and believe in the best practices of software engineering, following are the skills we are looking for:


  • Developer-centric culture - No bureaucracy and red-tapes 
  • Chance to work on 200+ security platform and more 
  • Opportunity to engage with end-users (customers) and just a cog in the wheel


Position: Senior Software Engineer

Location: Pune 


Mandatory Skills

  • Efficiently able to design and implement software features. 
  • Expertise in at least one Object Oriented Programming language (Python, typescript, Java, Node.js, Angular, react.js C#, C++).
  • Good knowledge on Data Structure and their correct usage.
  • Open to learn any new software development skill if needed for the project.
  • Alignment and utilisation of the core enterprise technology stacks and integration capabilities throughout the transition states.
  • Participate in planning, definition, and high-level design of the solution and exploration of solution alternatives.
  • Identify bottlenecks and bugs, and devise appropriate solutions.
  • Define, explore, and support the implementation of enablers to evolve solution intent, working directly with Agile teams to implement them.
  • Good knowledge on the implications of Cyber Security on the production. 
  • Experience architecting & estimating deep technical custom solutions & integrations.


Added advantage:

  • You have experience in Cyber Security domain.
  • You have developed software using web technologies.
  • You have handled a project from start to end.
  • You have worked in an Agile Development project and have experience of writing and estimating User Stories.
  • Contribution to open source - Please share your link in the application/resume. 


Read more
Metron Security Private Limited
Chanchal Kale
Posted by Chanchal Kale
Pune
2 - 4 yrs
₹5L - ₹7L / yr
skill iconPython
MERN Stack
Integration
skill iconGo Programming (Golang)
TypeScript
+2 more

Job Summary:


  • We are looking for a highly motivated and skilled Software Engineer to join our team.
  • This role requires a strong understanding of the software development lifecycle, proficiency in coding, and excellent communication skills.
  • The ideal candidate will be responsible for production monitoring, resolving minor technical issues, collecting client information, providing effective client interactions, and supporting our development team in resolving challenges


Key Responsibilities:


  • Client Interaction: Serve as the primary point of contact for client queries, provide excellent communication, and ensure timely issue resolution.
  • Issue Resolution: Troubleshoot and resolve minor issues related to software applications in a timely manner.
  • Information Collection: Gather detailed technical information from clients, understand the problem context, and relay the information to the development leads for further action.
  • Collaboration: Work closely with development leads and cross-functional teams to provide timely support and resolution for customer issues.
  • Documentation: Document client issues, actions taken, and resolutions for future reference and continuous improvement.
  • Software Development Lifecycle: Be involved in maintaining, supporting, and optimizing software through its lifecycle, including bug fixes and enhancements.
  • Automating Redundant Support Tasks: (good to have) Should be able to automate the redundant repetitive tasks Required Skills and Qualifications:


Mandatory Skills:


  • Expertise in at least one Object Oriented Programming language (Python, Java, C#, C++, Reactjs, Nodejs).
  • Good knowledge on Data Structure and their correct usage.
  • Open to learn any new software development skill if needed for the project.
  • Alignment and utilization of the core enterprise technology stacks and integration capabilities throughout the transition states.
  • Participate in planning, definition, and high-level design of the solution and exploration of solution alternatives.
  • Define, explore, and support the implementation of enablers to evolve solution intent, working directly with Agile teams to implement them.
  • Good knowledge on the implications.
  • Experience architecting & estimating deep technical custom solutions & integrations.


Added advantage:


  • You have developed software using web technologies.
  • You have handled a project from start to end.
  • You have worked in an Agile Development project and have experience of writing and estimating User Stories
  • Communication Skills: Excellent verbal and written communication skills, with the ability to clearly explain technical issues to non-technical clients.
  • Client-Facing Experience: Strong ability to interact with clients, gather necessary information, and ensure a high level of customer satisfaction.
  • Problem-Solving: Quick-thinking and proactive in resolving minor issues, with a focus on providing excellent user experience.
  • Team Collaboration: Ability to collaborate with development leads, engineering teams, and other stakeholders to escalate complex issues or gather additional technical support when required.


Preferred Skills:


  • Familiarity with Cloud Platforms and Cyber Security tools: Knowledge of cloud computing platforms and services (AWS, Azure, Google Cloud) and Cortex XSOAR, SIEM, SOAR, XDR tools is a plus.
  • Automation and Scripting: Experience with automating processes or writing scripts to support issue resolution is an advantage.


Work Environment:

  • This is a rotational shift position
  • During evening shift the timings will be (5 PM to 2 AM), and you will be expected to work independently and efficiently during these hours.
  • The position may require occasional weekend shifts depending on the project requirements.
  • Additional benefit of night allowance.
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Gurugram, Noida, Bhopal, Bengaluru (Bangalore)
4 - 8 yrs
₹8L - ₹22L / yr
MLOps
skill iconAmazon Web Services (AWS)
AWS Sagemaker
skill iconPython

Role - MLops Engineer

Location - Pune, Gurgaon, Noida, Bhopal, Bangalore 

Mode - Hybrid


Role Overview

We are looking for an experienced MLOps Engineer to join our growing AI/ML team. You will be responsible for automating, monitoring, and managing machine learning workflows and infrastructure in production environments. This role is key to ensuring our AI solutions are scalable, reliable, and continuously improving.


Key Responsibilities

  • Design, build, and manage end-to-end ML pipelines, including model training, validation, deployment, and monitoring.
  • Collaborate with data scientists, software engineers, and DevOps teams to integrate ML models into production systems.
  • Develop and manage scalable infrastructure using AWS, particularly AWS Sagemaker.
  • Automate ML workflows using CI/CD best practices and tools.
  • Ensure model reproducibility, governance, and performance tracking.
  • Monitor deployed models for data drift, model decay, and performance metrics.
  • Implement robust versioning and model registry systems.
  • Apply security, performance, and compliance best practices across ML systems.
  • Contribute to documentation, knowledge sharing, and continuous improvement of our MLOps capabilities.


Required Skills & Qualifications

  • 4+ years of experience in Software Engineering or MLOps, preferably in a production environment.
  • Proven experience with AWS services, especially AWS Sagemaker for model development and deployment.
  • Working knowledge of AWS DataZone (preferred).
  • Strong programming skills in Python, with exposure to R, Scala, or Apache Spark.
  • Experience with ML model lifecycle management, version control, containerization (Docker), and orchestration tools (e.g., Kubernetes).
  • Familiarity with MLflow, Airflow, or similar pipeline/orchestration tools.
  • Experience integrating ML systems into CI/CD workflows using tools like Jenkins, GitHub Actions, or AWS CodePipeline.
  • Solid understanding of DevOps and cloud-native infrastructure practices.
  • Excellent problem-solving skills and the ability to work collaboratively across teams.


Read more
Cambridge Wealth (Baker Street Fintech)
Sangeeta Bhagwat
Posted by Sangeeta Bhagwat
Pune
3 - 6 yrs
₹7L - ₹9L / yr
Agile/Scrum
skill iconPython
skill iconGoogle Analytics
Product Management
Product Lifecycle Management (PLM)
+4 more

 

ABOUT US  

We are a fast-growing, excellence-oriented mutual fund distribution and fintech firm delivering exceptional solutions to domestic/NRI/retail and ultra-HNI clients. Cambridge Wealth is a respected brand in the wealth segment, having won awards from BSE and Mutual Fund houses. Learn more about us at www.cambridgewealth.in

 

JOB OVERVIEW  

Drive product excellence through data-backed decisions while ensuring efficient delivery and continuous improvement.

 

KEY RESPONSIBILITIES  

  • Sprint & Timeline Management: Drive Agile sprints with clear milestones to prevent scope creep
  • Process Optimization: Identify bottlenecks early and implement standardised workflows
  • Market Research: Analyze competitive landscape and customer preferences to inform strategy
  • Feature Development: Refine product features based on customer feedback and data analysis
  • Performance Analysis: Create actionable dashboards tracking KPIs and user behavior metrics
  • Risk Management: Proactively identify potential roadblocks and develop contingency plans
  • User Testing: Conduct testing sessions and translate feedback into product improvements
  • Documentation: Develop comprehensive specs and user stories for seamless implementation
  • Cross-Team Coordination: Align stakeholders on priorities and deliverables throughout development

 

TECHNICAL REQUIREMENTS  

  • Data Analysis: SQL proficiency for data extraction and manipulation
  • Project Management: Expert in Agile methods and tracking tools
  • Advanced Excel/Google/Zoho sheets: Expertise in pivot tables, VLOOKUP, and complex formulas
  • Analytics Platforms: Experience with Mixpanel, Amplitude, or Google Analytics, Zoho Analytics
  • Financial Knowledge: Understanding of mutual funds and fintech industry metrics

 

QUALIFICATIONS  

  • 2+ years experience in product analysis or similar role
  • Strong analytical skills with the ability to collect, analyse, and interpret data from various sources.
  • Basic understanding of user experience (UX) principles and methodologies.
  • Excellent verbal and written communication skills  for translating complex findings
  • Ability to work collaboratively in a team environment and adapt to changing priorities.
  • Eagerness to learn, take initiative, and contribute ideas to improve products and processes

 

READY TO SHAPE THE FUTURE OF FINTECH?  

Apply now to join our award-winning team


Our Hiring Process:

  1. You Apply and answer a couple of quick questions [5 min]
  2. Recruiter screening phone interview [30 min]
  3. Online Technical assessment [60 min]
  4. Technical interview [45 min]
  5. Founder's interview [30 min]
  6. We make you an offer and proceed for reference and BGV check
Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad, Pune
4 - 10 yrs
₹10L - ₹24L / yr
skill iconJava
Artificial Intelligence (AI)
Automation
IDX
skill iconSpring Boot
+4 more

Job Title : Senior Backend Engineer – Java, AI & Automation

Experience : 4+ Years

Location : Any Cognizant location (India)

Work Mode : Hybrid

Interview Rounds :

  1. Virtual
  2. Face-to-Face (In-person)

Job Description :

Join our Backend Engineering team to design and maintain services on the Intuit Data Exchange (IDX) platform.

You'll work on scalable backend systems powering millions of daily transactions across Intuit products.


Key Qualifications :

  • 4+ years of backend development experience.
  • Strong in Java, Spring framework.
  • Experience with microservices, databases, and web applications.
  • Proficient in AWS and cloud-based systems.
  • Exposure to AI and automation tools (Workato preferred).
  • Python development experience.
  • Strong communication skills.
  • Comfortable with occasional US shift overlap.
Read more
InfoBeans

at InfoBeans

2 recruiters
Sanjana Thakur
Posted by Sanjana Thakur
Pune, Indore
8 - 13 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
skill iconDjango
Microservices

Job Title: Python Django Microservices Lead

Job Title: Django Backend Lead Developer


Location: Indore/ Pune (Hybrid - Wednesday and Thursday WFO)

Timings - 12.30 to 9.30 PM

Experience Level: 8+ Years


Job Overview: We are seeking an experienced Django Backend Lead Developer to join our team. The ideal candidate will have a strong background in backend development, cloud technologies, and big data

processing. This role involves leading technical projects, mentoring junior developers, and ensuring the delivery of high-quality solutions.


Responsibilities:


Lead the development of backend systems using Django.


Design and implement scalable and secure APIs.


Integrate Azure Cloud services for application deployment and management.


Utilize Azure Databricks for big data processing and analytics.


Implement data processing pipelines using PySpark.


Collaborate with front-end developers, product managers, and other stakeholders to deliver comprehensive solutions.


Conduct code reviews and ensure adherence to best practices.


Mentor and guide junior developers.


Optimize database performance and manage data storage solutions.


Ensure high performance and security standards for applications.


Participate in architecture design and technical decision-making.


Qualifications:


Bachelor's degree in Computer Science, Information Technology, or a related field.


8+ years of experience in backend development.


8+ years of experience with Django.


Proven experience with Azure Cloud services.


Experience with Azure Databricks and PySpark.


Strong understanding of RESTful APIs and web services.


Excellent communication and problem-solving skills.


Familiarity with Agile methodologies.


Experience with database management (SQL and NoSQL).


Skills: Django, Python, Azure Cloud, Azure Databricks, Delta Lake and Delta tables, PySpark, SQL/NoSQL databases, RESTful APIs, Git, and Agile methodologies

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Bhopal, Jaipur
4 - 6 yrs
₹4L - ₹20L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
SageMaker
MLOps

Role - MLops Engineer

Required Experience - 4 Years

Location - Pune, Gurgaon, Noida, Bhopal, Bangalore 

Mode - Hybrid


Key Requirements:

  • 4+ years of experience in Software Engineering with MLOps focus
  • Strong expertise in AWS, particularly AWS SageMaker (required)
  • AWS Data Zone experience (preferred)
  • Proficiency in Python, R, Scala, or Spark
  • Experience developing scalable, reliable, and secure applications
  • Track record of production-grade development, integration and support


 

Read more
Onelab Ventures

Onelab Ventures

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 3 yrs
₹5L - ₹6L / yr
skill iconPython
FastAPI
skill iconFlask
skill iconDjango
PyTorch
+3 more

AccioJob is conducting an offline hiring drive with OneLab Ventures for the position of:


  • AI/ML Engineer / Intern - Python, Fast API, Flask/Django, PyTorch, TensorFlow, Scikit-learn, GenAI Tools


Apply Now: https://links.acciojob.com/44MJQSB


Eligibility:

  • Degree: BTech / BSc / BCA / MCA / MTech / MSc / BCS / MCS
  • Graduation Year:
  • For Interns - 2024 and 2025
  • For experienced - 2024 and before
  • Branch: All Branches
  • Location: Pune (work from office)


Salary:

  • For interns - 25K for 6 months and 5- 6 LPA PPO
  • For experienced - Hike on the current CTC


Evaluation Process:

  • Assessment at AccioJob Pune Skill Centre.
  • Company side process: 2 rounds of tech interviews (Virtual +F2F) + 1 HR round


Apply Now: https://links.acciojob.com/44MJQSB


Important: Please bring your laptop & earphones for the test.



Read more
Onelab Ventures

Onelab Ventures

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 3 yrs
₹5L - ₹6L / yr
skill iconPython
skill iconDjango
FastAPI
skill iconFlask
skill iconHTML/CSS
+3 more

AccioJob is conducting an offline hiring drive with OneLab Ventures for the position of:


  • Python Full Stack Engineer / Intern - Python, Fast API, Flask/Django, HTML, CSS, JavaScript, and frameworks like React.js or Node.js


Apply Now: https://links.acciojob.com/4d0Gtd6


Eligibility:

  • Degree: BTech / BSc / BCA / MCA / MTech / MSc / BCS / MCS
  • Graduation Year:
  • For Interns - 2024 and 2025
  • For experienced - 2024 and before
  • Branch: All Branches
  • Location: Pune (work from office)


Salary:

  • For interns - 25K for 6 months and 5- 6 LPA PPO
  • For experienced - Hike on the current CTC


Evaluation Process:

  • Assessment at AccioJob Pune Skill Centre.
  • Company side process: 2 rounds of tech interviews (Virtual +F2F) + 1 HR round


Apply Now: https://links.acciojob.com/4d0Gtd6


Important: Please bring your laptop & earphones for the test.


Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
Data Axle

at Data Axle

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
6 - 10 yrs
Best in industry
Google Cloud Platform (GCP)
ETL
skill iconPython
skill iconJava
skill iconScala
+7 more

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 45 years in the USA. Data Axle has set up a strategic global center of excellence in Pune. This center delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases. Data Axle is headquartered in Dallas, TX, USA.


Roles and Responsibilities:

  • Design, implement, and manage scalable analytical data infrastructure, enabling efficient access to large datasets and high-performance computing on Google Cloud Platform (GCP).
  • Develop and optimize data pipelines using GCP-native services like BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Data Fusion, and Cloud Storage.
  • Work with diverse data sources to extract, transform, and load data into enterprise-grade data lakes and warehouses, ensuring high availability and reliability.
  • Implement and maintain real-time data streaming solutions using Pub/Sub, Dataflow, and Kafka.
  • Research and integrate the latest big data and visualization technologies to enhance analytics capabilities and improve efficiency.
  • Collaborate with cross-functional teams to implement machine learning models and AI-driven analytics solutions using Vertex AI and BigQuery ML.
  • Continuously improve existing data architectures to support scalability, performance optimization, and cost efficiency.
  • Enhance data security and governance by implementing industry best practices for access control, encryption, and compliance.
  • Automate and optimize data workflows to simplify reporting, dashboarding, and self-service analytics using Looker and Data Studio.


Basic Qualifications

  • 7+ years of experience in data engineering, software development, business intelligence, or data science, with expertise in large-scale data processing and analytics.
  • Strong proficiency in SQL and experience with BigQuery for data warehousing.
  • Hands-on experience in designing and developing ETL/ELT pipelines using GCP services (Cloud Composer, Dataflow, Dataproc, Data Fusion, or Apache Airflow).
  • Expertise in distributed computing and big data processing frameworks, such as Apache Spark, Hadoop, or Flink, particularly within Dataproc and Dataflow environments.
  • Experience with business intelligence and data visualization tools, such as Looker, Tableau, or Power BI.
  • Knowledge of data governance, security best practices, and compliance requirements in cloud environments.


Preferred Qualifications:

  • Degree/Diploma in Computer Science, Engineering, Mathematics, or a related technical field.
  • Experience working with GCP big data technologies, including BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud SQL.
  • Hands-on experience with real-time data processing frameworks, including Kafka and Apache Beam.
  • Proficiency in Python, Java, or Scala for data engineering and pipeline development.
  • Familiarity with DevOps best practices, CI/CD pipelines, Terraform, and infrastructure-as-code for managing GCP resources.
  • Experience integrating AI/ML models into data workflows, leveraging BigQuery ML, Vertex AI, or TensorFlow.
  • Understanding of Agile methodologies, software development life cycle (SDLC), and cloud cost optimization strategies.
Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
9 - 12 yrs
Best in industry
skill iconPython
PySpark
skill iconMachine Learning (ML)
SQL
skill iconData Science
+1 more

Roles & Responsibilities:  

We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.  


We are looking for a Lead Data Scientist who will be responsible for  

  • Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture  
  • Design or enhance ML workflows for data ingestion, model design, model inference and scoring 3. Oversight on team project execution and delivery  
  • Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies  
  • Visualize and publish model performance results and insights to internal and external audiences  


Qualifications:  

  • Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)  
  • Minimum of 9+ years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  • Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)  
  • Proficiency in Python and SQL required; PySpark/Spark experience a plus  
  • Ability to conduct a productive peer review and proper code structure in Github
  • Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)  
  • Working knowledge of modern CI/CD methods  


This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level. 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort