Cutshort logo
Data manipulation Jobs in Pune

2+ Data manipulation Jobs in Pune | Data manipulation Job openings in Pune

Apply to 2+ Data manipulation Jobs in Pune on CutShort.io. Explore the latest Data manipulation Job opportunities across top companies like Google, Amazon & Adobe.

icon
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹15L - ₹30L / yr
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
skill iconKubernetes
ECS
Amazon Redshift
+14 more

Core Responsibilities:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.

 

Skills:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.

 

Required experience:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)

 

Skills: Aws, Aws Cloud, Amazon Redshift, Eks

 

Must-Haves

Machine Learning +Aws+ (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sagemaker

Notice period - 0 to 15days only

Hybrid work mode- 3 days office, 2 days at home

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹25L - ₹30L / yr
skill iconMachine Learning (ML)
AWS CloudFormation
Online machine learning
skill iconAmazon Web Services (AWS)
ECS
+20 more

MUST-HAVES: 

  • Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
  • Notice period - 0 to 15 days only 
  • Hybrid work mode- 3 days office, 2 days at home


SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS


ADDITIONAL GUIDELINES:

  • Interview process: - 2 Technical round + 1 Client round
  • 3 days in office, Hybrid model. 


CORE RESPONSIBILITIES:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.


SKILLS:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.


REQUIRED EXPERIENCE:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort