11+ Data cleansing Jobs in Pune | Data cleansing Job openings in Pune
Apply to 11+ Data cleansing Jobs in Pune on CutShort.io. Explore the latest Data cleansing Job opportunities across top companies like Google, Amazon & Adobe.
MUST-HAVES:
- Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
- Notice period - 0 to 15 days only
- Hybrid work mode- 3 days office, 2 days at home
SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS
ADDITIONAL GUIDELINES:
- Interview process: - 2 Technical round + 1 Client round
- 3 days in office, Hybrid model.
CORE RESPONSIBILITIES:
- The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
- Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
- Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
- Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
- System Integration: Integrate models into existing systems and workflows.
- Model Deployment: Deploy models to production environments and monitor performance.
- Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
- Continuous Improvement: Identify areas for improvement in model performance and systems.
SKILLS:
- Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
- Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
- Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
- Knowledge of model monitoring and performance evaluation.
REQUIRED EXPERIENCE:
- Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
- AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
- AWS data: Redshift, Glue
- Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
SENIOR DATA ENGINEER:
ROLE SUMMARY:
Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.
RESPONSIBILITIES:
- Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
- Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
- Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
- Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
- DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
- Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
- Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
- Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
- Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
- Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.
REQUIRED QUALIFICATIONS:
- Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
- Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
- Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
- ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
- Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
- DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
- Serverless and events: Design event-driven distributed systems on AWS.
- NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
- AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.
NICE-TO-HAVE QUALIFICATIONS:
- Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
- Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
- Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
- Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.
OUTCOMES AND MEASURES:
- Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
- Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
- Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
- Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
- Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.
LOCATION AND SCHEDULE:
● Location: Outside US (OUS).
● Schedule: Minimum 6 hours of overlap with US time zones.
Job Description :
We’re looking for a Full Time Project Manager (Payments & Crypto) to oversee initiatives around payment infrastructure, digital wallets, and blockchain integrations. You’ll coordinate between technical, product, and business teams to ensure smooth execution of payment and crypto-related projects.
Responsibilities
• Manage and track progress of ongoing payment and crypto projects.
• Coordinate between product, engineering, design, and compliance teams.
• Set timelines, milestones, and deliverables for payment feature rollouts.
• Report project updates and performance metrics to leadership.
• Identify risks, dependencies, and opportunities for optimization.
Requirements
• 2+ years of experience managing fintech, payments, or blockchain projects.
• Strong understanding of payment gateways, wallets, and crypto ecosystems.
• Excellent communication and stakeholder management skills.
• Hands-on experience with tools like Jira, Asana, or Notion.
Looking an experienced and organized Project Manager to oversee end-to-end project execution, ensure timely delivery, and coordinate between teams, clients, and stakeholders. The role requires strong leadership, planning, and communication skills to successfully manage projects and achieve business objectives.
Key Responsibilities:
- Plan, execute, and deliver projects within scope, budget, and timelines.
- Define project objectives, deliverables, and resource requirements.
- Coordinate with cross-functional teams (sales, operations, documentation, counseling, etc.).
- Track project progress, prepare reports, and communicate updates to management and stakeholders.
- Identify project risks, issues, and bottlenecks; implement corrective actions.
- Ensure client requirements are clearly understood and met with quality standards.
- Manage project documentation, compliance, and reporting.
- Lead, mentor, and motivate team members to achieve performance goals.
- Optimize workflows and improve efficiency in project execution.
Requirements:
- Proven work experience as a Project Manager.
- Strong knowledge of project management tools, methodologies, and reporting.
- Excellent leadership, problem-solving, and organizational skills.
- Strong communication and stakeholder management abilities.
- Ability to handle multiple projects simultaneously and meet deadlines.
- Proficiency in MS Office, project management tools (e.g., Trello, Asana, MS Project, Jira).
✨ Role: Store Executive
📍 Locations:
• Erode, Tamil Nadu
• Pune, Maharashtra
💼 Industry: Footwear / Retail
💰 Budget: ₹15,000 per month
🎓 Eligibility: Freshers can apply
🔑 Key Responsibilities:
• Assist customers and deliver excellent service
• Support day-to-day store operations
• Help in billing, stock handling, and visual merchandising
• Maintain store hygiene and brand standards
• Coordinate with the store team to achieve daily targets
🌟 What We’re Looking For:
• Good communication skills
• Customer-first attitude
• Willingness to learn and grow in retail
• Flexible to work in shifts
Expected Scope of work / Responsibilities:
Support as process expert for designing the process template in managing the SAP projects
General Requirements:
• Implementation Experience: Minimum of two end-to-end SAP EWM implementations.
• Support Experience: At least two projects involving post-go-live support or full-time support roles.
• Rollout Experience: Understanding of template rollout methodologies.
• Integration & Interface Expertise: Hands-on experience with IDOCs, CIF, RFC for setup and issue resolution.
• Solution Design: Ability to translate customer requirements into EWM solutions effectively.
Technical Expertise (6–10 years):
• EWM Versions: Experience in SAP EWM 9.5 and S/4HANA Embedded EWM, with at least one year in S/4HANA.
• Functional Specifications: Ability to prepare Functional Specs for EWM developments.
• EWM-ERP Integration: Expertise in EWM-ERP integration setup and configuration.
• EWM Features: Must have hands-on experience with all mandatory EWM features (as per provided reference).
• Nice to Have: Knowledge of Value-Added Services (VAS) and Automated Warehouse processes.
Process Knowledge & Experience:
• Inbound, Outbound, Warehousing, and Packaging: Must be strong in at least one process area, with good knowledge of the other three.
o Definition of "Strong": Ability to list worked-on processes, explain customer requirements, and describe how the process meets business needs.
• Process Variants: Understanding of process variations for different business scenarios.
• EWM Production Integration: Good knowledge of production replenishment processes and integration features.
Industry Expertise:
• Auto/Manufacturing Industry: Minimum two years of experience working in EWM within automotive or manufacturing industries.
Role Overview:
• Position: Customer-Facing SAP EWM Functional Consultant.
• Function: Serve as a Process Solution Expert, supporting Plant Users in addressing queries and issues.
Key Responsibilities:
• Customer Engagement:
o Conduct gap analysis and identify new business requirements.
o Collaborate with the Process Responsible team to develop new solutions.
• User & Process Support:
o Assist Plant Key Users & Plant Users in testing phases (Unit, Integration, UAT).
o Support data migration activities (Test Migration, Productive Migration).
o Conduct training sessions for Key Users.
o Provide Hypercare Support post go-live.
o Troubleshoot defects during testing and post-go-live phases.
Certifications (Preferred):
• SAP EWM Certification with knowledge of:
o Embedded EWM
o Decentralized EWM
• Optional: SAP WM Certification
Soft Skills & Competencies:
• Strong verbal and written communication skills in English.
• Ability to troubleshoot and resolve issues independently.
• Proficiency in MS Office (Excel, PowerPoint, etc.).
• Strong collaboration skills to work with internal and external customers effectively.
Travel Requirements:
• Willingness to travel for short-term business trips (up to 4 weeks) domestically and internationally.
Nice to Have:
• Exposure to SAP Transportation Management (SAP TM).
Skill Name: ETL Automation Testing
Location: Bangalore, Chennai and Pune
Experience: 5+ Years
Required:
Experience in ETL Automation Testing
Strong experience in Pyspark.
Budget: Industry Standards
Location: Bangalore, Pune, Noida and Chennai (WFO)
Positions: Multiple
Skills:
- .net with MVC/Angular/Web API
- Exp in SQL
Designation:Linux/System/Support Engineer (L2) Experience : 2-5 Yrs
Notice period : immediate to 30 Days
- Server Monitoring
- Deployments
- Collecting information about the reported issues
- Ensuring whether all the information has been logged in the ticketing system or not
- Must be able to follow and execute instructions specified in user guides, emails to run, monitor and trouble shoot
- Must be able and willing to document activities, procedures
- Must have trouble shooting skills and have knowledge on Antivirus, Firewall, Gateway
- Should be ready to work for extended shifts, if
- Good customer management skills bundled with good communication skills
- Databases: concepts and ability to use DB tools such as psql,
- Good Understanding of Oracle, weblogic, Linux/ Unix Terminology and able to execute commands
- Internet Technologies : Tomcat/ apache concepts, basic html, etc
- Able to use MS- Excel, Power Point
- Demonstrate, present and promote services to salons, spas & gyms (Beauty & Wellness Industry)
- Should have excellent verbal & presentation skills
- Knowlege of local language is a must
- Should be inovative and target driven
- Willing to travel
- Demonstrate honesty & integrity






