11+ Data cleansing Jobs in Pune | Data cleansing Job openings in Pune
Apply to 11+ Data cleansing Jobs in Pune on CutShort.io. Explore the latest Data cleansing Job opportunities across top companies like Google, Amazon & Adobe.
MUST-HAVES:
- Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
- Notice period - 0 to 15 days only
- Hybrid work mode- 3 days office, 2 days at home
SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS
ADDITIONAL GUIDELINES:
- Interview process: - 2 Technical round + 1 Client round
- 3 days in office, Hybrid model.
CORE RESPONSIBILITIES:
- The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
- Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
- Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
- Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
- System Integration: Integrate models into existing systems and workflows.
- Model Deployment: Deploy models to production environments and monitor performance.
- Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
- Continuous Improvement: Identify areas for improvement in model performance and systems.
SKILLS:
- Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
- Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
- Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
- Knowledge of model monitoring and performance evaluation.
REQUIRED EXPERIENCE:
- Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
- AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
- AWS data: Redshift, Glue
- Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Strong Brand Marketing Manager profiles, with experience with well-known B2C brands
Mandatory (Experience 1): Must have total 6+ YOE, and recent 3+ years working in end-to-end Brand Marketing (including Brand building, positioning, & brand strategy)
Mandatory (Experience 2): Must have strong hands-on experience in Brand Marketing for a Top / well-known B2C or Fintech brands
Mandatory (Core Skills): Must possess creative thinking, strong project management skills, and be execution-focused; should have managed brand campaigns including creative development, production, social content, media planning, and buying..
Mandatory (Education): Candidates from Top college like IIMs, ISB, MICA, VGSOM-IIT Kharagpur, IIT etc (Note: College can be ignored if candidate is from unicorn companies like Bharatpe, Unilever, Reliance, Groww etc)
Mandatory (Company): Well-know B2C Brands in FMCG, Fintech, BFSI, Consumer Tech, or Lifestyle brands.
Job Description:
We are looking for a Senior Java Developer with strong expertise in Apache Kafka and backend systems. The ideal candidate will have hands-on experience in Java (8/11+), Spring Boot, and building scalable, real-time data pipelines using Kafka.
Key Responsibilities:
- Develop and maintain backend services using Java and Spring Boot
- Design and implement Kafka-based messaging and streaming solutions
- Optimize Kafka performance (topics, partitions, consumers)
- Collaborate with cross-functional teams to deliver scalable microservices
- Ensure code quality and maintain best practices in a distributed environment
Required Skills:
- 6+ years in Java development
- 3+ years of hands-on Kafka experience (producers, consumers, streams)
- Strong knowledge of Spring Boot, REST APIs, and microservices
- Familiarity with Kafka Connect, Schema Registry, and stream processing
- Experience with containerization (Docker), CI/CD, and cloud platforms (AWS/GCP/Azure)
What you'll do:
· Perform complex application programming activities with an emphasis on mobile development: Node.js, TypeScript, JavaScript, RESTful APIs and related backend frameworks
· Assist in the definition of system architecture and detailed solution design that are scalable and extensible
· Collaborate with Product Owners, Designers, and other engineers on different permutations to find the best solution possible
· Own the quality of code and do your own testing. Write unit test and improve test coverage.
· Deliver amazing solutions to production that knock everyone’s socks off
· Mentor junior developers on the team
What we’re looking for:
· Amazing technical instincts. You know how to evaluate and choose the right technology and approach for the job. You have stories you could share about what problem you thought you were solving at first, but through testing and iteration, came to solve a much bigger and better problem that resulted in positive outcomes all-around.
· A love for learning. Technology is continually evolving around us, and you want to keep up to date to ensure we are using the right tech at the right time.
· A love for working in ambiguity—and making sense of it. You can take in a lot of disparate information and find common themes, recommend clear paths forward and iterate along the way. You don’t form an opinion and sell it as if it’s gospel; this is all about being flexible, agile, dependable, and responsive in the face of many moving parts.
· Confidence, not ego. You have an ability to collaborate with others and see all sides of the coin to come to the best solution for everyone.
· Flexible and willing to accept change in priorities, as necessary
· Demonstrable passion for technology (e.g., personal projects, open-source involvement)
· Enthusiastic embrace of DevOps culture and collaborative software engineering
· Ability and desire to work in a dynamic, fast paced, and agile team environment
· Enthusiasm for cloud computing platforms such as AWS or Azure
Basic Qualifications:
· Minimum B.S. / M.S. Computer Science or related discipline from accredited college or University
· At least 4 years of experience designing, developing, and delivering backend applications with Node.js, TypeScript
· At least 2 years of experience building internet facing services
· At least 2 years of experience with AWS and/or OpenShift
· Exposure to some of the following concepts: object-oriented programming, software engineering techniques, quality engineering, parallel programming, databases, etc.
· Experience integrating APIs with front-end and/or mobile-specific frameworks
· Proficiency in building and consuming RESTful APIs
· Ability to manage multiple tasks and consistently meet established timelines
· Strong collaboration skills
· Excellent written and verbal communications skills
Preferred Qualifications:
· Experience with Apache Cordova framework
- Demonstrable knowledge of native coding background in iOS, Android
· Experience developing and deploying applications within Kubernetes based containers
Experience in Agile and SCRUM development techniques
Qualification / Requirement
. 4-12 years core development experience in VisionPLUS product in
credit card processing domain
. Exposure to VMx / WNGSFM is required
. Strong hands-on working knowledge in CMS/FAS module
. Strong knowledge of CICS
. Should have strong knowledge of VisionPLUS online architecture and troubleshooting experience in online area
. Should be experienced in z/OS Software Change Management tools - Endevor and Changeman
. Experience in design and development of medium complexity problems
. Good knowledge and experience of DevOps & Agile discipline
Strong interpersonal and communications skills
Job description
Ruby on Rails Developer Responsibilities :
- Designing and developing new web applications.
- Maintaining and troubleshooting existing web applications.
- Writing and maintaining reliable Ruby code.
- Integrating data storage solutions.
- Creating back-end components.
- Identifying and fixing bottlenecks and bugs.
- Integrating user-facing elements designed by the front-end team.
- Connecting applications with additional web servers.
- Maintaining APIs.
Ruby on Rails Developer Requirements :
- Bachelor's degree in Computer Science, Computer Engineering, or related field.
- Experience working with Ruby on Rails as well as libraries like Resque and RSpec.
- Ability to write clean Ruby code.
- Proficiency with code versioning tools including Git, Github, SVN, and Mercurial.

Hiring for a Leading InsureTech Platform in Mumbai/Pune
- Strong experience with one or more general purpose programming languages including but
not limited to: Python, Java, C/C++, C#
- Demonstrated expertise working with at least one modern enterprise application frameworks like
Spring Boot, Play Framework, Django
- Demonstrated expertise in building scalable distributed applications in microservices architecture
- Expert knowledge of best practice software engineering methodologies and coding standards
- Strong and proven advocacy for Test Driven Development is preferred
- Experience with SQL (mySQL, Postgres, etc) and NoSQL (MongoDb, DynamoDB, Aerospike or Redis)
- Production experience in running cloud-based enterprise-grade systems at scale
- Natural ability to process requirements, figure out multiple execution options, their complexity, and
estimate the scope of work required to get tasks done
- DevOps experience
- Cloud experience (AWS required, Google Cloud Platform bonus
What You'll Do
You will be part of our data platform & data engineering team. As part of this agile team, you will work in our cloud native environment and perform following activities to support core product development and client specific projects:
- You will develop the core engineering frameworks for an advanced self-service data analytics product.
- You will work with multiple types of data storage technologies such as relational, blobs, key-value stores, document databases and streaming data sources.
- You will work with latest technologies for data federation with MPP (Massive Parallel Processing) capabilities
- Your work will entail backend architecture to enable data modeling, data queries and API development for both back-end and front-end data interfaces.
- You will support client specific data processing needs using SQL and Python/Pyspark
- You will integrate our product with other data products through Django APIs
- You will partner with other team members in understanding the functional / non-functional business requirements, and translate them into software development tasks
- You will follow the software development best practices in ensuring that the code architecture and quality of code written by you is of high standard, as expected from an enterprise software
- You will be a proactive contributor to team and project discussions
Who you are
- Strong education track record - Bachelors or an advanced degree in Computer Science or a related engineering discipline from Indian Institute of Technology or equivalent premium institute.
- 2-3 years of experience in data queries, data processing and data modeling
- Excellent ANSI SQL skills to handle complex queries
- Excellent Python and Django programming skills.
- Strong knowledge and experience in modern and distributed data stack components such as the Spark, Hive, Airflow, Kubernetes, Docker etc.
- Experience with cloud environments (AWS, Azure) and native cloud technologies for data storage and data processing
- Experience with relational SQL and NoSQL databases, including Postgres, Blobs, MongoDB etc.
- Familiarity with ML models is highly preferred
- Experience with Big Data processing and performance optimization
- Should know how to write modular, optimized and documented code.
- Should have good knowledge around error handling.
- Experience in version control systems such as GIT
- Strong problem solving and communication skills.
- Self-starter, continuous learner.
Good to have some exposure to
- Start-up experience is highly preferred
- Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
- Agile software development methodologies.
- Working in multi-functional, multi-location teams
What You'll Love About Us – Do ask us about these!
- Be an integral part of the founding team. You will work directly with the founder
- Work Life Balance. You can't do a good job if your job is all you do!
- Prepare for the Future. Academy – we are all learners; we are all teachers!
- Diversity & Inclusion. HeForShe!
- Internal Mobility. Grow with us!
- Business knowledge of multiple sectors
• Excellent programming skills in C, C++11
• Strong working experience in developing application on Linux.
• Experience of working with multithreading, IPC mechanism, queuing is must.
• Effective requirement analysis and effort estimation skills.
• In-depth knowledge of Object-Oriented Programming.
• In-depth understanding of docker-containers
• Understanding of container orchestration tools (e.g. Kubernetes, docker swarm)
• Experience using MQTT, SSL, boost, etc.
• Preliminary Python hands-on experience
• Unit Testing and Test-Driven Development
• Experience of working with Interfacing or implementing any protocol (BACnet, Modbus, etc.) will be an
added advantage
• Knowledge of developing applications for data collection, real time monitoring systems will be an added
advantage
• Experienced with all phases of project in development, testing and deployment and management of
enterprise solutions.
• Aware of Agile Methodologies, SCRUM, CI/CD methods
• Coordinate application implementations and follow-up on client problems


