11+ Data cleansing Jobs in Chennai | Data cleansing Job openings in Chennai
Apply to 11+ Data cleansing Jobs in Chennai on CutShort.io. Explore the latest Data cleansing Job opportunities across top companies like Google, Amazon & Adobe.
- Partnering with internal business owners (product, marketing, edit, etc.) to understand needs and develop custom analysis to optimize for user engagement and retention
- Good understanding of the underlying business and workings of cross functional teams for successful execution
- Design and develop analyses based on business requirement needs and challenges.
- Leveraging statistical analysis on consumer research and data mining projects, including segmentation, clustering, factor analysis, multivariate regression, predictive modeling, etc.
- Providing statistical analysis on custom research projects and consult on A/B testing and other statistical analysis as needed. Other reports and custom analysis as required.
- Identify and use appropriate investigative and analytical technologies to interpret and verify results.
- Apply and learn a wide variety of tools and languages to achieve results
- Use best practices to develop statistical and/ or machine learning techniques to build models that address business needs.
Requirements
- 2 - 4 years of relevant experience in Data science.
- Preferred education: Bachelor's degree in a technical field or equivalent experience.
- Experience in advanced analytics, model building, statistical modeling, optimization, and machine learning algorithms.
- Machine Learning Algorithms: Crystal clear understanding, coding, implementation, error analysis, model tuning knowledge on Linear Regression, Logistic Regression, SVM, shallow Neural Networks, clustering, Decision Trees, Random forest, XGBoost, Recommender Systems, ARIMA and Anomaly Detection. Feature selection, hyper parameters tuning, model selection and error analysis, boosting and ensemble methods.
- Strong with programming languages like Python and data processing using SQL or equivalent and ability to experiment with newer open source tools.
- Experience in normalizing data to ensure it is homogeneous and consistently formatted to enable sorting, query and analysis.
- Experience designing, developing, implementing and maintaining a database and programs to manage data analysis efforts.
- Experience with big data and cloud computing viz. Spark, Hadoop (MapReduce, PIG, HIVE).
- Experience in risk and credit score domains preferred.
We are looking for a MuleSoft RPA & IDP Developer with at least 5 years of experience in developing and implementing MuleSoft-based automation and integration solutions. The ideal candidate should be proficient in MuleSoft RPA (Robotic Process Automation) and IDP (Intelligent Document Processing) and have hands-on experience with Mule ESB, Anypoint Platform, API Management, and integration workflows.
- Key Responsibilities:Develop and implement automation solutions using MuleSoft RPA & IDP.
- Design and build integration flows using Mule ESB, Anypoint Platform, and API Management.
- Create and configure MuleSoft Connectors for seamless data exchange between systems.
- Optimize business processes by integrating automation solutions into enterprise applications.
- Troubleshoot and debug integration issues to ensure seamless data flow and automation.
- Collaborate with stakeholders to gather requirements and implement best practices for integration.
- Ensure compliance with security standards and industry best practices.
- Required Skills & Experience:5+ years of experience in MuleSoft development, including Mule ESB, Anypoint Platform, API Management.
- Hands-on experience with MuleSoft RPA & IDP for automating business processes.
- Strong experience in designing RESTful APIs, SOAP services, and DataWeave transformations.
- Expertise in integrating cloud-based and on-premise applications using MuleSoft Connectors.
- Knowledge of error handling, logging, security, and performance optimization in MuleSoft.
- Experience in CI/CD deployment processes for MuleSoft applications.
- Strong problem-solving skills and ability to work independently.
- Preferred Qualifications:MuleSoft Certifications (MuleSoft Certified Developer / Architect)
- Experience with Java, Spring Boot, and other integration tools is a plus.
- Prior experience in banking, healthcare, or financial domains is a bonus.
- Application Process:Immediate joiners preferred. Candidates currently serving their notice period and available to join before month-end will be given priority.
Job Description :
Responsibilities:
• Develop and maintain automated test scripts using Cypress for web applications.
• Conduct API testing using Postman to ensure the functionality, reliability, and performance of APIs.
• Collaborate with developers and other stakeholders to identify and resolve defects and issues.
• Create and execute test plans and test cases for both frontend and backend systems.
• Design and implement automation frameworks to enhance test coverage and efficiency.
• Perform load and stress testing to identify performance bottlenecks.
• Analyze and interpret test results, providing detailed reports and recommendations.
• Stay up-to-date with industry trends and best practices in automation testing.
Requirements:
• Proven experience in automation testing with Cypress.
• Strong knowledge of API testing tools such as Postman.
• Proficiency in scripting languages like JavaScript or Python.
• Familiarity with software QA methodologies, tools, and processes.
• Experience with continuous integration tools like Jenkins.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills
Must Have:
Automation, Cypress, API, Strong Java or .Net coding, Sql
Job Description:
We are seeking a Senior AI Developer to join our dynamic team and lead AI-driven projects. This role requires a blend of advanced technical expertise in AI and machine learning, exceptional problem-solving abilities, and experience in building and deploying scalable AI solutions.
Key Responsibilities:
- Machine Learning & AI: Expertise in ML algorithms, especially for classification, anomaly detection, and predictive analytics.
- Cloud Platforms: Proficiency in Google Cloud Platform (GCP) services, including Cloud Run, BigQuery, and Vertex AI.
- Programming Languages: Advanced knowledge of Python; experience with libraries like TensorFlow, PyTorch, Scikit-learn, and Pandas.
- Data Processing: Skilled in SQL, data transformations, and ETL processes, especially with BigQuery.
- Anomaly Detection: Proven experience in building and deploying anomaly detection models, particularly for network data.
- Network Data Analysis: Familiarity with analyzing network logs, IP traffic, and related data structures.
- DevOps & CI/CD: Experience with CI/CD pipelines for ML model deployment, particularly with tools like Cloud Build.
Are you interested to work at a place where you are given the opportunity to grow and level up your skills?
Are you looking to build products that are impactful?
Do you want to work with an energetic team, like-minded and passionate programmers?
Do you like to create impactful and scalable products?
What we Look for
- Bring in a "can-do" attitude
- Passion towards technology and software engineering - open source, pet projects, conference talks, getting better at writing clean code
- Participate in a highly fluid environment applying agile software development principles
- Carry out unit tests and other quality control mechanisms to inform and validate the designs and code
- Work with product owner/BAs to bring an end to end perspective of the problem
- Ability to ask the right questions and communicate your ideas with clarity
- Ability to collaborate with various stakeholders and take complete ownership
- Participate actively in building one of the most impactful organizations in retail technology
As a Software Engineer …..
- Quickly adapt to our startup environment which is both demanding and fast-paced (but is also a lot of fun)
- Eager to learn new concepts, and technologies and be productive in short time
- Convert a raw idea into usable real-life software products
- Write modular, readable and maintainable code
- Good understanding and working knowledge of technologies worked
- Demonstrate good problem solving skills
- Own and be able to manage a small to medium sized modules
- Demonstrate ability to guide and coach a new team member or fresher
About us:
HappyFox is a software-as-a-service (SaaS) support platform. We offer an enterprise-grade help desk ticketing system and intuitively designed live chat software.
We serve over 12,000 companies in 70+ countries. HappyFox is used by companies that span across education, media, e-commerce, retail, information technology, manufacturing, non-profit, government and many other verticals that have an internal or external support function.
To know more, Visit! - https://www.happyfox.com/
Responsibilities
- Build and scale production infrastructure in AWS for the HappyFox platform and its products.
- Research, Build/Implement systems, services and tooling to improve uptime, reliability and maintainability of our backend infrastructure. And to meet our internal SLOs and customer-facing SLAs.
- Implement consistent observability, deployment and IaC setups
- Lead incident management and actively respond to escalations/incidents in the production environment from customers and the support team.
- Hire/Mentor other Infrastructure engineers and review their work to continuously ship improvements to production infrastructure and its tooling.
- Build and manage development infrastructure, and CI/CD pipelines for our teams to ship & test code faster.
- Lead infrastructure security audits
Requirements
- At least 7 years of experience in handling/building Production environments in AWS.
- At least 3 years of programming experience in building API/backend services for customer-facing applications in production.
- Proficient in managing/patching servers with Unix-based operating systems like Ubuntu Linux.
- Proficient in writing automation scripts or building infrastructure tools using Python/Ruby/Bash/Golang
- Experience in deploying and managing production Python/NodeJS/Golang applications to AWS EC2, ECS or EKS.
- Experience in security hardening of infrastructure, systems and services.
- Proficient in containerised environments such as Docker, Docker Compose, Kubernetes
- Experience in setting up and managing test/staging environments, and CI/CD pipelines.
- Experience in IaC tools such as Terraform or AWS CDK
- Exposure/Experience in setting up or managing Cloudflare, Qualys and other related tools
- Passion for making systems reliable, maintainable, scalable and secure.
- Excellent verbal and written communication skills to address, escalate and express technical ideas clearly
- Bonus points – Hands-on experience with Nginx, Postgres, Postfix, Redis or Mongo systems.
Job profile- Oracle _ Solution Architect
Experience – 6+ years
Location- Bangalore/ Chennai/ Pune/ Noida
Salary- 30 LPA
Qualification- Any
Job Location- Bangalore/ Chennai/ Pune/ Noida
Key Skills- Oracle ERP cloud solutions, Rollout or implementation, Oracle EBS
Roles & Responsibilities-
- Solution architect experience in the delivery of Oracle ERP cloud solutions, leading the design and implementation of Oracle ERP Cloud Solutions across a range of client industries and leading customer engagements and advising on Oracle fusion related topics.
- Strong hands-on experience with Oracle Cloud ERP solution architectures, design, rollout and implementation leadership.
- 8 years minimum experience in Oracle EBS and Oracle ERP Cloud with the majority of recent experience in Oracle Fusion (Cloud) applications.
- Minimum 4-6 years of experience in Oracle cloud solution implementation. Recent experience in a lead role in at least 2 full ERP Cloud implementations
Full Time / Part Time- Full Time
Remote / on-site- WFO
Responsibilities
- Collaborate with multiple stakeholders to understand the business context
- Take responsibility for developing product features
- Implement development best practices
Requirements
- Rich experience in Javascript and front end frameworks like React, Angular or Vue
- Build a backend APIs using Java, Node, Python or GoLang
- Proficiency in leveraging Cloud Native components in AWS, Azure or GCP
- Experience in building scalable applications using Microservices principles is a plus
- Experience in designing for performance is a big plus
- Ability to write high quality code
- Experience in polygot persistence using databases like relational (MySQL, Postgres) and NoSQL (MongoDB, Cassandra, DynamoDB, Redis etc)
- Familiarity with DevOps tools and technologies is a plus
- Passion for continuous learning of new technologies
Responsibilities:
- Designing and implementing Java-based applications.
- Analyzing user requirements to inform application design.
- Defining application objectives and functionality.
- Aligning application design with business goals.
- Developing and testing software.
- Debugging and resolving technical problems that arise.
- Producing detailed design documentation.
- Recommending changes to existing Java infrastructure.
- Developing multimedia applications.
- Developing documentation to assist users.
- Ensuring continuous professional self-development.
Requirements:
- Good knowledge in Java integration, Spring and oauth/Rest Integration








