
Responsibilities: - Write and maintain production level code in Python for deploying machine learning models - Create and maintain deployment pipelines through CI/CD tools (preferribly GitLab CI) - Implement alerts and monitoring for prediction accuracy and data drift detection - Implement automated pipelines for training and replacing models - Work closely with with the data science team to deploy new models to production Required Qualifications: - Degree in Computer Science, Data Science, IT or a related discipline. - 2+ years of experience in software engineering or data engineering. - Programming experience in Python - Experience in data profiling, ETL development, testing and implementation - Experience in deploying machine learning models
Good to have: - Experience in AWS resources for ML and data engineering (SageMaker, Glue, Athena, Redshift, S3) - Experience in deploying TensorFlow models - Experience in deploying and managing ML Flow

About Carsome
About
Connect with the team
Similar jobs
Key Responsibilities
Identify and bid on relevant projects on platforms like Upwork, Freelancer, Fiverr, Guru, and PeoplePerHour.
Write compelling business proposals, cover letters, and client responses to secure projects.
Engage with potential clients, understand their requirements, and provide tailored solutions.
Negotiate terms, finalize contracts, and ensure smooth project onboarding.
Maintain strong follow-ups with leads and nurture relationships for long-term business growth.
Collaborate with the technical team to ensure the successful execution of projects.
Stay updated with market trends and competitors to strategize effectively.
Required Skills & Qualifications
✅ 2+ years of experience in bidding and proposal writing.
✅ Proven track record of successfully acquiring projects from Upwork, Freelancer, and similar platforms.
✅ Excellent written and verbal communication skills.
✅ Strong understanding of IT services, web development, and digital solutions.
✅ Ability to negotiate deals and handle client queries professionally.
✅ Strong analytical and problem-solving skills to create effective proposals.
Why Join Us?
🚀 Exciting Growth Opportunities – Work with international clients and high-value projects.
💰 Attractive Incentives – Performance-based bonuses and rewards.
🤝 Collaborative Team Culture – Work with experienced professionals in a dynamic environment.
🏡 Skill Enhancement – Continuous learning and development opportunities.
Job Title: Developer
Work Location: Pune, MH
Skills Required: Azure Data Factory
Experience Range in Required Skills: 6-8 Years
Job Description: Azure, ADF, Databricks, Python
Essential Skills: Azure, ADF, Databricks, Python
Desirable Skills: Azure, ADF, Databricks, Python
Job Description:
We are currently seeking a talented and experienced SAP SF Data Migration Specialist to join our team and drive the successful migration of SAP S/4 from SAP ECC.
As the SAP SF Data Migration Specialist, you will play a crucial role in overseeing the design, development, and implementation of data solutions within our SAP SF environment. You will collaborate closely with cross-functional teams to ensure data integrity, accuracy, and usability to support business processes and decision-making.
About the Company:
We are a dynamic and innovative company committed to delivering exceptional solutions that empower our clients to succeed. With our headquarters in the UK and a global footprint across the US, Noida, and Pune in India, we bring a decade of expertise to every endeavour, driving real results. We take a holistic approach to project delivery, providing end-to-end services that encompass everything from initial discovery and design to implementation, change management, and ongoing support. Our goal is to help clients leverage the full potential of the Salesforce platform to achieve their business objectives.
What Makes VE3 The Best For You We think of your family as our family, no matter the shape or size. We offer maternity leaves, PF Fund Contributions, 5 days working week along with a generous paid time off program that benefits balance your work & personal life.
Requirements
Responsibilities:
- Lead the design and implementation of data migration strategies and solutions within SAP SF environments.
- Develop and maintain data migration plans, ensuring alignment with project timelines and objectives.
- Collaborate with business stakeholders to gather and analyse data requirements, ensuring alignment with business needs and objectives.
- Design and implement data models, schemas, and architectures to support SAP data structures and functionalities.
- Lead data profiling and analysis activities to identify data quality issues, gaps, and opportunities for improvement.
- Define data transformation rules and processes to ensure data consistency, integrity, and compliance with business rules and regulations.
- Manage data cleansing, enrichment, and standardization efforts to improve data quality and usability.
- Coordinate with technical teams to implement data migration scripts, ETL processes, and data loading mechanisms.
- Develop and maintain data governance policies, standards, and procedures to ensure data integrity, security, and privacy.
- Lead data testing and validation activities to ensure accuracy and completeness of migrated data.
- Provide guidance and support to project teams, including training, mentoring, and knowledge sharing on SAP data best practices and methodologies.
- Stay current with SAP data management trends, technologies, and best practices, and recommend innovative solutions to enhance data capabilities and performance.
Requirements:
- Bachelor’s degree in computer science, Information Systems, or related field; master’s degree preferred.
- 10+ years of experience in SAP and Non-SAP data management, with a focus on data migration, data modelling, and data governance.
- Have demonstrable experience as an SAP Data Consultant, ideally working across SAP SuccessFactors and non-SAP systems
- Highly knowledgeable and experienced in managing HR data migration projects in SAP SuccessFactors environments
- Demonstrate knowledge of how data aspects need to be considered within overall SAP solution design
- Manage the workstream activities and plan, including stakeholder management, engagement with the business and the production of governance documentation.
- Proven track record of leading successful SAP data migration projects from conception to completion.
- Excellent analytical, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional teams.
- Experience with SAP Activate methodologies preferred.
- SAP certifications in data management or related areas are a plus.
- Ability to work independently and thrive in a fast-paced, dynamic environment.
- Lead the data migration workstream, with a direct team of circa 5 resources in addition to other 3rd party and client resource.
- Work flexibly and remotely. Occasional UK travel will be required
Benefits
- Competitive salary and comprehensive benefits package.
- Opportunity to work in a dynamic and challenging environment on critical migration projects.
- Professional growth opportunities in a supportive and forward-thinking organization.
- Engagement with cutting-edge SAP technologies and methodologies in data migration.
Urgent Opening for Mobile Automation Testing with Value Labs
Kindly share your details;-
Total Experience:
Relevant Experience:
Current CTC:
Exp CTC:
Offer:
Notice Period:
JD:-• Well versed with Functional/Manual Testing and testing documentation
• SeeTest Automation/Appium Automation tool
• Must have Mobile Automation Experience
• Good communication skills and should be able to handle the projects independently
• Should be flexible and ready to support when required and able to work in handshake model with onsite counter parts.
• Very good communication skills to take up Dev, client, and onsite communication.
• Ready API Automation Expertise
• JIRA Tool Experience
• Should be willing to learn new libraries and apply them in project quickly
• Willing to learn UI automation test frameworks
• Should work on Shifts 2-11PM
Regards
Ajay Kumar(Kindly reach me at ajay.kumar@saivasystemdotcom)
PRIMARY RESPONSIBILITIES :-
- Should have develop professional applications in PHP using CodeIgniter.
- Additional advantage if worked on frameworks like Laravel and Zend
- Extensive knowledge with JavaScript, jQuery or Angular
- Advance Level SQL knowledge
- Additional advantage if familiar with Server Configuration and Monitoring
- Should have working on Git Repository
- Candidate should have exceptionally good debugging skills.
- Clear about Object Oriented Programming Concepts.
SECONDARY RESPONSIBILITIES :-
- Ability to work independently and take ownership of the project.
- Should be capable to work in a team.
- Must have strong communication skills.
- Should be enthusiasts and able to take challenge.
Duration :- Full Time
Company Name :- Fragma Data
Main Skills are Java, Microservices, and SQL
Primary Responsibilities
Development of applications in Java including:
Building data processing platforms.
Developing micro-service-oriented applications.
Interact with stakeholders of the applications being developed.
Desired Skills
Must have experience in Java JEE, Spring Framework, and Microservices
Experience in SQL and JDBC
Experience in build tools Maven, git
Experience in Cloud Platforms AWS, Azure is a plus.
We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Experience
12+ Years
Location
Hyderabad
Skills
Bachelors/Masters/Phd in CS or equivalent industry experience
10+ years of industry experience in java related frameworks such as Spring and/or Typesafe
Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python
Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
Demonstrated expertise of building and shipping cloud native applications
Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
Experience in API development using Swagger
Strong expertise with containerization technologies including kubernetes, docker-compose
Experience with cloud platform services such as AWS, Azure or GCP.
Implementing automated testing platforms and unit tests
Proficient understanding of code versioning tools, such as Git
Familiarity with continuous integration, Jenkins
Responsibilities
Architect, Design and Implement Large scale data processing pipelines
Design and Implement APIs
Assist in dev ops operations
Identify performance bottlenecks and bugs, and devise solutions to these problems
Help maintain code quality, organization, and documentation
Communicate with stakeholders regarding various aspects of solution.
Mentor team members on best practices












