Cutshort logo
Data migration Jobs in Pune

5+ Data migration Jobs in Pune | Data migration Job openings in Pune

Apply to 5+ Data migration Jobs in Pune on CutShort.io. Explore the latest Data migration Job opportunities across top companies like Google, Amazon & Adobe.

icon
Pune, Hybrid
4 - 8 yrs
₹10L - ₹25L / yr
SAP
Data migration
ETL

Job Description:


We are currently seeking a talented and experienced SAP SF Data Migration Specialist to join our team and drive the successful migration of SAP S/4 from SAP ECC.


As the SAP SF Data Migration Specialist, you will play a crucial role in overseeing the design, development, and implementation of data solutions within our SAP SF environment. You will collaborate closely with cross-functional teams to ensure data integrity, accuracy, and usability to support business processes and decision-making. 



About the Company:


We are a dynamic and innovative company committed to delivering exceptional solutions that empower our clients to succeed. With our headquarters in the UK and a global footprint across the US, Noida, and Pune in India, we bring a decade of expertise to every endeavour, driving real results. We take a holistic approach to project delivery, providing end-to-end services that encompass everything from initial discovery and design to implementation, change management, and ongoing support. Our goal is to help clients leverage the full potential of the Salesforce platform to achieve their business objectives.



What Makes VE3 The Best For You We think of your family as our family, no matter the shape or size. We offer maternity leaves, PF Fund Contributions, 5 days working week along with a generous paid time off program that benefits balance your work & personal life.


Requirements

Responsibilities:

  • Lead the design and implementation of data migration strategies and solutions within SAP SF environments.
  • Develop and maintain data migration plans, ensuring alignment with project timelines and objectives.
  • Collaborate with business stakeholders to gather and analyse data requirements, ensuring alignment with business needs and objectives.
  • Design and implement data models, schemas, and architectures to support SAP data structures and functionalities.
  • Lead data profiling and analysis activities to identify data quality issues, gaps, and opportunities for improvement.
  • Define data transformation rules and processes to ensure data consistency, integrity, and compliance with business rules and regulations.
  • Manage data cleansing, enrichment, and standardization efforts to improve data quality and usability.
  • Coordinate with technical teams to implement data migration scripts, ETL processes, and data loading mechanisms.
  • Develop and maintain data governance policies, standards, and procedures to ensure data integrity, security, and privacy.
  • Lead data testing and validation activities to ensure accuracy and completeness of migrated data.
  • Provide guidance and support to project teams, including training, mentoring, and knowledge sharing on SAP data best practices and methodologies.
  • Stay current with SAP data management trends, technologies, and best practices, and recommend innovative solutions to enhance data capabilities and performance.

Requirements:

  • Bachelor’s degree in computer science, Information Systems, or related field; master’s degree preferred.
  • 10+ years of experience in SAP and Non-SAP data management, with a focus on data migration, data modelling, and data governance.
  • Have demonstrable experience as an SAP Data Consultant, ideally working across SAP SuccessFactors and non-SAP systems
  • Highly knowledgeable and experienced in managing HR data migration projects in SAP SuccessFactors environments
  • Demonstrate knowledge of how data aspects need to be considered within overall SAP solution design
  • Manage the workstream activities and plan, including stakeholder management, engagement with the business and the production of governance documentation.
  • Proven track record of leading successful SAP data migration projects from conception to completion.
  • Excellent analytical, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional teams.
  • Experience with SAP Activate methodologies preferred.
  • SAP certifications in data management or related areas are a plus.
  • Ability to work independently and thrive in a fast-paced, dynamic environment.
  • Lead the data migration workstream, with a direct team of circa 5 resources in addition to other 3rd party and client resource.
  • Work flexibly and remotely. Occasional UK travel will be required


Benefits

  • Competitive salary and comprehensive benefits package.
  • Opportunity to work in a dynamic and challenging environment on critical migration projects.
  • Professional growth opportunities in a supportive and forward-thinking organization.
  • Engagement with cutting-edge SAP technologies and methodologies in data migration.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Pune, Mumbai
10 - 22 yrs
Best in industry
DynamoDB
AWS
EMR
Data migration
Data modeling

Responsibilities:

 

  • Design, implement, and maintain scalable and reliable database solutions on the AWS platform.
  • Architect, deploy, and optimize DynamoDB databases for performance, scalability, and cost-efficiency.
  • Configure and manage AWS OpenSearch (formerly Amazon Elasticsearch Service) clusters for real-time search and analytics capabilities.
  • Design and implement data processing and analytics solutions using AWS EMR (Elastic MapReduce) for large-scale data processing tasks.
  • Collaborate with cross-functional teams to gather requirements, design database solutions, and implement best practices.
  • Perform performance tuning, monitoring, and troubleshooting of database systems to ensure high availability and performance.
  • Develop and maintain documentation, including architecture diagrams, configurations, and operational procedures.
  • Stay current with the latest AWS services, database technologies, and industry trends to provide recommendations for continuous improvement.
  • Participate in the evaluation and selection of new technologies, tools, and frameworks to enhance database capabilities.
  • Provide guidance and mentorship to junior team members, fostering knowledge sharing and skill development.

 

Requirements:

 

  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Proven experience as an AWS Architect or similar role, with a focus on database technologies.
  • Hands-on experience designing, implementing, and optimizing DynamoDB databases in production environments.
  • In-depth knowledge of AWS OpenSearch (Elasticsearch) and experience configuring and managing clusters for search and analytics use cases.
  • Proficiency in working with AWS EMR (Elastic MapReduce) for big data processing and analytics.
  • Strong understanding of database concepts, data modelling, indexing, and query optimization.
  • Experience with AWS services such as S3, EC2, RDS, Redshift, Lambda, and CloudFormation.
  • Excellent problem-solving skills and the ability to troubleshoot complex database issues.
  • Solid understanding of cloud security best practices and experience implementing security controls in AWS environments.
  • Strong communication and collaboration skills with the ability to work effectively in a team environment.
  • AWS certifications such as AWS Certified Solutions Architect, AWS Certified Database - Specialty, or equivalent certifications are a plus.


Read more
Tredence
Rohit S
Posted by Rohit S
Chennai, Pune, Bengaluru (Bangalore), Gurugram
11 - 16 yrs
₹20L - ₹32L / yr
Data Warehouse (DWH)
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Data engineering
Data migration
+1 more
• Engages with Leadership of Tredence’s clients to identify critical business problems, define the need for data engineering solutions and build strategy and roadmap
• S/he possesses a wide exposure to complete lifecycle of data starting from creation to consumption
• S/he has in the past built repeatable tools / data-models to solve specific business problems
• S/he should have hand-on experience of having worked on projects (either as a consultant or with in a company) that needed them to
o Provide consultation to senior client personnel o Implement and enhance data warehouses or data lakes.
o Worked with business teams or was a part of the team that implemented process re-engineering driven by data analytics/insights
• Should have deep appreciation of how data can be used in decision-making
• Should have perspective on newer ways of solving business problems. E.g. external data, innovative techniques, newer technology
• S/he must have a solution-creation mindset.
Ability to design and enhance scalable data platforms to address the business need
• Working experience on data engineering tool for one or more cloud platforms -Snowflake, AWS/Azure/GCP
• Engage with technology teams from Tredence and Clients to create last mile connectivity of the solutions
o Should have experience of working with technology teams
• Demonstrated ability in thought leadership – Articles/White Papers/Interviews
Mandatory Skills Program Management, Data Warehouse, Data Lake, Analytics, Cloud Platform
Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
MNC Pune based IT company

MNC Pune based IT company

Agency job
via Bhs Staffing Solutions Pvt Ltd by Bhagyesh Shinde
Pune
10 - 18 yrs
₹35L - ₹40L / yr
Google Cloud Platform (GCP)
Dataflow architecture
Data migration
Data processing
Big Data
+4 more

CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)

 

Below are the job Details :-

 

Experience 10 to 18 years

 

Mandatory skills –

  • data migration,
  • data flow

The ideal candidate for this role will have the below experience and qualifications:  

  • Experience of building a range of Services in a Cloud Service provider (ideally GCP)  
  • Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies. 
  • Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools 
  • Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion 
  • Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.  
  • Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)  
  • Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.  
  • Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform  
  • Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala  
  • Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes  
  • Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security 
  • Financial experience is preferred 
  • Ability to learn new technologies and rapidly prototype newer concepts 
  • Top-down thinker, excellent communicator, and great problem solver

 

Exp:- 10  to 18 years

 

Location:- Pune

 

Candidate must have experience in below.

  • GCP Data Platform
  • Data Processing:- Data Flow, Data Prep, Data Fusion
  • Data Storage:- Big Query, Cloud Sql,
  • Pub Sub, GCS Bucket
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort