Cutshort logo
DMS Jobs in Bangalore (Bengaluru)

2+ DMS Jobs in Bangalore (Bengaluru) | DMS Job openings in Bangalore (Bengaluru)

Apply to 2+ DMS Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest DMS Job opportunities across top companies like Google, Amazon & Adobe.

icon
GrowthArc

at GrowthArc

2 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote, Bengaluru (Bangalore)
7yrs+
Upto ₹38L / yr (Varies
)
skill iconAmazon Web Services (AWS)
skill iconKubernetes
skill iconDocker
Terraform
Amazon EC2
+7 more

Seeking an experienced AWS Migration Engineer with 7+ years of hands-on experience to lead cloud migration projects, assess legacy systems, and ensure seamless transitions to AWS infrastructure. The role focuses on strategy, execution, optimization, and minimizing downtime during migrations.


Key Responsibilities:

  • Conduct assessments of on-premises and legacy systems for AWS migration feasibility.
  • Design and execute migration strategies using AWS Migration Hub, DMS, and Server Migration Service.
  • Plan and implement lift-and-shift, re-platforming, and refactoring approaches.
  • Optimize workloads post-migration for cost, performance, and security.
  • Collaborate with stakeholders to define migration roadmaps and timelines.
  • Perform data migration, application re-architecture, and hybrid cloud setups.
  • Monitor migration progress, troubleshoot issues, and ensure business continuity.
  • Document processes and provide post-migration support and training.
  • Manage and troubleshoot Kubernetes/EKS networking components including VPC CNI, Service Mesh, Ingress controllers, and Network Policies.


Required Qualifications:

  • 7+ years of IT experience, with minimum 4 years focused on AWS migrations.
  • AWS Certified Solutions Architect or Migration Specialty certification preferred.
  • Expertise in AWS services: EC2, S3, RDS, VPC, Direct Connect, DMS, SMS.
  • Strong knowledge of cloud migration tools and frameworks (AWS MGN, Snowball).
  • Experience with infrastructure as code (CloudFormation, Terraform).
  • Proficiency in scripting (Python, PowerShell) and automation.
  • Familiarity with security best practices (IAM, encryption, compliance).
  • Hands-on experience with Kubernetes/EKS networking components and best practices.


Preferred Skills:

  • Experience with hybrid/multi-cloud environments.
  • Knowledge of DevOps tools (Jenkins, GitLab CI/CD).
  • Excellent problem-solving and communication skills.
Read more
A fast growing Big Data company

A fast growing Big Data company

Agency job
via Careerconnects by Kumar Narayanan
Noida, Bengaluru (Bangalore), Chennai, Hyderabad
6 - 8 yrs
₹10L - ₹15L / yr
AWS Glue
SQL
skill iconPython
PySpark
Data engineering
+6 more

AWS Glue Developer 

Work Experience: 6 to 8 Years

Work Location:  Noida, Bangalore, Chennai & Hyderabad

Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops, 

Job Reference ID:BT/F21/IND


Job Description:

Design, build and configure applications to meet business process and application requirements.


Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.


Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.


➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

➢ Create data pipeline architecture by designing and implementing data ingestion solutions.

➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

➢ Author ETL processes using Python, Pyspark.

➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.

➢ ETL process monitoring using CloudWatch events.

➢ You will be working in collaboration with other teams. Good communication must.

➢ Must have experience in using AWS services API, AWS CLI and SDK


Professional Attributes:

➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.


Qualification:

➢ Degree in Computer Science, Computer Engineering or equivalent.


Salary: Commensurate with experience and demonstrated competence

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort