Cutshort logo
TSG Global Services Private Limited logo
AWS Data Migration Consultant-Delhi
AWS Data Migration Consultant-Delhi
TSG Global Services Private Limited's logo

AWS Data Migration Consultant-Delhi

Sony Pathak's profile picture
Posted by Sony Pathak
10 - 15 yrs
₹10L - ₹15L / yr
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
Skills
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
Migration

Greetings !!!


Looking Urgently !!!


Exp-Min 10 Years

Location-Delhi

Sal-nego



Role

AWS Data Migration Consultant

Provide Data Migration strategy, expert review and guidance on Data Migration from onprem to AWS infrastructure that includes AWS Fargate, PostgreSQL, DynamoDB. This includes review and SME inputs on:

·       Data migration plan, architecture, policies, procedures

·       Migration testing methodologies

·       Data integrity, consistency, resiliency.

·       Performance and Scalability

·       Capacity planning

·       Security, access control, encryption

·       DB replication and clustering techniques

·       Migration risk mitigation approaches

·       Verification and integrity testing, reporting (Record and field level verifications)

·       Schema consistency and mapping

·       Logging, error recovery

·       Dev-test, staging and production artifact promotions and deployment pipelines

·       Change management

·       Backup, DR approaches and best practices.


Qualifications

  • Worked on mid to large scale data migration projects, specifically from on-prem to AWS, preferably in BFSI domain
  • Deep expertise in AWS Redshift, PostgreSQL, DynamoDB from data management, performance, scalability and consistency standpoint
  • Strong knowledge of AWS Cloud architecture and components, solutions, well architected frameworks
  • Expertise in SQL and DB performance related aspects
  • Solution Architecture work for enterprise grade BFSI applications
  • Successful track record of defining and implementing data migration strategies
  • Excellent communication and problem solving skills
  • 10+ Yrs experience in Technology, at least 4+yrs in AWS and DBA/DB Management/Migration related work
  • Bachelors degree or higher in Engineering or related field


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About TSG Global Services Private Limited

Founded :
1999
Type
Size :
20-100
Stage :
Profitable
About
N/A
Connect with the team
Profile picture
Sony Pathak
Company social profiles
bloglinkedinfacebook

Similar jobs

Startup Focused on simplifying Buying Intent
Bengaluru (Bangalore)
4 - 9 yrs
₹28L - ₹56L / yr
Big Data
Apache Spark
Spark
Hadoop
ETL
+7 more
5+ years of experience in a Data Engineer role.
 Proficiency in Linux.
 Must have SQL knowledge and experience working with relational databases,
query authoring (SQL) as well as familiarity with databases including Mysql,
Mongo, Cassandra, and Athena.
 Must have experience with Python/Scala.
Must have experience with Big Data technologies like Apache Spark.
 Must have experience with Apache Airflow.
 Experience with data pipeline and ETL tools like AWS Glue.
 Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Read more
Bengaluru (Bangalore)
1 - 6 yrs
₹2L - ₹8L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+9 more

ROLE AND RESPONSIBILITIES

Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should

be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and

transform data into insights that drive business value, through use of data analytics, data visualization and data

modeling techniques.


QUALIFICATIONS AND EDUCATION REQUIREMENTS

Technical Bachelor’s Degree.

Non-Technical Degree holders should have 1+ years of relevant experience.

Read more
Perfios
Agency job
via Seven N Half by Susmitha Goddindla
Bengaluru (Bangalore)
4 - 6 yrs
₹4L - ₹15L / yr
SQL
ETL tool
python developer
skill iconMongoDB
skill iconData Science
+15 more
Job Description
1. ROLE AND RESPONSIBILITIES
1.1. Implement next generation intelligent data platform solutions that help build high performance distributed systems.
1.2. Proactively diagnose problems and envisage long term life of the product focusing on reusable, extensible components.
1.3. Ensure agile delivery processes.
1.4. Work collaboratively with stake holders including product and engineering teams.
1.5. Build best-practices in the engineering team.
2. PRIMARY SKILL REQUIRED
2.1. Having a 2-6 years of core software product development experience.
2.2. Experience of working with data-intensive projects, with a variety of technology stacks including different programming languages (Java,
Python, Scala)
2.3. Experience in building infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data
sources to support other teams to run pipelines/jobs/reports etc.
2.4. Experience in Open-source stack
2.5. Experiences of working with RDBMS databases, NoSQL Databases
2.6. Knowledge of enterprise data lakes, data analytics, reporting, in-memory data handling, etc.
2.7. Have core computer science academic background
2.8. Aspire to continue to pursue career in technical stream
3. Optional Skill Required:
3.1. Understanding of Big Data technologies and Machine learning/Deep learning
3.2. Understanding of diverse set of databases like MongoDB, Cassandra, Redshift, Postgres, etc.
3.3. Understanding of Cloud Platform: AWS, Azure, GCP, etc.
3.4. Experience in BFSI domain is a plus.
4. PREFERRED SKILLS
4.1. A Startup mentality: comfort with ambiguity, a willingness to test, learn and improve rapidl
Read more
Uber9 Business Process Services Pvt Ltd
Lakshmi J
Posted by Lakshmi J
Chennai
1 - 4 yrs
₹1L - ₹4L / yr
skill iconMongoDB
skill iconMachine Learning (ML)
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconAmazon Web Services (AWS)

Working along with the highly  motivated advanced Machine Learning team, with  key responsibilities are to research, design, develop, and implement applications that will be integrated into our workflows.

Responsibilities and Accountabilities:

 

  • Provide ML  & Deep Learning solutions and build models for day to day Needs.
  • Working on end-to-end automation with regard to complex workflows.
  • Should have the ability to read and understand the necessary  Deep Learning research papers and draw solution out of it
  • Information extraction from various kinds of documents submitted by our customers. These documents will be images (different formats and resolutions), and PDF (text and scanned images).
  • Advanced Natural Language processing algorithms to extract metadata and drive research based  workflows.
  • Work collaboratively with the Engineering and Product team to design and implement the company’s technical vision.

I Experience:


  • Nature of Experience: Practical experience applying machine learning to computer vision tasks or in NLP
Length of Experience: 1-2 years (Freshers with extraordinary projects)


III Skill Set & Personality Traits required:

  • Have a proven understanding of computer vision and machine learning theory.
  • Candidates should be able to analyze and synthesize data both syntactically and semantically using NLP techniques through Neural Networks (Transformers-BERT and its Variants,RNN, LSTM, Bi-LSTM,).
  • Reasoning on Knowledge Graphs
  • Applied Linguists and Computational linguistics
  • Should have in-depth knowledge of Computer Vision (Image classification (CNNs) & Processing) and Natural Language Processing(Syntactic and Semantic regime) 
  • Must have the following Machine Learning skills: Probabilistic Learning (Naive Bayes), Neural Networks (CNN, RNN, LSTM, Bi-LSTM, GCNN, Object detection Neural Networks-Yolo).
  • Proficiency in Python(mandatory) and Scala will be add on
  • Must have work knowledge in FrameWorks: TensorFlow/Pytorch/Keras, Spark, Sci-kit learn, Flask, Fast API
  • Must have work knowledge in Database: MongoDB
  • Must have working knowledge in Cloud: AWS (S3and Lambda)
  • Have vision and experience to make end-to-end Machine Learning Platform solutions.
  • Proven experience working in product driven environment building and shipping early-stage technologies.
  • Strong professionally – credible with integrity.
  • Good communication skills.
  • Strong interpersonal skills.
  • Organizational skills and ability to manage deadlines.
Read more
Mactores Cognition Private Limited
Remote only
2 - 15 yrs
₹6L - ₹40L / yr
skill iconAmazon Web Services (AWS)
PySpark
athena
Data engineering

As AWS Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts, and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision-making. You are passionate about the data quality of our business metrics and the flexibility of your solution that scales to respond to broader business questions. 


If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.

What you will do?

  • Write efficient code in - PySpark, Amazon Glue
  • Write SQL Queries in - Amazon Athena, Amazon Redshift
  • Explore new technologies and learn new techniques to solve business problems creatively
  • Collaborate with many teams - engineering and business, to build better data products and services 
  • Deliver the projects along with the team collaboratively and manage updates to customers on time


What are we looking for?

  • 1 to 3 years of experience in Apache Spark, PySpark, Amazon Glue
  • 2+ years of experience in writing ETL jobs using pySpark, and SparkSQL
  • 2+ years of experience in SQL queries and stored procedures
  • Have a deep understanding of all the Dataframe API with all the transformation functions supported by Spark 2.7+


You will be preferred if you have

  • Prior experience in working on AWS EMR, Apache Airflow
  • Certifications AWS Certified Big Data – Specialty OR Cloudera Certified Big Data Engineer OR Hortonworks Certified Big Data Engineer
  • Understanding of DataOps Engineering


Life at Mactores


We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.


1. Be one step ahead

2. Deliver the best

3. Be bold

4. Pay attention to the detail

5. Enjoy the challenge

6. Be curious and take action

7. Take leadership

8. Own it

9. Deliver value

10. Be collaborative


We would like you to read more details about the work culture on https://mactores.com/careers 


The Path to Joining the Mactores Team

At Mactores, our recruitment process is structured around three distinct stages:


Pre-Employment Assessment: 

You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.


Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.


HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.


At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.

Read more
Persistent Systems
at Persistent Systems
1 video
1 recruiter
Agency job
via Milestone Hr Consultancy by Haina khan
Bengaluru (Bangalore), Hyderabad, Pune
9 - 16 yrs
₹7L - ₹32L / yr
Big Data
skill iconScala
Spark
Hadoop
skill iconPython
+1 more
Greetings..
 
We have urgent requirement for the post of Big Data Architect in reputed MNC company
 
 


Location:  Pune/Nagpur,Goa,Hyderabad/Bangalore

Job Requirements:

  • 9 years and above of total experience preferably in bigdata space.
  • Creating spark applications using Scala to process data.
  • Experience in scheduling and troubleshooting/debugging Spark jobs in steps.
  • Experience in spark job performance tuning and optimizations.
  • Should have experience in processing data using Kafka/Pyhton.
  • Individual should have experience and understanding in configuring Kafka topics to optimize the performance.
  • Should be proficient in writing SQL queries to process data in Data Warehouse.
  • Hands on experience in working with Linux commands to troubleshoot/debug issues and creating shell scripts to automate tasks.
  • Experience on AWS services like EMR.
Read more
Simform Solutions
at Simform Solutions
4 recruiters
Dipali Pithava
Posted by Dipali Pithava
Ahmedabad
4 - 8 yrs
₹5L - ₹12L / yr
ETL
Informatica
Data Warehouse (DWH)
Relational Database (RDBMS)
DBA
+4 more
We are looking for Lead DBA, with 4-7 years of experience

We are a fast-growing digital, cloud, and mobility services provider with a principal market being North
America. We are looking for talented database/SQL experts for the management and analytics of large
data in various enterprise projects.

Responsibilities
 Translate business needs to technical specifications
 Manage and maintain various database servers (backup, replicas, shards, jobs)
 Develop and execute database queries and conduct analyses
 Occasionally write scripts for ETL jobs.
 Create tools to store data (e.g. OLAP cubes)
 Develop and update technical documentation

Requirements
 Proven experience as a database programmer and administrator
 Background in data warehouse design (e.g. dimensional modeling) and data mining
 Good understanding of SQL and NoSQL databases, online analytical processing (OLAP) and ETL
(Extract, transform, load) framework
 Advance Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server
Integration Services (SSIS)
 Familiarity with BI technologies (strong Tableu hands-on experience) is a plus
 Analytical mind with a problem-solving aptitude
Read more
EASEBUZZ
at EASEBUZZ
1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
MNC
at MNC
Agency job
via Fragma Data Systems by geeti gaurav mohanty
Bengaluru (Bangalore), Hyderabad
3 - 6 yrs
₹10L - ₹15L / yr
Big Data
Spark
ETL
Apache
Hadoop
+2 more
Desired Skill, Experience, Qualifications, and Certifications:
• 5+ years’ experience developing and maintaining modern ingestion pipeline using
technologies like Spark, Apache Nifi etc).
• 2+ years’ experience with Healthcare Payors (focusing on Membership, Enrollment, Eligibility,
• Claims, Clinical)
• Hands on experience on AWS Cloud and its Native components like S3, Athena, Redshift &
• Jupyter Notebooks
• Strong in Spark Scala & Python pipelines (ETL & Streaming)
• Strong experience in metadata management tools like AWS Glue
• String experience in coding with languages like Java, Python
• Worked on designing ETL & streaming pipelines in Spark Scala / Python
• Good experience in Requirements gathering, Design & Development
• Working with cross-functional teams to meet strategic goals.
• Experience in high volume data environments
• Critical thinking and excellent verbal and written communication skills
• Strong problem-solving and analytical abilities, should be able to work and delivery
individually
• Good-to-have AWS Developer certified, Scala coding experience, Postman-API and Apache
Airflow or similar schedulers experience
• Nice-to-have experience in healthcare messaging standards like HL7, CCDA, EDI, 834, 835, 837
• Good communication skills
Read more
Data
at Data
Agency job
via parkcom by Ravi P
Pune
6 - 15 yrs
₹7L - ₹15L / yr
ETL
Oracle
Talend
Ab Initio

We are looking for a Senior Database Developer to provide a senior-level contribution to design, develop and implement critical business enterprise applications for marketing systems 

 

  1. Play a lead role in developing, deploying and managing our databases (Oracle, My SQL and Mongo) on Public Clouds.
  2. Design and develop PL/SQL processes to perform complex ETL processes. 
  3. Develop UNIX and Perl scripts for data auditing and automation. 
  4. Responsible for database builds and change requests.
  5. Holistically define the overall reference architecture and manage its overall implementation in the production systems.
  6. Identify architecture gaps that can improve availability, performance and security for both productions systems and database systems and works towards resolving those issues.
  7. Work closely with Engineering, Architecture, Business and Operations teams to provide necessary and continuous feedback.
  1. Automate all the manual steps for the database platform.
  2. Deliver solutions for access management, availability, security, replication and patching.
  3. Troubleshoot application database performance issues.
  4. Participate in daily huddles (30 min.) to collaborate with onshore and offshore teams.

 

Qualifications: 

 

  1. 5+ years of experience in database development.
  2. Bachelor’s degree in Computer Science, Computer Engineering, Math, or similar.
  3. Experience using ETL tools (Talend or Ab Initio a plus).
  4. Experience with relational database programming, processing and tuning (Oracle, PL/SQL, My SQL, MS SQL Server, SQL, TSQL).
  5. Familiarity with BI tools (Cognos, Tableau, etc.).
  6. Experience with Cloud technology (AWS, etc.).
  7. Agile or Waterfall methodology experience preferred.
  8. Experience with API integration.
  9. Advanced software development and scripting skills for use in automation and interfacing with databases.
  10. Knowledge of software development lifecycles and methodologies.
  11. Knowledge of developing procedures, packages and functions in a DW environment.
  12. Knowledge of UNIX, Linux and Service Oriented Architecture (SOA).
  13. Ability to multi-task, to work under pressure, and think analytically.
  14. Ability to work with minimal supervision and meet deadlines.
  15. Ability to write technical specifications and documents.
  16. Ability to communicate effectively with individuals at all levels in the company and with various business contacts outside of the company in an articulate, professional manner.
  17. Knowledge of CDP, CRM, MDM and Business Intelligence is a plus.
  18. Flexible work hours.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos