Data Migration Developer

at Qvantel Software Solutions Ltd

DP
Posted by Srinivas Bollipally
icon
Hyderabad
icon
3 - 7 yrs
icon
₹6L - ₹20L / yr
icon
Full time
Skills
Data Migration
BSS
ETL
We are now looking for passionate DATA MIGRATION DEVELOPERS to work in our Hyderabad site Role Description: We are looking for data migration developers to our BSS delivery projects. Your main goal is to analyse migration data, create migration solution and execute the data migration. You will work as part of the migration team in cooperation with our migration architect and BSS delivery project manager. You have a solid background with telecom BSS and experience in data migrations. You will be expected to interpret data analysis produced by Business Analysts and raise issues or questions and work directly with the client on-site to resolve them. You must therefore be capable of understanding the telecom business behind a technical solution. Requirements: – To understand different data migration approaches and capability to adopt requirements to migration tool development and utilization – Capability to analyse the shape & health of source data – Extraction of data from multiple legacy sources – Building transformation code to adhere to data mappings – Loading data to either new or existing target solutions. We appreciate: – Deep knowledge of ETL processes and/or other migration tools – Proven experience in data migrations with high volumes and in business critical systems in telecom business – Experience in telecom business support systems – Ability to apply innovation and improvement to the data migration/support processes and to be able to manage multiple priorities effectively. We can offer you: – Interesting and challenging work in a fast-growing, customer-oriented company – An international and multicultural working environment with experienced and enthusiastic colleagues – Plenty of opportunities to learn, grow and progress in your career At Qvantel we have built a young, dynamic culture where people are motivated to learn and develop themselves, are used to working both independently as well as in teams, have a systematic, hands on working style and a can-do attitude. Our people are used to communicate across other cultures and time zones. A sense of humor can also come in handy. Don’t hesitate to ask for more information from Srinivas Bollipally our Recruitment Specialist reachable at [email protected]
Read more

About Qvantel Software Solutions Ltd

What if your Business Support Solutions BSS operations were run as efficiently and modern way as today's Internet services? Read more.
Read more
Founded
2000
Type
Products & Services
Size
100-1000 employees
Stage
Profitable
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Scientist

at Gormalone LLP

Founded 2017  •  Products & Services  •  20-100 employees  •  Bootstrapped
Data Science
Machine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
Data Analytics
EDA
Python
Statistical Analysis
ETL
Artificial Intelligence (AI)
TensorFlow
Deep Learning
Artificial Neural Network (ANN)
DNN
Long short-term memory (LSTM)
Keras
PyTorch
Model Building
Airflow
Mlops
CNN
RNN
YOLO
icon
Bengaluru (Bangalore)
icon
2 - 5 yrs
icon
₹5L - ₹20L / yr

DATA SCIENTIST-MACHINE LEARNING                           

GormalOne LLP. Mumbai IN

 

Job Description

GormalOne is a social impact Agri tech enterprise focused on farmer-centric projects. Our vision is to make farming highly profitable for the smallest farmer, thereby ensuring India's “Nutrition security”. Our mission is driven by the use of advanced technology. Our technology will be highly user-friendly, for the majority of farmers, who are digitally naive. We are looking for people, who are keen to use their skills to transform farmers' lives. You will join a highly energized and competent team that is working on advanced global technologies such as OCR, facial recognition, and AI-led disease prediction amongst others.

 

GormalOne is looking for a machine learning engineer to join. This collaborative yet dynamic, role is suited for candidates who enjoy the challenge of building, testing, and deploying end-to-end ML pipelines and incorporating ML Ops best practices across different technology stacks supporting a variety of use cases. We seek candidates who are curious not only about furthering their own knowledge of ML Ops best practices through hands-on experience but can simultaneously help uplift the knowledge of their colleagues.

 

Location: Bangalore

 

Roles & Responsibilities

  • Individual contributor
  • Developing and maintaining an end-to-end data science project
  • Deploying scalable applications on different platform
  • Ability to analyze and enhance the efficiency of existing products

 

What are we looking for?

  • 3 to 5 Years of experience as a Data Scientist
  • Skilled in Data Analysis, EDA, Model Building, and Analysis.
  • Basic coding skills in Python
  • Decent knowledge of Statistics
  • Creating pipelines for ETL and ML models.
  • Experience in the operationalization of ML models
  • Good exposure to Deep Learning, ANN, DNN, CNN, RNN, and LSTM.
  • Hands-on experience in Keras, PyTorch or Tensorflow

 

 

Basic Qualifications

  • Tech/BE in Computer Science or Information Technology
  • Certification in AI, ML, or Data Science is preferred.
  • Master/Ph.D. in a relevant field is preferred.

 

 

Preferred Requirements

  • Exp in tools and packages like Tensorflow, MLFlow, Airflow
  • Exp in object detection techniques like YOLO
  • Exposure to cloud technologies
  • Operationalization of ML models
  • Good understanding and exposure to MLOps

 

 

Kindly note: Salary shall be commensurate with qualifications and experience

 

 

 

 

Read more
Job posted by
Dhwani Rambhia

Senior Data Engineer

at InnovAccer

Founded 2014  •  Products & Services  •  100-1000 employees  •  Profitable
ETL
SQL
Data Warehouse (DWH)
Informatica
Datawarehousing
ELT
SSIS
icon
Noida, Bengaluru (Bangalore), Pune, Hyderabad
icon
4 - 7 yrs
icon
₹4L - ₹16L / yr

We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.

In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that

  • Has healthcare experience and is passionate about helping heal people,
  • Loves working with data,
  • Has an obsessive focus on data quality,
  • Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
  • Has strong data interrogation and analysis skills,
  • Defaults to written communication and delivers clean documentation, and,
  • Enjoys working with customers and problem solving for them.

A day in the life at Innovaccer:

  • Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
  • Measure and communicate impact to our customers.
  • Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.

What You Need:

  • 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
  • 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
  • Intermediate to advanced level SQL programming skills.
  • Data Analytics and Visualization (using tools like PowerBI)
  • The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
  • Ability to work in a fast-paced and agile environment.
  • Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.

What we offer:

  • Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
  • Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
  • Health benefits: We cover health insurance for you and your loved ones.
  • Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
  • Pet-friendly office and open floor plan: No boring cubicles.
Read more
Job posted by
Jyoti Kaushik
Data Warehouse (DWH)
Informatica
ETL
CI/CD
SQL
icon
Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹9L - ₹14L / yr

 

Role: Talend Production Support Consultant

 

Brief Job Description:  

  • Involve in release deployment and monitoring of the ETL pipelines.
  • Closely work with the development team and business team to provide operational support.
  • Candidate should have good knowledge and hands on experience on below tools/technologies:

Talend (Talend Studio, TAC, TMC),SAP BODS,SQL,HIVE & Azure(Azure fundamentals, ADB,ADF)

  • Hands on experience in CI/CD is an added advantage.

As discussed, please provide your Linkedin ID URL & a valid ID proof of yours.

 

Please confirm as well, you will relocate to Bangalore once required.

Read more
Job posted by
Srabanti Saha
Data Warehouse (DWH)
ETL
ADF
Business Intelligence (BI)
Data architecture
SQL Azure
Azure Databricks
icon
Gurgaon/Gurugram, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
icon
7 - 15 yrs
icon
₹25L - ₹45L / yr
Responsibilities

* Formulates and recommends standards for achieving maximum performance

and efficiency of the DW ecosystem.

* Participates in the Pre-sales activities for solutions of various customer

problem-statement/situations.

* Develop business cases and ROI for the customer/clients.

* Interview stakeholders and develop BI roadmap for success given project

prioritization

* Evangelize self-service BI and visual discovery while helping to automate any

manual process at the client site.

* Work closely with the Engineering Manager to ensure prioritization of

customer deliverables.

* Champion data quality, integrity, and reliability throughout the organization by

designing and promoting best practices.

 *Implementation 20%

* Help DW/DE team members with issues needing technical expertise or

complex systems and/or programming knowledge.

* Provide on-the-job training for new or less experienced team members.

* Develop a technical excellence team

Requirements

- experience designing business intelligence solutions

- experience with ETL Process, Data warehouse architecture

- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,

Synapse, Azure Databricks, and Power BI

- Good analytical and problem-solving skills

- Fluent in relational database concepts and flat file processing concepts

- Must be knowledgeable in software development lifecycles/methodologies
Read more
Job posted by
Vikas Shelke

Data Engineer

at surusha technology Pvt Ltd

Founded 2016  •  Products & Services  •  20-100 employees  •  Profitable
C#
SQL Azure
ETL
OLAP
SQL
Data engineering
icon
Remote only
icon
3 - 6 yrs
icon
₹3L - ₹6L / yr
Role - Data Engineer

Skillsets-Azure, Olap, Etl, sql, python, c#

exp range - 3+ to 4 years

salary-best in industry

notice period - Currently serving notice period (Immediate joiners are preferred)

location- remote work

job type -permanent role

it is full time and totally remote based


Note: For the interview 3 rounds are there -technical round, manager/client round, hr round
Read more
Job posted by
subham kumar

Data Analyst

at KGISL

Founded 1994  •  Services  •  100-1000 employees  •  Profitable
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
Data Analytics
Amazon Web Services (AWS)
Amazon Redshift
SQL
ETL
quicksight
icon
Bengaluru (Bangalore), Coimbatore
icon
4 - 10 yrs
icon
₹12L - ₹18L / yr
The Data Analyst will be responsible for supporting the data and reporting needs of the Brokerage. This role will involve interacting with internal stakeholders for business problem formulation, requirements gathering, identifying relevant data sources or sub-processes in which to deploy new KPIs, communicating new data extract requirements, analyzing and synthesizing data from multiple sources, and ultimately producing high-quality insights that demonstrate a full narrative. This role requires a significant amount of collaboration. The successful person will be highly adaptable and able to prioritize multiple critical projects.

Why work with us?

we're helping Canadian families finance their dream homes and our people are at the heart of everything we do. We are the fastest-growing mortgage brokerage and amongst the Top 20 Brokerages in Canada. We are founded in innovation and driven by technology. This translates into a process and service that is consistent, reliable, and scalable. We provide a state of art employment facility, cutting-edge technology, a proven sales process with continuous training and support.

Responsibilities:

 

  • Applies scripting/programming skills to assemble various types of source data (unstructured, semi-structured, and structured) into well-prepared datasets with multiple levels of granularities (e.g., demographics, customers, products, transactions).
  • Lead the development of standard and customized reporting, dashboards, and analysis of information
  • Lead the development of tools, methodologies, and statistical
  • Provide hands-on development and support in creating and launching various tools and reporting
  • Develops analytical solutions and makes recommendations based on an understanding of the business strategy and stakeholder
  • Works with various data owners to discover and select available data from internal sources to fulfill analytical needs
  • Summarizes statistical findings and draws conclusions, presents actionable business recommendations. Presents findings & recommendations in a simple, clear way to drive action.
  • Uses the appropriate algorithms to discover
  • Works independently on a range of complex tasks, which may include unique

 

 

Qualifications, Skills & Competencies:

 

  • Post Secondary Degree – Computer Science, Information Technology or other relevant degrees with curriculum related to data structures and analysis

  • Minimum 5 years of experience as an analyst
  • Minimum 5 years of knowledge of business intelligence tools and programming languages
  • Advance skills in data analysis and profiling, data mapping, data modeling, data lakes, and analytics
  • Data Analytics: AWS Quicksight and Redshift
  • Data Migration: solid in SQL and ETL
  • Scripting and Integration: REST APIs, GraphQL, Nodejs, AWS Lambda/API Gateway
  • Experience working with data mining and performing quantitative analysis
  • Experience with Machine Learning algorithms and associated data sets
  • Business acumen results-oriented
  • Proactive/takes initiative/self-starter
  • Excellent written and oral communication skills
  • Ability to create, coordinate and facilitate presentations
  • Time management, highly organized
  • Collaboration and Team Engagement
  • Analytical and Problem Solving
  • Data-driven/Metrics Driven


Read more
Job posted by
Abhilash P

Data Engineer

at software and consultancy company

Agency job
via Exploro Solutions
Amazon Web Services (AWS)
ETL
Informatica
Data Warehouse (DWH)
SQL
icon
Bengaluru (Bangalore), Chennai
icon
6 - 8 yrs
icon
₹12L - ₹30L / yr

Primary Skills

 6 to 8 years of relevant work experience in ETL tools

 Good knowledge working in AWS Cloud Data Bases like Aurora DB and ecosystem and tools (AWS DMS)

Migrating databases to AWS Cloud would be Mandatory

Sound knowledge of SQL and procedural language.

Possess solid experience of writing complex SQL queries and optimizing SQL query performance

Knowledge of data ingestion one-off feed, change data capture, incremental batch

 

Additional Skills :

Experience in Unix/Linux systems and writing shell scripts would be nice to have

Java knowledge would be an added advantage

Knowledge in Spark Python for building ETL pipelines on cloud would be preferable

Read more
Job posted by
ramya ramchandran

Snowflake Developer

at Our Client company is into Computer Software. (EC1)

Agency job
via Multi Recruit
ETL
Snowflake
snow flake
Data engineering
SQL
DWH
icon
Bengaluru (Bangalore)
icon
3 - 5 yrs
icon
₹12L - ₹15L / yr
  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Snowflake Cloud Datawarehouse as well as SQL and Azure ‘big data’ technologies
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.

Basic Qualifications

  • 3+ years of experience in a Data Engineer or Software Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with “Snowflake Cloud Datawarehouse”
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with data pipeline and workflow management tools
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes/Snowflake
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Experience supporting and working with cross-functional teams in a dynamic environment.
Read more
Job posted by
Fiona RKS

ETL Developer

at PriceSenz

Founded 2015  •  Services  •  20-100 employees  •  Profitable
ETL
SQL
Informatica PowerCenter
icon
Remote only
icon
2 - 15 yrs
icon
₹1L - ₹20L / yr

If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you. 

 

Below is the job description for your review.

Extensive hands- on experience in designing and developing ETL packages using SSIS

Extensive experience in performance tuning of SSIS packages

In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.

Read more
Job posted by
Karthik Padmanabhan

Talend Developer

at Helical IT Solution

Founded 2012  •  Products & Services  •  20-100 employees  •  Profitable
ETL
Big Data
TAC
PL/SQL
Relational Database (RDBMS)
MySQL
icon
Hyderabad
icon
1 - 5 yrs
icon
₹3L - ₹8L / yr

ETL Developer – Talend

Job Duties:

  • ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,

best practices and are maintainable, modular and reusable.

  • Proficiency with Talend or Pentaho Data Integration / Kettle.
  • ETL Developer will analyze and review complex object and data models and the metadata

repository in order to structure the processes and data for better management and efficient

access.

  • Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
  • Training and mentoring Junior Analysts and building their proficiency in the ETL process.
  • Preparing mapping document to extract, transform, and load data ensuring compatibility with

all tables and requirement specifications.

  • Experience in ETL system design and development with Talend / Pentaho PDI is essential.
  • Create quality rules in Talend.
  • Tune Talend / Pentaho jobs for performance optimization.
  • Write relational(sql) and multidimensional(mdx) database queries.
  • Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &

Load balancing setup, and all its administrative functions.

  • Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,

dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,

and analytical models.

  • Exposure in Map Reduce components of Talend / Pentaho PDI.
  • Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and

maintenance.

  • Working knowledge of relational database theory and dimensional database models.
  • Creating and deploying Talend / Pentaho custom components is an add-on advantage.
  • Nice to have java knowledge.

Skills and Qualification:

  • BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
  • Having an experience of 3+ years.
  • Proficiency with Talend or Pentaho Data Integration / Kettle.
  • Ability to work independently.
  • Ability to handle a team.
  • Good written and oral communication skills.
Read more
Job posted by
Niyotee Gupta
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Qvantel Software Solutions Ltd?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort