ETL Jobs in Pune

Explore top ETL Job opportunities in Pune from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon
DP
Posted by Alfiya Khan
icon
Pune, Bengaluru (Bangalore)
icon
6 - 8 yrs
icon
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
DP
Posted by Komal Samudrala
icon
Hyderabad, Bengaluru (Bangalore), Pune
icon
5 - 8 yrs
icon
₹25L - ₹32L / yr
Alteryx
AWS CloudFormation
Google Cloud Platform (GCP)
ETL
SQL
+3 more
Alteryx Engineer

Job Description

Qualifications

• Like us, you’re a high performer who’s an expert at your craft, constantly challenging the status quo
• A minimum of 5+ years in a data-focused business development/ alliances, sales engineering, solutions architect, consulting, or engineering experience
• Complex enterprise systems and problem-solving skills
• Deep Experience with one or more cloud platforms and architectures (AWS, GCP, Azure)
• Experience with one or more enterprise-level security deployments: Single Sign-on (SSO), Active Directory (AD), Lightweight Directory Access Protocol (LDAP), Kerberos (or equivalent)
• Data industry experience in Analytic applications, Application integration, Extract Transform Load (ETL), Extract Load Transform (ELT), Business Intelligence, DataViz, Software as a Service (SaaS), Platform as a Service (PaaS), Cloud Data Warehouse/Data Lake technologies (BigQuery/Snowflake/Redshift/Databricks/Synapse)
• Demonstrable working knowledge of Unix/Linux OS and Filesystems
• Demonstrable working knowledge of SQL
• Excellent verbal, written, and presentation skills
• Comfortable and quick with learning new technologies as needed
• Experience with Kubernetes
• Experience in selling Enterprise subscription-based cloud software
• Experience supporting workflows using Airflow or other enterprise workflow tools
• Some programming experience in Python/Java/R

Responsibilities

• This ranges from an executive-level discussion on overall business strategy to deep technical engagements with product and engineering teams
• Develop and continually refine deep Trifacta product knowledge which is part of the Alteryx Analytic Cloud
• Data savvy and proficient at communicating data manipulation concepts
• Create and execute high impact Technical/Architectural presentations and top-notch programs/workshops/demos, for Partner Technical and Architectural enablement
• Enable and support partner-led presentations, demonstrations, and technical evaluations by providing a technical environment and product expertise
• Distill and communicate partner needs and product feedback to Product Management, Engineering, Marketing and Sales
• Collaborate with Alteryx's Cloud Alliances Leadership & Product Management to develop comprehensive technical plans for strategic partners, including identifying, incubating, and bringing to market service/solution offerings based on the Alteryx cloud platform and services
• Provide oversight, guidance, and assistance during the partner's sales process to ensure mutual success
• Represent Alteryx at partner events, and work with partners to develop integrated solutions, demos, joint blog posts and whitepapers


Read more
DP
Posted by Komal Samudrala
icon
Hyderabad, Bengaluru (Bangalore), Pune
icon
3 - 5 yrs
icon
₹10L - ₹15L / yr
ETL
T-SQL
Azure Data Factory
Data Warehouse (DWH)
Informatica
+2 more

Data Engineer

 

ESSENTIAL DUTIES AND RESPONSIBILITIES

  • Data Warehouse Architecture;
  • Data Model Design;
  • Data pipeline maintenance/testing;
  • Machine learning algorithm deployment;
  • Managing data and meta-data;
  • Setting up data-access tools;
  • Maintain System Reliability;
  • Other related duties as required.

JOB REQUIREMENTS

  • Good understanding of data warehouse models, including data marts and data lakes;
  • Strong skillset in T-SQL, DAX, and Power Query;
  • Strong skillset in Azure Data Factory, Azure Synapse, and Power BI;
  • Knowledge or experience in handling security-sensitive data;
  • Good understanding of ETL fundamentals and building efficient data pipelines;
  • Prior experience developing, integrating, maintaining, monitoring, and performance tuning of ETL, data pipeline, API, data extract, and ad hoc queries;
  • Strong organizational and time management skills;
  • Strong interpersonal and communication skills (both oral and written) and the ability to work well with employees at all levels of the organization;
  • Able to work independently and collaboratively in a team environment.

EDUCATION, EXPERIENCE AND/OR CREDENTIALS

  • BA/BS in Computer Science or similar technical discipline (or equivalent experience);
  • 3+ years in a Data Engineering or Data Warehousing role;
  • 3+ years of python coding experience;
  • Cloud-certified Azure Data Engineer preferred.

 

Read more
icon
Pune
icon
2 - 5 yrs
icon
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Oracle
Job Description : 
 
Roles and Responsibility : 
 
  • Designing and coding the data warehousing system to desired company specifications 
  • Conducting preliminary testing of the warehousing environment before data is extracted
  • Extracting company data and transferring it into the new warehousing environment
  • Testing the new storage system once all the data has been transferred
  • Troubleshooting any issues that may arise
  • Providing maintenance support
  • Consulting with data management teams to get a big-picture idea of the company’s data storage needs
  • Presenting the company with warehousing options based on their storage needs
Requirements :
  • Experience of 1-3 years in Informatica Power Center
  • Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
  • Knowledge of SQL Server database 
  • Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques  Understanding of ETL Control Framework
  • Experience in UNIX shell/Perl Scripting
  • Good communication skills, including the ability to write clearly
  • Able to function effectively as a member of a team 
  • Proactive with respect to personal and technical development
Read more

a reputed firm providing world-class consulting Company

Agency job
via Jobdost by Saida Jabbar
icon
Ahmedabad, Hyderabad, Pune, Delhi
icon
5 - 8 yrs
icon
₹25L - ₹30L / yr
Snow flake schema
Amazon Web Services (AWS)
AWS Lambda
ETL
Informatica
+1 more

Data Engineer 

 

Mandatory Requirements 

  • Expertise in ETL , SNowFlake
  • Experience in AWS ETL using AWS Glue, AWS Lambda
  • Proficient in blob storage and data lake 
  • Understanding of file-based ingestion best practices. 

CORE RESPONSIBILITIES

  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

 

QUALIFICATIONS

  • 5-7+ years’ experience as data engineer in consumer finance or manufacturing or Oil & Gas industry 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language Python, R, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Azure, Docker / Kubernetes, SQL Server, Synapse, Snowflake,AWS
  • Proficient with
    • Data mining/programming tools (e.g. SAS, SQL, R, Python)
    • Database technologies (e.g. MongoDB, PostgreSQL, Redshift, Snowflake. and Greenplum)
    • Data visualization (e.g. Tableau, PowerBI, QlikSense)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

 

Read more
DP
Posted by Jyoti Kaushik
icon
Noida, Bengaluru (Bangalore), Pune, Hyderabad
icon
4 - 7 yrs
icon
₹4L - ₹16L / yr
ETL
SQL
Data Warehouse (DWH)
Informatica
Datawarehousing
+2 more

We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.

In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that

  • Has healthcare experience and is passionate about helping heal people,
  • Loves working with data,
  • Has an obsessive focus on data quality,
  • Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
  • Has strong data interrogation and analysis skills,
  • Defaults to written communication and delivers clean documentation, and,
  • Enjoys working with customers and problem solving for them.

A day in the life at Innovaccer:

  • Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
  • Measure and communicate impact to our customers.
  • Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.

What You Need:

  • 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
  • 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
  • Intermediate to advanced level SQL programming skills.
  • Data Analytics and Visualization (using tools like PowerBI)
  • The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
  • Ability to work in a fast-paced and agile environment.
  • Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.

What we offer:

  • Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
  • Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
  • Health benefits: We cover health insurance for you and your loved ones.
  • Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
  • Pet-friendly office and open floor plan: No boring cubicles.
Read more
DP
Posted by Baiju Sukumaran
icon
Vadodara, Bengaluru (Bangalore), Ahmedabad, Pune, Kolkata, Hyderabad
icon
6 - 8 yrs
icon
Best in industry
Datawarehousing
Microsoft Windows Azure
ETL
Relational Database (RDBMS)
SQL Server Integration Services (SSIS)
+7 more
Technical Skills
Mandatory (Minimum 4 years of working experience)
 3+ years of experience leading data warehouse implementation with technical architectures , ETL / ELT ,
reporting / analytic tools and scripting (end to end implementation)
 Experienced in Microsoft Azure (Azure SQL Managed Instance , Data Factory , Azure Synapse, Azure Monitoring ,
Azure DevOps , Event Hubs , Azure AD Security)
 Deep experience in using any BI tools such as Power BI/Tableau, QlikView/SAP-BO etc.,
 Experienced in ETL tools such as SSIS, Talend/Informatica/Pentaho
 Expertise in using RDBMSes like Oracle, SQL Server as source or target and online analytical processing (OLAP)
 Experienced in SQL/T-SQL/ DML/DDL statements, stored procedure, function, trigger, indexes, cursor
 Expertise in building and organizing advanced DAX calculations and SSAS cubes
 Experience in data/dimensional modelling, analysis, design, testing, development, and implementation
 Experienced in advanced data warehouse concepts using structured, semi-structured and un-structured data
 Experienced with real time ingestion, change data capture, real time & batch processing
 Good knowledge of meta data management and data governance
 Great problem solving skills, with a strong bias for quality and design excellence
 Experienced in developing dashboards with a focus on usability, performance, flexibility, testability, and
standardization.
 Familiarity with development in cloud environments like AWS / Azure / Google

Good To Have (1+ years of working experience)
 Experience working with Snowflake, Amazon RedShift
Soft Skills
 Good verbal and written communication skills
 Ability to collaborate and work effectively in a team.
 Excellent analytical and logical skills
Read more
DP
Posted by Baiju Sukumaran
icon
Remote, Vadodara, Bengaluru (Bangalore), Pune, Ahmedabad
icon
7 - 9 yrs
icon
Best in industry
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more
Key Responsibilities
 Ability to interpret and map business, functional and non functional requirements to technical specifications
 Interact with diverse stakeholders like clients, project manager/scrum master, business analysts, testing and
other cross-functional teams as part of Business Intelligence projects
 Develop solutions following established technical design, application development standards and quality processes in projects to deliver efficient, reusable and reliable code with complete ownership
 Assess the impacts on technical design because of the changes in functional requirements.
 Developing the full life cycle of a BI project (data movement and visualization) which includes requirements analysis , platform selection , architecture , application design and development , testing and deployment
 Provide architectural leadership with a strong emphasis on data architecture around ETL and governance
 Troubleshoot highly complex technical problems in a OLAP/OLTP/DW based environments
 Provide support specific to application bugs or issues within defined SLAs
 Proactively identify and communicate technical risks, issues, and challenges with mitigations
 Perform independent code reviews and guide junior team members for correction

Must Have Technical Skills :
3+ years of experience leading data warehouse implementation with technical architectures , ETL / ELT, reporting / analytic tools and scripting (end to end implementation)
 Experienced in Microsoft Azure (Azure SQL Managed Instance , Data Factory , Azure Synapse, Azure Monitoring, Azure DevOps , Event Hubs , Azure AD Security)
 Deep experience in using any BI tools such as Power BI/Tableau, QlikView/SAP-BO etc.,
 Experienced in ETL tools such as SSIS, Talend/Informatica/Pentaho
 Expertise in using RDBMSes like Oracle, SQL Server as source or target and online analytical processing (OLAP)
 Experienced in SQL/T-SQL/ DML/DDL statements, stored procedure, function, trigger, indexes, cursor
 Expertise in building and organizing advanced DAX calculations and SSAS cubes
 Experience in data/dimensional modelling, analysis, design, testing, development, and implementation
 Experienced in advanced data warehouse concepts using structured, semi-structured and un-structured data
 Experienced with real time ingestion, change data capture, real time & batch processing
 Good knowledge of meta data management and data governance
 Great problem solving skills, with a strong bias for quality and design excellence
 Experienced in developing dashboards with a focus on usability, performance, flexibility, testability, and
standardization.
 Familiarity with development in cloud environments like AWS / Azure / Google
Read more
icon
Bengaluru (Bangalore), Hyderabad, Pune, Indore, Gurugram, Noida
icon
10 - 17 yrs
icon
₹25L - ₹50L / yr
Product Management
Big Data
Data Warehouse (DWH)
ETL
Hi All, 
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
Read more
DP
Posted by Amala Baby
icon
Pune
icon
2 - 4 yrs
icon
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more

Client of People First Consultants

Agency job
icon
Pune, Hyderabad
icon
3 - 6 yrs
icon
₹4L - ₹8L / yr
Python
NumPy
pandas
Django
Flask
+2 more

Key skills : Python, Numpy, Panda, SQL, ETL 

Roles and Responsibilities:

 

- The work will involve the development of workflows triggered by events from other systems

- Design, develop, test, and deliver software solutions in the FX Derivatives group

- Analyse requirements for the solutions they deliver, to ensure that they provide the right solution

- Develop easy to use documentation for the frameworks and tools developed for adaption by other teams

Familiarity with event-driven programming in Python

- Must have unit testing and debugging skills

- Good problem solving and analytical skills

- Python packages such as NumPy, Scikit learn

- Testing and debugging applications.

- Developing back-end components.

Read more

With a global provider of Business Process Management.

Agency job
via Jobdost by Mamatha A
icon
Bengaluru (Bangalore), Pune, Delhi, Gurugram, Nashik, Vizag
icon
3 - 5 yrs
icon
₹8L - ₹12L / yr
Oracle
Business Intelligence (BI)
PowerBI
Oracle Warehouse Builder
Informatica
+3 more
Oracle BI developer wiith 6+ years experience working on Oracle warehouse design, development and
testing
Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
requirements
Good communication skills and written communication
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Read more
DP
Posted by Ashwini Dhaipule
icon
Pune
icon
6 - 10 yrs
icon
₹6L - ₹15L / yr
Software Testing (QA)
Shell Scripting
Data management
ETL QA
ETL
+3 more

ABOUT US:
Pingahla was founded by a group of people passionate about making the world a better place by harnessing
the power of Data. We are a data management firm with offices in New York and India.Our mission is to help transform the way companies operate and think about their business. We make it easier
to adopt and stay ahead of the curve in the ever-changing digital landscape. One of our core beliefs is
excellence in everything we do!

JOB DESCRIPTION:
Pingahla is recruiting ETL & BI Test Manager who can build and lead a team, establish infrastructure, processes
and best practices for our Quality Assurance vertical. The candidates are expected to have at least 5+ years of experience with ETL Testing and working in Data Management project testing. Being a growing company, we will be able to provide very good career opportunities and a very attractive remuneration.

JOB ROLE & RESPONSIBILITIES:
• Plans and manages the testing activities;
• Defect Management and Weekly & Monthly Test report Generation;
• Work as a Test Manager to design Test Strategy and approach for DW&BI - (ETL & BI) solution;
• Provide leadership and directions to the team on quality standards and testing best practices;
• Ensured that project deliverables are produced, including, but not limited to: quality assurance plans,
test plans, testing priorities, status reports, user documentation, online help, etc. Managed and
motivated teams to accomplish significant deliverables within tight deadlines.
• Test Data Management; Reviews and approves all test cases prior to execution;
• Coordinates and reviews offshore work efforts for projects and maintenance activities.

REQUIRED SKILLSET:
• Experience in Quality Assurance Management, Program Management, DW - (ETL & BI)
Management
• Minimum 5 years in ETL Testing, at least 2 years in the Team Lead role
• Technical abilities complemented by sound communication skills, user interaction abilities,
requirement gathering and analysis, and skills in data migration and conversion strategies.
• Proficient in test definition, capable of developing test plans and test cases from technical
specifications
• Single handedly looking after complete delivery from testing side.
• Experience working with remote teams, across multiple time zones.
• Must have a strong knowledge of QA processes and methodologies
• Strong UNIX and PERL scripting skills
• Expertise with ETL testing & hands-on with working on ETL tool like Informatica, DataStage
PL/SQL is a plus.
• Excellent problem solving, analytical and technical troubleshooting skills.
• Familiar with Data Management projects.
• Eager to learn, adopt and apply rapidly changing new technologies and methodologies.
• Efficient and effective at approaching and escalating quality issues when appropriate.

Read more
icon
Pune, Hyderabad
icon
6 - 12 yrs
icon
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
icon
Pune
icon
4 - 7 yrs
icon
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
icon
Pune
icon
3 - 8 yrs
icon
₹3L - ₹20L / yr
Intelligence
Artificial Intelligence (AI)
Deep Learning
Machine Learning (ML)
Data extraction
+3 more
Responsibilities
● Frame ML / AI use cases that can improve the company’s product
● Implement and develop ML / AI / Data driven rule based algorithms as software items
● For example, building a chatbot that replies an answer from relevant FAQ, and
reinforcing the system with a feedback loop so that the bot improves
Must have skills:
● Data extraction and ETL
● Python (numpy, pandas, comfortable with OOP)
● Django
● Knowledge of basic Machine Learning / Deep Learning / AI algorithms and ability to
implement them
● Good understanding of SDLC
● Deployed ML / AI model in a mobile / web product
● Soft skills : Strong communication skills & Critical thinking ability

Good to have:
● Full stack development experience
Required Qualification:
B.Tech. / B.E. degree in Computer Science or equivalent software engineering
Read more
icon
Pune, Hyderabad
icon
4 - 10 yrs
icon
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

Read more
icon
Pune
icon
3 - 8 yrs
icon
₹5L - ₹20L / yr
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
+1 more

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Read more
icon
Pune
icon
12 - 20 yrs
icon
₹20L - ₹35L / yr
Data Warehouse (DWH)
ETL
Big Data
Business Intelligence (BI)
Project Management
+1 more

Job Description

Experience : 10+ Years

Location : Pune


Job Requirements:

  • Minimum of 10+ years of experience with a proven record of increased responsibility
  • Hands on experience in design, development and managing Big Data, Cloud, Data warehousing
  • and Business Intelligence projects
  • Experience of managing projects in Big Data, Cloud, Data warehousing, Business Intelligence
  • Using open source or top of the line tools and technologies
  • Good knowledge of Dimensional Modeling
  • Experience of working with any ETL and BI Reporting tools
  • Experience of managing medium to large projects, preferably on Big Data
  • Proven experience in project planning, estimation, execution and implementation of medium to
  • large projects
  • Should be able to effectively communicate in English
  • Strong management and leadership skills, with proven ability to develop and manage client
  • relationships
  • Proven problem-solving skills from both technical and managerial perspectives
  • Attention to detail and a commitment to excellence and high standards
  • Excellent interpersonal and communication skills, both verbal and written
  • Position is remote with occasional travel to other offices, client sites, conventions, training
  • locations, etc.
  • Bachelor’s degree in Computer Science, Business\Economics, or a related field or demonstrated,
  • equivalent/practical knowledge or experience

Job Responsibilities:

  • Day to day project management, scrum and agile management including project planning, delivery
  • and execution of Big Data and
  • Primary Point of contact for customer related to all project engagements, delivery and project
  • escalations
  • Design right architecture and technology stack depending on business requirement on Cloud / Big
  • Data and BI related technologies both some on-premise and on cloud
  • Liaise with key stakeholders to define the Cloud / Big data solutions roadmap, prioritize the
  • deliverables
  • Responsible for end to end project delivery of Cloud / Big Data Solutions from project estimations,
  • project planning, resourcing and monitoring perspective
  • Drive and participate in requirements gathering workshops, estimation discussions, design
  • meetings and status review meetings
  • Support & assist the team in resolving issues during testing and when the system is in production
  • Involved in the full customer lifecycle with a goal to make customers successful and increase
  • revenue and retention
  • Interface with the offshore engineering team to solve customer issues
  • Develop programs that meet customer needs with respect to functionality, performance,
  • scalability, reliability, schedule, principles and recognized industry standards
  • Requirement analysis and documentation
  • Manage day-to-day operational aspects of a project and scope
  • Prepare for engagement reviews and quality assurance procedures
  • Visit and/or host clients to strengthen business relationships
Read more

at 1CH

DP
Posted by Sathish Sukumar
icon
Chennai, Bengaluru (Bangalore), Hyderabad, NCR (Delhi | Gurgaon | Noida), Mumbai, Pune
icon
4 - 15 yrs
icon
₹10L - ₹25L / yr
Data engineering
Data engineer
ETL
SSIS
ADF
+3 more
  • Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
  • Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
  • Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
  • Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
  • Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
  • Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree  and Random forest Algorithms.
  • PolyBase queries for exporting and importing data into Azure Data Lake.
  • Building data models both tabular and multidimensional using SQL Server data tools.
  • Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
  • Programming experience using python libraries NumPy, Pandas and Matplotlib.
  • Implementing NOSQL databases and writing queries using cypher.
  • Designing end user visualizations using Power BI, QlikView and Tableau.
  • Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
  • Experience using the expression languages MDX and DAX.
  • Experience in migrating on-premise SQL server database to Microsoft Azure.
  • Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
  • Performance tuning complex SQL queries, hands on experience using SQL Extended events.
  • Data modeling using Power BI for Adhoc reporting.
  • Raw data load automation using T-SQL and SSIS
  • Expert in migrating existing on-premise database to SQL Azure.
  • Experience in using U-SQL for Azure Data Lake Analytics.
  • Hands on experience in generating SSRS reports using MDX.
  • Experience in designing predictive models using Python and SQL Server.
  • Developing machine learning models using Azure Databricks and SQL Server
Read more

European MNC

Agency job
via Kavayah People Consulting by Kavita Singh
icon
Pune
icon
3 - 8 yrs
icon
₹8L - ₹15L / yr
ETL
Data Warehouse (DWH)
SQL
Technical support
The Support Engineer (L2) will serve as a technical support champion for both internal and external customers. 
Key Responsibilities
Support mission critical applications and technologies
Adhere to agreed SLAs.
 
Required Experience, Skills and Qualifications
3-8 years of relevant experience
Proven track record of supporting ETL/Data Warehouse/Business Intelligence solutions
Strong SQL / Unix skills
Excellent written and verbal communication
High-degree of analytical and problem solving skills
Exposure to handling customers from various geographies
Strong debugging and troubleshooting skills
Ability to work with minimum supervision
Team player who shares ideas and resources
Tools and Technologies
ETL Tools: Talend or Informatica experience
BI Tools: Experience supporting Tableau or Jaspersoft or Pentaho or Qlikview
Database: Experience in Oracle or any RDBMS
Read more

Swiss Healthcare MNC

Agency job
via Kavayah People Consulting by Kavita Singh
icon
Pune
icon
2 - 5 yrs
icon
₹10L - ₹14L / yr
ETL
Datawarehousing
Data Warehouse (DWH)
SQL
Informatica

 Review all job requirements and specifications required for deploying the solution into the production environment. 

 Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same. 

 Work with Lead to resolve all issues within the required timeframe and inform for any delays. 

 Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments. 

 Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis. 

 Review and maintain all technical and business documents.  Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide 

 Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same. 

 Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues. 

 Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau. 

 Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones. 

 

Experience on Tools and Technologies preferred: 

ETL Tools: Talend or Informatica ,Abinitio,Datastage 

BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience 

Database: Experience in Oracle or SS 

Methodology: Experience in SDLC and/or Agile Methodology

Read more
Agency job
via Nu-Pie by Jerrin Thomas
icon
Pune, Hyderabad, Mumbai, Bengaluru (Bangalore)
icon
6 - 10 yrs
icon
₹6L - ₹13L / yr
MicroStrategy
OLAP
SQL
ETL
  • 5 years of experience in Business Intelligence development.
  • Experience with MicroStrategy toolset: Desktop, Report Services, Architect, OLAP, Administrator
  • Strong experience in design, creation, and deployment of reports and dashboards
  • Experience with designing reusable MicroStrategy components for business reporting
  • Excellent communication skills to interact with users at various levels within the organization.
  • Strong SQL skills to perform queries, data/file validation, analysis, profiling, etc, as needed
  • Creating and maintaining documentation is a plus
  • Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals.
  • Experience with ETL and Collibra is a plus
  • Previous experience in the banking industry is preferred
Read more
Agency job
via Nu-Pie by Sanjay Biswakarma
icon
Remote, Pune, Hyderabad, Mumbai, Bengaluru (Bangalore)
icon
5 - 7 yrs
icon
₹6L - ₹19L / yr
Business Intelligence (BI)
MSTR
OLAP
MicroStrategy
Desktop
+7 more
JD
  • 5 years of experience in Business Intelligence development.
  • Experience with MicroStrategy toolset: Desktop, Report Services, Architect, OLAP, Administrator
  • Strong experience in design, creation, and deployment of reports and dashboards
  • Experience with designing reusable MicroStrategy components for business reporting
  • Excellent communication skills to interact with users at various levels within the organization.
  • Strong SQL skills to perform queries, data/file validation, analysis, profiling, etc, as needed
  • Creating and maintaining documentation is a plus
  • Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals.
  • Experience with ETL and Collibra is a plus
  • Previous experience in the banking industry is preferred
Read more

IT Giant

Agency job
icon
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
icon
10 - 18 yrs
icon
₹15L - ₹30L / yr
ETL
Informatica
Informatica PowerCenter
Windows Azure
SQL Azure
+2 more
Key skills:
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake

Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Read more

Data

Agency job
via parkcom by Ravi P
icon
Pune
icon
6 - 15 yrs
icon
₹7L - ₹15L / yr
ETL
Oracle
Talend
Ab Initio

We are looking for a Senior Database Developer to provide a senior-level contribution to design, develop and implement critical business enterprise applications for marketing systems 

 

  1. Play a lead role in developing, deploying and managing our databases (Oracle, My SQL and Mongo) on Public Clouds.
  2. Design and develop PL/SQL processes to perform complex ETL processes. 
  3. Develop UNIX and Perl scripts for data auditing and automation. 
  4. Responsible for database builds and change requests.
  5. Holistically define the overall reference architecture and manage its overall implementation in the production systems.
  6. Identify architecture gaps that can improve availability, performance and security for both productions systems and database systems and works towards resolving those issues.
  7. Work closely with Engineering, Architecture, Business and Operations teams to provide necessary and continuous feedback.
  1. Automate all the manual steps for the database platform.
  2. Deliver solutions for access management, availability, security, replication and patching.
  3. Troubleshoot application database performance issues.
  4. Participate in daily huddles (30 min.) to collaborate with onshore and offshore teams.

 

Qualifications: 

 

  1. 5+ years of experience in database development.
  2. Bachelor’s degree in Computer Science, Computer Engineering, Math, or similar.
  3. Experience using ETL tools (Talend or Ab Initio a plus).
  4. Experience with relational database programming, processing and tuning (Oracle, PL/SQL, My SQL, MS SQL Server, SQL, TSQL).
  5. Familiarity with BI tools (Cognos, Tableau, etc.).
  6. Experience with Cloud technology (AWS, etc.).
  7. Agile or Waterfall methodology experience preferred.
  8. Experience with API integration.
  9. Advanced software development and scripting skills for use in automation and interfacing with databases.
  10. Knowledge of software development lifecycles and methodologies.
  11. Knowledge of developing procedures, packages and functions in a DW environment.
  12. Knowledge of UNIX, Linux and Service Oriented Architecture (SOA).
  13. Ability to multi-task, to work under pressure, and think analytically.
  14. Ability to work with minimal supervision and meet deadlines.
  15. Ability to write technical specifications and documents.
  16. Ability to communicate effectively with individuals at all levels in the company and with various business contacts outside of the company in an articulate, professional manner.
  17. Knowledge of CDP, CRM, MDM and Business Intelligence is a plus.
  18. Flexible work hours.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

 

Read more
icon
Pune
icon
3 - 9 yrs
icon
₹6L - ₹15L / yr
Amazon Web Services (AWS)
ETL
Linux/Unix
Bonzai is a cloud based enterprise software platform that helps brands to create, traffic, measure and optimize their HTML5 ads. The self-serve drag and drop tool allows brands to build dynamic and responsive ad units for all screens, without a single line of code.
Read more
icon
Pune
icon
3 - undefined yrs
icon
₹6L - ₹15L / yr
Amazon Web Services (AWS)
ETL
Linux/Unix
Bonzai is a cloud based enterprise software platform that helps brands to create, traffic, measure and optimize their HTML5 ads. The self-serve drag and drop tool allows brands to build dynamic and responsive ad units for all screens, without a single line of code.
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort