Cutshort logo

35+ ETL Jobs in Pune | ETL Job openings in Pune

Apply to 35+ ETL Jobs in Pune on CutShort.io. Explore the latest ETL Job opportunities across top companies like Google, Amazon & Adobe.

icon
Nirmitee.io

at Nirmitee.io

4 recruiters
Disha Karia
Posted by Disha Karia
Pune
5 - 10 yrs
₹8L - ₹18L / yr
ETL
IBM InfoSphere DataStage
SQL

Job Title: Senior ETL Developer (DataStage and SQL)

Location: Pune

Overview:

We’re looking for a Senior ETL Developer with 5+ years of experience in ETL development, strong DataStage and SQL skills, and a track record in complex data integration projects.

Responsibilities:

  • Develop and maintain ETL processes using IBM DataStage and SQL for data warehousing.
  • Write advanced SQL queries to transform and validate data.
  • Troubleshoot ETL jobs, optimize performance, and ensure data quality.
  • Document ETL workflows and adhere to coding standards.
  • Lead and mentor junior developers, providing technical guidance.
  • Collaborate with architects and analysts to deliver scalable solutions.

Qualifications:

  • 5+ years in ETL development; 5+ years with IBM DataStage.
  • Advanced SQL skills and experience with relational databases.
  • Strong understanding of data warehousing and data integration.
  • Experience in performance tuning and ETL process optimization.
  • Team player with leadership abilities and excellent problem-solving skills.


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
4 - 7 yrs
₹5L - ₹16L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+3 more

Responsibility :

  • Proficient in SQL
  • 4+ yrs experience
  • Banking domain expertise 
  • Excellent data analysis and problem-solving skills
  • Attention to data details and accuracy 
  • ETL knowledge and experience 
  • Identifies, creates, and analyzes data, information, and reports to make recommendations and enhance organizational capability.
  • Excellent Communication skills
  • Experience in using Business Analysis tools and techniques
  • Knowledge and understanding of various Business Analysis methodologies
  • Attention to detail and problem-solving skills
  • NP : Immediate joiner 


Read more
Pune, Hybrid
4 - 8 yrs
₹10L - ₹25L / yr
SAP
Data migration
ETL

Job Description:


We are currently seeking a talented and experienced SAP SF Data Migration Specialist to join our team and drive the successful migration of SAP S/4 from SAP ECC.


As the SAP SF Data Migration Specialist, you will play a crucial role in overseeing the design, development, and implementation of data solutions within our SAP SF environment. You will collaborate closely with cross-functional teams to ensure data integrity, accuracy, and usability to support business processes and decision-making. 



About the Company:


We are a dynamic and innovative company committed to delivering exceptional solutions that empower our clients to succeed. With our headquarters in the UK and a global footprint across the US, Noida, and Pune in India, we bring a decade of expertise to every endeavour, driving real results. We take a holistic approach to project delivery, providing end-to-end services that encompass everything from initial discovery and design to implementation, change management, and ongoing support. Our goal is to help clients leverage the full potential of the Salesforce platform to achieve their business objectives.



What Makes VE3 The Best For You We think of your family as our family, no matter the shape or size. We offer maternity leaves, PF Fund Contributions, 5 days working week along with a generous paid time off program that benefits balance your work & personal life.


Requirements

Responsibilities:

  • Lead the design and implementation of data migration strategies and solutions within SAP SF environments.
  • Develop and maintain data migration plans, ensuring alignment with project timelines and objectives.
  • Collaborate with business stakeholders to gather and analyse data requirements, ensuring alignment with business needs and objectives.
  • Design and implement data models, schemas, and architectures to support SAP data structures and functionalities.
  • Lead data profiling and analysis activities to identify data quality issues, gaps, and opportunities for improvement.
  • Define data transformation rules and processes to ensure data consistency, integrity, and compliance with business rules and regulations.
  • Manage data cleansing, enrichment, and standardization efforts to improve data quality and usability.
  • Coordinate with technical teams to implement data migration scripts, ETL processes, and data loading mechanisms.
  • Develop and maintain data governance policies, standards, and procedures to ensure data integrity, security, and privacy.
  • Lead data testing and validation activities to ensure accuracy and completeness of migrated data.
  • Provide guidance and support to project teams, including training, mentoring, and knowledge sharing on SAP data best practices and methodologies.
  • Stay current with SAP data management trends, technologies, and best practices, and recommend innovative solutions to enhance data capabilities and performance.

Requirements:

  • Bachelor’s degree in computer science, Information Systems, or related field; master’s degree preferred.
  • 10+ years of experience in SAP and Non-SAP data management, with a focus on data migration, data modelling, and data governance.
  • Have demonstrable experience as an SAP Data Consultant, ideally working across SAP SuccessFactors and non-SAP systems
  • Highly knowledgeable and experienced in managing HR data migration projects in SAP SuccessFactors environments
  • Demonstrate knowledge of how data aspects need to be considered within overall SAP solution design
  • Manage the workstream activities and plan, including stakeholder management, engagement with the business and the production of governance documentation.
  • Proven track record of leading successful SAP data migration projects from conception to completion.
  • Excellent analytical, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional teams.
  • Experience with SAP Activate methodologies preferred.
  • SAP certifications in data management or related areas are a plus.
  • Ability to work independently and thrive in a fast-paced, dynamic environment.
  • Lead the data migration workstream, with a direct team of circa 5 resources in addition to other 3rd party and client resource.
  • Work flexibly and remotely. Occasional UK travel will be required


Benefits

  • Competitive salary and comprehensive benefits package.
  • Opportunity to work in a dynamic and challenging environment on critical migration projects.
  • Professional growth opportunities in a supportive and forward-thinking organization.
  • Engagement with cutting-edge SAP technologies and methodologies in data migration.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Pune
5 - 10 yrs
Best in industry
ETL
SQL
Snow flake schema
Data Warehouse (DWH)

Job Description for QA Engineer:

  • 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience on Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management toolis good to have.
  • Exposure to the financial domain knowledge is considered a plus
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus


Key Attributes include:

  • Team player with professional and positive approach
  • Creative, innovative and able to think outside of the box
  • Strong attention to detail during root cause analysis and defect issue resolution
  • Self-motivated & self-sufficient
  • Effective communicator both written and verbal
  • Brings a high level of energy with enthusiasm to generate excitement and motivate the team
  • Able to work under pressure with tight deadlines and/or multiple projects
  • Experience in negotiation and conflict resolution


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
5 - 7 yrs
₹5L - ₹20L / yr
ETL

Job Description:

Responsibilities:

  • Design, develop, and implement ETL processes using IBM DataStage.
  • Collaborate with business analysts and data architects to understand data requirements and translate them into ETL solutions.
  • Develop and maintain data integration workflows to ensure data accuracy and integrity.
  • Optimize ETL processes for performance and scalability.
  • Troubleshoot and resolve data issues and ETL job failures.
  • Document ETL processes, data flows, and mappings.
  • Ensure compliance with data governance and security policies.
  • Participate in code reviews and provide constructive feedback to team members.
  • Stay updated with the latest trends and best practices in ETL and data integration technologies.

Requirements:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as an ETL Developer with expertise in IBM DataStage.
  • Strong knowledge of SQL and database management systems (e.g., Oracle, SQL Server, DB2).
  • Experience with data modeling and data warehousing concepts.
  • Familiarity with data quality and data governance principles.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and teamwork abilities.

Location : Pune

Read more
Onepoint IT Consulting Pvt Ltd
Lalitha Goparaju
Posted by Lalitha Goparaju
Pune
0 - 1 yrs
₹2L - ₹4L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL

We will invite candidates for the selection process who meet the following criteria:


- Graduates/post graduates from the computer stream only (16 years of education – 10+2+4 or 10+3+3) having passed out in 2023/24

- 65% minimum marks in all semesters and cleared inonego

- Excellent communication skills - written and verbal in English


Further details on salary, benefits, location and working hours:


- Compensation: Rs. 20,000/- (inclusive of PF) stipend for first 3 months while on training and on successful completion Rs. 4LPA.

-Location: Pune

- Working hours: UK timing (8am – 5pm).

-Health insurance coverage of Rs 5L will be provided for the duration of your employment


Selection process


The selection process will consist of an aptitude assessment (1.5 hr), followed by a technical round (Java and SQL - MCQ) test (20 min) and a Java programming assignment. After clearing all the above rounds, the next round will be a personal interview.


We request that you only reply to this email message if you meet the selection criteria and agree with the terms and conditions. Please also mention which position you are applying for.


We will also ask you to answer the following screening questions if you wish to apply for any of these open vacancies.


Why shouldOnepointconsider you for an interview?

Which values are important for you at the workplace and why?

Where would you like to be in 2 to 3 years’ time in terms of your career?


Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 4 yrs
₹8L - ₹20L / yr
skill iconPython
PySpark
ETL
databricks
Azure
+6 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe. 

 

 

We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English. 

 

 

We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives. 

 

 

Skills Required 

  • Experience in the manufacturing industry (metal industry is a plus)  
  • 2+ years of experience as a Data Engineer 
  • Experience in data cleaning & structuring and data manipulation 
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines. 
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation. 
  • Experience in SQL and data structures  
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases. 
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform. 
  • Proficient in data management and data governance  
  • Strong analytical and problem-solving skills. 
  • Excellent communication and teamwork abilities. 

 


Nice To Have 

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database). 
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud. 


Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 6 yrs
₹8L - ₹25L / yr
SQL Azure
databricks
skill iconPython
SQL
ETL
+9 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.


We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.


We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.


Skills Required:


  • Experience in the manufacturing industry (metal industry is a plus)
  • 4+ years of experience as a Data Engineer
  • Experience in data cleaning & structuring and data manipulation
  • Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
  • Experience in SQL and data structures
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
  • Proficient in data management and data governance
  • Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.


Nice To Have:

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
  • Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
  • Benefits And Perks
  • A culture that fosters innovation, creativity, continuous learning, and resilience
  • Progressive leave policy promoting work-life balance
  • Mentorship opportunities with highly qualified internal resources and industry-driven programs
  • Multicultural peer groups and supportive workplace policies
  • Annual workcation program allowing you to work from various scenic locations
  • Experience the unique environment of a dynamic start-up


Why should you join TVARIT ?


Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.


If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Pune
8 - 15 yrs
₹20L - ₹25L / yr
skill iconPython
CI/CD
Systems Development Life Cycle (SDLC)
ETL
JIRA
+5 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.



Requirements:

  • Python Experience: Minimum 3+ years.
  • Software Development Experience: Minimum 8+ years.
  • Data Engineering and ETL Workloads: Minimum 2+ years.
  • Familiarity with Software Development Life Cycle (SDLC).
  • CI/CD Pipeline Development: Experience in developing CI/CD pipelines for large projects.
  • Agile Framework & Sprint Methodology: Experience with Jira.
  • Source Version Control: Experience with GitHub or similar SVC.
  • Team Leadership: Experience leading a team of software developers/data scientists.

Good to Have:

  • Experience with Golang.
  • DevOps/Cloud Experience (preferably AWS).
  • Experience with React and TypeScript.

Responsibilities:

  • Mentor and train a team of data scientists and software developers.
  • Lead and guide the team in best practices for software development and data engineering.
  • Develop and implement CI/CD pipelines.
  • Ensure adherence to Agile methodologies and participate in sprint planning and execution.
  • Collaborate with the team to ensure the successful delivery of projects.
  • Provide on-site support and training in Pune.

Skills and Attributes:

  • Strong leadership and mentorship abilities.
  • Excellent problem-solving skills.
  • Effective communication and teamwork.
  • Ability to work in a fast-paced environment.
  • Passionate about technology and continuous learning.


Note: This is a part-time position paid on an hourly basis. The initial commitment is 4-8 hours per week, with potential fluctuations.


Join TVARIT and be a pivotal part of shaping the future of software development and data engineering.

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
InfoCepts
Lalsaheb Bepari
Posted by Lalsaheb Bepari
Chennai, Pune, Nagpur
7 - 10 yrs
₹5L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Responsibilities:

 

• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing

• Implementing Spark processing based ETL frameworks

• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

• Modifying the Informatica-Teradata & Unix based data pipeline

• Enhancing the Talend-Hive/Spark & Unix based data pipelines

• Develop and Deploy Scala/Python based Spark Jobs for ETL processing

• Strong SQL & DWH concepts.

 

Preferred Background:

 

• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs

• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives

• Understanding of EDW system of business and creating High level design document and low level implementation document

• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document

• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Bengaluru (Bangalore), Pune, Mumbai
6 - 8 yrs
₹25L - ₹28L / yr
ETL
Informatica
Data Warehouse (DWH)
ETL management
SQL
+1 more

Your key responsibilities

 

  • Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
  • Responsible for development, support, maintenance, and implementation of a complex project module
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
  • complete reporting solutions.
  • Preparation of HLD about architecture of the application and high level design.
  • Preparation of LLD about job design, job description and in detail information of the jobs.
  • Preparation of Unit Test cases and execution of the same.
  • Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle

Skills and attributes for success

 

  • Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
  • Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
  • Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
  • Should have enough experience to work on Power Shell Scripting
  • Able to guide the team through the development, testing and implementation stages and review the completed work effectively
  • Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
  • Primary owner of delivery, timelines. Review code was written by other engineers.
  • Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
  • Must have understanding of business intelligence development in the IT industry
  • Outstanding written and verbal communication skills
  • Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
  • Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
  • Should be able to orchestrate and automate pipeline
  • Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark

 

To qualify for the role, you must have

 

  • Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
  • More than 6 years of experience in ETL development projects
  • Proven experience in delivering effective technical ETL strategies
  • Microsoft Azure project experience
  • Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)

 

Ideally, you’ll also have

Read more
AArete Technosoft Pvt Ltd
Pune
7 - 12 yrs
₹25L - ₹30L / yr
Snowflake
Snow flake schema
ETL
Data Warehouse (DWH)
skill iconPython
+8 more
Help us modernize our data platforms, with a specific focus on Snowflake
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Read more
Humancloud Technology Pvt Ltd
yash gurav
Posted by yash gurav
Pune
9 - 12 yrs
₹12L - ₹17L / yr
Project Management
IT project management
Software project management
Problem solving
Client Management
+12 more
About us :
Humancloud Technologies is a leading digital technology and innovation partner
transforming businesses across the globe through its services and solutions.
We believe in helping our businesses stay ahead of the curve by enabling them to
leverage the new-age technology services of Blockchain, IoT, Cloud and Experience
Design. We, at Humancloud, have nurtured ideas from validation to production
and shaped them into scalable businesses.
An experienced IIT Delhi alumni leadership coupled with a team of talented and
supportive peers look forward to your onboarding.

 Roles and Responsibilities:

You will be the leader of a team of high-performing individuals who own the entire product lifecycle from strategy to evaluation.
Guide the team in achieving both individual and collective goals.
Create and manage a process that drives toward a scalable product portfolio that will in turn drive the organization’s product profitability.
Guide the team to transform the product ideas into actionable concepts aligned with the business objectives within scope and budget.
Brainstorming for new initiatives and roadmaps, keeping an eye on the market, the customers, new opportunities and the business cases for new product opportunities.
Develop and monitor data-driven analytics for project reporting and status checks.
Prepare and maintain the RAID register and include it in planning as appropriate.
Keep all the stakeholders updated on project progress and risks if any
Initiate continuous improvements (CI) within project teams.
As and when necessary, deep dive with code reviews and RCA.

Desired Qualification and Experience :

Able to manage projects in technologies such as Angular, JAVA, J2EE, Microservices .NET, SQL Server, Microsoft SSIS, SSRS, and ETL with hands-on in one or more technologies.
Experience in automation and system integration.
Understand products and working of the commerce ecosystem in depth, have a good understanding of funnels, sprint & agile ways of working, PRDs & wireframes.
Align with Agile methodology and collaborate with the development team and business users.

Able to demonstrate & present the necessary reports and achievements.
Are comfortable with ambiguity, believe in first principles and have the skill to transform broad ideas into action plans
Engage with global stakeholders from a project delivery perspective and drive the project suavely.
Establish relevant policies, processes and procedures, and monitor compliance.


Why Join Us:
Are you inquisitive? Are you someone who believes in facing the odds with the
determination to drive ideas forward? Then, Humancloud Technologies is the place
for you.
At Humancloud Technologies, we are committed to fostering a culture of innovation,
integrity, passion, courage and empathy. We believe in human potential and the
numerous ways it can serve humanity through adopting technology. If you share a
similar belief, then we welcome you to join us.
For Further Information:
Visit our website: http://www.humancloud.co.in/" target="_blank">www.humancloud.co.in

Follow us on social media: LinkedIn          
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
GradMener Technology Pvt. Ltd.
Pune
2 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Oracle
Job Description : 
 
Roles and Responsibility : 
 
  • Designing and coding the data warehousing system to desired company specifications 
  • Conducting preliminary testing of the warehousing environment before data is extracted
  • Extracting company data and transferring it into the new warehousing environment
  • Testing the new storage system once all the data has been transferred
  • Troubleshooting any issues that may arise
  • Providing maintenance support
  • Consulting with data management teams to get a big-picture idea of the company’s data storage needs
  • Presenting the company with warehousing options based on their storage needs
Requirements :
  • Experience of 1-3 years in Informatica Power Center
  • Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
  • Knowledge of SQL Server database 
  • Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques  Understanding of ETL Control Framework
  • Experience in UNIX shell/Perl Scripting
  • Good communication skills, including the ability to write clearly
  • Able to function effectively as a member of a team 
  • Proactive with respect to personal and technical development
Read more
InnovAccer

at InnovAccer

3 recruiters
Jyoti Kaushik
Posted by Jyoti Kaushik
Noida, Bengaluru (Bangalore), Pune, Hyderabad
4 - 7 yrs
₹4L - ₹16L / yr
ETL
SQL
Data Warehouse (DWH)
Informatica
Datawarehousing
+2 more

We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.

In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that

  • Has healthcare experience and is passionate about helping heal people,
  • Loves working with data,
  • Has an obsessive focus on data quality,
  • Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
  • Has strong data interrogation and analysis skills,
  • Defaults to written communication and delivers clean documentation, and,
  • Enjoys working with customers and problem solving for them.

A day in the life at Innovaccer:

  • Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
  • Measure and communicate impact to our customers.
  • Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.

What You Need:

  • 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
  • 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
  • Intermediate to advanced level SQL programming skills.
  • Data Analytics and Visualization (using tools like PowerBI)
  • The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
  • Ability to work in a fast-paced and agile environment.
  • Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.

What we offer:

  • Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
  • Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
  • Health benefits: We cover health insurance for you and your loved ones.
  • Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
  • Pet-friendly office and open floor plan: No boring cubicles.
Read more
Impetus Technologies

at Impetus Technologies

1 recruiter
Gangadhar T.M
Posted by Gangadhar T.M
Bengaluru (Bangalore), Hyderabad, Pune, Indore, Gurugram, Noida
10 - 17 yrs
₹25L - ₹50L / yr
Product Management
Big Data
Data Warehouse (DWH)
ETL
Hi All, 
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Client of People First Consultants

Client of People First Consultants

Agency job
via People First Consultants by Aishwarya KA
Pune, Hyderabad
3 - 6 yrs
₹4L - ₹8L / yr
skill iconPython
NumPy
pandas
skill iconDjango
skill iconFlask
+2 more

Key skills : Python, Numpy, Panda, SQL, ETL 

Roles and Responsibilities:

 

- The work will involve the development of workflows triggered by events from other systems

- Design, develop, test, and deliver software solutions in the FX Derivatives group

- Analyse requirements for the solutions they deliver, to ensure that they provide the right solution

- Develop easy to use documentation for the frameworks and tools developed for adaption by other teams

Familiarity with event-driven programming in Python

- Must have unit testing and debugging skills

- Good problem solving and analytical skills

- Python packages such as NumPy, Scikit learn

- Testing and debugging applications.

- Developing back-end components.

Read more
With a global provider of Business Process Management.

With a global provider of Business Process Management.

Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore), Pune, Delhi, Gurugram, Nashik, Vizag
3 - 5 yrs
₹8L - ₹12L / yr
Oracle
Business Intelligence (BI)
PowerBI
Oracle Warehouse Builder
Informatica
+3 more
Oracle BI developer wiith 6+ years experience working on Oracle warehouse design, development and
testing
Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
requirements
Good communication skills and written communication
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Read more
Pinghala

at Pinghala

1 recruiter
Ashwini Dhaipule
Posted by Ashwini Dhaipule
Pune
6 - 10 yrs
₹6L - ₹15L / yr
Software Testing (QA)
Shell Scripting
Data management
ETL QA
ETL
+3 more

ABOUT US:
Pingahla was founded by a group of people passionate about making the world a better place by harnessing
the power of Data. We are a data management firm with offices in New York and India.Our mission is to help transform the way companies operate and think about their business. We make it easier
to adopt and stay ahead of the curve in the ever-changing digital landscape. One of our core beliefs is
excellence in everything we do!

JOB DESCRIPTION:
Pingahla is recruiting ETL & BI Test Manager who can build and lead a team, establish infrastructure, processes
and best practices for our Quality Assurance vertical. The candidates are expected to have at least 5+ years of experience with ETL Testing and working in Data Management project testing. Being a growing company, we will be able to provide very good career opportunities and a very attractive remuneration.

JOB ROLE & RESPONSIBILITIES:
• Plans and manages the testing activities;
• Defect Management and Weekly & Monthly Test report Generation;
• Work as a Test Manager to design Test Strategy and approach for DW&BI - (ETL & BI) solution;
• Provide leadership and directions to the team on quality standards and testing best practices;
• Ensured that project deliverables are produced, including, but not limited to: quality assurance plans,
test plans, testing priorities, status reports, user documentation, online help, etc. Managed and
motivated teams to accomplish significant deliverables within tight deadlines.
• Test Data Management; Reviews and approves all test cases prior to execution;
• Coordinates and reviews offshore work efforts for projects and maintenance activities.

REQUIRED SKILLSET:
• Experience in Quality Assurance Management, Program Management, DW - (ETL & BI)
Management
• Minimum 5 years in ETL Testing, at least 2 years in the Team Lead role
• Technical abilities complemented by sound communication skills, user interaction abilities,
requirement gathering and analysis, and skills in data migration and conversion strategies.
• Proficient in test definition, capable of developing test plans and test cases from technical
specifications
• Single handedly looking after complete delivery from testing side.
• Experience working with remote teams, across multiple time zones.
• Must have a strong knowledge of QA processes and methodologies
• Strong UNIX and PERL scripting skills
• Expertise with ETL testing & hands-on with working on ETL tool like Informatica, DataStage
PL/SQL is a plus.
• Excellent problem solving, analytical and technical troubleshooting skills.
• Familiar with Data Management projects.
• Eager to learn, adopt and apply rapidly changing new technologies and methodologies.
• Efficient and effective at approaching and escalating quality issues when appropriate.

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
Pune
4 - 7 yrs
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
Avegen India Pvt. Ltd

at Avegen India Pvt. Ltd

2 recruiters
Shubham Shinde
Posted by Shubham Shinde
Pune
3 - 8 yrs
₹3L - ₹20L / yr
Intelligence
Artificial Intelligence (AI)
skill iconDeep Learning
skill iconMachine Learning (ML)
Data extraction
+3 more
Responsibilities
● Frame ML / AI use cases that can improve the company’s product
● Implement and develop ML / AI / Data driven rule based algorithms as software items
● For example, building a chatbot that replies an answer from relevant FAQ, and
reinforcing the system with a feedback loop so that the bot improves
Must have skills:
● Data extraction and ETL
● Python (numpy, pandas, comfortable with OOP)
● Django
● Knowledge of basic Machine Learning / Deep Learning / AI algorithms and ability to
implement them
● Good understanding of SDLC
● Deployed ML / AI model in a mobile / web product
● Soft skills : Strong communication skills & Critical thinking ability

Good to have:
● Full stack development experience
Required Qualification:
B.Tech. / B.E. degree in Computer Science or equivalent software engineering
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
Pune
3 - 8 yrs
₹5L - ₹20L / yr
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
+1 more

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
12 - 20 yrs
₹20L - ₹35L / yr
Data Warehouse (DWH)
ETL
Big Data
Business Intelligence (BI)
Project Management
+1 more

Job Description

Experience : 10+ Years

Location : Pune


Job Requirements:

  • Minimum of 10+ years of experience with a proven record of increased responsibility
  • Hands on experience in design, development and managing Big Data, Cloud, Data warehousing
  • and Business Intelligence projects
  • Experience of managing projects in Big Data, Cloud, Data warehousing, Business Intelligence
  • Using open source or top of the line tools and technologies
  • Good knowledge of Dimensional Modeling
  • Experience of working with any ETL and BI Reporting tools
  • Experience of managing medium to large projects, preferably on Big Data
  • Proven experience in project planning, estimation, execution and implementation of medium to
  • large projects
  • Should be able to effectively communicate in English
  • Strong management and leadership skills, with proven ability to develop and manage client
  • relationships
  • Proven problem-solving skills from both technical and managerial perspectives
  • Attention to detail and a commitment to excellence and high standards
  • Excellent interpersonal and communication skills, both verbal and written
  • Position is remote with occasional travel to other offices, client sites, conventions, training
  • locations, etc.
  • Bachelor’s degree in Computer Science, Business\Economics, or a related field or demonstrated,
  • equivalent/practical knowledge or experience

Job Responsibilities:

  • Day to day project management, scrum and agile management including project planning, delivery
  • and execution of Big Data and
  • Primary Point of contact for customer related to all project engagements, delivery and project
  • escalations
  • Design right architecture and technology stack depending on business requirement on Cloud / Big
  • Data and BI related technologies both some on-premise and on cloud
  • Liaise with key stakeholders to define the Cloud / Big data solutions roadmap, prioritize the
  • deliverables
  • Responsible for end to end project delivery of Cloud / Big Data Solutions from project estimations,
  • project planning, resourcing and monitoring perspective
  • Drive and participate in requirements gathering workshops, estimation discussions, design
  • meetings and status review meetings
  • Support & assist the team in resolving issues during testing and when the system is in production
  • Involved in the full customer lifecycle with a goal to make customers successful and increase
  • revenue and retention
  • Interface with the offshore engineering team to solve customer issues
  • Develop programs that meet customer needs with respect to functionality, performance,
  • scalability, reliability, schedule, principles and recognized industry standards
  • Requirement analysis and documentation
  • Manage day-to-day operational aspects of a project and scope
  • Prepare for engagement reviews and quality assurance procedures
  • Visit and/or host clients to strengthen business relationships
Read more
1CH

at 1CH

1 recruiter
Sathish Sukumar
Posted by Sathish Sukumar
Chennai, Bengaluru (Bangalore), Hyderabad, NCR (Delhi | Gurgaon | Noida), Mumbai, Pune
4 - 15 yrs
₹10L - ₹25L / yr
Data engineering
Data engineer
ETL
SSIS
ADF
+3 more
  • Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
  • Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
  • Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
  • Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
  • Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
  • Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree  and Random forest Algorithms.
  • PolyBase queries for exporting and importing data into Azure Data Lake.
  • Building data models both tabular and multidimensional using SQL Server data tools.
  • Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
  • Programming experience using python libraries NumPy, Pandas and Matplotlib.
  • Implementing NOSQL databases and writing queries using cypher.
  • Designing end user visualizations using Power BI, QlikView and Tableau.
  • Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
  • Experience using the expression languages MDX and DAX.
  • Experience in migrating on-premise SQL server database to Microsoft Azure.
  • Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
  • Performance tuning complex SQL queries, hands on experience using SQL Extended events.
  • Data modeling using Power BI for Adhoc reporting.
  • Raw data load automation using T-SQL and SSIS
  • Expert in migrating existing on-premise database to SQL Azure.
  • Experience in using U-SQL for Azure Data Lake Analytics.
  • Hands on experience in generating SSRS reports using MDX.
  • Experience in designing predictive models using Python and SQL Server.
  • Developing machine learning models using Azure Databricks and SQL Server
Read more
European MNC

European MNC

Agency job
via Kavayah People Consulting by Kavita Singh
Pune
3 - 8 yrs
₹8L - ₹15L / yr
ETL
Data Warehouse (DWH)
SQL
Technical support
The Support Engineer (L2) will serve as a technical support champion for both internal and external customers. 
Key Responsibilities
Support mission critical applications and technologies
Adhere to agreed SLAs.
 
Required Experience, Skills and Qualifications
3-8 years of relevant experience
Proven track record of supporting ETL/Data Warehouse/Business Intelligence solutions
Strong SQL / Unix skills
Excellent written and verbal communication
High-degree of analytical and problem solving skills
Exposure to handling customers from various geographies
Strong debugging and troubleshooting skills
Ability to work with minimum supervision
Team player who shares ideas and resources
Tools and Technologies
ETL Tools: Talend or Informatica experience
BI Tools: Experience supporting Tableau or Jaspersoft or Pentaho or Qlikview
Database: Experience in Oracle or any RDBMS
Read more
Swiss Healthcare MNC

Swiss Healthcare MNC

Agency job
via Kavayah People Consulting by Kavita Singh
Pune
2 - 5 yrs
₹10L - ₹14L / yr
ETL
Datawarehousing
Data Warehouse (DWH)
SQL
Informatica

 Review all job requirements and specifications required for deploying the solution into the production environment. 

 Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same. 

 Work with Lead to resolve all issues within the required timeframe and inform for any delays. 

 Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments. 

 Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis. 

 Review and maintain all technical and business documents.  Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide 

 Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same. 

 Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues. 

 Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau. 

 Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones. 

 

Experience on Tools and Technologies preferred: 

ETL Tools: Talend or Informatica ,Abinitio,Datastage 

BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience 

Database: Experience in Oracle or SS 

Methodology: Experience in SDLC and/or Agile Methodology

Read more
IT Giant

IT Giant

Agency job
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
10 - 18 yrs
₹15L - ₹30L / yr
ETL
Informatica
Informatica PowerCenter
Windows Azure
SQL Azure
+2 more
Key skills:
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake

Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Read more
Data

Data

Agency job
via parkcom by Ravi P
Pune
6 - 15 yrs
₹7L - ₹15L / yr
ETL
Oracle
Talend
Ab Initio

We are looking for a Senior Database Developer to provide a senior-level contribution to design, develop and implement critical business enterprise applications for marketing systems 

 

  1. Play a lead role in developing, deploying and managing our databases (Oracle, My SQL and Mongo) on Public Clouds.
  2. Design and develop PL/SQL processes to perform complex ETL processes. 
  3. Develop UNIX and Perl scripts for data auditing and automation. 
  4. Responsible for database builds and change requests.
  5. Holistically define the overall reference architecture and manage its overall implementation in the production systems.
  6. Identify architecture gaps that can improve availability, performance and security for both productions systems and database systems and works towards resolving those issues.
  7. Work closely with Engineering, Architecture, Business and Operations teams to provide necessary and continuous feedback.
  1. Automate all the manual steps for the database platform.
  2. Deliver solutions for access management, availability, security, replication and patching.
  3. Troubleshoot application database performance issues.
  4. Participate in daily huddles (30 min.) to collaborate with onshore and offshore teams.

 

Qualifications: 

 

  1. 5+ years of experience in database development.
  2. Bachelor’s degree in Computer Science, Computer Engineering, Math, or similar.
  3. Experience using ETL tools (Talend or Ab Initio a plus).
  4. Experience with relational database programming, processing and tuning (Oracle, PL/SQL, My SQL, MS SQL Server, SQL, TSQL).
  5. Familiarity with BI tools (Cognos, Tableau, etc.).
  6. Experience with Cloud technology (AWS, etc.).
  7. Agile or Waterfall methodology experience preferred.
  8. Experience with API integration.
  9. Advanced software development and scripting skills for use in automation and interfacing with databases.
  10. Knowledge of software development lifecycles and methodologies.
  11. Knowledge of developing procedures, packages and functions in a DW environment.
  12. Knowledge of UNIX, Linux and Service Oriented Architecture (SOA).
  13. Ability to multi-task, to work under pressure, and think analytically.
  14. Ability to work with minimal supervision and meet deadlines.
  15. Ability to write technical specifications and documents.
  16. Ability to communicate effectively with individuals at all levels in the company and with various business contacts outside of the company in an articulate, professional manner.
  17. Knowledge of CDP, CRM, MDM and Business Intelligence is a plus.
  18. Flexible work hours.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

 

Read more
Bonzai Digital Pvt. Ltd.

at Bonzai Digital Pvt. Ltd.

2 recruiters
Neha Vyas
Posted by Neha Vyas
Pune
3 - 9 yrs
₹6L - ₹15L / yr
skill iconAmazon Web Services (AWS)
ETL
Linux/Unix
Bonzai is a cloud based enterprise software platform that helps brands to create, traffic, measure and optimize their HTML5 ads. The self-serve drag and drop tool allows brands to build dynamic and responsive ad units for all screens, without a single line of code.
Read more
Bonzai Digital Pvt. Ltd.

at Bonzai Digital Pvt. Ltd.

2 recruiters
Neha Vyas
Posted by Neha Vyas
Pune
3 - undefined yrs
₹6L - ₹15L / yr
skill iconAmazon Web Services (AWS)
ETL
Linux/Unix
Bonzai is a cloud based enterprise software platform that helps brands to create, traffic, measure and optimize their HTML5 ads. The self-serve drag and drop tool allows brands to build dynamic and responsive ad units for all screens, without a single line of code.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort