Cutshort logo
Data Warehouse (DWH) Jobs in Pune

19+ Data Warehouse (DWH) Jobs in Pune | Data Warehouse (DWH) Job openings in Pune

Apply to 19+ Data Warehouse (DWH) Jobs in Pune on CutShort.io. Explore the latest Data Warehouse (DWH) Job opportunities across top companies like Google, Amazon & Adobe.

icon
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
Django
Flask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
InfoCepts
Lalsaheb Bepari
Posted by Lalsaheb Bepari
Chennai, Pune, Nagpur
7 - 10 yrs
₹5L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Responsibilities:

 

• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing

• Implementing Spark processing based ETL frameworks

• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

• Modifying the Informatica-Teradata & Unix based data pipeline

• Enhancing the Talend-Hive/Spark & Unix based data pipelines

• Develop and Deploy Scala/Python based Spark Jobs for ETL processing

• Strong SQL & DWH concepts.

 

Preferred Background:

 

• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs

• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives

• Understanding of EDW system of business and creating High level design document and low level implementation document

• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document

• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

Read more
AArete Technosoft Pvt Ltd
Pune
7 - 12 yrs
₹25L - ₹30L / yr
Snowflake
Snow flake schema
ETL
Data Warehouse (DWH)
Python
+8 more
Help us modernize our data platforms, with a specific focus on Snowflake
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Read more
Tredence
Rohit S
Posted by Rohit S
Chennai, Pune, Bengaluru (Bangalore), Gurugram
11 - 16 yrs
₹20L - ₹32L / yr
Data Warehouse (DWH)
Google Cloud Platform (GCP)
Amazon Web Services (AWS)
Data engineering
Data migration
+1 more
• Engages with Leadership of Tredence’s clients to identify critical business problems, define the need for data engineering solutions and build strategy and roadmap
• S/he possesses a wide exposure to complete lifecycle of data starting from creation to consumption
• S/he has in the past built repeatable tools / data-models to solve specific business problems
• S/he should have hand-on experience of having worked on projects (either as a consultant or with in a company) that needed them to
o Provide consultation to senior client personnel o Implement and enhance data warehouses or data lakes.
o Worked with business teams or was a part of the team that implemented process re-engineering driven by data analytics/insights
• Should have deep appreciation of how data can be used in decision-making
• Should have perspective on newer ways of solving business problems. E.g. external data, innovative techniques, newer technology
• S/he must have a solution-creation mindset.
Ability to design and enhance scalable data platforms to address the business need
• Working experience on data engineering tool for one or more cloud platforms -Snowflake, AWS/Azure/GCP
• Engage with technology teams from Tredence and Clients to create last mile connectivity of the solutions
o Should have experience of working with technology teams
• Demonstrated ability in thought leadership – Articles/White Papers/Interviews
Mandatory Skills Program Management, Data Warehouse, Data Lake, Analytics, Cloud Platform
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
GradMener Technology Pvt. Ltd.
Pune, Chennai
5 - 9 yrs
₹15L - ₹20L / yr
Scala
PySpark
Spark
SQL Azure
Hadoop
+4 more
  • 5+ years of experience in a Data Engineering role on cloud environment
  • Must have good experience in Scala/PySpark (preferably on data-bricks environment)
  • Extensive experience with Transact-SQL.
  • Experience in Data-bricks/Spark.
  • Strong experience in Dataware house projects
  • Expertise in database development projects with ETL processes.
  • Manage and maintain data engineering pipelines
  • Develop batch processing, streaming and integration solutions
  • Experienced in building and operationalizing large-scale enterprise data solutions and applications
  • Using one or more of Azure data and analytics services in combination with custom solutions
  • Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Cloud repositories for e.g. Azure GitHub, Git
  • Experience in an agile environment (Prefer Azure DevOps).

Good to have

  • Manage source data access security
  • Automate Azure Data Factory pipelines
  • Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
  • Experience in implementing and maintaining CICD pipelines
  • Power BI understanding, Delta Lake house architecture
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
  • Expertise in database development projects and ETL processes.
Read more
Impetus

at Impetus

3 recruiters
Agency job
via Impetus by Gangadhar TM
Bengaluru (Bangalore), Pune, Hyderabad, Indore, Noida, Gurugram
10 - 16 yrs
₹30L - ₹50L / yr
Big Data
Data Warehouse (DWH)
Product Management

Job Title: Product Manager

 

Job Description

Bachelor or master’s degree in computer science or equivalent experience.
Worked as Product Owner before and took responsibility for a product or project delivery.
Well-versed with data warehouse modernization to Big Data and Cloud environments.
Good knowledge* of any of the Cloud (AWS/Azure/GCP) – Must Have
Practical experience with continuous integration and continuous delivery workflows.
Self-motivated with strong organizational/prioritization skills and ability to multi-task with close attention to detail.
Good communication skills
Experience in working within a distributed agile team
Experience in handling migration projects – Good to Have
 

*Data Ingestion, Processing, and Orchestration knowledge

 

Roles & Responsibilities


Responsible for coming up with innovative and novel ideas for the product.
Define product releases, features, and roadmap.
Collaborate with product teams on defining product objectives, including creating a product roadmap, delivery, market research, customer feedback, and stakeholder inputs.
Work with the Engineering teams to communicate release goals and be a part of the product lifecycle. Work closely with the UX and UI team to create the best user experience for the end customer.
Work with the Marketing team to define GTM activities.
Interface with Sales & Customer teams to identify customer needs and product gaps
Market and competition analysis activities.
Participate in the Agile ceremonies with the team, define epics, user stories, acceptance criteria
Ensure product usability from the end-user perspective

 

Mandatory Skills

Product Management, DWH, Big Data

Read more
Accion Labs

at Accion Labs

14 recruiters
Anjali Mohandas
Posted by Anjali Mohandas
Remote, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai
4 - 8 yrs
₹15L - ₹28L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more

4-6 years of total experience in data warehousing and business intelligence

3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)

2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)

Strong experience building visually appealing UI/UX in Power BI

Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)

Experience building Power BI using large data in direct query mode

Expert SQL background (query building, stored procedure, optimizing performance)

Read more
Impetus Technologies

at Impetus Technologies

1 recruiter
Gangadhar T.M
Posted by Gangadhar T.M
Bengaluru (Bangalore), Hyderabad, Pune, Indore, Gurugram, Noida
10 - 17 yrs
₹25L - ₹50L / yr
Product Management
Big Data
Data Warehouse (DWH)
ETL
Hi All, 
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
Read more
JetSynthesys Pvt. Ltd.

at JetSynthesys Pvt. Ltd.

1 recruiter
Agency job
via Jobdost by Mamatha A
Remote, Pune
5 - 7 yrs
₹12L - ₹16L / yr
Javascript
NodeJS (Node.js)
PHP
SQL
C#
+12 more
leading marketing software provides mobile app developers a powerful set of solutions to grow their mobile apps.  technology platform enables developers to market, monetize, analyze and publish their apps. The company’s first-party content includes over 200+ popular, engaging apps and its technology brings that content to millions of users around the world. Company is headquartered in Palo Alto, California with several offices globally.
 
Company is a Certified Great Place to Work, one of Inc.’s Best Workplaces and a recipient of the 2019 Glassdoor Top CEO employee’s choice award. The San Francisco Business Times and Silicon Valley Business Journal awarded Company one of the Bay Area’s Best Places to Work in 2019, 2020 and 2021, and the Workplace Wellness Award in 2019 which recognizes businesses that are leaders in improving worker well-being.
 
We are looking for a Senior Full Stack Engineer to join our team! As a Senior Full Stack Engineer, you will lead our dashboards and backend engineering team and help oversee our AWS Cloud infrastructure. You will help build marketing tools to help our growth team, develop our web portal to interface with third-party developers and support backend services and data pipelines for our games. If you are a driven, results-focused engineer looking to work with a fun, collaborative team - we’d love to talk to you!
 
Location / time zone preferences: Mostly overlap with West Coast / PST time zone
Language preferences: English B1-B2 or above
 

Responsibilities:

    • Take the lead in building tools to increase the productivity of our business and product teams
    • Build client facing portal to support the submission and integration of games from external developers
    • Collaborate with teams in a range of disciplines
    • Clearly communicate challenges and progress to stakeholders
    • Adopt and learn new technologies

Basic Qualifications:

    • 5+ years professional experience in software development and a BS or MS in Computer Science or related field
    • Solid understanding of Javascript, NodeJS, PHP, SQL, C#
    • Strong knowledge of AWS Cloud architecture, services, and DevOps
    • Adhere to software design patterns and have knowledge of algorithms
    • Experience with databases and database programming (MySQL, NoSQL, etc.)
    • Comfortable understanding and implementing REST APIs, knowledge of AJAX patterns and principles
    • In-depth knowledge of modern HTML/CSS
    • Strong understanding of web architecture, security, cookies, reverse-proxies
    • Have a solid knowledge of web debugging tools (Firebug or Chrome Developer Console)

Pluses:

    • Bonus points for data warehouse experience (Snowflake, Redshift)
    • Experience in game programming and Unity development
    • Knowledge of unit testing and test driven development
    • A passion for games (of any type) as well as a passion for code
    • Knowledge of mobile gaming metrics and the mobile gaming industry

Perks:

    • Free medical, dental, and vision insurance
    • Work from home stipend on each paycheck
    • Competitive Salary
    • Flexible Time Off - work hard and take time when you need it
    • Interested? Send us your resume and let's talk!
 is an equal opportunity employer and considers qualified applicants without regard to race, gender, sexual orientation, gender identity or expression, genetic information, national origin, age, disability, medical condition, religion, marital status or veteran status, or any other basis protected by law.
Read more
Mobile Programming India Pvt Ltd

at Mobile Programming India Pvt Ltd

1 video
17 recruiters
Inderjit Kaur
Posted by Inderjit Kaur
Bengaluru (Bangalore), Chennai, Pune, Gurugram
4 - 8 yrs
₹9L - ₹14L / yr
ETL
Data Warehouse (DWH)
Data engineering
Data modeling
BRIEF JOB RESPONSIBILITIES:

• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem
Read more
Gurugram, Pune, Mumbai, Bengaluru (Bangalore), Chennai, Nashik
4 - 12 yrs
₹12L - ₹15L / yr
Data engineering
Data modeling
data pipeline
Data integration
Data Warehouse (DWH)
+12 more

 

 

Designation – Deputy Manager - TS


Job Description

  1. Total of  8/9 years of development experience Data Engineering . B1/BII role
  2. Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
  3. Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
  4. Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
  5. Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
  6. Strong Python skill .
  7. Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
  8. Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
  9. Life Science & Healthcare domain background will be a plus

Qualifications

BE/Btect/ME/MTech

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
Pune
4 - 7 yrs
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
2 - 6 yrs
₹3L - ₹15L / yr
SQL
Linux/Unix
Shell Scripting
SQL server
PL/SQL
+3 more

Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.

 

Mandatory Skills:

  • Strong in SQL development
  • Hands-on at least one scripting language - preferably shell scripting
  • Development experience in Data warehouse projects

Opportunities:

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
12 - 20 yrs
₹20L - ₹35L / yr
Data Warehouse (DWH)
ETL
Big Data
Business Intelligence (BI)
Project Management
+1 more

Job Description

Experience : 10+ Years

Location : Pune


Job Requirements:

  • Minimum of 10+ years of experience with a proven record of increased responsibility
  • Hands on experience in design, development and managing Big Data, Cloud, Data warehousing
  • and Business Intelligence projects
  • Experience of managing projects in Big Data, Cloud, Data warehousing, Business Intelligence
  • Using open source or top of the line tools and technologies
  • Good knowledge of Dimensional Modeling
  • Experience of working with any ETL and BI Reporting tools
  • Experience of managing medium to large projects, preferably on Big Data
  • Proven experience in project planning, estimation, execution and implementation of medium to
  • large projects
  • Should be able to effectively communicate in English
  • Strong management and leadership skills, with proven ability to develop and manage client
  • relationships
  • Proven problem-solving skills from both technical and managerial perspectives
  • Attention to detail and a commitment to excellence and high standards
  • Excellent interpersonal and communication skills, both verbal and written
  • Position is remote with occasional travel to other offices, client sites, conventions, training
  • locations, etc.
  • Bachelor’s degree in Computer Science, Business\Economics, or a related field or demonstrated,
  • equivalent/practical knowledge or experience

Job Responsibilities:

  • Day to day project management, scrum and agile management including project planning, delivery
  • and execution of Big Data and
  • Primary Point of contact for customer related to all project engagements, delivery and project
  • escalations
  • Design right architecture and technology stack depending on business requirement on Cloud / Big
  • Data and BI related technologies both some on-premise and on cloud
  • Liaise with key stakeholders to define the Cloud / Big data solutions roadmap, prioritize the
  • deliverables
  • Responsible for end to end project delivery of Cloud / Big Data Solutions from project estimations,
  • project planning, resourcing and monitoring perspective
  • Drive and participate in requirements gathering workshops, estimation discussions, design
  • meetings and status review meetings
  • Support & assist the team in resolving issues during testing and when the system is in production
  • Involved in the full customer lifecycle with a goal to make customers successful and increase
  • revenue and retention
  • Interface with the offshore engineering team to solve customer issues
  • Develop programs that meet customer needs with respect to functionality, performance,
  • scalability, reliability, schedule, principles and recognized industry standards
  • Requirement analysis and documentation
  • Manage day-to-day operational aspects of a project and scope
  • Prepare for engagement reviews and quality assurance procedures
  • Visit and/or host clients to strengthen business relationships
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
2.5 - 6 yrs
₹1L - ₹8L / yr
Big Data
Hadoop
Apache Hive
Spark
Data engineering
+3 more
Job Title/Designation: Big Data Engineers - Hadoop, Pig, Hive, Spark
Employment Type: Full Time, Permanent

Job Description:
 
Work Location - Pune
Work Experience - 2.5 to 6 Years
 
Note - Candidates with short notice periods will be given preference.
 
Mandatory Skills:
  • Working knowledge and hands-on experience of Big Data / Hadoop tools and technologies.
  • Experience of working in Pig, Hive, Flume, Sqoop, Kafka etc.
  • Database development experience with a solid understanding of core database concepts, relational database design, ODS & DWH.
  • Expert level knowledge of SQL and scripting preferably UNIX shell scripting, Perl scripting.
  • Working knowledge of Data integration solution and well-versed with any ETL tool (Informatica / Datastage / Abinitio/Pentaho etc).
  • Strong problem solving and logical reasoning ability.
  • Excellent understanding of all aspects of the Software Development Lifecycle.
  • Excellent written and verbal communication skills.
  • Experience in Java will be an added advantage
  • Knowledge of object oriented programming concepts
  • Exposure to ISMS policies and procedures.
Read more
EZEU (OPC) India Pvt Ltd

at EZEU (OPC) India Pvt Ltd

2 recruiters
HR Ezeu
Posted by HR Ezeu
Pune, Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹20L / yr
DevOps
CI/CD
Software deployment
Automation
Python
+16 more
What you will do
• Develop and maintain CI/CD tools to build and deploy scalable web and responsive applications in production environment
• Design and implement monitoring solutions that identify both system bottlenecks and production issues
• Design and implement workflows for continuous integration, including provisioning, deployment, testing, and version control of the software.
• Develop self-service solutions for the engineering team in order to deliver sites/software with great speed and quality
o Automating Infra creation
o Provide easy to use solutions to engineering team
• Conduct research, tests, and implements new metrics collection systems that can be reused and applied as engineering best practices
o Update our processes and design new processes as needed.
o Establish DevOps Engineer team best practices.
o Stay current with industry trends and source new ways for our business to improve.
• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Manage timely resolution of all critical and/or complex problems
• Maintain, monitor, and establish best practices for containerized environments.
• Mentor new DevOps engineers
What you will bring
• The desire to work in fast-paced environment.
• 5+ years’ experience building, maintaining, and deploying production infrastructures in AWS or other cloud providers
• Containerization experience with applications deployed on Docker and Kubernetes
• Understanding of NoSQL and Relational Database with respect to deployment and horizontal scalability
• Demonstrated knowledge of Distributed and Scalable systems Experience with maintaining and deployment of critical infrastructure components through Infrastructure-as-Code and configuration management tooling across multiple environments (Ansible, Terraform etc)
• Strong knowledge of DevOps and CI/CD pipeline (GitHub, BitBucket, Artifactory etc)
• Strong understanding of cloud and infrastructure components (server, storage, network, data, and applications) to deliver end-to-end cloud Infrastructure architectures and designs and recommendations
o AWS services like S3, CloudFront, Kubernetes, RDS, Data Warehouses to come up with architecture/suggestions for new use cases.
• Test our system integrity, implemented designs, application developments and other processes related to infrastructure, making improvements as needed
Good to have
• Experience with code quality tools, static or dynamic code analysis and compliance and undertaking and resolving issues identified from vulnerability and compliance scans of our infrastructure
• Good knowledge of REST/SOAP/JSON web service API implementation
Read more
IQVIA

at IQVIA

6 recruiters
Nishigandha Wagh
Posted by Nishigandha Wagh
Pune
3 - 6 yrs
₹5L - ₹15L / yr
Data Warehouse (DWH)
Business Intelligence (BI)
Amazon Web Services (AWS)
SQL
MDM
+1 more
Consultants will have the opportunity to :
- Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support
- Build technical solutions using latest open source and cloud based technologies
- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements
- Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment
- Build demos and POCs in support of business development for new and existing clients
- Prepare project documents and PowerPoint presentations for client communication
- Conduct training sessions to train associates and help shape their growth
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort