Loading...

{{notif_text}}

Work at top Indian companies and global startups in 2020 - Check it out

Data Warehouse (DWH) Jobs

Explore top Data Warehouse (DWH) Job opportunities for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Senior Consultant
Senior Consultant

via IQVIA
Founded 1969
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
5 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1600000, duration: "undefined", currency: "INR", equity: false})}}

Key Responsibilities   Work with India and US managers to design end to end technology solutions in DWH/BI space Work with India manager to manage overall project delivery and lead project planning, system design & development, testing, UAT and deployment activities Work closely with 159 team and client's business and IT teams to gather project requirements Develop client relationships and serve as primary contact for all project related communications Build technical solutions using latest open source and cloud-based technologies like AWS Redshift, RDS, Glue, Apache Airflow etc. Build demos and POCs in support of business development for new and existing clients Lead creation of PowerPoint slides and online visualization (e.g. Tableau, Qlik, Sisense etc.) to communicate findings Work with India manager to build & grow a team of analyst & consultants with expertise in ETL, BI reporting, python and analytics support Mentor a team of 5 to 8 consultants/analysts ongoing basis Conduct training sessions to train analysts and help shape their growth

Job posted by
apply for job
apply for job
Nishigandha Wagh picture
Nishigandha Wagh
Job posted by
Nishigandha Wagh picture
Nishigandha Wagh
Apply for job
apply for job

Consultant
Consultant

via IQVIA
Founded 1969
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Consultants will have the opportunity to : - Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support- Build technical solutions using latest open source and cloud based technologies- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements - Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment - Build demos and POCs in support of business development for new and existing clients - Prepare project documents and PowerPoint presentations for client communication - Conduct training sessions to train associates and help shape their growth

Job posted by
apply for job
apply for job
Nishigandha Wagh picture
Nishigandha Wagh
Job posted by
Nishigandha Wagh picture
Nishigandha Wagh
Apply for job
apply for job

Senior Data Engineer
Senior Data Engineer

via upGrad
Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Mumbai, Bengaluru (Bangalore)
Experience icon
6 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

About the Role We are looking for a Data Engineer to help us scale the existing data infrastructure and in parallel work on building the next generation data platform for analytics at scale, machine learning infrastructure and data validation systems.In this role, you will be responsible for communicating effectively with data consumers to fine-tune data platform systems (existing or new), taking ownership and delivering high performing systems and data pipelines, and helping the team scale them up, to endure ever growing traffic.This is a growing team, which makes for many opportunities to be involved directly with product management, development, sales, and support teams. Everybody on the team is passionate about their work and we’re looking for similarly motivated “get stuff done” kind of people to join us! Roles & Responsibilities Engineer data pipelines (batch and real-time ) that aids in creation of data-driven products for our platform Design, develop and maintain a robust and scalable data-warehouse and data lake Work closely alongside Product managers and data-scientists to bring the various datasets together and cater to our business intelligence and analytics use-cases Design and develop solutions using data science techniques ranging from statistics, algorithms to machine learning Perform hands-on devops work to keep the Data platform secure and reliable Skills Required Bachelor's degree in Computer Science, Information Systems, or related engineering discipline 6 + years’ experience with ETL, Data Mining, Data Modeling, and working with large-scale datasets 6+ years’ experience with an object-oriented programming language such as Python, Scala, Java, etc Extremely proficient in writing performant SQL working with large data volumes Experience with map-reduce, Spark, Kafka, Presto, and the ecosystem.  Experience in building automated analytical systems utilizing large data sets. Experience with designing, scaling and optimizing cloud based data warehouses (like AWS Redshift) and data lakes Familiarity with AWS technologies preferred Qualification – B.Tech/M.Tech/MCA(IT/Computer Science) Years of Exp – 6-9

Job posted by
apply for job
apply for job
Omkar Pradhan picture
Omkar Pradhan
Job posted by
Omkar Pradhan picture
Omkar Pradhan
Apply for job
apply for job

Data Engineer for Thane, Mumbai
Data Engineer for Thane, Mumbai

Founded 2010
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Thane
Experience icon
3 - 7 years
Salary icon
Best in industry{{renderSalaryString({min: 300000, max: 600000, duration: "undefined", currency: "INR", equity: false})}}

Vervali is seeking Data Engineer for Thane, Mumbai  Salary: 20 to 30% Hike Notice Period: Immediate or 10 to 20 Days Qualification: BE/B Tech in Computer Science/Information Technology Relevant experience: 4-5 Years of experience in Programming and data transformation tools for ETL. Key Responsibilities:  Data warehouse development and ETL design  Program using Python, R technologies  Build and maintain SQL procedures and ETL processes  Design and implement a technical vision for client project  Must have:   Qualified individuals possess the attributes of being smart, curious,committed to vision, passionate, fun/pleasant, an achiever and having a sense of urgency  5+ years of data focused software development and design experience  3+ years of experience designing and developing ETL solutions using Informatica PowerCenter 9.x version, Matillion, Talend  Experience designing and developing database solutions using SQL server and/or Cloud database solutions (Hadoop, Redshift, Snowflake, BigQuery, MySQL, etc.)  3+ years of experience with Python, Bash shell scripting experience with cloud technologies – AWS, GCP, Azure is a plus.  Thanks & Regards, Darshit Mandavia

Job posted by
apply for job
apply for job
Darshit Mandavia picture
Darshit Mandavia
Job posted by
Darshit Mandavia picture
Darshit Mandavia
Apply for job
apply for job

Software Architect/Solution Architect/CTO
Software Architect/Solution Architect/CTO

Founded 2010
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
7 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1200000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

At least 8 years combined experience in either Information Technology, Consulting, Insurance industry or Financial Services at a mid to senior management level At least 1-2 years combined experience as Technical Architect Insurance Industry product knowledge across General and Commercial products lines is highly desirable Excellent analytical and documentation skills. Strong Programming experience on Node.js, Java, JavaScript, HTML, CSS Front-end framework experience with Angular.js or React Ability to work with cloud providers like Azure/AWS Knowledge of container deployments like Docker Solid foundational concepts like algorithms, APIs, scaling and performance Sturdy object-oriented design and programming skills Experience with using and designing RESTful services Previous experience working in an agile, environment is strongly encouraged.  Experience with database modelling (SQL and NoSQL), with DWH exposure.  Experience with analytics tools is an added advantage. Significant experience developing scalable systems that served millions of users Writing and Executing Unit Test cases using JUnit Appservers (JBoss/Weblogic/Websphere) Eclipse/Netbean/JBoss/IntelliJ Developer Studio SOA Concepts & Design Patterns Rich Internet application using JavaScript skill set Build Management (Ant/Maven/Jenkins/Hudson) An inherent desire to learn and grow and a passion to move things forward Above all else, you're a creative and innovative problem solver  Strong organisational skills combined with the ability to multi-task and excellent time-management skills

Job posted by
apply for job
apply for job
Lithin Raj picture
Lithin Raj
Job posted by
Lithin Raj picture
Lithin Raj
Apply for job
apply for job

Data Warehouse Developer
Data Warehouse Developer

Founded 2009
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, NCR (Delhi | Gurgaon | Noida)
Experience icon
3 - 12 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1400000, duration: "undefined", currency: "INR", equity: false})}}

Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design. Role and Responsibility ·         Plan, create, coordinate, and deploy data warehouses. ·         Design end user interface. ·         Create best practices for data loading and extraction. ·         Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment. ·         Develop reporting applications and data warehouse consistency. ·         Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers. ·         Supervise design throughout implementation process. ·         Design and build cubes while performing custom scripts. ·         Develop and implement ETL routines according to the DWH design and architecture. ·         Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse. ·         Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required. ·         Manage multiple projects at once. DESIRABLE SKILL SET ·         Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures ·         Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database ·         High proficiency in dimensional modeling techniques and their applications ·         Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel ·         Several years working experience with Tableau,  MicroStrategy, Information Builders, and other reporting and analytical tools ·         Working knowledge of SAS and R code used in data processing and modeling tasks ·         Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data

Job posted by
apply for job
apply for job
harpreet kaur picture
harpreet kaur
Job posted by
harpreet kaur picture
harpreet kaur
Apply for job
apply for job

Engineering Head
Engineering Head

Founded 2019
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
9 - 15 years
Salary icon
Best in industry{{renderSalaryString({min: 5000000, max: 7000000, duration: "undefined", currency: "INR", equity: false})}}

Main responsibilities: + Management of a growing technical team + Continued technical Architecture design based on product roadmap + Annual performance reviews + Work with DevOps to design and implement the product infrastructure Strategic: + Testing strategy + Security policy + Performance and performance testing policy + Logging policy Experience: + 9-15 years of experience including that of managing teams of developers + Technical & architectural expertise, and have evolved a growing code base, technology stack and architecture over many years + Have delivered distributed cloud applications + Understand the value of high quality code and can effectively manage technical debt + Stakeholder management + Work experience in consumer focused early stage (Series A, B) startups is a big plus Other innate skills: + Great motivator of people and able to lead by example + Understand how to get the most out of people + Delivery of products to tight deadlines but with a focus on high quality code + Up to date knowledge of technical applications

Job posted by
apply for job
apply for job
Jennifer Jocelyn picture
Jennifer Jocelyn
Job posted by
Jennifer Jocelyn picture
Jennifer Jocelyn
Apply for job
apply for job

BI Developer (SQL writer for analytical queries)
BI Developer (SQL writer for analytical queries)
at bipp

via bipp
Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Pune, Hyderabad, NCR (Delhi | Gurgaon | Noida), Bengaluru (Bangalore), Mumbai
Experience icon
3 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 400000, max: 800000, duration: "undefined", currency: "INR", equity: false})}}

Do NOT apply if you are :- Want to be a Power Bi, Qlik, or Tableau only developer.- A machine learning aspirant- A data scientist- Wanting to write Python scripts- Want to do AI - Want to do 'BIG' data- Want to do HADOOP- Fresh GraduateApply if you :- Write SQL for complicated analytical queries . - Understand existing business problem of the client and map their needs to the schema that they have.-Can neatly disassemble the problem into components and solve the needs by using SQL. - Have worked on existing BI products.Develop solutions with our exciting new BI product for our clients.You should be very experienced and comfortable with writing SQL against very complicated schema to help answer business questions.Have an analytical thought process.

Job posted by
apply for job
apply for job
Vish Josh picture
Vish Josh
Job posted by
Vish Josh picture
Vish Josh
Apply for job
apply for job

Data ETL Engineer
Data ETL Engineer

Founded 2013
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai
Experience icon
1 - 3 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

Responsibilities: Design and develop ETL Framework and Data Pipelines in Python 3. Orchestrate complex data flows from various data sources (like RDBMS, REST API, etc) to the data warehouse and vice versa. Develop app modules (in Django) for enhanced ETL monitoring. Device technical strategies for making data seamlessly available to BI and Data Sciences teams. Collaborate with engineering, marketing, sales, and finance teams across the organization and help Chargebee develop complete data solutions. Serve as a subject-matter expert for available data elements and analytic capabilities. Qualification: Expert programming skills with the ability to write clean and well-designed code. Expertise in Python, with knowledge of at least one Python web framework. Strong SQL Knowledge, and high proficiency in writing advanced SQLs. Hands on experience in modeling relational databases. Experience integrating with third-party platforms is an added advantage. Genuine curiosity, proven problem-solving ability, and a passion for programming and data.

Job posted by
apply for job
apply for job
Vinothini Sundaram picture
Vinothini Sundaram
Job posted by
Vinothini Sundaram picture
Vinothini Sundaram
Apply for job
apply for job

Data Engineer
Data Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida), Mumbai
Experience icon
2 - 7 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

About the job: - You will work with data scientists to architect, code and deploy ML models - You will solve problems of storing and analyzing large scale data in milliseconds - architect and develop data processing and warehouse systems - You will code, drink, breathe and live python, sklearn and pandas. It’s good to have experience in these but not a necessity - as long as you’re super comfortable in a language of your choice. - You will develop tools and products that provide analysts ready access to the data About you: - Strong CS fundamentals - You have strong experience in working with production environments - You write code that is clean, readable and tested - Instead of doing it second time, you automate it - You have worked with some of the commonly used databases and computing frameworks (Psql, S3, Hadoop, Hive, Presto, Spark, etc) - It will be great if you have one of the following to share - a kaggle or a github profile - You are an expert in one or more programming languages (Python preferred). Also good to have experience with python-based application development and data science libraries. - Ideally, you have 2+ years of experience in tech and/or data. - Degree in CS/Maths from Tier-1 institutes.

Job posted by
apply for job
apply for job
Pragya Singh picture
Pragya Singh
Job posted by
Pragya Singh picture
Pragya Singh
Apply for job
apply for job

Data Engineer
Data Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
1 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 700000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

JOB DESCRIPTION: We are looking for a Data Engineer with a solid background in scalable systems to work with our engineering team to improve and optimize our platform. You will have significant input into the team’s architectural approach and execution. We are looking for a hands-on programmer who enjoys designing and optimizing data pipelines for large-scale data. This is NOT a "data scientist" role, so please don't apply if you're looking for that. RESPONSIBILITIES: 1. Build, maintain and test, performant, scalable data pipelines 2. Work with data scientists and application developers to implement scalable pipelines for data ingest, processing, machine learning and visualization 3. Building interfaces for ingest across various data stores MUST-HAVE: 1. A track record of building and deploying data pipelines as a part of work or side projects 2. Ability to work with RDBMS, MySQL or Postgres 3. Ability to deploy over cloud infrastructure, at least AWS 4. Demonstrated ability and hunger to learn GOOD-TO-HAVE: 1. Computer Science degree 2. Expertise in at least one of: Python, Java, Scala 3. Expertise and experience in deploying solutions based on Spark and Kafka 4. Knowledge of container systems like Docker or Kubernetes 5. Experience with NoSQL / graph databases: 6. Knowledge of Machine Learning Kindly apply only if you are skilled in building data pipelines.

Job posted by
apply for job
apply for job
Zeimona Dsouza picture
Zeimona Dsouza
Job posted by
Zeimona Dsouza picture
Zeimona Dsouza
Apply for job
apply for job

Data Architect
Data Architect

Founded 2011
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
9 - 13 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 2300000, duration: "undefined", currency: "INR", equity: false})}}

Data Architect who leads a team of 5 numbers. Required skills : Spark ,Scala , hadoop

Job posted by
apply for job
apply for job
Sravanthi Alamuri picture
Sravanthi Alamuri
Job posted by
Sravanthi Alamuri picture
Sravanthi Alamuri
Apply for job
apply for job

Database Architect
Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.
File upload not supportedAudio recording not supported
This browser does not support file upload. Please follow the instructions to upload your resume.This browser does not support audio recording. Please follow the instructions to record audio.
  1. Click on the 3 dots
  2. Click on "Copy link"
  3. Open Google Chrome (or any other browser) and enter the copied link in the URL bar
Done