Cutshort logo
Healthtech Startup logo
Data Engineering Lead
Data Engineering Lead
Healthtech Startup's logo

Data Engineering Lead

Agency job
via Qrata
6 - 10 yrs
₹20L - ₹30L / yr
Bengaluru (Bangalore)
Skills
Google Cloud Platform (GCP)
bigquery

Description: 

As a Data Engineering Lead at Company, you will be at the forefront of shaping and managing our data infrastructure with a primary focus on Google Cloud Platform (GCP). You will lead a team of data engineers to design, develop, and maintain our data pipelines, ensuring data quality, scalability, and availability for critical business insights. 


Key Responsibilities: 

1. Team Leadership: 

a. Lead and mentor a team of data engineers, providing guidance, coaching, and performance management. 

b. Foster a culture of innovation, collaboration, and continuous learning within the team. 

2. Data Pipeline Development (Google Cloud Focus): 

a. Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, and Dataprep.

b. Implement best practices for data extraction, transformation, and loading (ETL) processes on GCP. 

3. Data Architecture and Optimization: 

a. Define and enforce data architecture standards, ensuring data is structured and organized efficiently. 

b. Optimize data storage, processing, and retrieval for maximum 

performance and cost-effectiveness on GCP. 

4. Data Governance and Quality: 

a. Establish data governance frameworks and policies to maintain data quality, consistency, and compliance with regulatory requirements. b. Implement data monitoring and alerting systems to proactively address data quality issues. 

5. Cross-functional Collaboration: 

a. Collaborate with data scientists, analysts, and other cross-functional teams to understand data requirements and deliver data solutions that drive business insights. 

b. Participate in discussions regarding data strategy and provide technical expertise. 

6. Documentation and Best Practices: 

a. Create and maintain documentation for data engineering processes, standards, and best practices. 

b. Stay up-to-date with industry trends and emerging technologies, making recommendations for improvements as needed. 


Qualifications 

● Bachelor's or Master's degree in Computer Science, Data Engineering, or related field. 

● 5+ years of experience in data engineering, with a strong emphasis on Google Cloud Platform. 

● Proficiency in Google Cloud services, including BigQuery, Dataflow, Dataprep, and Cloud Storage. 

● Experience with data modeling, ETL processes, and data integration. ● Strong programming skills in languages like Python or Java. 

● Excellent problem-solving and communication skills. 

● Leadership experience and the ability to manage and mentor a team.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Healthtech Startup

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Fintrac Global services
Hyderabad
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
Bash
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

Required Qualifications: 

∙Bachelor’s degree in computer science, Information Technology, or related field, or equivalent experience. 

∙5+ years of experience in a DevOps role, preferably for a SaaS or software company. 

∙Expertise in cloud computing platforms (e.g., AWS, Azure, GCP). 

∙Proficiency in scripting languages (e.g., Python, Bash, Ruby). 

∙Extensive experience with CI/CD tools (e.g., Jenkins, GitLab CI, Travis CI). 

∙Extensive experience with NGINX and similar web servers. 

∙Strong knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). 

∙Familiarity with infrastructure-as-code tools (e.g. Terraform, CloudFormation). 

∙Ability to work on-call as needed and respond to emergencies in a timely manner. 

∙Experience with high transactional e-commerce platforms.


Preferred Qualifications: 

∙Certifications in cloud computing or DevOps are a plus (e.g., AWS Certified DevOps Engineer, 

Azure DevOps Engineer Expert). 

∙Experience in a high availability, 24x7x365 environment. 

∙Strong collaboration, communication, and interpersonal skills. 

∙Ability to work independently and as part of a team.

Read more
Intuitive Technology Partners
shalu Jain
Posted by shalu Jain
Remote only
9 - 20 yrs
Best in industry
Architecture
Presales
Postsales
skill iconAmazon Web Services (AWS)
databricks
+13 more

Intuitive cloud (http://www.intuitive.cloud">www.intuitive.cloud) is one of the fastest growing top-tier Cloud Solutions and SDx Engineering solution and service company supporting 80+ Global Enterprise Customer across Americas, Europe and Middle East.

Intuitive is a recognized professional and manage service partner for core superpowers in cloud(public/ Hybrid), security, GRC, DevSecOps, SRE, Application modernization/ containers/ K8 -as-a- service and cloud application delivery.


Data Engineering:

  • 9+ years’ experience as data engineer.
  • Must have 4+ Years in implementing data engineering solutions with Databricks.
  • This is hands on role building data pipelines using Databricks. Hands-on technical experience with Apache Spark.
  • Must have deep expertise in one of the programming languages for data processes (Python, Scala). Experience with Python, PySpark, Hadoop, Hive and/or Spark to write data pipelines and data processing layers
  • Must have worked with relational databases like Snowflake. Good SQL experience for writing complex SQL transformation.
  • Performance Tuning of Spark SQL running on S3/Data Lake/Delta Lake/ storage and Strong Knowledge on Databricks and Cluster Configurations.
  • Hands on architectural experience
  • Nice to have Databricks administration including security and infrastructure features of Databricks.
Read more
British Telecom
Agency job
via posterity consulting by Kapil Tiwari
Bengaluru (Bangalore)
3 - 7 yrs
₹8L - ₹14L / yr
Data engineering
Big Data
Google Cloud Platform (GCP)
ETL
Datawarehousing
+6 more
You'll have the following skills & experience:

• Problem Solving:. Resolving production issues to fix service P1-4 issues. Problems relating to
introducing new technology, and resolving major issues in the platform and/or service.
• Software Development Concepts: Understands and is experienced with the use of a wide range of
programming concepts and is also aware of and has applied a range of algorithms.
• Commercial & Risk Awareness: Able to understand & evaluate both obvious and subtle commercial
risks, especially in relation to a programme.
Experience you would be expected to have
• Cloud: experience with one of the following cloud vendors: AWS, Azure or GCP
• GCP : Experience prefered, but learning essential.
• Big Data: Experience with Big Data methodology and technologies
• Programming : Python or Java worked with Data (ETL)
• DevOps: Understand how to work in a Dev Ops and agile way / Versioning / Automation / Defect
Management – Mandatory
• Agile methodology - knowledge of Jira
Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹25L / yr
Cloud
Google Cloud Platform (GCP)
BigQuery
skill iconPython
SQL
+2 more

Specific Responsibilities

  • Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
  • Design and develop the ETL framework using BigQuery
  • Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
  • Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
  • Should be able to automate the data load from Big Query using APIs or scripting language.
  • Good experience in Advanced SQL concepts.
  • Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
  • Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
  • Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
  • Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
  • Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
  • Understand requirements to translate business problems & technical problems into analytics problems.
  • Effective storyboarding and presentation of the solution to the client and leadership.
  • Client engagement & management
  • Ability to interface effectively with multiple levels of management and functional disciplines.
  • Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.

 

Work Experience
  • 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
  • Relevant experience in Consumer Tech/CPG/Retail industries
  • Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
  • Strong problem solving and web analytical skills. Acute attention to detail.
  • Experience in analyzing large, complex, multi-dimensional data sets.
  • Experience in one or more roles in an online eCommerce or online support environment.
 
Skills
  • Expertise in Google Big Query & Google Cloud Platform
  • Experience in Advanced SQL, Scripting language (Python/R)
  • Hands-on experience in BI tools (Tableau, Power BI)
  • Working Experience & understanding of Adobe Analytics or Google Analytics
  • Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
  • Excellent analytical thinking, analysis, and problem-solving skills.
  • Knowledge of other GCP services is a plus
 
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
2 - 6 yrs
₹3L - ₹15L / yr
Google Cloud Platform (GCP)
SQL
BQ

Datametica is looking for talented Big Query engineers

 

Total Experience - 2+ yrs.

Notice Period – 0 - 30 days

Work Location – Pune, Hyderabad

 

Job Description:

  • Sound understanding of Google Cloud Platform Should have worked on Big Query, Workflow, or Composer
  • Experience in migrating to GCP and integration projects on large-scale environments ETL technical design, development, and support
  • Good SQL skills and Unix Scripting Programming experience with Python, Java, or Spark would be desirable.
  • Experience in SOA and services-based data solutions would be advantageous

 

About the Company: 

www.datametica.com

Datametica is amongst one of the world's leading Cloud and Big Data analytics companies.

Datametica was founded in 2013 and has grown at an accelerated pace within a short span of 8 years. We are providing a broad and capable set of services that encompass a vision of success, driven by innovation and value addition that helps organizations in making strategic decisions influencing business growth.

Datametica is the global leader in migrating Legacy Data Warehouses to the Cloud. Datametica moves Data Warehouses to Cloud faster, at a lower cost, and with few errors, even running in parallel with full data validation for months.

Datametica's specialized team of Data Scientists has implemented award-winning analytical models for use cases involving both unstructured and structured data.

Datametica has earned the highest level of partnership with Google, AWS, and Microsoft, which enables Datametica to deliver successful projects for clients across industry verticals at a global level, with teams deployed in the USA, EU, and APAC.

 

Recognition:

We are gratified to be recognized as a Top 10 Big Data Global Company by CIO story.

 

If it excites you, please apply.

Read more
Multinational Company
Remote only
5 - 15 yrs
₹27L - ₹30L / yr
Data engineering
Google Cloud Platform (GCP)
skill iconPython

• The incumbent should have hands on experience in data engineering and GCP data technologies.

• Should Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform.

• Should Work with Agile and DevOps techniques and implementation approaches in the delivery.

• Showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions.

• Build and deliver Data solutions using GCP products and offerings.
• Have hands on Experience on Python 
Experience on SQL or MySQL. Experience on Looker is an added advantage.

Read more
Fast paced Startup
Pune
3 - 6 yrs
₹15L - ₹22L / yr
Big Data
Data engineering
Hadoop
Spark
Apache Hive
+6 more

ears of Exp: 3-6+ Years 
Skills: Scala, Python, Hive, Airflow, Spark

Languages: Java, Python, Shell Scripting

GCP: BigTable, DataProc,  BigQuery, GCS, Pubsub

OR
AWS: Athena, Glue, EMR, S3, Redshift

MongoDB, MySQL, Kafka

Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time 

Read more
Nisum Technologies
at Nisum Technologies
8 recruiters
Sameena Shaik
Posted by Sameena Shaik
Hyderabad
4 - 12 yrs
₹1L - ₹20L / yr
Big Data
Hadoop
Spark
Apache Kafka
skill iconScala
+4 more
  • 5+ years of experience in a Data Engineer role
  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases such as Cassandra.
  • Experience with AWS cloud services: EC2, EMR, Athena
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with unstructured datasets.
  • Deep problem-solving skills to perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Read more
MNC
at MNC
Agency job
via I Squaresoft by Khadri SH
Remote only
5 - 8 yrs
₹10L - ₹20L / yr
ETL
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
SSIS
Cloud Datawarehouse
Hi,

job Description

Problem Formulation: Identifies possible options to address the business problems and must possess good understanding of dimension modelling

Must have worked on at least one end to end project using any Cloud Datawarehouse (Azure Synapses, AWS Redshift, Google Big query)

Good to have an understand of POWER BI and integration with any Cloud services like Azure or GCP

Experience of working with SQL Server, SSIS(Preferred)

Applied Business Acumen: Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues.

Data Transformation/Integration/Optimization:

The ETL developer is responsible for designing and creating the Data warehouse and all related data extraction, transformation and load of data function in the company

The developer should provide the oversight and planning of data models, database structural design and deployment and work closely with the data architect and Business analyst

Duties include working in a cross functional software development teams (Business analyst, Testers, Developers) following agile ceremonies and development practices.

The developer plays a key role in contributing to the design, evaluation, selection, implementation and support of databases solution.

Development and Testing: Develops codes for the required solution by determining the appropriate approach and leveraging business, technical, and data requirements.

Creates test cases to review and validate the proposed solution design. Work on POCs and deploy the software to production servers.

Good to Have (Preferred Skills):

  • Minimum 4-8 Years of experience in Data warehouse design and development for large scale application
  • Minimum 3 years of experience with star schema, dimensional modelling and extract transform load (ETL) Design and development
  • Expertise working with various databases (SQL Server, Oracle)
  • Experience developing Packages, Procedures, Views and triggers
  • Nice to have Big data technologies
  • The individual must have good written and oral communication skills.
  • Nice to have SSIS

 

Education and Experience

  • Minimum 4-8 years of software development experience
  • Bachelor's and/or Master’s degree in computer science

Please revert back with below details.

Total Experience:
Relevant Experience:

Current CTC:
Expected CTC:

Any offers: Y/N

Notice Period:

Qualification:

DOB:
Present Company Name:

Designation:

Domain

Reason for job change:

Current Location:

Read more
Pune
10 - 18 yrs
₹35L - ₹40L / yr
Google Cloud Platform (GCP)
Dataflow architecture
Data migration
Data processing
Big Data
+4 more

CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)

 

Below are the job Details :-

 

Experience 10 to 18 years

 

Mandatory skills –

  • data migration,
  • data flow

The ideal candidate for this role will have the below experience and qualifications:  

  • Experience of building a range of Services in a Cloud Service provider (ideally GCP)  
  • Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies. 
  • Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools 
  • Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion 
  • Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.  
  • Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)  
  • Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.  
  • Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform  
  • Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala  
  • Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes  
  • Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security 
  • Financial experience is preferred 
  • Ability to learn new technologies and rapidly prototype newer concepts 
  • Top-down thinker, excellent communicator, and great problem solver

 

Exp:- 10  to 18 years

 

Location:- Pune

 

Candidate must have experience in below.

  • GCP Data Platform
  • Data Processing:- Data Flow, Data Prep, Data Fusion
  • Data Storage:- Big Query, Cloud Sql,
  • Pub Sub, GCS Bucket
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos