2+ Data flow Jobs in Delhi, NCR and Gurgaon | Data flow Job openings in Delhi, NCR and Gurgaon
Apply to 2+ Data flow Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Data flow Job opportunities across top companies like Google, Amazon & Adobe.

GCP Data Engineer Job Description
A GCP Data Engineer is responsible for designing, building, and maintaining data pipelines, architectures, and systems on Google Cloud Platform (GCP). Here's a breakdown of the job:
Key Responsibilities
- Data Pipeline Development: Design and develop data pipelines using GCP services like Dataflow, BigQuery, and Cloud Pub/Sub.
- Data Architecture: Design and implement data architectures to meet business requirements.
- Data Processing: Process and analyze large datasets using GCP services like BigQuery and Cloud Dataflow.
- Data Integration: Integrate data from various sources using GCP services like Cloud Data Fusion and Cloud Pub/Sub.
- Data Quality: Ensure data quality and integrity by implementing data validation and data cleansing processes.
Essential Skills
- GCP Services: Strong understanding of GCP services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage.
- Data Engineering: Experience with data engineering concepts, including data pipelines, data warehousing, and data integration.
- Programming Languages: Proficiency in programming languages like Python, Java, or Scala.
- Data Processing: Knowledge of data processing frameworks like Apache Beam and Apache Spark.
- Data Analysis: Understanding of data analysis concepts and tools like SQL and data visualization.

● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.