2+ Dataflow architecture Jobs in India
Apply to 2+ Dataflow architecture Jobs on CutShort.io. Find your next job, effortlessly. Browse Dataflow architecture Jobs and apply today!
Experience Level
10+ years of experience in data engineering, with at least 3–5 years providing architectural guidance, leading teams, and standardizing enterprise data solutions. Must have deep expertise in Databricks, GCP, and modern data architecture patterns.
Key Responsibilities
- Provide architectural guidance and define standards for data engineering implementations.
- Lead and mentor a team of data engineers, fostering best practices in design, development, and operations.
- Own and drive improvements in performance, scalability, and reliability of data pipelines and platforms.
- Standardize data architecture patterns and reusable frameworks across multiple projects.
- Collaborate with cross-functional stakeholders (Product, Analytics, Business) to align data solutions with organizational goals.
- Design data models, schemas, and dataflows for efficient storage, querying, and analytics.
- Establish and enforce strong data governance practices, ensuring security, compliance, and data quality.
- Work closely with governance teams to implement lineage, cataloging, and access control in compliance with standards.
- Design and optimize ETL pipelines using Databricks, PySpark, and SQL.
- Ensure robust CI/CD practices are implemented for data workflows, leveraging Terraform and modern DevOps practices.
- Leverage GCP services such as Cloud Functions, Cloud Run, BigQuery, Pub/Sub, and Dataflow for building scalable solutions.
- Evaluate and adopt emerging technologies, with exposure to Gen AI and advanced analytics capabilities.
Qualifications & Skills
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Extensive hands-on experience with Databricks (Autoloader, DLT, Delta Lake, CDF) and PySpark.
- Expertise in SQL and advanced query optimization.
- Proficiency in Python for data engineering and automation tasks.
- Strong expertise with GCP services: Cloud Functions, Cloud Run, BigQuery, Pub/Sub, Dataflow, GCS.
- Deep understanding of CI/CD pipelines, infrastructure-as-code (Terraform), and DevOps practices.
- Proven ability to provide architectural guidance and lead technical teams.
- Experience designing data models, schemas, and governance frameworks.
- Knowledge of Gen AI concepts and ability to evaluate practical applications.
- Excellent communication, leadership, and stakeholder management skills.
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket


