Cutshort logo

3+ Data flow Jobs in India

Apply to 3+ Data flow Jobs on CutShort.io. Find your next job, effortlessly. Browse Data flow Jobs and apply today!

icon
ByteFoundry AI

at ByteFoundry AI

4 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
2 - 5 yrs
Upto ₹30L / yr (Varies
)
Google Cloud Platform (GCP)
databricks
Data-flow analysis
bigQuery
Spark
+2 more

About the Role

We are looking for a Big Data Engineer with 2–5 years of experience in designing, building, and operating large-scale data processing systems, preferably on Google Cloud Platform

(GCP). This role is suited for engineers who understand modern data architectures and are comfortable working across multiple stages of the data lifecycle.

We do not expect expertise in every GCP data service. Instead, candidates should have

strong hands-on experience in at least one service from each core data area listed below

and the ability to learn and adapt to new tools quickly.

Key Responsibilities

● Design, develop, and maintain scalable data pipelines on GCP.

● Build batch and streaming data processing workflows using managed cloud services.

● Develop and maintain data transformation workflows using SQL and Python.

● Create and manage workflow orchestration using DAG-based schedulers.

● Collaborate with analytics, product, and engineering teams to deliver reliable datasets.

● Optimize data pipelines for performance, cost, and reliability.

● Ensure data quality, monitoring, and observability across pipelines.

● Participate in code reviews and contribute to data engineering best practices.

Core Experience Areas (At Least One From Each)

1. Data Warehousing & Analytics

● BigQuery

● Dataproc (Spark / Hadoop)

● Other cloud data warehouse or analytics platforms

2. Data Processing and Pipelines

● Dataflow (Apache Beam)

● Cloud Run Jobs / Cloud Run Services

● Apache Spark (batch or streaming)

● dbt for transformations

3. Databases & Storage

● Bigtable

● Cloud Storage

● Relational databases (PostgreSQL, MySQL, Cloud SQL)

● NoSQL databases

4. Data Preparation Exploration

● SQL-based data analysis

● Python for data manipulation (Pandas, PySpark)

● Exploratory data analysis on large datasets

Workflow Orchestration & Scheduling

● Cloud Composer (Airflow)

● Cloud Scheduler

● Experience creating and maintaining DAGs in Python


Required Skills & Qualifications

● 2–5 years of experience in data engineering or big data processing.

● Hands-on experience with Google Cloud Platform (preferred).

● Strong proficiency in Python and SQL.

● Understanding of distributed data processing concepts.

● Experience with CI/CD, Git, and production-grade data systems.

● Ability to work across ambiguous problem statements and evolving requirements.

AI & System Mindset

Experience working with AI-powered systems is a strong plus. Candidates should be

comfortable integrating AI agents, third-party APIs, and automation workflows into

applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.

Good to Have

● Experience with streaming data (Pub/Sub, Kafka).

● Cost optimization experience on cloud data platforms.

● Exposure to AI/ML pipelines or feature engineering.

● Experience working in product-driven or startup environments.

Education

Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.

Read more
Miracle Software Systems, Inc
Ratnakumari Modhalavalasa
Posted by Ratnakumari Modhalavalasa
Visakhapatnam
3 - 5 yrs
₹2L - ₹4L / yr
Hadoop
Apache Sqoop
Apache Hive
Apache Spark
Apache Pig
+9 more
Position : Data Engineer

Duration : Full Time

Location : Vishakhapatnam, Bangalore, Chennai

years of experience : 3+ years

Job Description :

- 3+ Years of working as a Data Engineer with thorough understanding of data frameworks that collect, manage, transform and store data that can derive business insights.

- Strong communications (written and verbal) along with being a good team player.

- 2+ years of experience within the Big Data ecosystem (Hadoop, Sqoop, Hive, Spark, Pig, etc.)

- 2+ years of strong experience with SQL and Python (Data Engineering focused).

- Experience with GCP Data Services such as BigQuery, Dataflow, Dataproc, etc. is an added advantage and preferred.

- Any prior experience in ETL tools such as DataStage, Informatica, DBT, Talend, etc. is an added advantage for the role.
Read more
Top startup of India -  News App

Top startup of India - News App

Agency job
via Jobdost by Sathish Kumar
Noida
2 - 5 yrs
₹20L - ₹35L / yr
Linux/Unix
skill iconPython
Hadoop
Apache Spark
skill iconMongoDB
+4 more
Responsibilities
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.

Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort