Cutshort logo
Egen Solutions logo
GCP Data Engineer
GCP Data Engineer
Egen Solutions's logo

GCP Data Engineer

Hemavathi Panduri's profile picture
Posted by Hemavathi Panduri
4 - 8 yrs
₹12L - ₹25L / yr
Hyderabad
Skills
skill iconPython
Google Cloud Platform (GCP)
ETL
Apache Airflow

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.


Key Responsibilities:

  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Work extensively on Google Cloud Platform (GCP) services such as:
  • Dataflow for real-time and batch data processing
  • Cloud Functions for lightweight serverless compute
  • BigQuery for data warehousing and analytics
  • Cloud Composer for orchestration of data workflows (based on Apache Airflow)
  • Google Cloud Storage (GCS) for managing data at scale
  • IAM for access control and security
  • Cloud Run for containerized applications
  • Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
  • Implement and enforce data quality checks, validation rules, and monitoring.
  • Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
  • Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
  • Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
  • Document pipeline designs, data flow diagrams, and operational support procedures.

Required Skills:

  • 4–8 years of hands-on experience in Python for backend or data engineering projects.
  • Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
  • Solid understanding of data pipeline architecture, data integration, and transformation techniques.
  • Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
  • Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).



Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Egen Solutions

Founded :
2009
Type :
Services
Size :
100-1000
Stage :
Profitable

About

We build next-generation cloud-native data platforms and applications. Our clients rely on us to architect and execute next generation technology strategies. From building and migrating to cloud-native data platforms to designing new modern business models.
Read more

Company social profiles

bloglinkedintwitterfacebook

Similar jobs

NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
3 - 8 yrs
₹4L - ₹10L / yr
skill iconPython
skill iconDjango
RabbitMQ
skill iconRedis
Celery
+4 more

Position Title : Python Django Developer

Location : Gurgaon (6 Days WFO)

Experience : 3+ Years


Job Overview :

We are looking for a skilled Python Django Developer with a strong background in developing scalable, high-performance web applications. The ideal candidate must have 3+ Years of hands-on experience in Django and related technologies, including RabbitMQ, Redis, Celery, and PostgreSQL, to ensure seamless background task management, caching, and database performance.


Key Responsibilities :

  • Develop, maintain, and enhance Django-based web applications and APIs.
  • Design and implement message broker solutions using RabbitMQ for asynchronous communication.
  • Integrate Redis for caching and session management to optimize application performance.
  • Implement and manage task queues using Celery for efficient background processing.
  • Work with PostgreSQL, ensuring proper database design, query optimization, and performance tuning.
  • Collaborate with front-end developers, DevOps engineers, and stakeholders to deliver high-quality software solutions.
  • Write clean, modular, and well-documented code following best practices.
  • Debug, troubleshoot, and resolve issues across the application stack.
  • Participate in code reviews, system design discussions, and team meetings.
  • Ensure scalability, reliability, and security of applications.


Required Technical Skills :

  • Minimum 3+ Years of relevant experience in Python and Django framework.
  • Proficiency in RabbitMQ for message brokering.
  • Hands-on experience with Redis for caching and session management.
  • Strong knowledge of Celery for distributed task queues.
  • Experience with PostgreSQL, including database design, indexing, and optimization.
  • Expertise in RESTful API design and development.
  • Understanding of Docker and containerized applications.


Preferred Skills :

  • Experience with CI/CD pipelines for automated deployments.
  • Familiarity with cloud platforms like AWS or GCP.
  • Knowledge of Django ORM and its performance optimizations.
  • Basic understanding of front-end technologies (HTML, CSS, JavaScript).

Soft Skills

  • Strong problem-solving and analytical abilities.
  • Excellent communication and collaboration skills.
  • Ability to adapt to an agile development environment and evolving requirements.

Educational Qualifications :

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
Read more
Deqode
at Deqode
1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
6 - 10 yrs
₹5L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconPython
PySpark
skill icon.NET
skill iconScala

🚀 Hiring: Data Engineer | GCP + Spark + Python + .NET |

| 6–10 Yrs | Gurugram (Hybrid)


We’re looking for a skilled Data Engineer with strong hands-on experience in GCP, Spark-Scala, Python, and .NET.


📍 Location: Suncity, Sector 54, Gurugram (Hybrid – 3 days onsite)

💼 Experience: 6–10 Years

⏱️ Notice Period :- Immediate Joiner


Required Skills:

  • 5+ years of experience in distributed computing (Spark) and software development.
  • 3+ years of experience in Spark-Scala
  • 5+ years of experience in Data Engineering.
  • 5+ years of experience in Python.
  • Fluency in working with databases (preferably Postgres).
  • Have a sound understanding of object-oriented programming and development principles.
  • Experience working in an Agile Scrum or Kanban development environment.
  • Experience working with version control software (preferably Git).
  • Experience with CI/CD pipelines.
  • Experience with automated testing, including integration/delta, Load, and Performance
Read more
Bluecopa
Bluecopa
Agency job
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹15L / yr
DevOps
skill iconPython
skill iconKubernetes
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

Role: DevOps Engineer


Exp: 4 - 7 Years

CTC: up to 28 LPA


Key Responsibilities

•   Design, build, and manage scalable infrastructure on cloud platforms (GCP, AWS, Azure, or OCI)

•   Administer and optimize Kubernetes clusters and container runtimes (Docker, containerd)

•   Develop and maintain CI/CD pipelines for multiple services and environments

•   Manage infrastructure as code using tools like Terraform and/or Pulumi

•   Automate operations with Python and shell scripting for deployment, monitoring, and maintenance

•   Ensure high availability and performance of production systems and troubleshoot incidents effectively

•   Monitor system metrics and implement observability best practices using tools like Prometheus, Grafana, ELK, etc.

•   Collaborate with development, security, and product teams to align infrastructure with business needs

•   Apply best practices in cloud networking, Linux administration, and configuration management

•   Support compliance and security audits; assist with implementation of cloud security measures (e.g., firewalls, IDS/IPS, IAM hardening)

•   Participate in on-call rotations and incident response activities

Read more
Building the world's largest search intelligence products.
Building the world's largest search intelligence products.
Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹18L / yr
skill iconJava
skill iconPython
skill iconMachine Learning (ML)
XSD
skill iconXML
+10 more

About the Role-

Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.

Reason to Join-

An opportunity for innovators, problem solvers & learners.  Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.

 

Requiremets and Responsibilities- (please read carefully before applying)

  • The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
  • Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
  • Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
  • Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
  • Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
  • Design, implement, and deploy highly scalable and reliable systems.
  • Troubleshoot Solr indexing process and querying engine.
  • Bachelors or Masters in Computer Science from Tier 1 Institutions
Read more
Noodle.ai
at Noodle.ai
2 recruiters
Ankita Ghosh
Posted by Ankita Ghosh
Remote only
8 - 15 yrs
₹20L - ₹70L / yr
TensorFlow
pandas
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
+2 more

Must have:

  • 8+ years of experience with a significant focus on developing, deploying & supporting AI solutions in production environments.
  • Proven experience in building enterprise software products for B2B businesses, particularly in the supply chain domain.
  • Good understanding of Generics, OOPs concepts & Design Patterns
  • Solid engineering and coding skills. Ability to write high-performance production quality code in Python
  • Proficiency with ML libraries and frameworks (e.g., Pandas, TensorFlow, PyTorch, scikit-learn).
  • Strong expertise in time series forecasting using stat, ML, DL and foundation models
  • Experience of working on processing time series data employing techniques such as decomposition, clustering, outlier detection & treatment
  • Exposure to generative AI models and agent architectures on platforms such as AWS Bedrock, Crew AI, Mosaic/Databricks, Azure
  • Experience of working with modern data architectures, including data lakes and data warehouses, having leveraged one or more of the frameworks such as Airbyte, Airflow, Dagster, AWS Glue, Snowflake,, DBT
  • Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and deploying ML models in cloud environments.
  • Excellent problem-solving skills and the ability to work independently as well as in a collaborative team environment.
  • Effective communication skills, with the ability to convey complex technical concepts to non-technical stakeholders


Good To Have:

  • Experience with MLOps tools and practices for continuous integration and deployment of ML models.
  • Has familiarity with deploying applications on Kubernetes
  • Knowledge of supply chain management principles and challenges.
  • A Master's or Ph.D. in Computer Science, Machine Learning, Data Science, or a related field is preferred
Read more
Vamstar
at Vamstar
3 recruiters
Sayantan Paul
Posted by Sayantan Paul
Remote only
2 - 4 yrs
₹5L - ₹10L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
Vamstar is a data science powered B2B healthcare marketplace that helps suppliers find the most relevant healthcare contracting leads using Artificial Intelligence (AI). We apply real-time machine learning forecasts, analyse buyers and competitor behaviours, and perform analytics at scale.

We are looking for a full-time remote Senior Backend Developer who has worked with big data and stream processing, to solve big technical challenges at scale that will reshape the healthcare industry for generations. You will get the opportunity to be involved in the big data engineering, novel machine learning pipelines and highly scalable backend development. The successful candidates will be working in a team of highly skilled and experienced developers, data scientists and CTO.

Job Requirements

1) Writing well tested, readable code using Python that is capable of processing large volumes of data

2) Experience with cloud platforms such as GCP, Azure or AWS are essential

3) The ability to work to project deadlines efficiently and with minimum guidance

4) A positive attitude and love working within a global distributed team

Skills

1) Highly proficient working with Python

2)Comfort working with large data sets and high velocity data streams

3) Experienced with microservices and backend services

4) Good relational and NoSQL database working knowledge

5) An interest in healthcare and medical sectors

6) Technical degree with minimum of 2 plus years- backend data heavy development or data engineering experience in Python

7) Desirable ETL/ELT

8) Desirable Apache Spark and big data pipelines, and stream data processing (e.g. Kafka, Flink, Kinesis, Event Hub)
Read more
Solutionec Private Limited
Ajith Gopi
Posted by Ajith Gopi
Bengaluru (Bangalore), Paris
6 - 10 yrs
₹16L - ₹22L / yr
Cloud Computing
Big Data
Microsoft Business Intelligence (MSBI)
skill iconAmazon Web Services (AWS)
SQL Server Integration Services (SSIS)
+6 more
Exciting opportunity for any contractor to work with a start-up firm which is into product cum service based industry. We are looking for someone who has got rich experience in below mentioned skills to join us immediately.

This role is for 1 month where the person will be working from Client site in Paris to understand the system architecture and documenting them. Contract extension for this role will be purely on the performance of individual. 

Since the requirement is immediate and critical, we need someone who can join us soon and travel to Paris in December

- Hands on experience handling multiple data sources/datasets
- experience in data/BI architect role
- Expert on SSIS, SSRS, SSAS
- Should have knowledge writing MDX queries
- Technical document preparation
- Should have excellent communication
- Process oriented
- Strong project management
- Should be able to think Out of the Box and provide ideas to have better solutions
- Outstanding team player with positive attitude
Read more
BrowserStack
at BrowserStack
1 video
8 recruiters
Aditi Singh
Posted by Aditi Singh
Mumbai
4 - 8 yrs
Best in industry
skill iconJava
skill iconPython
skill iconNodeJS (Node.js)
skill iconRuby
skill iconRuby on Rails (ROR)
● Good experience in at least one scripting language: Ruby, Nodejs, Python, AppleScript, Unix shell or similar ● Familiarity with one compiled language: C, Java, C ++, Go or similar ● Good knowledge of operating systems and networking concepts ● Reasonable knowledge of Windows and/or Linux operating systems ● Ability to work on Windows and Linux platform below the application layer, including file systems, kernels, custom installations, shell scripting, internal APIs, etc ● Aggressive problem diagnosis and creative problem solving skills ● Startup mentality, high willingness to learn, and hardworking ● Experience of 4+ years
Read more
www.talentdekho.in
at www.talentdekho.in
1 recruiter
saloni mehta
Posted by saloni mehta
Mumbai
2 - 7 yrs
₹4L - ₹17L / yr
skill iconNodeJS (Node.js)
NOSQL Databases
skill iconPHP
skill iconPython
skill iconRuby on Rails (ROR)
+1 more
- 2+yrs of experience in Node.js with API framework such as Sails & Express.js - Needs to be familiar with Mongodb, Cassandra or any other NoSql database. - Experience with message queue like rabbitmq etc will be add on.
Read more
Perspective
at Perspective
1 recruiter
Nayan Goenka
Posted by Nayan Goenka
Mumbai
3 - 7 yrs
₹4L - ₹7L / yr
skill iconAngularJS (1.x)
skill iconJavascript
skill iconPython
skill iconRuby on Rails (ROR)
skill iconNodeJS (Node.js)
+2 more
High Functioning Product team in creation for a SaaS Product and two social media products
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos