

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).

About Egen Solutions
About
Similar jobs


Job Title: .NET Developer with Cloud Migration Experience
Job Description:
We are seeking a skilled .NET Developer with experience in C#, MVC, and ASP.NET to join our team. The ideal candidate will also have hands-on experience with cloud migration projects, particularly in migrating on-premise applications to cloud platforms such as Azure or AWS.
Responsibilities:
- Develop, test, and maintain .NET applications using C#, MVC, and ASP.NET
- Collaborate with cross-functional teams to define, design, and ship new features
- Participate in code reviews and ensure coding best practices are followed
- Work closely with the infrastructure team to migrate on-premise applications to the cloud
- Troubleshoot and debug issues that arise during migration and post-migration phases
- Stay updated with the latest trends and technologies in .NET development and cloud computing
Requirements:
- Bachelor's degree in Computer Science or related field
- X+ years of experience in .NET development using C#, MVC, and ASP.NET
- Hands-on experience with cloud migration projects, preferably with Azure or AWS
- Strong understanding of cloud computing concepts and principles
- Experience with database technologies such as SQL Server
- Excellent problem-solving and communication skills
Preferred Qualifications:
- Microsoft Azure or AWS certification
- Experience with other cloud platforms such as Google Cloud Platform (GCP)
- Familiarity with DevOps practices and tools

Dear,
We are excited to inform you about an exclusive opportunity at Xebia for a Senior Backend Engineer role.
📌 Job Details:
- Role: Senior Backend Engineer
- Shift: 1 PM – 10 PM
- Work Mode: Hybrid (3 days a week) across Xebia locations
- Notice Period: Immediate joiners or up to 30 days
🔹 Job Responsibilities:
✅ Design and develop scalable, reliable, and maintainable backend solutions
✅ Work on event-driven microservices architecture
✅ Implement REST APIs and optimize backend performance
✅ Collaborate with cross-functional teams to drive innovation
✅ Mentor junior and mid-level engineers
🔹 Required Skills:
✔ Backend Development: Scala (preferred), Java, Kotlin
✔ Cloud: AWS or GCP
✔ Databases: MySQL, NoSQL (Cassandra)
✔ DevOps & CI/CD: Jenkins, Terraform, Infrastructure as Code
✔ Messaging & Caching: Kafka, RabbitMQ, Elasticsearch
✔ Agile Methodologies: Scrum, Kanban
⚠ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response! Also, feel free to refer anyone in your network who might be a good fit.
Best regards,
Vijay S
Assistant Manager - TAG
Java Developer with GCP
Skills : Java and Spring Boot, GCP, Cloud Storage, BigQuery, RESTful API,
EXP : SA(6-10 Years)
Loc : Bangalore, Mangalore, Chennai, Coimbatore, Pune, Mumbai, Kolkata
Np : Immediate to 60 Days.
Kindly share your updated resume via WA - 91five000260seven


At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation.
Everything we do centers around people. That means we obsess over how to make the lives of our customers, and their customers, better. And it means we prioritize a diverse F5 community where each individual can thrive.
F5, Inc. is seeking a Software Engineer III with experience in building highly available and highly scalable services on public cloud like AWS, Azure and GCP. In this role you will help develop networking and security technologies as a service (SaaS) to solve customers’ multi-cloud problem. You will be part of Cloud Orchestration team working on F5 Distributed Cloud platform.
Primary Responsibilities
-
Design and development of highly available and highly scalable services using public cloud and F5 Distributed Cloud services.
-
Understand product requirements related to multi-cloud and propose solutions.
-
Follow software development lifecycle for feature development i.e., design, develop, test and support the features
-
Creating prototypes to validate use cases and get feedback from product team and architects.
-
Work cohesively with geographically distributed team
Knowledge, Skills and Abilities
-
Experience in designing and implementing solutions for services in public cloud
-
Experience in developing software in a SaaS environment
-
Extensive hands-on experience in using Infrastructure as code (IsC) tools like terraform (preferred), cloud formation etc
-
Extensive hands-on experience in programming languages such as Golang(preferred), python, Java, Rust
-
Solid understanding of AWS VPC Networking Services like Transit Gateway, Virtual Private Gateway, Direct Connect, Gateway Load balancer, Private link etc.
-
Solid understanding of Azure Networking Services like Virtual Network (Vnet), Express route, Azure load balancers etc
-
Solid understanding of GCP networking services
-
Good understanding and experience in L2 to L7 networking protocols including but not limited to Ethernet, TCP/IP, VLAN, BGP, HTTP
-
Good understanding of container technologies such as Docker, Kubernetes etc.,
-
Working knowledge of CI/CD tools like GitLab, Argo
-
Ability to implement all phases of a development cycle for a software product from understanding requirements, going through design, development and deploy phases
-
Self-motivated and willing to delve into new areas and take on new challenges in a proactive manner
-
Excellent written and verbal communication skills.
Qualifications
-
Minimum of 5 years of related experience with a Bachelor's degree in Computer Science/related field


autonomous world.Rich data in large volumes is getting collected at the edge (outside a datacenter) in use cases like autonomous vehicles, smart manufacturing, satellite imagery, smart retail, smart agriculture etc.These datasets are characterized by being unstructured
(images/videos), large size (Petabytes per month), distributed (across edge, on-prem and
cloud) and form the input for training AI models to get to higher degrees of automation.
Akridata is engaged with building products that solve these unique challenges and be at the forefront of this edge data revolution.
The company is backed by prominent VCs and has it’s entire software engineering team
based out of India and provides ample opportunities for from-scratch design and
development.
Role:
This role is an individual contributor role with key responsibilities in developing web server
backends for Akridata management plane software that provides a ‘single pane of glass’ for users to manage assets, specify and monitor large volume data pipelines at scale involving 10s of petabytes of data.
This role involves:
1. Working with tech leads and the rest of the team on the feature design activities and
picking appropriate tools and techniques for implementation.
2. Be a hands-on developer able to independently make correct implement choices, follow
sound development practices to ensure an enterprise grade application.
3. Guide and mentor junior team members.
What we are looking for:
1. A Bachelor’s or Master’s degree in computer science with strong CS fundamentals and
problem solving.
2. 5+ years of hands-on experience with software development with 3+ years on web
backend development.
3. A good understanding of backend application interactions with relational databases like
MySQL, Postgres etc
4. Knowledge of web server development frameworks preferably on Python.
5. Enthusiastic to work in a dynamic, fast paced startup environment.
Good to have:
1. Hands-on experience with designing database schema and implementing and debugging SQL queries for optimal performance for large datasets
2. Experience working with applications deployed on Kubernetes clusters.
3. Experience with working on a product from early stages of it’s development typically in a
startup environment.


- Deliver features in an end-to-end manner: technical design, development, testing, deployment, and maintenance.
- Provide technical leadership and own specific areas of the platform.
- Work closely with Product Managers to translate product requirements to engineering specifications.
- Lead code review efforts and quality efforts in your area of ownership.
- Participate in product discussions, taking ownership and initiative.
- Work independently in a fast-paced environment.
- Mentored and guide junior engineers.
Must Haves:
- Minimum 2 years of working experience in Ruby on Rails/ Python/ Node.JS.
- Strong problem-solving skills, data structures, and algorithms.
- Excellent experience on databases like MySQL, PostgreSQL, etc.
- Familiarity with tools for code reviews, version control (GIT).
- A knack for writing clean, readable Ruby/ Python/Node.JS code.
- Experience in application server hosting and monitoring.
- Understanding of fundamental design principles behind a scalable application.
Nice to Have:
- Hands-on with search platforms (Solr, Elasticsearch, etc).
- Prior experience with microservices-based architecture/SOA/distributed systems.
- Able to implement automated testing platforms and unit tests.
- Experience with common AWS service EC2, RDS, S3, SES, etc.
Key Competencies:
- Building Effective Team
- Solving Problems Creatively
- Learning Agility
- Drive For Result


JD - Senior Dev with more than 5 years relevant experience
1. Working experience on web development using Python/Django
2. Comfortable with using Django framework.
3. Understanding of web servers
4. Knowledge in PostgreSQL, MySQL, and other database queries
5. Experience in using Redis cache.
6. Understanding of web technologies
7. Experience in REST API development using Django.
8. Strong hold on Apache
9. Integration of data storage solutions
10. Able to integrate multiple data sources and databases into one system
11. Understanding of JavaScript, and front-end JS frameworks like VueJS, React JS and Angular JS
12. Good communication skill
13. .net/.NET core will be an added advantage


Develop state of the art algorithms in the fields of Computer Vision, Machine Learning and Deep Learning.
Provide software specifications and production code on time to meet project milestones Qualifications
BE or Master with 3+ years of experience
Must have Prior knowledge and experience in Image processing and Video processing • Should have knowledge of object detection and recognition
Must have experience in feature extraction, segmentation and classification of the image
Face detection, alignment, recognition, tracking & attribute recognition
Excellent Understanding and project/job experience in Machine learning, particularly in areas of Deep Learning – CNN, RNN, TENSORFLOW, KERAS etc.
Real world expertise in deep learning- applied to Computer Vision problems • Strong foundation in Mathematics
Strong development skills in Python
Must have worked upon Vision and deep learning libraries and frameworks such as Opencv, Tensorflow, Pytorch, keras
Quick learner of new technologies
Ability to work independently as well as part of a team
Knowledge of working closely with Version Control(GIT)

Good knowledge and experience of working with backend systems;
designing server-side architecture, including production maintenance experience are must-haves.
At least 1-2 years of experience in any programming languages like Java, Ruby, PHP, Python and Node.js(Node.js preferred).
Understanding of micro-services oriented architecture.
Experience with Databases design (SQL, NoSQL) and analytics
Experience in driving and delivering complex features/software modules from technical design to launch.
Expertise with unit testing & Test Driven Development (TDD)


- Spend most of your time developing brand new software in Python
- Build out a secure data replication infrastructure to replicate transactional data across a hierarchical system
- Build out APIs for supporting a fleet of 20k devices
- Build out a configurable data analytics and reporting framework
- Build out real time dashboards for getting a high level and deep dive view of the overall system
- Be responsible for independently designing and building out modules of the overall system
- Work with a team of smart peers and take ownership of the resulting system
- Push yourselves, create impact, learn and grow as an engineer

