Senior Data Engineer
Experience: 5+ years
Chennai/ Trichy (Hybrid)
Type: Fulltime
Skills: GCP + Airflow + Bigquery + Python + Docker
The Role
As a Senior Data Engineer, you own new initiatives, design and build world-class platforms to measure and optimize ad performance. You ensure industry-leading scalability and reliability of mission-critical systems processing billions of real-time transactions a day. You apply state-of-the-art technologies, frameworks, and strategies to address complex challenges with Big Data processing and analytics. You work closely with the talented engineers across different time zones in building industry-first solutions to measure and optimize ad performance.
What you’ll do
● Write solid code with a focus on high performance for services supporting high throughput and low latency
● Architect, design, and build big data processing platforms handling tens of TBs/Day, serve thousands of clients, and support advanced analytic workloads
● Providing meaningful and relevant feedback to junior developers and staying up-to-date with system changes
● Explore the technological landscape for new ways of producing, processing, and analyzing data to gain insights into both our users and our product features
● Design, develop, and test data-driven products, features, and APIs that scale
● Continuously improve the quality of deliverables and SDLC processes
● Operate production environments, investigate issues, assess their impact, and develop feasible solutions.
● Understand business needs and work with product owners to establish priorities
● Bridge the gap between Business / Product requirements and technical details
● Work in multi-functional agile teams with end-to-end responsibility for product development and delivery
Who you are
● 3-5+ years of programming experience in coding, object-oriented design, and/or functional programming, including Python or a related language
● Love what you do and are passionate about crafting clean code, and have a steady foundation.
● Deep understanding of distributed system technologies, standards, and protocols, and have 2+ years of experience working in distributed systems like Airflow, BigQuery, Spark, Kafka Eco System ( Kafka Connect, Kafka Streams, or Kinesis), and building data pipelines at scale.
● Excellent SQL, DBT query writing abilities, and data understanding
● Care about agile software processes, data-driven development, reliability, and responsible experimentation
● Genuine desire to automate decision-making, processes, and workflows
● Experience working with orchestration tools like Airflow
● Good understanding of semantic layers and experience in tools like LookerML, Kube
● Excellent communication skills and a team player
● Google BigQuery or Snowflake
● Cloud environment, Google Cloud Platform
● Container technologies - Docker / Kubernetes
● Ad-serving technologies and standards
● Familiarity with AI tools like Cursor AI, CoPilot.