
Big Data Engineer/Data Engineer
What we are solving
Welcome to today’s business data world where:
• Unification of all customer data into one platform is a challenge
• Extraction is expensive
• Business users do not have the time/skill to write queries
• High dependency on tech team for written queries
These facts may look scary but there are solutions with real-time self-serve analytics:
• Fully automated data integration from any kind of a data source into a universal schema
• Analytics database that streamlines data indexing, query and analysis into a single platform.
• Start generating value from Day 1 through deep dives, root cause analysis and micro segmentation
At Propellor.ai, this is what we do.
• We help our clients reduce effort and increase effectiveness quickly
• By clearly defining the scope of Projects
• Using Dependable, scalable, future proof technology solution like Big Data Solutions and Cloud Platforms
• Engaging with Data Scientists and Data Engineers to provide End to End Solutions leading to industrialisation of Data Science Model Development and Deployment
What we have achieved so far
Since we started in 2016,
• We have worked across 9 countries with 25+ global brands and 75+ projects
• We have 50+ clients, 100+ Data Sources and 20TB+ data processed daily
Work culture at Propellor.ai
We are a small, remote team that believes in
• Working with a few, but only with highest quality team members who want to become the very best in their fields.
• With each member's belief and faith in what we are solving, we collectively see the Big Picture
• No hierarchy leads us to believe in reaching the decision maker without any hesitation so that our actions can have fruitful and aligned outcomes.
• Each one is a CEO of their domain.So, the criteria while making a choice is so our employees and clients can succeed together!
To read more about us click here:
https://bit.ly/3idXzs0" target="_blank">https://bit.ly/3idXzs0
About the role
We are building an exceptional team of Data engineers who are passionate developers and wants to push the boundaries to solve complex business problems using the latest tech stack. As a Big Data Engineer, you will work with various Technology and Business teams to deliver our Data Engineering offerings to our clients across the globe.
Role Description
• The role would involve big data pre-processing & reporting workflows including collecting, parsing, managing, analysing, and visualizing large sets of data to turn information into business insights
• Develop the software and systems needed for end-to-end execution on large projects
• Work across all phases of SDLC, and use Software Engineering principles to build scalable solutions
• Build the knowledge base required to deliver increasingly complex technology projects
• The role would also involve testing various machine learning models on Big Data and deploying learned models for ongoing scoring and prediction.
Education & Experience
• B.Tech. or Equivalent degree in CS/CE/IT/ECE/EEE 3+ years of experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions.
Must have (hands-on) experience
• Python and SQL expertise
• Distributed computing frameworks (Hadoop Ecosystem & Spark components)
• Must be proficient in any Cloud computing platforms (AWS/Azure/GCP) • Experience in in any cloud platform would be preferred - GCP (Big Query/Bigtable, Pub sub, Data Flow, App engine )/ AWS/ Azure
• Linux environment, SQL and Shell scripting Desirable
• Statistical or machine learning DSL like R
• Distributed and low latency (streaming) application architecture
• Row store distributed DBMSs such as Cassandra, CouchDB, MongoDB, etc
. • Familiarity with API design
Hiring Process:
1. One phone screening round to gauge your interest and knowledge of fundamentals
2. An assignment to test your skills and ability to come up with solutions in a certain time
3. Interview 1 with our Data Engineer lead
4. Final Interview with our Data Engineer Lead and the Business Teams
Preferred Immediate Joiners

About Propellor.ai
About
Who we are
At Propellor, we are passionate about solving Data Unification challenges faced by our clients. We build solutions using the latest tech stack. We believe all solutions lie in the congruence of Business, Technology, and Data Science. Combining the 3, our team of young Data professionals solves some real-world problems. Here's what we live by:
Skin in the game
We believe that Individual and Collective success orientations both propel us ahead.
Cross Fertility
Borrowing from and building on one another’s varied perspectives means we are always viewing business problems with a fresh lens.
Sub 25's
A bunch of young turks, who keep our explorer mindset alive and kicking.
Future-proofing
Keeping an eye ahead, we are upskilling constantly, staying relevant at any given point in time.
Tech Agile
Tech changes quickly. Whatever your stack, we adapt speedily and easily.
If you are evaluating us to be your next employer, we urge you to read more about our team and culture here: https://bit.ly/3ExSNA2. We assure you, it's worth a read!
Tech stack
Company video


Candid answers by the company


Photos
Connect with the team
Similar jobs
Job Title : Senior QA Automation Architect (Cloud & Kubernetes)
Experience : 6+ Years
Location : India (Multiple Offices)
Shift Timings : 12 PM to 9 PM (Noon Shift)
Working Days : 5 Days WFO (NO Hybrid)
About the Role :
We’re looking for a Senior QA Automation Architect with deep expertise in cloud-native systems, Kubernetes, and automation frameworks.
You’ll design scalable test architectures, enhance automation coverage, and ensure product reliability across hybrid-cloud and distributed environments.
Key Responsibilities :
- Architect and maintain test automation frameworks for microservices.
- Integrate automated tests into CI/CD pipelines (Jenkins, GitHub Actions).
- Ensure reliability, scalability, and observability of test systems.
- Work closely with DevOps and Cloud teams to streamline automation infrastructure.
Mandatory Skills :
- Kubernetes, Helm, Docker, Linux
- Cloud Platforms : AWS / Azure / GCP
- CI/CD Tools : Jenkins, GitHub Actions
- Scripting : Python, Pytest, Bash
- Monitoring & Performance : Prometheus, Grafana, Jaeger, K6
- IaC Practices : Terraform / Ansible
Good to Have :
- Experience with Service Mesh (Istio/Linkerd).
- Container Security or DevSecOps exposure.
GradRight is an ed-fin-tech company committed to driving transparency and accountability in the global higher education sector.
We are the world's first platform that integrates 'Selection' with 'Financing'. and brings students, banks, and universities into a SaaS environment to make smarter decisions. Using data science, technology, and strategic partnerships across the industry, we enable students to find the “Right University” at the “Right Cost”. We are on a mission to aid a million students to find their best-fit universities and financial offerings by 2025.
Our flagship product - FundRight, is the world’s first student loan bidding platform. In just two years, we have enabled students to get the best deals on $ 1 Billion of loan requests and facilitated disbursements of more than $ 120 Million in loans. We are India's largest single pipeline of education loans and are poised to scale up globally.
Our second product, SelectRight, is the world's first MarTech SaaS platform that redefines how students select the right colleges and how colleges find and engage with the right students.
GradRight is a Delaware, USA-registered company with a wholly-owned subsidiary in India.
Brief:
Obsession with customer needs, passionate about customer experience, and thrive on data - if you resonate with these, then we’d love to hear from you. As a product manager, you’ll be the voice of reason for our entire product ecosystem. You’ll join a fast-paced environment and work with cross-functional teams to design and deliver products aligned with the company’s vision and strategy.
Responsibilities:
- Collect and analyze customer needs, collaborate with key stakeholders to ideate and deliver compelling solutions that create incremental value for customers
- Create buy-in for product vision across all stakeholders
- Develop positioning strategies that can drive growth and improve market share
- Create and maintain detailed product requirements and roadmap
- Create and track product success metrics and use them to identify problems and opportunities to pursue
- Work with designers to build wireframes, validate and translate them into full-blown designs
- Collaborate with engineering teams to drive feasibility analysis and refine product requirements
- Work closely with engineering teams to deliver the roadmap while optimizing for time-to-market
- Own the design <> engineering teams hand-off, release timelines, and work with business teams to plan and execute product launches
- Evaluate marketing and sales strategies to ensure that they are in line with product strategy
- Standardize the product design process and evangelize the same across all stakeholders
Requirements:
- At least 6 years experience as a product manager at a B2C startup, with war stories to share
- Proven track record of managing all aspects of product development lifecycle
- Strong understanding of software development workflows and ability to work with top-notch engineering teams
- Strong problem-solving skills and willingness to roll up one’s sleeves to get the job done
- Excellent written and verbal communication skills
- Strong presentational and public speaking skills
- Bias for experimentation and ability to ask the right questions
Good to have:
- Both B2C and B2B experience, preferably on marketplace-like initiatives
- Exposure to edTech and/or finTech domains
- Worked on products that addressed an international audience
- Worked on products that scaled to millions of users
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
React.js, Java Script, Redux, React,React.js,Angular,Frontend Developer, UI Developer, Full Stack Developer
Experience:8+ Years
AWS Certification must.
Location:Pan india
Candidate should be self motivated and proactive and should have worked in agile environments
Experience of Min 5yrs of Rel. Exp. into Drupal
Migration exp. in D7 or D8 is mandatory*
Experience is D9 is preferred
Should be an Individual contributor
Should be well-versed with OOPS concepts
Good experience in frameworks like Laravel or code igniter or cake PHP or symfony2
Experience into react would be an added advantage.
- Work experience as a Python Developer
- Expertise in at least one popular Python framework (like Django, Flask or Pyramid)
- Knowledge of object-relational mapping (ORM)
- Familiarity with front-end technologies (like JavaScript and HTML5)
- Team spirit
- Good problem-solving skills
- Write effective, scalable code
- Develop back-end components to improve responsiveness and overall performance
- Integrate user-facing elements into applications
- Test and debug programs
- Improve functionality of existing systems
- Implement security and data protection solutions
- B.Tech/MTech from tier 1 institution
- 8+years of experience in machine learning techniques like logistic regression, random forest, boosting, trees, neural networks, etc.
- Showcased experience with Python, SQL and proficiency in Scikit Learn, Pandas, NumPy, Keras and TensorFlow/pytorch
- Experience of working with Qlik sense or Tableau is a plus
Job Description:
- Creation and execution of content strategy across social media platforms (Facebook, Twitter, LinkedIn, Instagram, TikTok, etc.) adhering to the brand's requirements and guidelines
- Develop engaging, creative, innovative campaigns and content for regularly scheduled posts, which enlighten audiences and promote brand-focused messages
Technical Skills required:
- Working knowledge on prominent social media content updation dashboards.
- Understanding of key social media engagement metrices.
- Working knowledge of MS Excel and MS Power point. Good hands on knowledge on MS Excel is a plus.
Other requirements:
- Ability to work in a small yet closely-knit team and be willing to go an extra mile to help colleagues in time of need.
- Keenness to learn new things and grow.
- Good communication skills. May be required to speak to clients directly often.
- Ability to put across own PoV.
- Agreement to work long hours and even on weekends if required during peak work load periods. These would be seldom though may occur.
- Jovial and hardworking nature.


















