

- Collaborate with the business teams to understand the data environment in the organization; develop and lead the Data Scientists team to test and scale new algorithms through pilots and subsequent scaling up of the solutions
- Influence, build and maintain the large-scale data infrastructure required for the AI projects, and integrate with external IT infrastructure/service
- Act as the single point source for all data related queries; strong understanding of internal and external data sources; provide inputs in deciding data-schemas
- Design, develop and maintain the framework for the analytics solutions pipeline
- Provide inputs to the organization’s initiatives on data quality and help implement frameworks and tools for the various related initiatives
- Work in cross-functional teams of software/machine learning engineers, data scientists, product managers, and others to build the AI ecosystem
- Collaborate with the external organizations including vendors, where required, in respect of all data-related queries as well as implementation initiatives

Similar jobs

Role Overview:
As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.
Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! - https://bit.ly/LT_Python
What You'll Do:
At LearnTube, we’re pushing the boundaries of Generative AI to revolutionise how the world learns. As a Backend Engineer, you will be building the backend for an AI system and working directly on AI. Your roles and responsibilities will include:
- Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
- Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
- Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
- Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
- Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
- Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
- Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
- Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.
What makes you a great fit?
Must-Haves:
- 2+ yrs Python back-end experience (FastAPI)
- Strong with Docker & container orchestration
- Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
- SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals
Nice-to-Haves
- k8s at scale, Terraform,
- Experience with AI/ML inference services (LLMs, vector DBs)
- Go / Rust for high-perf services
- Observability: Prometheus, Grafana, OpenTelemetry
About Us:
At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:
- AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
- Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.
Meet the Founders:
LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.
Why Work With Us?
At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:
- Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
- Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
- Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
- Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
- Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
- Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.

Job Title - Mern Stack Trainer
Skills Required - Java, Python, React, Nodejs, MongoDB
Experience - 3-4 Years
Package- 6-10LPA
Location - Coimbatore.
Purpose of Job:
We are looking for an exceptionally talented senior data engineer who has exposure in implementing AWS services to build data pipelines, api
integration and designing data warehouse.
Job Responsibilities:
• Total 4+ years of experience as a Data Engineer
• Have minimum 3 years of AWS Cloud experience.
• Well versed in languages such as Python, PySpark, SQL, NodeJS etc
• Has extensive experience in Spark ecosystem and has
worked on both real time and batch processing
• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.
• Experience with modern Database systems such as
Redshift, Presto, Hive etc.
• Worked on building data lakes in the past on S3 or
Apache Hudi • Solid understanding of Data Warehousing Concepts
• Good to have experience on tools such as Kafka or Kinesis
• Good to have AWS Developer Associate or Solutions Architect Associate Certification
Qualifications:
At least a bachelor’s degree in Science, Engineering, Applied
Mathematics. Other Requirements: Learning Attitude, Ownership skills
- Develops software solutions by studying information needs, conferring with users, studying systems flow, data usage, and work processes; investigating problem areas; and following the software development lifecycle.
- Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
- Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments, and clear code.
- Prepares and installs solutions by determining and designing system specifications, standards, and programming.
- Improves operations by conducting systems analysis and recommending changes in policies and procedures.
- Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment, and by participating in educational opportunities, reading professional publications, maintaining personal networks, and participating in professional organizations.
- Protects operations by keeping information confidential.
- Provides information by collecting, analyzing, and summarizing development and service issues.
- Accomplishes engineering and organization mission by completing related results as needed.
- Supports and develops software engineers by providing advice, coaching, and educational opportunities.


Responsibilities
- To analysis business requirements, prepare design and implementation recommendations and estimate development effort
- Collaborate with cross-functional teams to define, design, and ship new features
- Lead or participate in design reviews, code reviews and architecture evolution discussions
- Unit-test code for robustness, including edge cases, usability, and general reliability
- Work on bug fixing and improving application performance
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Required Skills
- Experienced with front-end programming language with React JS.
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model.
- Thorough understanding of React.js and its core principles.
- Experience with popular React.JS workflows (such as Flux or Redux)
-Experience on .Net Core
-Experience in database MS SQL

Job Overview:
You will work in engineering and development teams to integrate and develop cloud solutions and virtualized deployment of software as a service product. This will require understanding the software system architecture function as well as performance and security requirements. The DevOps Engineer is also expected to have expertise in available cloud solutions and services, administration of virtual machine clusters, performance tuning and configuration of cloud computing resources, the configuration of security, scripting and automation of monitoring functions. This position requires the deployment and management of multiple virtual clusters and working with compliance organizations to support security audits. The design and selection of cloud computing solutions that are reliable, robust, extensible, and easy to migrate are also important.
Experience:
- Experience working on billing and budgets for a GCP project - MUST
- Experience working on optimizations on GCP based on vendor recommendations - NICE TO HAVE
- Experience in implementing the recommendations on GCP
- Architect Certifications on GCP - MUST
- Excellent communication skills (both verbal & written) - MUST
- Excellent documentation skills on processes and steps and instructions- MUST
- At least 2 years of experience on GCP.
Basic Qualifications:
- Bachelor’s/Master’s Degree in Engineering OR Equivalent.
- Extensive scripting or programming experience (Shell Script, Python).
- Extensive experience working with CI/CD (e.g. Jenkins).
- Extensive experience working with GCP, Azure, or Cloud Foundry.
- Experience working with databases (PostgreSQL, elastic search).
- Must have 2 years of minimum experience with GCP certification.
Benefits :
- Competitive salary.
- Work from anywhere.
- Learning and gaining experience rapidly.
- Reimbursement for basic working set up at home.
- Insurance (including top-up insurance for COVID).
Location :
Remote - work from anywhere.
Ideal joining preferences:
Immediate or 15 days
We are looking for a high-performing Enterprise Sales Manager to meet customer acquisition targets. You will be responsible for maximising our sales team potential, driving customer acquisition and executing the sales strategy.
- Build and promote strong, long-lasting customer relationships by partnering with them and understanding their needs.
- Identifying the correct decision makers in the companies and setting up the meeting to understand their requirements and how our products can help them.
- A seasoned sales professional with an innate hunter mentality who has a proven track record of over-achieving sales revenue growth targets
- Actively approach targeted business clients (telephone, email, social networks, events, etc.)
- Search for new client leads and ability to build a new sales pipeline.
- Responsible for generating contacts and leads.
- Maintain and expand the database of prospects.
- Developing and managing relationships with CXO level
- An individual with an innate sense of ownership & successfully works towards closing deals.
- Negotiating with the required stakeholders and closing the deal
- Maintaining the relationship with the client
- Delivering growth targets across geographies, customer segments, and products
Desired Skills and Requirement:
- MBA desired
- Proven experience in new client acquisition.
- Willing to travel & meeting prospects
- An attitude to get things done and willingness to work in a high-growth start-up environment
- Excellent Relationship Management skills
- Excellent presentation skills.
- Good communication skills




- Writing clean code.
- Building reusable code
- Leverage native API's for deep integrations with both platforms, diagnose and fix bugs and performance bottlenecks for performance that feels native.
- Architecting and building new apps from scratch.
- Converting existing browser-based applications to mobile.
- Developing high performance multiuser social media networking and analytics centric mobile apps.
- Developing high performance multiuser mobile apps.
-
Performing and developing proper unit tests.
-
Performing additional duties as determined by business needs and as directed by management.
-
Working on bug fixing and improving application performance.
- The ideal candidate will have 2-3 years of experience.
- Entrepreneurial with a founder mindset.
- Ambitious, willing to work hard and invest in building a great career.
- Candidate who care more about what they learn and the impact they make.
- Take ownership of all tasks.
- Believe in work-life integration. passion > work-life balance.
- Must have experience into Android/ IOS Development
- Must have experience into Hybrid apps Development
- Willing to work with cross-platform frameworks
- Experience with consuming REST APIs.
- Experience with Git
As a Java Developer, you will be responsible for developing cutting edge health-tech applications that include high scale transaction processing, intelligent bot based programs and data analytics.
What you will do:
- Building components for the company’s advanced health tech platform using Java, Solr, SpringBoot, DialogFlow
- Communicating effectively in a cross-functional product development team and presenting ideas and solutions effectively
- Prioritizing and managing workload and meeting critical project milestones and deadlines
- Effectively collaborating with a team as well as taking initiative and working independently to solve problems
Desired Candidate Profile
What you need to have:- B.Tech/ B.E. (with 65% marks)
- Expertise at hands on programming in JAVA and J2EE
- Proven expertise in Java interfaces with MongoDB (or similar noSQL databases) as well as relational databases (mySQL, Postgres etc)
- Experience in atleast one 6+ months development project involving SpringBoot and Hibernate
- Strong understanding of application server infrastructure
- Good working knowledge of Maven based build systems
- Good understanding of build and deployment pipelines that involve ANT and Jenkins
- Understanding of code versioning tools, such as Git or SVN
- Good knowledge of working on Rest API’s, Webservices
- Excellent problem-solving and interpersonal skills
- Hands on experience with Lucene/ Solr
- Familiarity with DiagFlow based chat bot building
- Knowledge of NLP
- Experience in product based companies


Job Location : Gurgaon Sector 15
Min Experience : 4 years in Front End
Salary : Hike on present according to company norms
Your core responsibilities:
- Design and build efficient, scalable & high performance systems.
- Interact with Product, Design and Engineering teams to spec, build, test and deploy new features.
- Perform Peer Code Reviews and mentor other engineers.
- Investigate production issues pertaining to our retail applications and implement intelligent solutions.
- Work on experimental projects and prototypes and build them into stable solutions.
What will help you thrive in this role?
- You have 3+ years of experience building applications on the web platform with JavaScript, HTML and CSS.
- Expertise in Javascript and any Javascript frameworks/libraries like ReactJS, VueJS, Angular, Redux, etc.
- Strong fundamentals on Web Standards, Browser Performance and Rendering Pipelines, HTTP and other relevant concepts.
- You can write testable, maintainable code that’s easy to understand.
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
- Knowledge of server side programming using NodeJs is a plus.
- A knack for benchmarking and optimization.

