
We're at the forefront of creating advanced AI systems, from fully autonomous agents that provide intelligent customer interaction to data analysis tools that offer insightful business solutions. We are seeking enthusiastic interns who are passionate about AI and ready to tackle real-world problems using the latest technologies.
Duration: 6 months
Perks:
- Hands-on experience with real AI projects.
- Mentoring from industry experts.
- A collaborative, innovative and flexible work environment
After completion of the internship period, there is a chance to get a full-time opportunity as AI/ML engineer (Up to 12 LPA).
Compensation:
- Joining Bonus: A one-time bonus of INR 2,500 will be awarded upon joining.
- Stipend: Base is INR 8000/- & can increase up to 20000/- depending upon performance matrix.
Key Responsibilities
- Experience working with python, LLM, Deep Learning, NLP, etc..
- Utilize GitHub for version control, including pushing and pulling code updates.
- Work with Hugging Face and OpenAI platforms for deploying models and exploring open-source AI models.
- Engage in prompt engineering and the fine-tuning process of AI models.
Requirements
- Proficiency in Python programming.
- Experience with GitHub and version control workflows.
- Familiarity with AI platforms such as Hugging Face and OpenAI.
- Understanding of prompt engineering and model fine-tuning.
- Excellent problem-solving abilities and a keen interest in AI technology.
To Apply Click below link and submit the Assignment

Similar jobs
Job Title : Python Django Developer
Experience : 3+ Years
Location : Gurgaon
Working Days : 6 Days (Monday to Saturday)
Job Summary :
We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.
The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.
Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.
Key Responsibilities :
- Write efficient, reusable, testable, and scalable code using the Django framework.
- Develop backend components, server-side logic, and statistical models.
- Design and implement high-availability, low-latency applications with robust data protection and security.
- Contribute to the development of highly responsive web applications.
- Collaborate with cross-functional teams on system design and integration.
Mandatory Skills :
- Strong programming skills in Python and Django (or similar frameworks like Flask).
- Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
- Strong understanding of SQL and NoSQL ORM.
- Solid grasp of data structures, multithreading, and operating system concepts.
- Experience with RESTful API development and implementation of API security.
- Knowledge of JSON/XML and their use in data exchange.
Good-to-Have Skills :
- Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
- Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
- Familiarity with MongoDB and other NoSQL databases
- Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
- Knowledge in building and integrating statistical learning models.
Job Description:
- He / She candidate must possess a strong technology background with advanced knowledge of Java and Python based technology stack.
- Java, JEE, Spring MVC, Python, JPA, Spring Boot, REST API, Database, Playwright, CI/CD pipelines
- * At least 3 years of Hand-on Java EE and Core Java experience with strong leadership qualities.
- * Experience with Web Service development, REST and Services Oriented Architecture.
- * Expertise in Object Oriented Design, Design patterns, Architecture and Application Integration.
- * Working knowledge of Databases including Design, SOL proficiency.
- * Strong experience with frameworks used for development and automated testing like SpringBoot, Junit, BDD etc.
- * Experience with Unix/Linux Operating System and Basic Linux Commands.
- * Strong development skills with ability to understand technical design and translate the same into workable solution.
- * Basic knowledge of Python and Hand-on experience on Python scripting
- * Build, deploy, and monitor applications using CI/CD pipelines, * Experience with agile development methodology.
- Good to Have - Elastic Index Database, MongoDB. - No SQL Database Docker Deployments, Cloud Deployments Any Al ML. snowflake Experience
The requirements are as follows:
1) Familiar with the the Django REST API Framework.
2) Experience with the FAST API framework will be a plus
3) Strong grasp of basic python programming concepts ( We do ask a lot of questions on this on our interviews :) )
4) Experience with databases like MongoDB , Postgres , Elasticsearch , REDIS will be a plus
5) Experience with any ML library will be a plus.
6) Familiarity with using git , writing unit test cases for all code written and CI/CD concepts will be a plus as well.
7) Familiar with basic code patterns like MVC.
8) Grasp on basic data structures.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Key Responsibilities:
● Work closely with product managers, designers, frontend developers, and other
cross-functional teams to ensure the seamless integration and alignment of frontend and
backend technologies, driving cohesive and high-quality product delivery.
● Develop and implement coding standards and best practices for the backend team.
● Document technical specifications and procedures.
● Stay up-to-date with the latest backend technologies, trends, and best practices.
● Collaborate with other departments to identify and address backend-related issues.
● Conduct code reviews and ensure code quality and consistency across the backend team.
● Create technical documentation, ensuring clarity for future development and
maintenance.
Requirements;
● Experience: 4-6 years of hands-on experience in backend development, with a strong
background in product-based companies or startups.
● Education: Bachelor’s degree or above in Computer Science or a related field.
● Programming skills: Proficient in Python and software development principles, with a
focus on clean, maintainable code, and industry best practices. Experienced in unit
testing, AI-driven code reviews, version control with Git, CI/CD pipelines using GitHub
Actions, and integrating New Relic for logging and APM into backend systems.
● Database Development: Proficiency in developing and optimizing backend systems in
both relational and non-relational database environments, such as MySQL and NoSQL
databases.
● GraphQL: Proven experience in developing and managing robust GraphQL APIs,
preferably using Apollo Server. Ability to design type-safe GraphQL schemas and
resolvers, ensuring seamless integration and high performance.
● Cloud Platforms: Familiar with AWS and experienced in Docker containerization and
orchestrating containerized systems.
● System Architecture: Proficient in system design and architecture with experience in
developing multi-tenant platforms, including security implementation, user onboarding,
payment integration, and scalable architecture.
● Linux Systems: Familiarity with Linux systems is mandatory, including deployment and
management.
● Continuous Learning: Stay current with industry trends and emerging technologies to
influence architectural decisions and drive continuous improvement.
Benefits:
● Competitive salary.
● Health insurance.
● Casual dress code.
● Dynamic & Collaboration friendly office.
● Hybrid work schedule.
Industry
- IT Services and IT Consulting
Employment Type
Full-time
Mandatory skills
Hands on Python Programming.5+ years of Data Engineering experience: Skills sets: Python, SQL (Snowflake), S3.
Good to have
AWS familiarity would help
As an employee of our company, you will collaborate with each department to create and deploy disruptive products. Come work at a growing company that offers great benefits with opportunities to move forward and learn alongside accomplished leaders. We're seeking an experienced and outstanding member of staff.
This position is both creative and rigorous by nature you need to think outside the box. We expect the candidate to be proactive and have a "get it done" spirit. To be successful, you will have solid solving problem skills.
Responsibilities
- Design & Develop Odoo Apps
- Translate requirements into clean and efficient code
- Analyze, diagnose and resolve errors relate to the applications.
Must Have
- Bachelor's Degree In Computers/MCA
- Object-oriented programming language
- Basics of Python/HTML/CSS/JS
- Highly creative and autonomous
Nice To Have
- Python/Javascript Knowledge
- Linux, Github
- Contribution in open source projects
- Strong communication skills
What's great in the job?
- Great team of smart people, in a friendly and open culture
- No dumb managers, no stupid tools to use, no rigid working hours
- No waste of time in enterprise processes, real responsibilities and autonomy
- Expand your knowledge of various business industries
- Create content that will help our users on a daily basis
- Real responsibilities and challenges in a fast evolving company
Rapido is India’s largest bike taxi player focused on solving the first and last mile connectivity problem for India. The primary focus is mobility and changing all facets of mobility across India. We believe that 2 Wheeler are the right mode of transport for developing countries like India and have much more scope than 4 wheelers, which is also reflected in the fact that the number of 2 wheelers is significantly more than the number of 4-wheelers.We have operations in close to 100 cities and are the undisputed market leader in this space. Growing close to 500% year-on-year, we have ambitious targets set for ourselves in the future as well.
What you will do :
We are looking for a Technical Architect to design the structure of our IT systems and oversee programs to ensure the proper architecture is implemented.
Should have experience in data modelling, distributed system design, Microservice architecture and communications protocol and is passionate about writing code and the art of management
In this role, you should be an excellent communicator who is able to translate complex requirements into functional architecture. We'd also like you to have hands-on experience on software development and be able to manage complex programs. Overseeing development and implementation of programs. Providing technical leadership and support to software development teams. Your goal will be to ensure our internal IT framework operates properly.
Responsibilities:
- Understand company needs to define system specifications
- Plan and design the structure of a technology solution
- Communicate system requirements to software development teams
- Evaluate and select appropriate software or hardware and suggest integration methods
- Oversee assigned programs (e. g. conduct code review) and provide guidance to team members
- Assist with solving technical problems when they arise
- Ensure the implementation of agreed architecture and infrastructure
- Address technical concerns, ideas and suggestions
- Monitor systems to ensure they meet both user needs and business goals
Requirements:
- Proven experience as a Technical Architect
- Hands-on experience with software development and system administration
- Understanding of strategic IT solutions
- Experience in project management and service-oriented architecture (SOA)
- Knowledge of selected coding languages (e. g. JavaScript, Java)
- Familiarity with various operating systems
- Experience in cloud technologies
- Excellent communication skills
- Problem-solving aptitude
- Organisational and leadership skills
Role Competencies:
- Proven work experience as a Back-end developer.
- Hands on experience with programming languages like Java/NodeJS/Golang, JavaScript, Familiarity with Git,
- Database (SQL/Mysql/NO SQL).Good to have Kafka/queuing/messaging system.
- Familiarity with front-end languages (e.g. HTML, JavaScript and CSS)
- Strong knowledge of design principles, user interfaces, web standards and usability.
Functional Behavioral:
- Excellent analytical and time and people management skills.
- Teamwork skills with a problem-solving attitude.
About Us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take
key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable
competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to
help businesses develop data-driven strategies and make smarter decisions.
Products@DataWeave
We, the Products team at DataWeave, build data products that provide timely insights that are readily
consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are
focussed on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest
data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
● Opportunity to work on some of the most compelling data products that we are building for online
retailers and brands.
● Ability to see the impact of your work and the value you are adding to our customers almost immediately.
● Opportunity to work on a variety of challenging problems and technologies to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the
team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Role and Responsibilities
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
● 4-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and
dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.












