Research Scientist - Machine Learning/Artificial Intelligence


About Intellinet Systems
About
Connect with the team
Similar jobs


TensorIoT:
- AWS Advanced Consulting Partner (for ML and GenAI solutions)
- Pioneers in IoT and Generative AI products.
- Committed to diversity and inclusion in our teams.
TensorIoT is an AWS Advanced Consulting Partner. We assist companies to realize the value and efficiency of the AWS ecosystem. From building Proof of Concepts and Product Prototypes to production-ready applications, we analyze complex business problems every day and develop solutions to drive customers’ success.
The founders of TensorIoT, have previously helped build world-class IoT and AI platforms in AWS and Google. Our mission now is to help connect devices and make them intelligent. We firmly believe in the transformative power of smart devices to enhance our quality of life.
TensorIoT is proud to be an equal opportunity employer. We are committed to diversity and inclusion and encourage people from all backgrounds to apply. We do not tolerate discrimination or harassment of any kind, and make our hiring decisions based solely on qualifications, merit, and business needs at the time.
Job Description
At the TensorIoT India team, we look forward to bringing on board mid-level and senior Machine Learning Engineers / Data Scientists. In this section, we briefly describe the work role, the minimum and the preferred requirements to qualify for the first round of the selection process. Please note: A candidate needs to qualify at least 4 of the 7 preferred qualifications listed below, to be eligible for the task / first round of interviews.
What are the kinds of tasks Data Scientists do at TensorIoT?
As a Data Scientist, the kinds of tasks revolve around the data that we have and the business objectives of the client. The tasks generally involve: Studying, understanding, and analyzing datasets; feature engineering, proposing and solutions, evaluating the solution scientifically, and communicating with the client. Implementing ETL pipelines with database/data lake tools. Conduct and present scientific research/experiments within the team and to the client.
Preferred Requirements:
- PhD in the domain of Data Science / Machine Learning
- M.Sc | M.Tech in the domain of Computer Science / Machine Learning
- Hands-on experience in MLOps (model deployment, maintenance).
- Some experience working with Generative AI (LLM).
- Some experience in creating cloud-native technologies, and microservices design.
- Well-rounded exposure to Computer Vision, Natural Language Processing, and Time-Series Analysis.
- Hands-on experience with Docker.
Minimum Requirements:
- Masters + 3 years of work experience in Machine Learning Engineering OR B.Tech (Computer Science or related) + 5 years of work experience in Machine Learning Engineering 1 year of Cloud Experience.
- Clear concepts of the following:
- - Supervised Learning, Unsupervised Learning, Reinforcement Learning
- - Statistical Modelling, Deep Learning
- - Interpretable Machine Learning
- - Linear Algebra
- Scientific & Analytical mindset, proactive learning, adaptability to changes.
- Strong interpersonal and language skills in English, to communicate within the team and with the clients.
CV Tips:
Your CV is an integral part of your application process. We would appreciate it if the CV prioritizes the following:
- Focus:
- More focus on technical skills relevant to the job description.
- Less or no focus on your roles and responsibilities as a manager, team lead, etc.
- Less or no focus on the design aspect of the document.
- Regarding the projects you completed in your previous companies,
- Mention the problem statement very briefly.
- Your role and responsibilities in that project.
- Technologies & tools used in the project.
- Always good to mention (if relevant):
- Scientific papers published, Master Thesis, Bachelor Thesis.
- Github link, relevant blog articles.
- Link to LinkedIn profile.
- Mention skills that are relevant to the job description and you could demonstrate during the interview/tasks in the selection process.
We appreciate your interest in the company and look forward to your application.

ROLES AND RESPONSIBILITIES
As a Full Stack Developer at GoQuest Media, you will play a key role in building and maintaining
web applications that deliver seamless user experiences for our global clients. From
brainstorming features with the team to executing back-end logic, you will be involved in every
aspect of our application development process.
You will be working with modern technologies like NodeJS, ReactJS, NextJS, and Tailwind CSS
to create performant, scalable applications. Your role will span both front-end and back-end
development as you build efficient and dynamic solutions to meet the company’s and users’
needs.
What will you be accountable for?
● End-to-End Development:
● Design and develop highly scalable and interactive web applications from scratch.
● Take ownership of both front-end (ReactJS, NextJS, Tailwind CSS) and back-end
(NodeJS) development processes.
● Feature Implementation:
● Work closely with designers and product managers to translate ideas into highly
interactive and responsive interfaces.
● Maintenance and Debugging:
● Ensure applications are optimized for performance, scalability, and reliability.
● Perform regular maintenance, debugging, and testing of existing apps to ensure
they remain in top shape.
● Collaboration:
● Collaborate with cross-functional teams, including designers, product managers,
and stakeholders, to deliver seamless and robust applications.
● Innovation:
● Stay updated with the latest trends and technologies to suggest and implement
improvements in the development process.
Tech Stack
● Front-end: ReactJS, NextJS, Tailwind CSS
● Back-end: NodeJS, ExpressJS
● Database: MongoDB (preferred), MySQL
● Version Control: Git
● Tools: Webpack, Docker (optional but a plus)
Preferred Location
This role is based out of our Andheri Office, Mumbai.
Growth Opportunities for You
● Lead exciting web application projects end-to-end and own key product initiatives.
● Develop cutting-edge apps used by leading media clients around the globe.
● Gain experience working in a high-growth company in the media and tech industry.
● Potential to grow into a team lead role.
Who Should Apply?
● Individuals with a passion for coding and web technologies.
● Minimum 3-5 years of experience in full-stack development using NodeJS, ReactJS,
NextJS, and Tailwind CSS.
● Strong understanding of both front-end and back-end development and ability to
write efficient, reusable, and scalable code.
● Familiarity with databases like MongoDB and MySQL.
● Experience with CI/CD pipelines and cloud infrastructure (AWS, Google Cloud) is a
plus.
● Team players with excellent communication skills and the ability to work in a
fast-paced environment.
Who Should Not Apply?
● If you're not comfortable with both front-end and back-end development.
● If you don’t enjoy problem-solving or tackling complex development challenges.
● If working in a dynamic, evolving environment doesn’t appeal to you.


At Livello we building machine-learning-based demand forecasting tools as well as computer-vision-based multi-camera product recognition solutions that detects people and products to track the inserted/removed items on shelves based on the hand movement of users. We are building models to determine real-time inventory levels, user behaviour as well as predicting how much of each product needs to be reordered so that the right products are delivered to the right locations at the right time, to fulfil customer demand.
Responsibilities
- Lead the CV and DS Team
- Work in the area of Computer Vision and Machine Learning, with focus on product (primarily food) and people recognition (position, movement, age, gender, DSGVO compliant).
- Your work will include formulation and development of a Machine Learning models to solve the underlying problem.
- You help build our smart supply chain system, keep up to date with the latest algorithmic improvements in forecasting and predictive areas, challenge the status quo
- Statistical data modelling and machine learning research.
- Conceptualize, implement and evaluate algorithmic solutions for supply forecasting, inventory optimization, predicting sales, and automating business processes
- Conduct applied research to model complex dependencies, statistical inference and predictive modelling
- Technological conception, design and implementation of new features
- Quality assurance of the software through planning, creation and execution of tests
- Work with a cross-functional team to define, build, test, and deploy applications
Requirements:
- Master/PHD in Mathematics, Statistics, Engineering, Econometrics, Computer Science or any related fields.
- 3-4 years of experience with computer vision and data science.
- Relevant Data Science experience, deep technical background in applied data science (machine learning algorithms, statistical analysis, predictive modelling, forecasting, Bayesian methods, optimization techniques).
- Experience building production-quality and well-engineered Computer Vision and Data Science products.
- Experience in image processing, algorithms and neural networks.
- Knowledge of the tools, libraries and cloud services for Data Science. Ideally Google Cloud Platform
- Solid Python engineering skills and experience with Python, Tensorflow, Docker
- Cooperative and independent work, analytical mindset, and willingness to take responsibility
- Fluency in English, both written and spoken.

Company Name : LMES Academy Private Limited
Website : https://lmes.in/
Linkedin : https://www.linkedin.com/company/lmes-academy/mycompany/
Role : Machine Learning Engineer
Experience: 2 Year to 4 Years
Location: Urapakkam, Chennai, Tamil Nadu.
Job Overview:
We are looking for a Machine Learning Engineer to join our team and help us advance our AI capabilities.
Requirements
• Model Training and Fine-Tuning: Utilize and refine large language models using techniques such as distillation and supervised fine-tuning to enhance performance and efficiency.
• Retrieval-Augmented Generation (RAG): Good understanding on RAG systems to improve the quality and relevance of generated content.
• Vector Databases: Familiar with vector databases to support fast and accurate similarity searches and other ML-driven functionalities.
• API Integration: Good in REST APIs and integrate third-party APIs, including Open AI, Google Vertex, and Cloudflare Workers AI, to extend our AI capabilities.
• Generative AI: Experience with generative AI applications, including text-to-image, speech recognition, and text-to-speech systems.
• Collaboration: Work collaboratively with cross-functional teams, including data scientists, developers, and product managers, to deliver innovative AI solutions.
• Adaptability: Thrive in a fast-paced environment with loosely defined tasks and competing priorities, ensuring timely delivery of high-quality results.
Responsibilities
> Selecting features, building and optimizing classifiers using machine
> learning techniques
> Data mining using state-of-the-art methods
> Extending company’s data with third party sources of information when
> needed
> Enhancing data collection procedures to include information that is
> relevant for building analytic systems
> Processing, cleansing, and verifying the integrity of data used for
> analysis
> Doing ad-hoc analysis and presenting results in a clear manner
> Creating automated anomaly detection systems and constant tracking of
> its performance
Key Skills
> Hands-on experience of analysis tools like R, Advance Python
> Must Have Knowledge of statistical techniques and machine learning
> algorithms
> Artificial Intelligence
> Understanding of Text analysis- Natural Language processing (NLP)
> Knowledge on Google Cloud Platform
> Advanced Excel, PowerPoint skills
> Advanced communication (written and oral) and strong interpersonal
> skills
> Ability to work cross-culturally
> Good to have Deep Learning
> VBA and visualization tools like Tableau, PowerBI, Qliksense, Qlikview
> will be an added advantage

About Quizizz
Quizizz is one of the fastest-growing EdTech platforms in the world. Our team is on a mission to motivate every student and our learning platform is used by more than 75 million people per month in over 125 countries, including 80% of U.S. schools.
We have phenomenal investors, we’re profitable, and we’re committed to growing and improving every day. If you’re excited about international SaaS and want to build towards a mission that you can be proud of then Quizizz might be a good fit for you.
We currently have offices in India and the U.S. with incredible team members around the world and we hope you’ll join us.
Role
We are looking for an experienced Product Analyst. The role offers an exciting opportunity to shape the future of the product, significantly. The team is responsible for supporting all decisions being taken by other teams, to improve growth, engagement and revenue of the platform. Furthermore, the team sets up and maintains internal tools, apps, dashboards, processes and functions as arbiters of information within the organization.
The variety of tasks is immense and will give you the chance to play to your strengths. Tasks could include; improving search and recommendations, data mining, identifying potential customers, ad-hoc analyses, creating APIs for internal consumption et cetera.
Some of the challenges you will face include:
- Working cross-functionally with design, engineering, sales and marketing teams to aid in decision making.
- Analyze and conclude experiments of new product features.
- Creating, maintaining, and modifying internal dashboards, apps and reports being used, as part of the larger analytics function at Quizizz.
- Deep diving into data to extract insights that could help explain a certain phenomenon.
- Organizing the analytics warehouse, as and when new data is added.
Requirements:
- At least 2 years of industry experience, providing solutions to business problems in a cross-functional team.
- Versatility to communicate clearly with both technical and non-technical audiences.
- An SQL expert, and strong programming skills. (Python preferred)
- Mathematical thinking.
- Attention to Detail.
Good to have:
- Experience with Jupyter (/iPython) notebooks.
- Experience using a data visualization tool such as Tableau, Google Data Studio, Qlikview, Power BI, RShiny.
- Ability to create simple data apps/APIs. (we use flask or node.js)
- Knowledge of Natural Language Processing techniques.
- Data analytical and data engineering experience.
Benefits:
At Quizizz, we have built a world-class team of talented individuals. While we all care deeply about our work, we also ensure that we maintain a healthy work-life balance. Our policies are designed to ensure the well-being and comfort of our employees. Some of the benefits we offer include:
- Healthy work-life balance. Put in your 8 hours, and enjoy the rest of your day.
- Flexible leave policy. Take time off when you need it.
- Comprehensive health coverage of Rs. 6 lakhs, covering the employee and their parents, spouse and children. Pre-existing conditions are covered from day 1, and also benefits like free doctor consultations and more.
- Relocation support including travel and accommodation, and we'll also pay for a broker to find your home in Bangalore!
- Rs. 20,000 annual health and wellness allowance.
- Professional development support. We will reimburse you for relevant courses and books that you need to become a better professional.
- Delicious Meals including breakfast and lunch served at office, and a fully-stocked pantry for all your snacking needs.



About the company: https://www.hectar.in/
Hectar Global is a financial technology company that provides a disruptive cross-border trading platform for the agricultural commodities market. Our platform utilizes machine learning and data analysis to provide insights and improve the trading experience for farmers, traders, and other stakeholders in the agricultural industry. Our mission is to bring transparency and efficiency to the agricultural commodities market, which has traditionally been fragmented and opaque. We are committed to driving innovation in the industry and providing a user-friendly and accessible platform for our customers.
Job Overview:
We are seeking a highly skilled Head of Engineering to lead the development of our innovative and disruptive cross-border trading platform at Hectar Global. This person will be responsible for spearheading all technical aspects of the project, from architecture and infrastructure to data and machine learning.
Responsibilities:
- Lead and manage a team of engineers, including hiring, training, and mentoring
- Design and implement the technical architecture of the cross-border trading platform
- Ensure the platform is scalable, efficient, and secure
- Oversee the development of data models and machine learning algorithms
- Collaborate with cross-functional teams, including product and design, to ensure the platform meets user needs and is visually appealing and easy to use
- Work with stakeholders to identify business requirements and translate them into technical solutions
- Develop and maintain technical documentation, including system specifications, design documents, and user manuals
- Keep up to date with emerging trends and technologies in software engineering, machine learning, and data science
Requirements:
- Bachelor's or Master's degree in Computer Science or related field
- 10+ years of experience in software engineering, with a focus on web applications and machine learning
- Proven track record of leading and managing a team of engineers
- Expertise in software architecture and design patterns
- Experience with data modeling and machine learning techniques
- Strong problem-solving and analytical skills
- Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams
- Experience working in an agile development environment
- Strong knowledge of front-end technologies, such as HTML, CSS, and JavaScript
- Familiarity with modern web frameworks, such as React or Angular
Preferred qualifications:
- Experience with cloud computing platforms, such as AWS or Azure
- Familiarity with data visualization tools, such as D3.js or Tableau
- Experience with containerization and orchestration tools, such as Docker and Kubernetes
- Understanding of financial markets and trading platforms
If you are a passionate leader with a proven track record of building innovative and disruptive products and teams, we would love to hear from you.

Sr AI Scientist, Bengaluru |
Job Description
Introduction
Synapsica is a growth stage HealthTech startup founded by alumni from IIT Kharagpur, AIIMS New Delhi, and IIM Ahmedabad. We believe healthcare needs to be transparent and objective, while being affordable. Every patient has the right to know exactly what is happening in their bodies and they don’t have to rely on cryptic 2 liners given to them as diagnosis. Towards this aim, we are building an artificial intelligence enabled cloud based platform to analyse medical images and create v2.0 of advanced radiology reporting. We are backed by YCombinator and other investors from India, US and Japan. We are proud to have GE, AIIMS, the Spinal Kinetics as our partners.
Your Roles and Responsibilities
The role involves computer vision tasks including development, customization and training of Convolutional Neural Networks (CNNs); application of ML techniques (SVM, regression, clustering etc.) and traditional Image Processing (OpenCV etc.). The role is research focused and would involve going through and implementing existing research papers, deep dive of problem analysis, generating new ideas, automating and optimizing key processes.
Requirements:
- 4+ years of relevant experience in solving complex real-world problems at scale via computer vision based deep learning.
- Strong problem-solving ability
- Prior experience with Python, cuDNN, Tensorflow, PyTorch, Keras, Caffe (or similar Deep Learning frameworks).
- Extensive understanding of computer vision/image processing applications like object classification, segmentation, object detection etc
- Ability to write custom Convolutional Neural Network Architecture in Pytorch (or similar)
- Experience of GPU/DSP/other Multi-core architecture programming
- Effective communication with other project members and project stakeholders
- Detail-oriented, eager to learn, acquire new skills
- Prior Project Management and Team Leadership experience
- Ability to plan work and meet deadlines
Roles and Responsibilities :
You will be responsible to Create, debug, maintain and optimize the game's engine, editor and related tools used for the game's development.
To ensure success as a unity developer, you should have extensive experience working with Unity and Unity3D software, excellent coding skills, and a good eye for detail.
You will be responsible for maintaining, fixing, improving products on the go.
Offer technical solutions, innovate and improve the quality of implementation, performance and usability of the editor, tools and tool chain as a whole.
Implement new game features and services in close partnership with the content team of the project and with respect for the player's comfort and game's performance
Write technical design documents, for milestones and internal use.
While applying, please share your public GitHub or any projects in resume where we can review the code written by you.
Required :
2+ years of enthusiastic experience on Unity and at least 2 published games.
C#7 .NET 4.7.
IL2CPP for Android & iOS.
Ideal :
Json.NET Writing converters and contract resolvers.
Objective-C Specifically for writing native iOS plugins.
Java Specifically for writing native Android plugins.
Make our tiny eyes grow with delight.
PlayFab .NET API.
Facebook Unity SDK.
AmplitudeGame analytics.
Cloud Infra AWS and Google Cloud.
Notice Period : immediate to 30 days.


- You'd have to set up your own shop, work with design customers to find generalizable use cases, and build them out.
- Ability to collaborate with cross-functional teams to build and ship new features
- At least 2-5 years of experience
- Predictive Analytics – Machine Learning Algorithms, Logistics & Linear Regression, Decision Tree, Clustering.
- Exploratory Data Analysis – Data Preparation, Data Exploration, and Data Visualization.
- Analytics Tools – R, Python, SQL, Power BI, MS Excel.

