
Job Title: AI/ML Intern
Location: Hyderabad (On-site)
Internship Duration: 1 year
Experience Required: Basic knowledge with some hands-on project experience (0–1 year preferred)
Terms and Conditions:
- Stipend: ₹10,000 per month (subject to revision based on performance)
- Bond: 1 year of bond with SSC certificate submission
Role Overview:
We are looking for a passionate AI/ML Intern to join our growing team at WINIT. As an intern, you will work on real-world AI/ML projects that support the development of an intelligent Sales Supervisor Agent system. This is an exciting opportunity to apply your machine learning skills in sales performance analytics, pattern recognition, and recommendation systems.
Key Responsibilities:
1. AI/ML, Computer Vision & Gen AI Development Support:
- Assist in designing machine learning models for detecting sales trends and behavior patterns
- Support the development of recommendation engines for sales strategy optimization
- Work on computer vision tasks such as product identification, object detection, and image analysis using OpenCV or deep learning models
- Contribute to Generative AI use cases like content generation, sales pitch automation, data summarization, and LLM-based applications
- Help build and refine real-time dashboards for sales performance tracking
- Support integration of APIs for model deployment and system communication
2. Programming and Tools:
- Write clean Python code for ML model development
- Gain hands-on exposure to frameworks like PyTorch or TensorFlow
- Support backend development tasks using FastAPI or Flask
- Collaborate using Git for version control
3. Foundational AI/ML Understanding:
- Learn and explore concepts of LLMs, Transformers, and RAG
- Apply basic NLP techniques in project use cases
- Understand the ML workflow, from data preprocessing to deployment
4. Data Analysis & Handling:
- Work with SQL and Python libraries like Pandas/NumPy for data analysis
- Support data cleaning, transformation, and visualization
Preferred (Good to Have):
- Mini-projects or coursework in ML, computer vision, or AI
- Exposure to time series forecasting
- Awareness of vector databases (Faiss, Pinecone)
- Experience using RESTful APIs
- Familiarity with cloud platforms (AWS, Azure)
- Interest in developer tools like Cursor_AI, Codegiant, Vercel
Eligibility:
- Pursuing or recently completed B.Tech/B.E. in Computer Science, Data Science, or a related field
- Academic grounding in AI/ML fundamentals
What We’re Looking For:
- Curiosity and willingness to learn
- Problem-solving mindset
- Strong communication and collaboration skills
About WINIT:
WINIT is a pioneer in mobile Sales Force Automation (mSFA) with over 25 years of experience. We serve more than 600 global enterprises, helping them enhance efficiency, streamline logistics, and leverage AI/ML to optimize sales operations. With a commitment to innovation and global support, WINIT continues to lead digital transformation in sales.

Similar jobs
JD for SharePoint and Power Apps Developer Priority
Job Summary: We are looking for an experienced SharePoint and Power Apps Developer to join our team. The successful candidate will be responsible for designing, developing, testing, and deploying custom solutions on SharePoint and Power Apps platforms. The ideal candidate should have a strong background in SharePoint and Power Apps development, as well as experience with Microsoft Power Platform technologies.
Responsibilities:
• Design, develop, test, and deploy custom solutions on SharePoint and Power Apps platforms • Develop and implement custom workflows and forms using Microsoft Power Platform technologies • Develop and maintain SharePoint and Power Apps solutions using best practices • Create and maintain technical documentation and user manuals • Work closely with stakeholders to identify requirements and develop solutions that meet business needs • Collaborate with other developers and team members to ensure timely delivery of high-quality solutions • Stay up-to-date with the latest SharePoint and Power Apps development trends and technologies
Requirements:
• Bachelor’s degree in Computer Science, Information Systems, or related field
• Minimum of 5 years of experience in SharePoint and Power Apps development
• Strong experience with Microsoft Power Platform technologies such as Power Automate, Power BI, and Power Virtual Agents
• Strong experience with SharePoint development, including SharePoint Framework (SPFx), SharePoint Designer, and SharePoint Online
• Experience with SharePoint migration and administration is a plus
• Knowledge of HTML, CSS, JavaScript, and TypeScript
• Strong analytical and problem-solving skills
• Ability to work independently and as part of a team
• Excellent communication and interpersonal skills
• Ability to multitask and manage.
Job Type: Full-time
Experience- 5 yrs (not below)
Location: Hyderabad (WFO)


Requirements:
1.3+ years of strong programming experience with .Net Framework and .Net Core based software applications. (.NET,.NET Core, C#, ASP.Net, JavaScript framworks, Web API, MS SQL Server, Cosmos DB, Docker, Azure Cloud)
2. Strong knowledge of Microservices based Architecture, Design Patterns and Principles.(Optional) / Hands on experience on React Js /Redux (Optional)
3. Prior experience in handling application with larger volumes of data .
4. 3+ years of experience of designing high-performance enterprise software application.
5. Works with application development team to solve technical challenges with industry best practices.
6. Addresses and resolves complex technical issues with internal/external customers.
7. Design software systems with various Microsoft technologies and ensure compliance to all architecture requirements.
8. Review infrastructure for any issues and recommend solutions.
9. Very important to be good communicator and updates management team with the weekly/monthly status.
10. Analyzes enterprise system performance. Develops and implements system performance improvements.
11. Participates in development activities including code reviews, as well as coding and testing of new enhancements.

- Guiding team member for handling technical challenges
- Conducting training sessions
- Handling user issues and providing corrective solution
- Fixing of the vulnerabilities and Upgrades of new stable version
- Sustenance and maintenance of Archer tool
- Good scripting knowledge
Desired Candidate Profiles:
- Certified RSA Archer Professional, Internal Audit & Controls, Risk (Threat), ISO 27001
- Minimum of 5 years’ experience in the respective field
- Experience of managing a GRC Team
- Strong experience of implementation, commissioning and enhancement of modules of GRC Product
- Strong Understanding of Process workflows, Identifying the manual workflows
- Expertise in configuring GRC tool (Archer)
- Experience with all SDLC activities related to GRC program implementation
Title: Data Engineer (Azure) (Location: Gurgaon/Hyderabad)
Salary: Competitive as per Industry Standard
We are expanding our Data Engineering Team and hiring passionate professionals with extensive
knowledge and experience in building and managing large enterprise data and analytics platforms. We
are looking for creative individuals with strong programming skills, who can understand complex
business and architectural problems and develop solutions. The individual will work closely with the rest
of our data engineering and data science team in implementing and managing Scalable Smart Data
Lakes, Data Ingestion Platforms, Machine Learning and NLP based Analytics Platforms, Hyper-Scale
Processing Clusters, Data Mining and Search Engines.
What You’ll Need:
- 3+ years of industry experience in creating and managing end-to-end Data Solutions, Optimal
Data Processing Pipelines and Architecture dealing with large volume, big data sets of varied
data types.
- Proficiency in Python, Linux and shell scripting.
- Strong knowledge of working with PySpark dataframes, Pandas dataframes for writing efficient pre-processing and other data manipulation tasks.
● Strong experience in developing the infrastructure required for data ingestion, optimal
extraction, transformation, and loading of data from a wide variety of data sources using tools like Azure Data Factory, Azure Databricks (or Jupyter notebooks/ Google Colab) (or other similiar tools).
- Working knowledge of github or other version control tools.
- Experience with creating Restful web services and API platforms.
- Work with data science and infrastructure team members to implement practical machine
learning solutions and pipelines in production.
- Experience with cloud providers like Azure/AWS/GCP.
- Experience with SQL and NoSQL databases. MySQL/ Azure Cosmosdb / Hbase/MongoDB/ Elasticsearch etc.
- Experience with stream-processing systems: Spark-Streaming, Kafka etc and working experience with event driven architectures.
- Strong analytic skills related to working with unstructured datasets.
Good to have (to filter or prioritize candidates)
- Experience with testing libraries such as pytest for writing unit-tests for the developed code.
- Knowledge of Machine Learning algorithms and libraries would be good to have,
implementation experience would be an added advantage.
- Knowledge and experience of Datalake, Dockers and Kubernetes would be good to have.
- Knowledge of Azure functions , Elastic search etc will be good to have.
- Having experience with model versioning (mlflow) and data versioning will be beneficial
- Having experience with microservices libraries or with python libraries such as flask for hosting ml services and models would be great.

Benifits
Support for Continuous learning
Competetive Salary
Quarterly webinars and Annual conferences

