Job Description:
Machine Learning / AI Engineer (with 3+ years of experience)
We are seeking a highly skilled and passionate Machine Learning / AI Engineer to join our newly established data science practice area. In this role, you will primarily focus on working with Large Language Models (LLMs) and contribute to building generative AI applications. This position offers an exciting opportunity to shape the future of AI technology while charting an interesting career path within our organization.
Responsibilities:
1. Develop and implement machine learning models: Utilize your expertise in machine learning and artificial intelligence to design, develop, and deploy cutting-edge models, with a particular emphasis on Large Language Models (LLMs). Apply your knowledge to solve complex problems and optimize performance.
2. Building generative AI applications: Collaborate with cross-functional teams to conceptualize, design, and build innovative generative AI applications. Work on projects that push the boundaries of AI technology and deliver impactful solutions to real-world problems.
3. Data preprocessing and analysis: Collect, clean, and preprocess large volumes of data for training and evaluation purposes. Conduct exploratory data analysis to gain insights and identify patterns that can enhance the performance of AI models.
4. Model training and evaluation: Develop robust training pipelines for machine learning models, incorporating best practices in model selection, feature engineering, and hyperparameter tuning. Evaluate model performance using appropriate metrics and iterate on the models to improve accuracy and efficiency.
5. Research and stay up to date: Keep abreast of the latest advancements in machine learning, natural language processing, and generative AI. Stay informed about industry trends, emerging techniques, and open-source libraries, and apply relevant findings to enhance the team's capabilities.
6. Collaborate and communicate effectively: Work closely with a multidisciplinary team of data scientists, software engineers, and domain experts to drive AI initiatives. Clearly communicate complex technical concepts and findings to both technical and non-technical stakeholders.
7. Experimentation and prototyping: Explore novel ideas, experiment with new algorithms, and prototype innovative solutions. Foster a culture of innovation and contribute to the continuous improvement of AI methodologies and practices within the organization.
Requirements:
1. Education: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Relevant certifications in machine learning, deep learning, or AI are a plus.
2. Experience: A minimum of 3+ years of professional experience as a Machine Learning / AI Engineer, with a proven track record of developing and deploying machine learning models in real-world applications.
3. Strong programming skills: Proficiency in Python and experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and libraries (e.g., scikit-learn, pandas). Experience with cloud platforms (e.g., AWS, Azure, GCP) for model deployment is preferred.
4. Deep-learning expertise: Strong understanding of deep learning architectures (e.g., convolutional neural networks, recurrent neural networks, transformers) and familiarity with Large Language Models (LLMs) such as GPT-3, GPT-4, or equivalent.
5. Natural Language Processing (NLP) knowledge: Familiarity with NLP techniques, including tokenization, word embeddings, named entity recognition, sentiment analysis, text classification, and language generation.
6. Data manipulation and preprocessing skills: Proficiency in data manipulation using SQL and experience with data preprocessing techniques (e.g., cleaning, normalization, feature engineering). Familiarity with big data tools (e.g., Spark) is a plus.
7. Problem-solving and analytical thinking: Strong analytical and problem-solving abilities, with a keen eye for detail. Demonstrated experience in translating complex business requirements into practical machine learning solutions.
8. Communication and collaboration: Excellent verbal and written communication skills, with the ability to explain complex technical concepts to diverse stakeholders
About ZeMoSo Technologies
We are a product-market-fit studio founded and maintained by successful corporate innovation. We bring products from napkins to product market fit, and our staff are some of the smartest and best engineers, designers and marketers around.
We provide end-to-end product design and development services for the most disruptive and innovative products around the world. We exponentially increase the odds of success for new products by applying lean methodologies and design thinking to the entire process: from napkin to product-market fit to scale. Zemoso Labs has been ranked as one of India’s fastest-growing companies by Deloitte, for two years in a row.
We bring the silicon valley style operating model to the startups around US and Europe.
Our startup customers have raised over $1.2 billion and created value ~$8billion after working with us.
We were featured as one of Deloitte Fastest 50 growing tech companies from India thrice (2016, 2018, and 2019). We were also featured in Deloitte Technology Fast 500 Asia Pacific both in 2016 and 2018. Our engineering studio has won O'Reilly's Architectural Katas event as well (Spring, 2022).
What does that mean for our people?
Our clients are building products that are changing the course of their industries. So, staying on the cutting-edge is non-negotiable and essential to our success. That means you will learn more here than anywhere else.
Similar jobs
Responsibilities
Researches, develops and maintains machine learning and statistical models for
business requirements
Work across the spectrum of statistical modelling including supervised,
unsupervised, & deep learning techniques to apply the right level of solution to
the right problem Coordinate with different functional teams to monitor outcomes and refine/
improve the machine learning models Implements models to uncover patterns and predictions creating business value and innovation
Identify unexplored data opportunities for the business to unlock and maximize
the potential of digital data within the organization
Develop NLP concepts and algorithms to classify and summarize structured/unstructured text data
Qualifications
3+ years of experience solving complex business problems using machine
learning.
Fluency in programming languages such as Python, NLP and Bert, is a must
Strong analytical and critical thinking skills
Experience in building production quality models using state-of-the-art technologies
Familiarity with databases .
desirable Ability to collaborate on projects and work independently when required.
Previous experience in Fintech/payments domain is a bonus
You should have Bachelor’s or Master’s degree in Computer Science, Statistics
or Mathematics or another quantitative field from a top tier Institute
Someone who has strong industrial experience in NLP for a period of 2+ years. Experienced in applying different NLP techniques to problems such as text classification, text summarization, question &answering, information retrieval, knowledge extraction, and conversational bots design potentially with both traditional & Deep Learning Techniques. In-depth exposure to some of the Tools/Techniques: SpaCy, NLTK, Gensim, CoreNLP, NLU, NLG tools etc. Ability to design & develop practical analytical approach keeping the context of data quality & availability, feasibility, scalability, turnaround time aspects. Desirable to have demonstrated capability to integrate NLP technologies to improve chatbot experience. Exposure to frameworks like DialogFlow, RASA NLU, LUIS is preferred. Contributions to open-source software projects are an added advantage.
Experience in analyzing large amounts of user-generated content and process data in large-scale environments using cloud infrastructure is desirable
Company Name: Curl Tech
Location: Bangalore
Website: www.curl.tech
Company Profile: Curl Tech is a deep-tech firm, based out of Bengaluru, India. Curl works on developing Products & Solutions leveraging emerging technologies such as Machine Learning, Blockchain (DLT) & IoT. We work on domains such as Commodity Trading, Banking & Financial Services, Healthcare, Logistics & Retail.
Curl has been founded by technology enthusiasts with rich industry experience. Products and solutions that have been developed at Curl, have gone on to have considerable success and have in turn become separate companies (focused on that product / solution).
If you are looking for a job, that would challenge you and desire to work with an organization that disrupts entire value chain; Curl is the right one for you!
Designation: Data Scientist or Junior Data Scientist (according to experience)
Job Description:
Good with Machine Learning and Deep learning, good with programming and maths.
Details: The candidate will be working on many image analytics/ numerical data analytics projects. The work involves, data collection, building the machine learning models, deployment, client interaction and publishing academic papers.
Responsibilities:
-
The candidate will be working on many image analytics/numerical data projects.
-
Candidate will be building various machine learning models depending upon the requirements.
-
Candidate would be responsible for deployment of the machine learning models.
-
Candidate would be the face of the company in front of the clients and will have regular client interactions to understand that client requirements.
What we are looking for candidates with:
-
Basic Understanding of Statistics, Time Series, Machine Learning, Deep Learning, and their fundamentals and mathematical underpinnings.
-
Proven code proficiency in Python,C/C++ or any other AI language of choice.
-
Strong algorithmic thinking, creative problem solving and the ability to take ownership and do independent
research.
-
Understanding how things work internally in ML and DL models is a must.
-
Understanding of the fundamentals of Computer Vision and Image Processing techniques would be a plus.
-
Expertise in OpenCV, ML/Neural networks technologies and frameworks such as PyTorch, Tensorflow would be a
plus.
-
Educational background in any quantitative field (Computer Science / Mathematics / Computational Sciences and related disciplines) will be given preference.
Education: BE/ BTech/ B.Sc.(Physics or Mathematics)/Masters in Mathematics, Physics or related branches.
About Us
We are an AI-Powered CX Cloud that enables enterprises to transform customer experience and boost revenue with our APIs by automating and analyzing customer interactions at scale. We assist across multiple voices and non-voice channels in 30+ languages whilst coaching and training agents with minimal costs.
The problem we are solving
In comparison to worldwide norms, customer support in traditional contact centers is quite appalling, due to a high number of queries, insufficient capacity of agents and inane customer support systems, businesses struggle with a multi-fold rise in customer discontent and bounce rate, resulting in connectivity failure points between them and customers. To address this issue, IITian couple Manish and Rashi Gupta founded Rezo's AI-Powered CX Cloud for Enterprises 2018 to help businesses avoid customer churn and boost revenue without incurring financial costs by providing 24x7 real-time responses to customer inquiries with minimal human interaction
Roles and Responsibilities :
- Speech Recognition model development across multiple languages.
- Solve critical real-world scenarios - Noisy channel ASR performance, Multi speaker detection, etc.
- Implement and deliver PoC's /UATs products on the Rezo platform.
- Responsible for product performance, robustness and reliability.
Requirements:
- 2+ years Experience with Bachelors's/Master degree with a focus on CS, Machine Learning, and Signal Processing.
- Strong knowledge of various ML concepts/algorithms and hands-on experience in relevant projects.
- Experience in machine learning platforms such as TensorFlow, and Pytorch and solid programming development skills (Python, C, C++ etc).
- Ability to learn new tools, languages and frameworks quickly.
- Familiarity with databases, data transformation techniques, and ability to work with unstructured data like OCR/ speech/text data.
- Previous experience with working in Conversational AI is a plus.
- Git portfolios will be helpful.
Life at Rezo.AI
- We take transparency very seriously. Along with a full view of team goals, get a top-level view across the board with our regular town hall meetings.
- A highly inclusive work culture that promotes a relaxed, creative, and productive environment.
- Practice autonomy, open communication, and growth opportunities, while maintaining a perfect work-life balance.
- Go on company-sponsored offsites, and blow off steam with your work buddies.
Perks & Benefits
Learning is a way of life. Unlock your full potential backed with cutting-edge tools and mentor-ship
Get the best in class medical insurance, programs for taking care of your mental health, and a Contemporary Leave Policy (beyond sick leaves)
Why Us?
We are a fast-paced start-up with some of the best talents from diverse backgrounds. Working together to solve customer service problems. We believe a diverse workforce is a powerful multiplier of innovation and growth, which is key to providing our clients with the best possible service and our employees with the best possible career. Diversity makes us smarter, more competitive, and more innovative.
Explore more here
http://www.rezo.ai/">www.rezo.ai
Zycus is looking for applicants with a strong background in Analytics and Data mining (Web, Social and Big data), Machine Learning, Pattern Recognition, Natural Language Processing, Computational Linguistics, Statistical Modelling, Inferencing, Information Retrieval, Large Scale Distributed Systems, Cloud Computing, Econometrics, Quantitative Marketing, Applied Game Theory, Mechanism Design, Operations Research, Optimization, Human Computer Interaction and Information Visualization. Applicants with a background in other quantitative areas are also encouraged to apply.
We are looking for someone who can create and implement AI solutions. If you have built a product like IBM WATSON in the past and not just used WATSON to build applications, this could be the perfect role for you.
All successful candidates are expected to dive deep into problem areas of Zycus’ interest and invent technology solutions to not only advance the current products, but also to generate new product options that can strategically advantage the organization.
Roles & Responsibilities:
- Act as a technical thought leader in collaboration with the analytics leadership team, helping to set the strategy and standards for Machine Learning and advanced analytics
- Work with senior leaders from all functions to explore opportunities for using advance analytics
- Provide technical leadership, coaching, and mentoring to talented data scientists and analytics professionals
- Guide data scientists in the use of advanced statistical, machine learning, and artificial intelligence methodologies
- Guide the work of other Machine learning team members to provide support and assistance, while also ensuring quality
Job Description
▪ You are responsible for setting up, operating, and monitoring LS system solutions on premise and in the cloud
▪ You are responsible for the analysis and long-term elimination of system errors
▪ You provide support in the area of information and IT security
▪ You will work on the strategic further development and optimize the platform used
▪ You will work in a global, international team requirement profile
▪ You have successfully completed an apprenticeship / degree in the field of IT
▪ You can demonstrate in-depth knowledge and experience in the following areas:
▪ PostgreSQL databases
▪ Linux (e.g. Ubuntu, Oracle Linux, RHEL)
▪ Windows (e.g. Windows Server 2019/2022)
▪ Automation / IaC (e.g. Ansible, Terraform)
▪ Containerization with Kubernetes / Virtualization with Vmware is an advantage
▪ Service APIs (AWS, Azure)
▪ You have very good knowledge of English, knowledge of German is an advantage
▪ You are a born team player, show high commitment and are resilient
The Merck Data Engineering Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Merck’s data management and global analytics platform (Palantir Foundry, Hadoop, AWS and other components).
The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or on-premise Merck’s own data centers. Developing pipelines and applications on Foundry requires:
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary)
• Proficiency in PySpark for distributed computation
• Familiarity with Postgres and ElasticSearch
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.
Roles & Responsibilities:
• Develop data pipelines by ingesting various data sources – structured and un-structured – into Palantir Foundry
• Participate in end to end project lifecycle, from requirements analysis to go-live and operations of an application
• Acts as business analyst for developing requirements for Foundry pipelines
• Review code developed by other data engineers and check against platform-specific standards, cross-cutting concerns, coding and configuration standards and functional specification of the pipeline
• Document technical work in a professional and transparent way. Create high quality technical documentation
• Work out the best possible balance between technical feasibility and business requirements (the latter can be quite strict)
• Deploy applications on Foundry platform infrastructure with clearly defined checks
• Implementation of changes and bug fixes via Merck's change management framework and according to system engineering practices (additional training will be provided)
• DevOps project setup following Agile principles (e.g. Scrum)
• Besides working on projects, act as third level support for critical applications; analyze and resolve complex incidents/problems. Debug problems across a full stack of Foundry and code based on Python, Pyspark, and Java
• Work closely with business users, data scientists/analysts to design physical data models
Responsibilities:
- Should act as a technical resource for the Data Science team and be involved in creating and implementing current and future Analytics projects like data lake design, data warehouse design, etc.
- Analysis and design of ETL solutions to store/fetch data from multiple systems like Google Analytics, CleverTap, CRM systems etc.
- Developing and maintaining data pipelines for real time analytics as well as batch analytics use cases.
- Collaborate with data scientists and actively work in the feature engineering and data preparation phase of model building
- Collaborate with product development and dev ops teams in implementing the data collection and aggregation solutions
- Ensure quality and consistency of the data in Data warehouse and follow best data governance practices
- Analyse large amounts of information to discover trends and patterns
- Mine and analyse data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.\
Requirements
- Bachelor’s or Masters in a highly numerate discipline such as Engineering, Science and Economics
- 2-6 years of proven experience working as a Data Engineer preferably in ecommerce/web based or consumer technologies company
- Hands on experience of working with different big data tools like Hadoop, Spark , Flink, Kafka and so on
- Good understanding of AWS ecosystem for big data analytics
- Hands on experience in creating data pipelines either using tools or by independently writing scripts
- Hands on experience in scripting languages like Python, Scala, Unix Shell scripting and so on
- Strong problem solving skills with an emphasis on product development.
- Experience using business intelligence tools e.g. Tableau, Power BI would be an added advantage (not mandatory)
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!