Machine Learning Engineer

at Carsome

DP
Posted by Piyush Palkar
icon
Remote, Kuala Lumpur
icon
2 - 5 yrs
icon
₹20L - ₹30L / yr
icon
Full time
Skills
Python
Amazon Web Services (AWS)
Django
Flask
TensorFlow
Big Data
athena
Carsome is a growing startup that is utilising data to improve the experience of second hand car shoppers. This involves developing, deploying & maintaining machine learning models that are used to improve our customers' experience. We are looking for candidates who are aware of the machine learning project lifecycle and can help managing ML deployments.

Responsibilities: - Write and maintain production level code in Python for deploying machine learning models - Create and maintain deployment pipelines through CI/CD tools (preferribly GitLab CI) - Implement alerts and monitoring for prediction accuracy and data drift detection - Implement automated pipelines for training and replacing models - Work closely with with the data science team to deploy new models to production Required Qualifications: - Degree in Computer Science, Data Science, IT or a related discipline. - 2+ years of experience in software engineering or data engineering. - Programming experience in Python - Experience in data profiling, ETL development, testing and implementation - Experience in deploying machine learning models

Good to have: - Experience in AWS resources for ML and data engineering (SageMaker, Glue, Athena, Redshift, S3) - Experience in deploying TensorFlow models - Experience in deploying and managing ML Flow
Read more

About Carsome

Carsome is Southeast Asia’s largest integrated car e-commerce platform. With presence across Malaysia, Indonesia, Thailand and Singapore, it aims to digitalize the region’s used car industry by reshaping and elevating the car buying and selling experiences with complete peace-of-mind.
Read more
Founded
2015
Type
Product
Size
1000-5000 employees
Stage
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Scientist

at EnterpriseMinds

Founded 2017  •  Products & Services  •  100-1000 employees  •  Profitable
Machine Learning (ML)
Natural Language Processing (NLP)
Python
Data Science
PySpark
icon
Bengaluru (Bangalore)
icon
3 - 6 yrs
icon
₹7L - ₹30L / yr

Exp: 3-6 Yrs
Location: Bangalore
Notice: Immediate to 15 days

Responsibilities:

  • Develop advanced algorithms that solve problems of large dimensionality in a computationally efficient and statistically effective manner;
  • Execute statistical and data mining techniques (e.g. hypothesis testing, machine learning and retrieval processes) on large data sets to identify trends, figures and other relevant information;
  • Evaluate emerging datasets and technologies that may contribute to our analytical platform;
  • Participate in development of select assets/accelerators that create scale;
  • Contribute to thought leadership through research and publication support;
  • Guide and mentor Associates on teams.

Qualifications:

 
  • 3-6 years of relevant post-collegiate work experience;
  • Knowledge of big data/advanced analytics concepts and algorithms (e.g. text mining, social listening, recommender systems, predictive modeling, etc.);
  • Should have experience on NLP, Pyspark
  • Exposure to tools/platforms (e.g. Hadoop eco system and database systems);
  • Agile project planning and project management skills;
  • Relevant domain knowledge preferred; (healthcare/transportation/hi-tech/insurance);
  • Excellent oral and written communication skills;
  • Strong attention to detail, with a research-focused mindset;
  • Excellent critical thinking and problem solving skills;
  • High motivation, good work ethic and maturity.
Read more
Job posted by
Komal Samudrala

Machine Learning Engineer

at E-Commerce Aggregator Platform

Agency job
via Qrata
Machine Learning (ML)
Natural Language Processing (NLP)
Python
Algorithms
Deep Learning
icon
Remote only
icon
4 - 8 yrs
icon
₹14L - ₹22L / yr
6+ years of applied machine learning experience with a focus on natural language processing. Some of our current projects require knowledge of natural language
generation.
o 3+ years of software engineering experience.
o Advanced knowledge of Python, with 2+ years in a production environment.
o Experience with practical applications of deep learning.
o Experience with agile, test-driven development, continuous integration, and automated testing.
o Experience with productionizing machine learning models and integrating into web- services.
o Experience with the full software development life cycle, including requirements collection, design, implementation, testing, and operational support.
o Excellent verbal and written communication, teamwork, decision making and influencing
skills.
o Hustle. Thrives in an evolving, fast paced, ambiguous work environment.
Read more
Job posted by
Blessy Fernandes

Data Engineer

at Consulting and Services company

Agency job
via Jobdost
Amazon Web Services (AWS)
Apache
Python
PySpark
icon
Hyderabad, Ahmedabad
icon
5 - 10 yrs
icon
₹5L - ₹30L / yr

Data Engineer 

  

Mandatory Requirements  

  • Experience in AWS Glue 
  • Experience in Apache Parquet  
  • Proficient in AWS S3 and data lake  
  • Knowledge of Snowflake 
  • Understanding of file-based ingestion best practices. 
  • Scripting language - Python & pyspark 

 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS  
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies  
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform  
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations  
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data. 
  • Define process improvement opportunities to optimize data collection, insights and displays. 
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible  
  • Identify and interpret trends and patterns from complex data sets  
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.  
  • Key participant in regular Scrum ceremonies with the agile teams   
  • Proficient at developing queries, writing reports and presenting findings  
  • Mentor junior members and bring best industry practices  

 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)  
  • Strong background in math, statistics, computer science, data science or related discipline 
  • Advanced knowledge one of language: Java, Scala, Python, C#  
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake   
  • Proficient with 
  • Data mining/programming tools (e.g. SAS, SQL, R, Python) 
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum) 
  • Data visualization (e.g. Tableau, Looker, MicroStrategy) 
  • Comfortable learning about and deploying new technologies and tools.  
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.  
  • Good written and oral communication skills and ability to present results to non-technical audiences  
  • Knowledge of business intelligence and analytical tools, technologies and techniques. 

 

Familiarity and experience in the following is a plus:  

  • AWS certification 
  • Spark Streaming  
  • Kafka Streaming / Kafka Connect  
  • ELK Stack  
  • Cassandra / MongoDB  
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools 
Read more
Job posted by
Sathish Kumar

Sr. AL Engineer

at Matellio India Private Limited

Founded 1998  •  Services  •  100-1000 employees  •  Profitable
Machine Learning (ML)
Data Science
Natural Language Processing (NLP)
Computer Vision
Deep Learning
Python
Linear regression
Linear algebra
Big Data
Spark
API
Artificial Intelligence (AI)
icon
Remote only
icon
8 - 15 yrs
icon
₹10L - ₹27L / yr

Responsibilities include: 

  • Convert the machine learning models into application program interfaces (APIs) so that other applications can use it
  • Build AI models from scratch and help the different components of the organization (such as product managers and stakeholders) understand what results they gain from the model
  • Build data ingestion and data transformation infrastructure
  • Automate infrastructure that the data science team uses
  • Perform statistical analysis and tune the results so that the organization can make better-informed decisions
  • Set up and manage AI development and product infrastructure
  • Be a good team player, as coordinating with others is a must
Read more
Job posted by
Harshit Sharma

Lead Game Analyst

at Kwalee

Founded 2011  •  Product  •  100-500 employees  •  Profitable
Data Science
Data Analytics
Python
SQL
icon
Bengaluru (Bangalore)
icon
0 - 8 yrs
icon
Best in industry

Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Airport Security and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope, Die by the Blade and Scathe. 

We have a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, and we’ve recently acquired our first external studio, TicTales which is based in France. We have a truly global team making games for a global audience. And it’s paying off: Kwalee has been recognised with the Best Large Studio and Best Leadership awards from TIGA (The Independent Game Developers’ Association) and our games have been downloaded in every country on earth!

Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?

What’s the job?

As the Lead Game Analyst you will own the optimisation of in-game features and design, utilising A-B testing and multivariate testing of in-game components.


What you will be doing

  • Play a crucial role in finding the best people to work on your team.

  • Manage the delivery of reports and analysis from the team.

  • Investigate how millions of players interact with Kwalee games.

  • Perform statistical analysis to quantify the relationships between game elements and player engagement.

  • Design experiments which extract the most valuable information in the shortest time.

  • Develop testing plans which reveal complex interactions between game elements. 

  • Collaborate with the design team to come up with the most effective tests.

  • Regularly communicate results with development, management and data science teams.


How you will be doing this

  • You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.

  • You'll think creatively and be motivated by challenges and constantly striving for the best.

  • You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!


Team

Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.


Skills and Requirement

  • A degree in a numerically focussed degree discipline such as, Maths, Physics, Economics, Chemistry, Engineering, Biological Sciences.

  • An extensive record of outstanding contribution to data analysis projects.

  • Expert in using Python for data analysis and visualisation.

  • Experience manipulating data in SQL databases.

  • Experience managing and onboarding new team members.


We offer

  • We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment

  • In addition to a competitive salary we also offer private medical cover and life assurance

  • Creative Wednesdays! (Design and make your own games every Wednesday)

  • 20 days of paid holidays plus bank holidays 

  • Hybrid model available depending on the department and the role

  • Relocation support available 

  • Great work-life balance with flexible working hours

  • Quarterly team building days - work hard, play hard!

  • Monthly employee awards

  • Free snacks, fruit and drinks


Our philosophy

We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.

Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.

Read more
Job posted by
Michael Hoppitt

Software Architect/CTO

at Blenheim Chalcot IT Services India Pvt Ltd

SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
ETL
PowerBI
Apache Synapse
Data Warehouse (DWH)
API
SFTP
JSON
Java
Python
C#
Javascript
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
icon
Mumbai
icon
5 - 8 yrs
icon
₹25L - ₹30L / yr
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Job posted by
VIJAYAKIRON ABBINENI

Computer Vision

at Quidich

Founded 2014  •  Products & Services  •  20-100 employees  •  Bootstrapped
Computer Vision
TensorFlow
C++
slam
EKF
Linear algebra
3D Geometry
Probability
3D rendering
Machine Learning (ML)
Deep Learning
icon
Mumbai
icon
0 - 9 yrs
icon
₹2L - ₹14L / yr

About Quidich


Quidich Innovation Labs pioneers products and customized technology solutions for the Sports Broadcast & Film industry. With a mission to bring machines and machine learning to sports, we use camera technology to develop services using remote controlled systems like drones and buggies that add value to any broadcast or production. Quidich provides services to some of the biggest sports & broadcast clients in India and across the globe. A few recent projects include Indian Premier League, ICC World Cup for Men and Women, Kaun Banega Crorepati, Bigg Boss, Gully Boy & Sanju.

What’s Unique About Quidich?

  • Your work will be consumed by millions of people within months of your joining and will impact consumption patterns of how live sport is viewed across the globe.
  • You work with passionate, talented, and diverse people who inspire and support you to achieve your goals.
  • You work in a culture of trust, care, and compassion.
  • You have the autonomy to shape your role, and drive your own learning and growth. 

Opportunity

  • You will be a part of world class sporting events
  • Your contribution to the software will help shape the final output seen on television
  • You will have an opportunity to work in live broadcast scenarios
  • You will work in a close knit team that is driven by innovation

Role

We are looking for a tech enthusiast who can work with us to help further the development of our Augmented Reality product, Spatio, to keep us ahead of the technology curve. We are one of the few companies in the world currently offering this product for live broadcast. We have a tight product roadmap that needs enthusiastic people to solve problems in the realm of software development and computer vision systems. Qualified candidates will be driven self-starters, robust thinkers, strong collaborators, and adept at operating in a highly dynamic environment. We look for candidates that are passionate about the product and embody our values.




Responsibilities

  • Working with the research team to develop, evaluate and optimize various state of the art algorithms.
  • Deploying high performance, readable, and reliable code on edge devices or any other target environments.
  • Continuously exploring new frameworks and identifying ways to incorporate those in the product.
  • Collaborating with the core team to bring ideas to life and keep pace with the latest research in Computer Vision, Deep Learning etc.

Minimum Qualifications, Skills and Competencies

  • B.E/B.Tech or Masters in Computer Science, Mathematics or relevant experience
  • 3+ years of experience in computer vision algorithms like - sfm/SLAM, optical flow, visual-inertial odometry
  • Experience in sensor fusion (camera, imu, lidars) and in probabilistic filters - EKF, UKF
  • Proficiency in programming - C++ and algorithms
  • Strong mathematical understanding - linear algebra, 3d-geometry, probability.

Preferred Qualifications, Skills and Competencies

  • Proven experience in optical flow, multi-camera geometry, 3D reconstruction
  • Strong background in Machine Learning and Deep Learning frameworks.

Reporting To: Product Lead 

Joining Date: Immediate (Mumbai)

Read more
Job posted by
Parag Sule
Informatica
Big Data
Spark
Hadoop
SQL
icon
Abu Dhabi, Dubai
icon
8 - 15 yrs
icon
₹35L - ₹50L / yr
Skills- Informatica with Big Data Management
 
1. Minimum 6 to 8 years of experience in Informatica BDM development
 
2. Experience working on Spark/SQL
 
3. Develops informtica mapping/SQL 
 
4. Should have experience in Hadoop, spark, etc

Work Days-
 
Sunday to Thursday- Day shift
 
(Friday and Saturday would be weekly off.)
Read more
Job posted by
Evelyn Charles

SQL Developer

at Datametica Solutions Private Limited

Founded 2013  •  Products & Services  •  100-1000 employees  •  Profitable
SQL
Linux/Unix
Shell Scripting
SQL server
PL/SQL
Data Warehouse (DWH)
Big Data
Hadoop
icon
Pune
icon
2 - 6 yrs
icon
₹3L - ₹15L / yr

Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.

 

Mandatory Skills:

  • Strong in SQL development
  • Hands-on at least one scripting language - preferably shell scripting
  • Development experience in Data warehouse projects

Opportunities:

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing
Read more
Job posted by
Nikita Aher

Business Analyst

at WyngCommerce

Founded 2017  •  Product  •  20-100 employees  •  Raised funding
Data Analytics
Predictive analytics
Business Analysis
Data Science
Python
icon
Bengaluru (Bangalore)
icon
1 - 3 yrs
icon
₹6L - ₹8L / yr
WyngCommerce is building a Global Enterprise AI Platform for top tier brands and retailers to drive profitability for our clients. Our vision is to develop a self-learning retail backend that enables our clients to become more agile and responsive to demand and supply volatilities. We are looking for a Business Analyst to join our team. As a BA, you will take end-to-end ownership of on-boarding new clients, running proof-of-concepts and pilots with them on different AI product applications, and ensuring timely product roll-out and customer success for the clients. You will also be expected to drive significant inputs to the sales, engineering, data science and product team to help us build for scale. An eye for detail, ability to process and analyze data quickly, and communicating effectively with different teams (Client, Sales and Engineering) are the qualities we are looking for. There will be opportunities to grow up within the same job family (lead a team of analysts) or to move to other areas of business like customer success, data science or product management. KEY RESPONSIBILITIES: - Understand the client deliverables from the sales team and come back to them with the timelines of solution / product delivery - Coordinate with relevant stakeholders within the client team to configure the WyngCommerce platform to their business, set-up processes for regular data sharing - Drive relevant anomaly detection analyses and work on data pre-processing to prepare the data for the WyngCommerce Analytics engine - Drive rigorous testing of results from the WyngCommerce engine and apply manual overrides, wherever required before pushing the results to the client - Evaluate business outcomes of different engagements with the client and prepare the analysis for the business benefits to the clients KEY REQUIREMENTS: - 0-2 years of experience in analytics role (preferably client facing which required you to interface with multiple stakeholders) - Hands-on experience in data analysis (descriptive), visualization and data pre-processing - Hands-on experience in python, especially in data processing and visualization libraries like pandas, numpy, matplotlib, seaborn - Good understanding of statistical and predictive modeling concepts (you not need be completely hands-on) - Excellent analytical thinking, and problem solving skills - Experience in project management and handling client communications - Excellent communication (written/verbal) skills, including logically structuring and delivering presentations
Read more
Job posted by
Ankit Jain
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Carsome?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort