Our Engineering team values productivity, integrity, and pragmatism.
We are looking for a seasoned ML engineer who is able to understand customer needs along with any technical constraints, and be responsible for conceiving the product idea based on variety and amount of data we have available with us (in the BigData ecosystem), and then can translate them into product features leading to the success of our customers. The ML engineer will also be expected to design the overall functionality of the product, coding ML applications in distributed environments and seeing them through production.
● Degree in Computer Science, Mathematics, Data Science, Statistics or equivalent proven experience.
● Experience as an ML Engineer or Data scientist, ideally from a data or software engineering background.
● Minimum 4+ years of hands-on experience in developing and deploying(Docker, Kubernetes, etc) machine learning solutions into production environments.
● Excellent programming skills in Python and Scala and hands-on with Spark.
● Proficient with any of the Machine learning or Deep Learning framework like sklearn, spark ML,TensorFlow/Keras, PyTorch
● Strong system design and architecture skills, having individually led a project from design to delivery
● Exposure to container orchestration of production environments, DevOps and CI/CD
● Experience with a JVM based language (Java, Scala, Kotlin) is a plus.
● Experience with time series data analysis and models
● Familiarity with cloud environments (GCP, AWS, Azure).
Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Traffic Cop 3D and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope and Die by the Blade.
With a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, we have a truly global team making games for a global audience. And it’s paying off: Kwalee games have been downloaded in every country on earth! If you think you’re a good fit for one of our remote vacancies, we want to hear from you wherever you are based.
Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?
What’s the job?
As a Junior Data Scientist you will help utilise masses of data generated by Kwalee players all over the world to solve complex problems using cutting edge techniques.
What you tell your friends you do
"My models optimise the performance of Kwalee games and advertising every day!”
What you will really be doing
- Building intelligent systems which generate value from the data which our players and marketing activities produce.
- Leveraging statistical modelling and machine learning techniques to perform automated decision making on a large scale.
- Developing complex, multi-faceted and highly valuable data products which fuel the growth of Kwalee and our games.
- Owning and managing data science projects from concept to deployment.
- Collaborating with key stakeholders across the company to develop new products and avenues of research.
How you will be doing this
- You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.
- You'll think creatively and be motivated by challenges and constantly striving for the best.
- You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!
Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.
Skills and Requirements
- A degree in a numerically focussed degree discipline such as, Maths, Physics, Economics, Chemistry, Engineering, Biological Sciences
- A record of outstanding contribution to data science projects.
- Experience using Python for data analysis and visualisation.
- A good understanding of a deep learning framework such as Tensorflow.
- Experience manipulating data in SQL and/or NoSQL databases
- We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment
- In addition to a competitive salary we also offer private medical cover and life assurance
- Creative Wednesdays!(Design and make your own games every Wednesday)
- 20 days of paid holidays plus bank holidays
- Hybrid model available depending on the department and the role
- Relocation support available
- Great work-life balance with flexible working hours
- Quarterly team building days - work hard, play hard!
- Monthly employee awards
- Free snacks, fruit and drinks
We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.
Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.
About Us :
Docsumo is Document AI software that helps enterprises capture data and analyze customer documents. We convert documents such as invoices, ID cards, and bank statements into actionable data. We are work with clients such as PayU, Arbor and Hitachi and backed by Sequoia, Barclays, Techstars, and Better Capital.
As a Senior Machine Learning you will be working directly with the CTO to develop end to end API products for the US market in the information extraction domain.
- You will be designing and building systems that help Docsumo process visual data i.e. as PDF & images of documents.
- You'll work in our Machine Intelligence team, a close-knit group of scientists and engineers who incubate new capabilities from whiteboard sketches all the way to finished apps.
- You will get to learn the ins and outs of building core capabilities & API products that can scale globally.
- Should have hands-on experience applying advanced statistical learning techniques to different types of data.
- Should be able to design, build and work with RESTful Web Services in JSON and XML formats. (Flask preferred)
- Should follow Agile principles and processes including (but not limited to) standup meetings, sprints and retrospectives.
Skills / Requirements :
- Minimum 3+ years experience working in machine learning, text processing, data science, information retrieval, deep learning, natural language processing, text mining, regression, classification, etc.
- Must have a full-time degree in Computer Science or similar (Statistics/Mathematics)
- Working with OpenCV, TensorFlow and Keras
- Working with Python: Numpy, Scikit-learn, Matplotlib, Panda
- Familiarity with Version Control tools such as Git
- Theoretical and practical knowledge of SQL / NoSQL databases with hands-on experience in at least one database system.
- Must be self-motivated, flexible, collaborative, with an eagerness to learn
We are establishing infrastructure for internal and external reporting using Tableau and are looking for someone with experience building visualizations and dashboards in Tableau and using Tableau Server to deliver them to internal and external users.
- Implementation of interactive visualizations using Tableau Desktop
- Integration with Tableau Server and support of production dashboards and embedded reports with it
- Writing and optimization of SQL queries
- Proficient in Python including the use of Pandas and numpy libraries to perform data exploration and analysis
- 3 years of experience working as a Software Engineer / Senior Software Engineer
- Bachelors in Engineering – can be Electronic and comm , Computer , IT
- Well versed with Basic Data Structures Algorithms and system design
- Should be capable of working well in a team – and should possess very good communication skills
- Self-motivated and fun to work with and organized
- Productive and efficient working remotely
- Test driven mindset with a knack for finding issues and problems at earlier stages of development
- Interest in learning and picking up a wide range of cutting edge technologies
- Should be curious and interested in learning some Data science related concepts and domain knowledge
- Work alongside other engineers on the team to elevate technology and consistently apply best practices
- Data Analytics
- Experience in AWS cloud or any cloud technologies
- Experience in BigData technologies and streaming like – pyspark, kafka is a big plus
- Shell scripting
- Preferred tech stack – Python, Rest API, Microservices, Flask/Fast API, pandas, numpy, linux, shell scripting, Airflow, pyspark
- Has a strong backend experience – and worked with Microservices and Rest API’s - Flask, FastAPI, Databases Relational and Non-relational
We are looking for a Data Scientist who is excited by the prospects of Mining Big Datasets, Deriving Actionable Insights, Building production-ready Predictive Models and have a direct impact on business.
- Hands-on knowledge of Python for building production-ready data products.
- Strong Statistical Analysis and Modeling skills.
- Proven skills in Data Science, Data Mining, Machine Learning, covering the spectrum of supervised as well as unsupervised learning algorithms.
- Ability to grasp and work with new technologies quickly.
- Knowledge of Reinforcement Learning is a plus.
- Some knowledge of big data technologies like Hadoop, Apache Spark will be an added advantage
- Bachelor’s or Master's degree in STEM.
- 3-7 years of relevant experience in applied Data Science.
- Get a chance to work with some of the most popular big data technologies in the market
- Build algorithms that work on web-scale data
- Participate in every aspect of building a data-based product
- Chance of making a dent in a multi-billion dollar digital advertising industry
YOU CAN BE A GREAT FIT IF YOU HAVE:
- An analytical approach towards problem-solving
- Experience with data extraction and management
- Experience with R, Python for Data Analysis and Data modeling.
- Ability to set goals and meet deadlines in a fast-paced working environment
- Understanding of E-Commerce and Advertising as a domain.
- Willingness to work for a startup.
- Individual Contributor with a sense of Business Acumen and a hunger to make an impact.
- Participate in full machine learning Lifecycle including data collection, cleaning, preprocessing to training models, and deploying them to Production.
- Discover data sources, get access to them, ingest them, clean them up, and make them “machine learning ready”.
- Work with data scientists to create and refine features from the underlying data and build pipelines to train and deploy models.
- Partner with data scientists to understand and implement machine learning algorithms.
- Support A/B tests, gather data, perform analysis, draw conclusions on the impact of your models.
- Work cross-functionally with product managers, data scientists, and product engineers, and communicate results to peers and leaders.
- Mentor junior team members
Who we have in mind:
- Graduate in Computer Science or related field, or equivalent practical experience.
- 4+ years of experience in software engineering with 2+ years of direct experience in the machine learning field.
- Proficiency with SQL, Python, Spark, and basic libraries such as Scikit-learn, NumPy, Pandas.
- Familiarity with deep learning frameworks such as TensorFlow or Keras
- Experience with Computer Vision (OpenCV), NLP frameworks (NLTK, SpaCY, BERT).
- Basic knowledge of machine learning techniques (i.e. classification, regression, and clustering).
- Understand machine learning principles (training, validation, etc.)
- Strong hands-on knowledge of data query and data processing tools (i.e. SQL)
- Software engineering fundamentals: version control systems (i.e. Git, Github) and workflows, and ability to write production-ready code.
- Experience deploying highly scalable software supporting millions or more users
- Experience building applications on cloud (AWS or Azure)
- Experience working in scrum teams with Agile tools like JIRA
- Strong oral and written communication skills. Ability to explain complex concepts and technical material to non-technical users
About us: Nexopay helps transforming digital payments and enabling instant financing for parents, across schools and colleges world-wide.
- Work with stakeholders throughout the organisation and across entities to identify opportunities for leveraging internal and external data to drive business impact
- Mine and analyze data to improve and optimise performance, capture meaningful insights and turn them into business advantages
- Assess the effectiveness and accuracy of new data sources and data gathering techniques
- Develop custom data models and algorithms to apply to data sets
- Use predictive modeling to predict outcomes and identify key drivers
- Coordinate with different functional teams to implement models and monitor outcomes
- Develop processes and tools to monitor and analyze model performance and data accuracy
- Experience in solving business problem using descriptive analytics, statistical modelling / machine learning
- 2+ years of strong working knowledge of SQL language
- Experience with visualization tools e. g., Tableau, Power BI
- Working knowledge on handling analytical projects end to end using industry standard tools (e. g., R, Python)
- Strong presentation and communication skills
- Experience in education sector is a plus
- Fluency in English
- Experience with relational SQL & NoSQL databases including MySQL & MongoDB.
- Familiar with the basic principles of distributed computing and data modeling.
- Experience with distributed data pipeline frameworks like Celery, Apache Airflow, etc.
- Experience with NLP and NER models is a bonus.
- Experience building reusable code and libraries for future use.
- Experience building REST APIs.
Preference for candidates working in tech product companies
WE ARE GRAPHENE
Graphene is an award-winning AI company, developing customized insights and data solutions for corporate clients. With a focus on healthcare, consumer goods and financial services, our proprietary AI platform is disrupting market research with an approach that allows us to get into the mind of customers to a degree unprecedented in traditional market research.
Graphene was founded by corporate leaders from Microsoft and P&G and works closely with the Singapore Government & universities in creating cutting edge technology. We are gaining traction with many Fortune 500 companies globally.
Graphene has a 6-year track record of delivering financially sustainable growth and is one of the few start-ups which are self-funded, yet profitable and debt free.
We already have a strong bench strength of leaders in place. Now, we are looking to groom more talents for our expansion into the US. Join us and take both our growths to the next level!
WHAT WILL THE ENGINEER-ML DO?
- Primary Purpose: As part of a highly productive and creative AI (NLP) analytics team, optimize algorithms/models for performance and scalability, engineer & implement machine learning algorithms into services and pipelines to be consumed at web-scale
- Daily Grind: Interface with data scientists, project managers, and the engineering team to achieve sprint goals on the product roadmap, and ensure healthy models, endpoints, CI/CD,
- Career Progression: Senior ML Engineer, ML Architect
YOU CAN EXPECT TO
- Work in a product-development team capable of independently authoring software products.
- Guide junior programmers, set up the architecture, and follow modular development approaches.
- Design and develop code which is well documented.
- Optimize of the application for maximum speed and scalability
- Adhere to the best Information security and Devops practices.
- Research and develop new approaches to problems.
- Design and implement schemas and databases with respect to the AI application
- Cross-pollinated with other teams.
HARD AND SOFT SKILLS
- Problem-solving abilities
- Extremely strong programming background – data structures and algorithm
- Advanced Machine Learning: TensorFlow, Keras
- Python, spaCy, NLTK, Word2Vec, Graph databases, Knowledge-graph, BERT (derived models), Hyperparameter tuning
- Experience with OOPs and design patterns
- Exposure to RDBMS/NoSQL
- Test Driven Development Methodology
Good to Have
- Working in cloud-native environments (preferably Azure)
- Enterprise Design Patterns
- Microservices Architecture
- Distributed Systems