Required skills and experience: · Solid experience working in Big Data ETL environments with Spark and Java/Scala/Python · Strong experience with AWS cloud technologies (EC2, EMR, S3, Kinesis, etc) · Experience building monitoring/alerting frameworks with tools like Newrelic and escalations with slack/email/dashboard integrations, etc · Executive-level communication, prioritization, and team leadership skills
We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.
• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines
Qualifications and Experience:
• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Airport Security and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope, Die by the Blade and Scathe.
We have a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, and we’ve recently acquired our first external studio, TicTales which is based in France. We have a truly global team making games for a global audience. And it’s paying off: Kwalee has been recognised with the Best Large Studio and Best Leadership awards from TIGA (The Independent Game Developers’ Association) and our games have been downloaded in every country on earth!
Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?
What’s the job?
As our Data Scientist (Game / Concept Testing), you’ll be overseeing our critical ‘game discovery’ process, ensuring that we’re using the best data-driven methods for identifying hit hyper-casual games. You’ll be working with multiple teams within the company as well as external network partners to learn how we currently test concepts, then aim to iterate on and improve these methods in step with developments in the market.
A great opportunity for those with a love for growth hacking, data-driven testing and hyper-casual.
What you tell your friends you do
“I figure out how we find our hit hyper-casual titles!”
What you will really be doing
Analysing UA campaigns across a range of social channels and ad networks to assess their ability to identify profitable hyper-casual titles
Devising new methods and techniques for testing games
Running customised and experimental campaigns to test new methods and concepts
Working with the Head of Digital Marketing and Head of Growth to define overall strategy for Game Discovery for both our internal development team and publishing team
Partnering with our Data Science team to invent and prepare in-depth reporting and information sharing for game discovery tests
Presenting developments and findings to a multi-disciplinary group of experts across our internal teams
Communicating with external network partners to better understand their systems
How you will be doing this
You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.
You'll think creatively and be motivated by challenges and constantly striving for the best.
You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!
Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.
Skills and Requirements
At least 2 years of experience working within the Mobile UA / Mobile Marketing field, either within an app publisher or an ad network
A degree in a numerically focussed degree discipline such as, Maths, Physics, Economics, Chemistry, Engineering, Biological Sciences
Experience working across multiple teams with ability to organise and manage
A record of outstanding contribution to data science projects.
People-person with excellent communication skills
Passion for mobile games
Excellent organisation skills with strong attention to detail
We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment
In addition to a competitive salary we also offer private medical cover and life assurance
Creative Wednesdays! (Design and make your own games every Wednesday)
20 days of paid holidays plus bank holidays
Hybrid model available depending on the department and the role
Relocation support available
Great work-life balance with flexible working hours
Quarterly team building days - work hard, play hard!
Monthly employee awards
Free snacks, fruit and drinks
We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.
Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.
Experience in AWS Glue
Experience in Apache Parquet
Proficient in AWS S3 and data lake
Knowledge of Snowflake
Understanding of file-based ingestion best practices.
Scripting language - Python & pyspark
Create and manage cloud resources in AWS
Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
Define process improvement opportunities to optimize data collection, insights and displays.
Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
Identify and interpret trends and patterns from complex data sets
Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
Key participant in regular Scrum ceremonies with the agile teams
Proficient at developing queries, writing reports and presenting findings
Mentor junior members and bring best industry practices.
5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
Strong background in math, statistics, computer science, data science or related discipline
Advanced knowledge one of language: Java, Scala, Python, C#
Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
Data mining/programming tools (e.g. SAS, SQL, R, Python)
Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
Data visualization (e.g. Tableau, Looker, MicroStrategy)
Comfortable learning about and deploying new technologies and tools.
Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
Good written and oral communication skills and ability to present results to non-technical audiences
Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
Kafka Streaming / Kafka Connect
Cassandra / MongoDB
CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
The data science team is responsible for solving business problems with complex data. Data complexity could be characterized in terms of volume, dimensionality and multiple touchpoints/sources. We understand the data, ask fundamental-first-principle questions, apply our analytical and machine learning skills to solve the problem in the best way possible.
Our ideal candidate
The role would be a client facing one, hence good communication skills are a must.
The candidate should have the ability to communicate complex models and analysis in a clear and precise manner.
The candidate would be responsible for:
- Comprehending business problems properly - what to predict, how to build DV, what value addition he/she is bringing to the client, etc.
- Understanding and analyzing large, complex, multi-dimensional datasets and build features relevant for business
- Understanding the math behind algorithms and choosing one over another
- Understanding approaches like stacking, ensemble and applying them correctly to increase accuracy
Desired technical requirements
- Proficiency with Python and the ability to write production-ready codes.
- Experience in pyspark, machine learning and deep learning
- Big data experience, e.g. familiarity with Spark, Hadoop, is highly preferred
- Familiarity with SQL or other databases.
- Understanding the business requirements so as to formulate the problems to solve and restrict the slice of data to be explored.
- Collecting data from various sources.
- Performing cleansing, processing, and validation on the data subject to analyze, in order to ensure its quality.
- Exploring and visualizing data.
- Performing statistical analysis and experiments to derive business insights.
- Clearly communicating the findings from the analysis to turn information into something actionable through reports, dashboards, and/or presentations.
- Experience solving problems in the project’s business domain.
- Experience with data integration from multiple sources
- Proficiency in at least one query language, especially SQL.
- Working experience with NoSQL databases, such as MongoDB and Elasticsearch.
- Working experience with popular statistical and machine learning techniques, such as clustering, linear regression, KNN, decision trees, etc.
- Good scripting skills using Python, R or any other relevant language
- Proficiency in at least one data visualization tool, such as Matplotlib, Plotly, D3.js, ggplot, etc.
- Great communication skills.
- In this role, you will work closely with business leaders, subject matter experts to design/define, measure, and track suitable metrics for efficiency and effectiveness of business.
- Influence product roadmap for website and mobile through high quality digital analytical insights and recommendations.
- Drive customer conversions and increase customer delight via system interventions designed based on data driven insights.
- You will be responsible for modeling complex problems and processes using various analytical tools and methods.
- Effectively carry out the translation of business requirements into functional specifications to satisfy the business needs and necessary system modifications.
- Compile and analyse data related to business' issues.
- Develop clear visualizations to convey complicated data in a straightforward fashion
- Excellent analytical abilities: Data driven and understanding of key analytical techniques • Hands on: Web and app analytics tools (GA, apps flyer, Clever tap etc), Excel, SQL.
- Experience of quantifying impact of product features for sales conversion improvement, A/B testing, attribution modeling, campaign analysis etc.
- Champion of user behaviour understanding on digital platforms (website, mobile applications etc.)
- Effectively analyse and monitor the services, market trends, and customer requirements.
- Effectively undertake the requirement elicitation, document and analysis, solution design, testing definition and execution.
- Knowledge about the different functionalities and features of the main web browsers
- Ability to write clear reports and maintain the important records.
- Effective organizational and management skills.
- Expert in time management and the ability for achieving the given tasks within the allocated time frame .
- Atleast 3 to 6 years of experience with a growing e-commerce company or with analytics will be very important .
- Capable of taking initiatives for process betterments and flow improvements.
Data Engineer JD:
- Designing, developing, constructing, installing, testing and maintaining the complete data management & processing systems.
- Building highly scalable, robust, fault-tolerant, & secure user data platform adhering to data protection laws.
- Taking care of the complete ETL (Extract, Transform & Load) process.
- Ensuring architecture is planned in such a way that it meets all the business requirements.
- Exploring new ways of using existing data, to provide more insights out of it.
- Proposing ways to improve data quality, reliability & efficiency of the whole system.
- Creating data models to reduce system complexity and hence increase efficiency & reduce cost.
- Introducing new data management tools & technologies into the existing system to make it more efficient.
- Setting up monitoring and alarming on data pipeline jobs to detect failures and anomalies
What do we expect from you?
- BS/MS in Computer Science or equivalent experience
- 5 years of recent experience in Big Data Engineering.
- Good experience in working with Hadoop and Big Data technologies like HDFS, Pig, Hive, Zookeeper, Storm, Spark, Airflow and NoSQL systems
- Excellent programming and debugging skills in Java or Python.
- Apache spark, python, hands on experience in deploying ML models
- Has worked on streaming and realtime pipelines
- Experience with Apache Kafka or has worked with any of Spark Streaming, Flume or Storm
Data structure & Algorithms
Problem solving + Coding
• Total of 4+ years of experience in development, architecting/designing and implementing Software solutions for enterprises.
• Must have strong programming experience in either Python or Java/J2EE.
• Minimum of 4+ year’s experience working with various Cloud platforms preferably Google Cloud Platform.
• Experience in Architecting and Designing solutions leveraging Google Cloud products such as Cloud BigQuery, Cloud DataFlow, Cloud Pub/Sub, Cloud BigTable and Tensorflow will be highly preferred.
• Presentation skills with a high degree of comfort speaking with management and developers
• The ability to work in a fast-paced, work environment
• Excellent communication, listening, and influencing skills
• Lead teams to implement and deliver software solutions for Enterprises by understanding their requirements.
• Communicate efficiently and document the Architectural/Design decisions to customer stakeholders/subject matter experts.
• Opportunity to learn new products quickly and rapidly comprehend new technical areas – technical/functional and apply detailed and critical thinking to customer solutions.
• Implementing and optimizing cloud solutions for customers.
• Migration of Workloads from on-prem/other public clouds to Google Cloud Platform.
• Provide solutions to team members for complex scenarios.
• Promote good design and programming practices with various teams and subject matter experts.
• Ability to work on any product on the Google cloud platform.
• Must be hands-on and be able to write code as required.
• Ability to lead junior engineers and conduct code reviews
• Minimum B.Tech/B.E Engineering graduate