We are looking for an engineer with ML/DL background.
Ideal candidate should have the following skillset
3) Experience building and deploying systems
4) Experience with Theano/Torch/Caffe/Keras all useful
5) Experience Data warehousing/storage/management would be a plus
6) Experience writing production software would be a plus
7) Ideal candidate should have developed their own DL architechtures apart from using open source architechtures.
8) Ideal candidate would have extensive experience with computer vision applications
Candidates would be responsible for building Deep Learning models to solve specific problems. Workflow would look as follows:
1) Define Problem Statement (input -> output)
2) Preprocess Data
3) Build DL model
4) Test on different datasets using Transfer Learning
5) Parameter Tuning
6) Deployment to production
Candidate should have experience working on Deep Learning with an engineering degree from a top tier institute (preferably IIT/BITS or equivalent)
Manager | Data Engineering
Bangalore | Full Time
At Porter, we are passionate about improving productivity. We want to help businesses, large and small, optimize their last-mile operations and empower them to unleash the growth of their core functions. Last mile delivery logistics is one of the biggest and fastest growing sectors of the economy with a market cap upwards of 50 billion USD and a growth rate exceeding 15% CAGR.
Porter is the fastest growing leader in this sector with operations in 14 major cities, a fleet size exceeding 1L registered and 50k active driver partners and a customer base with 3.5M being monthly active. Our industry-best technology platform has raised over 50 million USD from investors including Sequoia Capital, Kae Capital, Mahindra group and LGT Aspada. We are addressing a massive problem and going after a huge market.
We’re trying to create a household name in transportation and our ambition is to disrupt all facets of last mile logistics including warehousing and LTL transportation. At Porter, we’re here to do the best work of our lives.
If you want to do the same and love the challenges and opportunities of a fast paced work environment, then we believe Porter is the right place for you.
Data Strategy and Alignment
- Work closely with data analysts and business / product teams to understand requirements and provide data ready for analysis and reporting.
- Apply, help define, and champion data governance : data quality, testing, documentation, coding best practices and peer reviews.
- Continuously discover, transform, test, deploy, and document data sources and data models.
- Work closely with the Infrastructure team to build and improve our Data Infrastructure.
- Develop and execute data roadmap (and sprints) - with a keen eye on industry trends and direction.
Data Stores and System Development
- Design and implement high-performance, reusable, and scalable data models for our data warehouse to ensure our end-users get consistent and reliable answers when running their own analyses.
- Focus on test driven design and results for repeatable and maintainable processes and tools.
- Create and maintain optimal data pipeline architecture - and data flow logging framework.
- Build the data products, features, tools, and frameworks that enable and empower Data, and Analytics teams across Porter.
- Drive project execution using effective prioritization and resource allocation.
- Resolve blockers through technical expertise, negotiation, and delegation.
- Strive for on-time complete solutions through stand-ups and course-correction.
- Manage and elevate team of 5-8 members.
- Do regular one-on-ones with teammates to ensure resource welfare.
- Periodic assessment and actionable feedback for progress.
- Recruit new members with a view to long-term resource planning through effective collaboration with the hiring team.
- Set the bar for the quality of technical and data-based solutions the team ships.
- Enforce code quality standards and establish good code review practices - using this as a nurturing tool.
- Set up communication channels and feedback loops for knowledge sharing and stakeholder management.
- Explore the latest best practices and tools for constant up-skilling.
Data Engineering Stack
- Analytics : Python / R / SQL + Excel / PPT, Google Colab
- Database : PostgreSQL, Amazon Redshift, DynamoDB, Aerospike
- Warehouse : Redshift, S3
- ETL : Airflow + DBT + Custom-made Python + Amundsen (Discovery)
- Business Intelligence / Visualization : Metabase + Google Data Studio
- Frameworks : Spark + Dash + StreamLit
- Collaboration : Git, Notion
Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Traffic Cop 3D and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope and Die by the Blade.
With a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, we have a truly global team making games for a global audience. And it’s paying off: Kwalee games have been downloaded in every country on earth! If you think you’re a good fit for one of our remote vacancies, we want to hear from you wherever you are based.
Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?
What’s the job?
As a Data Science Core Developer you will build tools and develop technology that deliver data science products to a team of strategists, marketing experts and game developers.
What you will be doing
- Create analytical tools, from simple scripts to full stack applications.
- Develop successful prototype tools into highly tested automated programs
- Work with the marketing, publishing and development teams to understand the problems they are facing, how to solve them and deliver products that are understandable to non-data scientists
- Solve challenging data management and data flow problems to fuel Kwalee’s analysis
How you will be doing this
- You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.
- You'll think creatively and be motivated by challenges and constantly striving for the best.
- You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!
Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.
Skills and Requirements
- A proven track record of writing high quality program code in Python
- Experience with machine learning python frameworks and libraries such as Tensorflow and Scikit-Learn
- The ability to write quick scripts to accelerate manual tasks
- Knowledge of NoSQL and SQL databases like Couchbase, Elasticsearch and PostgreSQL will be helpful but not necessary
- An avid interest in the development, marketing and monetisation of mobile games
- We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment
- In addition to a competitive salary we also offer private medical cover and life assurance
- Creative Wednesdays! (Design and make your own games every Wednesday)
- 20 days of paid holidays plus bank holidays
- Hybrid model available depending on the department and the role
- Relocation support available
- Great work-life balance with flexible working hours
- Quarterly team building days - work hard, play hard!
- Monthly employee awards
- Free snacks, fruit and drinks
We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.
Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.
- Overall 3 to 5 years of experience in designing and implementing complex large scale Software.
- Good in Python is must.
- Experience in Apache Spark, Scala, Java and Delta Lake
- Experience in designing and implementing templated ETL/ELT data pipelines
- Expert level experience in Data Pipeline Orchestrationusing Apache Airflow for large scale production deployment
- Experience in visualizing data from various tasks in the data pipeline using Apache Zeppelin/Plotly or any other visualization library.
- Log management and log monitoring using ELK/Grafana
- Git Hub Integration
Technology Stack: Apache Spark, Apache Airflow, Python, AWS, EC2, S3, Kubernetes, ELK, Grafana , Apache Arrow, Java
- Understand current state architecture, including pain points.
- Create and document future state architectural options to address specific issues or initiatives using Machine Learning.
- Innovate and scale architectural best practices around building and operating ML workloads by collaborating with stakeholders across the organization.
- Develop CI/CD & ML pipelines that help to achieve end-to-end ML model development lifecycle from data preparation and feature engineering to model deployment and retraining.
- Provide recommendations around security, cost, performance, reliability, and operational efficiency and implement them
- Provide thought leadership around the use of industry standard tools and models (including commercially available models and tools) by leveraging experience and current industry trends.
- Collaborate with the Enterprise Architect, consulting partners and client IT team as warranted to establish and implement strategic initiatives.
- Make recommendations and assess proposals for optimization.
- Identify operational issues and recommend and implement strategies to resolve problems.
- 3+ years of experience in developing CI/CD & ML pipelines for end-to-end ML model/workloads development
- Strong knowledge in ML operations and DevOps workflows and tools such as Git, AWS CodeBuild & CodePipeline, Jenkins, AWS CloudFormation, and others
- Background in ML algorithm development, AI/ML Platforms, Deep Learning, ML Operations in the cloud environment.
- Strong programming skillset with high proficiency in Python, R, etc.
- Strong knowledge of AWS cloud and its technologies such as S3, Redshift, Athena, Glue, SageMaker etc.
- Working knowledge of databases, data warehouses, data preparation and integration tools, along with big data parallel processing layers such as Apache Spark or Hadoop
- Knowledge of pure and applied math, ML and DL frameworks, and ML techniques, such as random forest and neural networks
- Ability to collaborate with Data scientist, Data Engineers, Leaders, and other IT teams
- Ability to work with multiple projects and work streams at one time. Must be able to deliver results based upon project deadlines.
- Willing to flex daily work schedule to allow for time-zone differences for global team communications
- Strong interpersonal and communication skills
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
Azure Synapse or Azure SQL data warehouse
Spark on Azure is available in HD insights and data bricks
- B.Tech/MTech from tier 1 institution
- 8+years of experience in machine learning techniques like logistic regression, random forest, boosting, trees, neural networks, etc.
- Showcased experience with Python, SQL and proficiency in Scikit Learn, Pandas, NumPy, Keras and TensorFlow/pytorch
- Experience of working with Qlik sense or Tableau is a plus
The person holding this position is responsible for leading the solution development and implementing advanced analytical approaches across a variety of industries in the supply chain domain.
At this position you act as an interface between the delivery team and the supply chain team, effectively understanding the client business and supply chain.
Candidates will be expected to lead projects across several areas such as
- Demand forecasting
- Inventory management
- Simulation & Mathematical optimization models.
- Procurement analytics
- Distribution/Logistics planning
- Network planning and optimization
Qualification and Experience
- 4+ years of analytics experience in supply chain – preferable industries hi-tech, consumer technology, CPG, automobile, retail or e-commerce supply chain.
- Master in Statistics/Economics or MBA or M. Sc./M. Tech with Operations Research/Industrial Engineering/Supply Chain
- Hands-on experience in delivery of projects using statistical modelling
Skills / Knowledge
- Hands on experience in statistical modelling software such as R/ Python and SQL.
- Experience in advanced analytics / Statistical techniques – Regression, Decision tress, Ensemble machine learning algorithms etc. will be considered as an added advantage.
- Highly proficient with Excel, PowerPoint and Word applications.
- APICS-CSCP or PMP certification will be added advantage
- Strong knowledge of supply chain management
- Working knowledge on the linear/nonlinear optimization
- Ability to structure problems through a data driven decision-making process.
- Excellent project management skills, including time and risk management and project structuring.
- Ability to identify and draw on leading-edge analytical tools and techniques to develop creative approaches and new insights to business issues through data analysis.
- Ability to liaison effectively with multiple stakeholders and functional disciplines.
- Experience in Optimization tools like Cplex, ILOG, GAMS will be an added advantage.
Niki is an artificially intelligent ordering application (http://niki.ai/app" target="_blank">niki.ai/app). Our founding team is from IIT Kharagpur, and we are looking for a Natural Language Processing Engineer to join our engineering team.
The ideal candidate will have industry experience solving language-related problems using statistical methods on vast quantities of data available from Indian mobile consumers and elsewhere.
Major responsibilities would be:
1. Create language models from text data. These language models draw heavily from statistical, deep learning as well as rule based research in recent times around building taggers, parsers, knowledge graph based dictionaries etc.
2. Develop highly scalable classifiers and tools leveraging machine learning, data regression, and rules based models
3. Work closely with product teams to implement algorithms that power user and developer-facing products
We work mostly in Java and Python and object oriented concepts are a must to fit in the team. Basic eligibility criteria are:
2. Industry experience of min 5 years.
3. Strong background in Natural Language Processing and Machine Learning
4. Have some experience in leading a team big or small.
5. Experience with Hadoop/Hbase/Pig or MaprReduce/Sawzall/Bigtable is a plus
What We're Building
We are building an automated messaging platform to simplify ordering experience for consumers. We have launched the Android App: http://niki.ai/app" target="_blank">niki.ai/app . In the current avatar, Niki can process mobile phone recharge and book cabs for the consumers. It assists in finding the right recharge plans across topup, 2g, 3g and makes the transaction. In cab booking, it helps in end to end booking along with tracking and cancellation within the App. You may also compare to get the nearest or the cheapest cab among available ones.
Being an instant messaging App, it works seamlessly on 2G / 3G / Wifi and is light weight around 3.6 MB. You may check out using: https://niki.ai/" target="_blank">niki.ai app