Strong experience in Scala/Spark
End client: Sapient
Mode of Hiring : FTE
Notice should be less than 30days
About Mirafra Technologies
Similar jobs
Required skills and experience: · Solid experience working in Big Data ETL environments with Spark and Java/Scala/Python · Strong experience with AWS cloud technologies (EC2, EMR, S3, Kinesis, etc) · Experience building monitoring/alerting frameworks with tools like Newrelic and escalations with slack/email/dashboard integrations, etc · Executive-level communication, prioritization, and team leadership skills
Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Traffic Cop 3D and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope and Die by the Blade.
With a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, we have a truly global team making games for a global audience. And it’s paying off: Kwalee games have been downloaded in every country on earth! If you think you’re a good fit for one of our remote vacancies, we want to hear from you wherever you are based.
Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?
What’s the job?
As an Analyst (Research) in the Mobile Publishing division you’ll be using your previous experience in analysing market trends to pull usable insights from numerous sources, and find trends others might miss.
What you tell your friends you do
“I provide insights that help guide the direction of Kwalee’s mobile publishing team as they expand their operation”
What you will really be doing
- Using our internal and external data sources to generate insights.
- Assess market trends and make recommendations to our publishing team on which opportunities to pursue and which to decline
- Evaluate market movements and use data to assess new opportunities
- Create frameworks to predict how successful new content can be and the metrics games are likely to achieve
- Evaluate business opportunities and conduct due diligence on potential business partners we are planning to work with
- Be an expert on industry data sets and how we can best use them
How you will be doing this
- You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.
- You'll think creatively and be motivated by challenges and constantly striving for the best.
- You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!
Team
Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.
Skills and Requirements
- Previous experience of working with big data sets, preferably in a gaming or tech environment
- An advanced degree in a related field
- A keen interest in video games and the market, particularly in the mobile space
- Familiarity with industry tools and data providers
- A can-do attitude and ability to move projects forward even when outcomes may not be clear
We offer
- We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment
- In addition to a competitive salary we also offer private medical cover and life assurance
- Creative Wednesdays!(Design and make your own games every Wednesday)
- 20 days of paid holidays plus bank holidays
- Hybrid model available depending on the department and the role
- Relocation support available
- Great work-life balance with flexible working hours
- Quarterly team building days - work hard, play hard!
- Monthly employee awards
- Free snacks, fruit and drinks
Our philosophy
We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.
Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.
Job Requirements :
- Define, implement and validate solution frameworks and architecture patterns for data modeling, data integration, processing, reporting, analytics and visualization using leading cloud, big data, open-source and other enterprise technologies.
- Develop scalable data and analytics solutions leveraging standard platforms, frameworks, patterns and full stack development skills.
- Analyze, characterize and understand data sources, participate in design discussions and provide guidance related to database technology best practices.
- Write tested, robust code that can be quickly moved into production
Responsibilities :
- Experience with distributed data processing and management systems.
- Experience with cloud technologies including Spark SQL, Java/ Scala, HDFS, AWS EC2, AWS S3, etc.
- Familiarity with leveraging and modifying open source libraries to build custom frameworks.
Primary Technical Skills :
- Spark SQL, Java/ Scala, Sbt/ Maven/ Gradle, HDFS, Hive, AWS(EC2, S3, SQS, EMR, Glue Scripts, Lambda, Step Functions), IntelliJ IDE, JIRA, Git, Bitbucket/GitLab, Linux, Oozie.
Notice Period - Max 30 -45 days only
Job Sector: IT, Software
Job Type: Permanent
Location: Chennai
Experience: 10 - 20 Years
Salary: 12 – 40 LPA
Education: Any Graduate
Notice Period: Immediate
Key Skills: Python, Spark, AWS, SQL, PySpark
Contact at triple eight two zero nine four two double seven
Job Description:
Requirements
- Minimum 12 years experience
- In depth understanding and knowledge on distributed computing with spark.
- Deep understanding of Spark Architecture and internals
- Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
- Expertise in ETL processes, data warehousing and data lakes.
- Hands on with python for Big data and analytics.
- Hands on in agile scrum model is an added advantage.
- Knowledge on CI/CD and orchestration tools is desirable.
- AWS S3, Redshift, Lambda knowledge is preferred
Job Description |
Job Title: Data Engineer |
Tech Job Family: DACI |
• Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field) |
• 2 years of experience in Data, BI or Platform Engineering, Data Warehousing/ETL, or Software Engineering |
• 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) |
Preferred Qualifications: |
• Master's Degree in Computer Science, CIS, or related field |
• 2 years of IT experience developing and implementing business systems within an organization |
• 4 years of experience working with defect or incident tracking software |
• 4 years of experience with technical documentation in a software development environment |
• 2 years of experience working with an IT Infrastructure Library (ITIL) framework |
• 2 years of experience leading teams, with or without direct reports |
• Experience with application and integration middleware |
• Experience with database technologies |
Data Engineering |
• 2 years of experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role) |
• Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering role) |
BI Engineering |
• Expertise in MicroStrategy/Power BI/SQL, Scripting, Teradata or equivalent RDBMS, Hadoop (OLAP on Hadoop), Dashboard development, Mobile development (specific to the BI Engineering role) |
Platform Engineering |
• 2 years of experience in Hadoop, NO-SQL, RDBMS or any Cloud Bigdata components, Teradata, MicroStrategy (specific to the Platform Engineering role) |
• Expertise in Python, SQL, Scripting, Teradata, Hadoop utilities like Sqoop, Hive, Pig, Map Reduce, Spark, Ambari, Ranger, Kafka or equivalent Cloud Bigdata components (specific to the Platform Engineering role) |
Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religion, sex, age, national origin, disability, sexual orientation, gender identity or expression, marital status, veteran status, genetics or any other category protected under applicable law. |
Work Timing: 5 Days A Week
Responsibilities include:
• Ensure right stakeholders gets right information at right time
• Requirement gathering with stakeholders to understand their data requirement
• Creating and deploying reports
• Participate actively in datamarts design discussions
• Work on both RDBMS as well as Big Data for designing BI Solutions
• Write code (queries/procedures) in SQL / Hive / Drill that is both functional and elegant,
following appropriate design patterns
• Design and plan BI solutions to automate regular reporting
• Debugging, monitoring and troubleshooting BI solutions
• Creating and deploying datamarts
• Writing relational and multidimensional database queries
• Integrate heterogeneous data sources into BI solutions
• Ensure Data Integrity of data flowing from heterogeneous data sources into BI solutions.
Minimum Job Qualifications:
• BE/B.Tech in Computer Science/IT from Top Colleges
• 1-5 years of experience in Datawarehousing and SQL
• Excellent Analytical Knowledge
• Excellent technical as well as communication skills
• Attention to even the smallest detail is mandatory
• Knowledge of SQL query writing and performance tuning
• Knowledge of Big Data technologies like Apache Hadoop, Apache Hive, Apache Drill
• Knowledge of fundamentals of Business Intelligence
• In-depth knowledge of RDBMS systems, Datawarehousing and Datamarts
• Smart, motivated and team oriented
Desirable Requirements
• Sound knowledge of software development in Programming (preferably Java )
• Knowledge of the software development lifecycle (SDLC) and models
Data Platform Engineer (SDE 1/2/3)
at Urbancompany (formerly known as Urbanclap)
Why are we building Urban Company?
Organized service commerce is a large yet young industry in India. While India is a very large market for home and local services (~USD 50 Billion in retail spends) and expected to double in the next 5 years, there is no billion-dollar company in this segment today.
The industry is bare ~20 years old, with a sub-optimal market architecture typical of an unorganized market - fragmented supply side operated by middlemen. As a result, experiences are broken for both customers and service professionals, each largely relying upon word of mouth to discover the other. The industry can easily be 1.5-2x larger than it is today if the frictions in user and professional journeys are removed - and the experiences made more meaningful and joyful.
The Urban Company team is young and passionate, and we see a massive disruption opportunity in his industry. By leveraging technology, and a set of simple yet powerful processes, we wish to build a platform that can organize the world of services - and bring them to your finger-tips. We believe there is the immense value (akin to serendipity) in bringing together customers and professionals looking for each other. In the process, we hope to impact the lives of millions of service entrepreneurs, and transform service commerce the way Amazon transformed product commerce.
Urbancompany has grown 3x YOY and so as our tech stack. We have evolved in data-driven approach solving for the product over the last few years. We deal with around 10TB in data analytics with around 50Mn/day. We adopted the platform thinking pretty early stage of UC. We started building central platform teams who are dedicated to solving core engineering problems around 2-3 years ago and now it has evolved to a full-fledged vertical. Out platform vertical majorly includes Data Engineering, Service and Core Platform, Infrastructure, and Security. We are looking for Data Engineers, a person who loves solving standardization, has strong platform thinking, opinions, and has solved for Data Engineering, Data Science and analytics platforms.
Job Responsibilities
- Platform first approach to engineering problems.
- Creating highly autonomous systems with minimal manual intervention.
- Frameworks which can be extended to larger audiences through open source.
- Extending and modifying the open source projects to adopt as per Urban Company use case.
- Developer productivity.
- Highly abstracted and standardized frameworks like micro services, event-driven architecture, etc.
Job Requirements/Potential Backgrounds
- Bachelors/master’s in computer science form top-tier Engineering School.
- Experience with Data pipeline and workflow management tools like Luigi, Airflow etc.
- Proven ability to work in a fast paced environment.
- History and Familiarity of server-side development of APIs, databases, dev-ops and systems.
- Fanatic about building scalable, reliable data products.
- Experience with Big data tools: Hadoop, Kafka/Kinesis, Flume, etc. is an added advantage.
- Experience with Relational SQL and NO SQL databases like HBase, Cassandra etc.
- Experience with stream processing engines like Spark, Link, Storm, etc. is an added advantage.
What UC has in store for you
- A phenomenal work environment, with massive ownership and growth opportunities.
- A high performance, high-velocity environment at the cutting edge of growth.
- Strong ownership expectation and freedom to fail.
- Quick iterations and deployments – fail-fast attitude.
- Opportunity to work on cutting edge technologies.
- The massive, and direct impact of the work you do on the lives of people.