Cutshort logo
2 - 3 yrs
₹15L - ₹20L / yr
Bengaluru (Bangalore)
Skills
skill iconPython
skill iconScala
Hadoop
Spark
Data Engineer
Kafka
Luigi
Airflow
Nosql
  • We are looking for a Data Engineer to build the next-generation mobile applications for our world-class fintech product.
  • The candidate will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for cross-functional teams.
  • The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up.
  • Looking for a person with a strong ability to analyse and provide valuable insights to the product and business team to solve daily business problems.
  • You should be able to work in a high-volume environment, have outstanding planning and organisational skills.

 

Qualifications for Data Engineer

 

  • Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimising ‘big data’ data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Looking for a candidate with 2-3 years of experience in a Data Engineer role, who is a CS graduate or has an equivalent experience.

 

What we're looking for?

 

  • Experience with big data tools: Hadoop, Spark, Kafka and other alternate tools.
  • Experience with relational SQL and NoSQL databases, including MySql/Postgres and Mongodb.
  • Experience with data pipeline and workflow management tools: Luigi, Airflow.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift.
  • Experience with stream-processing systems: Storm, Spark-Streaming.
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Big revolution in the e-gaming industry. (GK1)

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

TensorGo Software Private Limited
Deepika Agarwal
Posted by Deepika Agarwal
Remote only
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
PySpark
apache airflow
Spark
Hadoop
+4 more

Requirements:

● Understanding our data sets and how to bring them together.

● Working with our engineering team to support custom solutions offered to the product development.

● Filling the gap between development, engineering and data ops.

● Creating, maintaining and documenting scripts to support ongoing custom solutions.

● Excellent organizational skills, including attention to precise details

● Strong multitasking skills and ability to work in a fast-paced environment

● 5+ years experience with Python to develop scripts.

● Know your way around RESTFUL APIs.[Able to integrate not necessary to publish]

● You are familiar with pulling and pushing files from SFTP and AWS S3.

● Experience with any Cloud solutions including GCP / AWS / OCI / Azure.

● Familiarity with SQL programming to query and transform data from relational Databases.

● Familiarity to work with Linux (and Linux work environment).

● Excellent written and verbal communication skills

● Extracting, transforming, and loading data into internal databases and Hadoop

● Optimizing our new and existing data pipelines for speed and reliability

● Deploying product build and product improvements

● Documenting and managing multiple repositories of code

● Experience with SQL and NoSQL databases (Casendra, MySQL)

● Hands-on experience in data pipelining and ETL. (Any of these frameworks/tools: Hadoop, BigQuery,

RedShift, Athena)

● Hands-on experience in AirFlow

● Understanding of best practices, common coding patterns and good practices around

● storing, partitioning, warehousing and indexing of data

● Experience in reading the data from Kafka topic (both live stream and offline)

● Experience in PySpark and Data frames

Responsibilities:

You’ll

● Collaborating across an agile team to continuously design, iterate, and develop big data systems.

● Extracting, transforming, and loading data into internal databases.

● Optimizing our new and existing data pipelines for speed and reliability.

● Deploying new products and product improvements.

● Documenting and managing multiple repositories of code.

Read more
Rivos Inc
Deepa Savant
Posted by Deepa Savant
Bengaluru (Bangalore)
4 - 18 yrs
₹3L - ₹15L / yr
DFT
System on a chip
skill iconPython
CPU
(INDIA) BANGALORE, INDIA /
ENGINEERING – SILICON ENGINEERING /
FULL-TIME
 
Positions are open for full-time in the areas of DFT design from unit level to chip level, involving all aspects of DFT design functions from scan, MBIST, to ATPG. Roles in the areas of CPU and SOC DFT design and verification.

Responsibilities

    • Define DFT strategy and methodologies
    • Design the DFT features
    • Define test structures, debug structures, and test plans
    • Create test vectors or oversee their creation
    • Collaborate with physical design team to close requirements
    • Validate DFT requirements are being met
    • Work with designers to increase test coverage, debug observability and flexibility
    • Verify post-PD designs meet DFT requirements
    • Work with verification engineers, stepping in to do run tests when needed

Requirements

    • Good knowledge of digital logic design, microprocessor, debug feature, DFT architecture, CPU architecture, and microarchitecture
    • Knowledge of DFT and structural debug concepts and methodologies: JTAG, IEEE1500, MBIST, scan dump, memory dump
    • Knowledge of Verilog and experience with simulators and waveform debugging tools
    • Knowledge of Verilog / SystemVerilog
    • Knowledge of Python, , Shell scripting, Makefiles, TCL a plus
    • Excellent skills in problem solving, written and verbal communication, excellent organization skills, and highly self-motivated.
    • Ability to work well in a team and be productive under aggressive schedules.
Education and Experience
PhD, Master’s Degree or Bachelor’s Degree in technical subject area.


Note:
Annual job salary: The annual job salary mentioned in this posting is a default number taken by cutshort and is inaccurate. <Not mentioned/ disclosed by Rivos>

Resumes:
Interested folks with 3+ years of experience to 20 years of experience into Silicon DFT,  Please reach out to the Recruiter Deepa Savant to learn more about the job and discuss details. 
Read more
Cloudbloom Systems LLP
at Cloudbloom Systems LLP
5 recruiters
Sahil Rana
Posted by Sahil Rana
Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹15L / yr
Test Automation (QA)
Software Testing (QA)
skill iconPython
skill iconJava
Selenium
+8 more


Preference - 
Having experience in Telecom Domain.

You Will Do...

  • Test the performance and robustness of the product to help determining the stability, scalability, and dimensioning
  • Oversee preparation and verification of test environments in conjunction to developing test simulators and automation
  • Capture and detail the outcome of test executions and all information needed to support ongoing measurements and reporting on risks, defects, tests, and coverage
  • Report on the test execution in a timely manner with attention given to achieving outstanding quality
  • Identifies issues, proposes system improvements, and performs repetitive test execution to resolve identified issues.
  • Evaluate and research new tools and practices to improve the execution and processes
  • Define the automation strategy to reduce the lead time of Performance testing
  • Collaborate with microservice teams on test and integration with a focus on customer experience
  • Design & execute test automation suite on solution level
  • Continuously improve quality assurance through test management
  • Develop and maintain the test automation infrastructure
  • Operate pipelines for continuous delivery of microservices and SW packages
  • Work with internal/external organizations and partners.

You will bring...

  • Education: BS or MS in computer Science or acceptable equivalent
  • Experience with UNIX/Linux operating system
  • Min years of experience: 5 to 7 years of experience in Quality Assurance, Test Automation or similar role in development
  • Strong understanding of Test methodology, reporting and automation
  • Understanding and experience in Software Performance testing, analysis of highly integrated Enterprise application
  • Knowledge on K8's deployment on Cloud Infra (AWS EKS Cluster)
  • Hands-on experience on Helm Charts, config maps configuration changes, kubectl utility, Helm install and Upgrade
  • Working knowledge on Load testing tools like JMeter, Postman, SoapUI, LoadRunner
  • Hands-on knowledge on the Cassandra, Oracle, EDB, NoSQL DB like Cassandra
  • Experience with Ci/CD Pipeline using Jenkin
  • Good troubleshooting skill analyzing the thread dump, Memory heap dump
  • Experience analyzing CPU/Memory consumption from the JProfiler or other tools.
  • Strong programming Scripting & Programming languages - Python, Java, ANSIBLE, Shell
  • System Monitoring and Reporting Tools - Grafana, Prometheus, JIRA, X-Ray, Dynatrace
  • Good knowledge of wireless communication systems, lab troubleshooting and test automation technique/tools
  • Good understanding of test automation framework with TestNG or JUnit
  • Knowledge of cloud-native core principles, DevOps, ADP, Docker, Kubernetes, WRCP, RH OCP
  • Experience in making shell scripts on-demand basis to automate the small or medium day to day work
  • Experience communicating directly with daily status updates, Risk Mitigation & Results Closure.
  • Strong programming, scripting, testing, and debugging skills
  • Ability to work with Architect and Development team to define the key metrics and use cases to test the product stability and scalability
Read more
Propellor.ai
at Propellor.ai
5 candid answers
1 video
Kajal Jain
Posted by Kajal Jain
Remote only
1 - 4 yrs
₹5L - ₹15L / yr
skill iconPython
SQL
Spark
Hadoop
Big Data
+2 more

Big Data Engineer/Data Engineer


What we are solving
Welcome to today’s business data world where:
• Unification of all customer data into one platform is a challenge

• Extraction is expensive
• Business users do not have the time/skill to write queries
• High dependency on tech team for written queries

These facts may look scary but there are solutions with real-time self-serve analytics:
• Fully automated data integration from any kind of a data source into a universal schema
• Analytics database that streamlines data indexing, query and analysis into a single platform.
• Start generating value from Day 1 through deep dives, root cause analysis and micro segmentation

At Propellor.ai, this is what we do.
• We help our clients reduce effort and increase effectiveness quickly
• By clearly defining the scope of Projects
• Using Dependable, scalable, future proof technology solution like Big Data Solutions and Cloud Platforms
• Engaging with Data Scientists and Data Engineers to provide End to End Solutions leading to industrialisation of Data Science Model Development and Deployment

What we have achieved so far
Since we started in 2016,
• We have worked across 9 countries with 25+ global brands and 75+ projects
• We have 50+ clients, 100+ Data Sources and 20TB+ data processed daily

Work culture at Propellor.ai
We are a small, remote team that believes in
• Working with a few, but only with highest quality team members who want to become the very best in their fields.
• With each member's belief and faith in what we are solving, we collectively see the Big Picture
• No hierarchy leads us to believe in reaching the decision maker without any hesitation so that our actions can have fruitful and aligned outcomes.
• Each one is a CEO of their domain.So, the criteria while making a choice is so our employees and clients can succeed together!

To read more about us click here:
https://bit.ly/3idXzs0" target="_blank">https://bit.ly/3idXzs0

About the role
We are building an exceptional team of Data engineers who are passionate developers and wants to push the boundaries to solve complex business problems using the latest tech stack. As a Big Data Engineer, you will work with various Technology and Business teams to deliver our Data Engineering offerings to our clients across the globe.

Role Description

• The role would involve big data pre-processing & reporting workflows including collecting, parsing, managing, analysing, and visualizing large sets of data to turn information into business insights
• Develop the software and systems needed for end-to-end execution on large projects
• Work across all phases of SDLC, and use Software Engineering principles to build scalable solutions
• Build the knowledge base required to deliver increasingly complex technology projects
• The role would also involve testing various machine learning models on Big Data and deploying learned models for ongoing scoring and prediction.

Education & Experience
• B.Tech. or Equivalent degree in CS/CE/IT/ECE/EEE 3+ years of experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions.

Must have (hands-on) experience
• Python and SQL expertise
• Distributed computing frameworks (Hadoop Ecosystem & Spark components)
• Must be proficient in any Cloud computing platforms (AWS/Azure/GCP)  • Experience in in any cloud platform would be preferred - GCP (Big Query/Bigtable, Pub sub, Data Flow, App engine )/ AWS/ Azure

• Linux environment, SQL and Shell scripting Desirable
• Statistical or machine learning DSL like R
• Distributed and low latency (streaming) application architecture
• Row store distributed DBMSs such as Cassandra, CouchDB, MongoDB, etc
. • Familiarity with API design

Hiring Process:
1. One phone screening round to gauge your interest and knowledge of fundamentals
2. An assignment to test your skills and ability to come up with solutions in a certain time
3. Interview 1 with our Data Engineer lead
4. Final Interview with our Data Engineer Lead and the Business Teams

Preferred Immediate Joiners

Read more
Hyderabad, Bengaluru (Bangalore), Delhi
2 - 5 yrs
₹3L - ₹8L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
Agile/Scrum
Job Description

Artificial Intelligence (AI) Researchers and Developers

Successful candidate will be part of highly productive teams working on implementing core AI algorithms, Cryptography libraries, AI enabled products and intelligent 3D interface. Candidates will work on cutting edge products and technologies in highly challenging domains and will need to have highest level of commitment and interest to learn new technologies and domain specific subject matter very quickly. Successful completion of projects will require travel and working in remote locations with customers for extended periods

Education Qualification: Bachelor, Master or PhD degree in Computer Science, Mathematics, Electronics, Information Systems from a reputed university and/or equivalent Knowledge and Skills

Location : Hyderabad, Bengaluru, Delhi, Client Location (as needed)

Skillset and Expertise
• Strong software development experience using Python
• Strong background in mathematical, numerical and scientific computing using Python.
• Knowledge in Artificial Intelligence/Machine learning
• Experience working with SCRUM software development methodology
• Strong experience with implementing Web services, Web clients and JSON protocol is required
• Experience with Python Meta programming
• Strong analytical and problem-solving skills
• Design, develop and debug enterprise grade software products and systems
• Software systems testing methodology, including writing and execution of test plans, debugging, and testing scripts and tools
• Excellent written and verbal communication skills; Proficiency in English. Verbal communication in Hindi and other local
Indian languages
• Ability to effectively communicate product design, functionality and status to management, customers and other stakeholders
• Highest level of integrity and work ethic

Frameworks
1. Scikit-learn
2. Tensorflow
3. Keras
4. OpenCV
5. Django
6. CUDA
7. Apache Kafka

Mathematics
1. Advanced Calculus
2. Numerical Analysis
3. Complex Function Theory
4. Probability

Concepts (One or more of the below)
1. OpenGL based 3D programming
2. Cryptography
3. Artificial Intelligence (AI) Algorithms a) Statistical modelling b.) DNN c. RNN d. LSTM e.GAN f. CN
Read more
JetSynthesys Pvt. Ltd.
at JetSynthesys Pvt. Ltd.
1 recruiter
Agency job
via Jobdost by Mamatha A
Remote, Pune
5 - 8 yrs
₹12L - ₹16L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
skill iconDjango
skill iconFlask
RESTful APIs
+1 more
leading marketing software provides mobile app developers a powerful set of solutions to grow their mobile apps. technology platform enables developers to market, monetize, analyze and publish their apps. The company’s first-party content includes over 200+ popular, engaging apps and its technology brings that content to millions of users around the world is headquartered in Palo Alto, California with several offices globally.
 
s a Certified Great Place to Work, one of Inc.’s Best Workplaces and a recipient of the 2019 Glassdoor Top CEO employee’s choice award. The San Francisco Business Times and Silicon Valley Business Journal awarded  one of the Bay Area’s Best Places to Work in 2019, 2020 and 2021, and the Workplace Wellness Award in 2019 which recognizes businesses that are leaders in improving worker well-being.
 
Location / time zone preferences: Should have at least overlap of approx 4 hours with Eastern time zone
Language preferences:Good English Speaker (B1 - B2)
Project timeline: 3 month minimum including onboarding/design familiarity + dev + integration/testing
 
Project details
Web Shop is a new way of buying IAP from games aside from actual mobile platform based IAP. MZ first initiated building their own web shop, other customers are potential studios across AL portfolio. 
Technically Web Shop is a Web UI (React) connected to a backend (python) that is linked to the actual game backend and also CMS to edit available options etc.
Current estimates for the engineering needs are 1 senior web ui developer, 1 senior, python developer, and 1 middle, 3 month proposal including onboarding/design familiarity + devt + integration/testing.
 
About the role:
The Project's purpose is to provide players with the ability to purchase digital content for the game through a web page with players' personal account.
You will be working in a team of font web and backend developers as well as design. Current timeline for the project is 3 months with onboarding/design familiarity, development time, integration/testing included. The team will be managed with an internal project manager and tech lead. With the successful completion of the project we expect to support the project and work on its modules and some minor parts, this scope is to be discussed and is outside of the current estimate.
 

Responsibilities:

    • Help design and implement functional requirements
    • Build efficient back-end features in Python
    • Integrate front-end components into applications
    • Manage testing and bug fixes
    • Prepare technical documentation
    • Collaborate with UX/UI designers to implement design into the code
    • Code review
    • Implement software enhancements and suggest improvements

What we are looking for:

    • Solid experience as Python Developer
    • Experience with Python frameworks (e.g. Django, Flask, Bottle)
    • Familiarity with Amazon Web Services (AWS) and REST API
    • Understanding of databases and SQL
    • Attention to detail
    • Leadership skills
    • Self-starter, able to work independently

Bonus skills:

    • Cloud deployment services - Docker, Kubernetes/AWS/Azure/Openshift/GCS etc.
    • API deployment / WSGI frameworks - Flask/Django/Bottle/FastAPI etc.
    • Basic database operations with Python (CRUD)
 is an equal opportunity employer and considers qualified applicants without regard to race, gender, sexual orientation, gender identity or expression, genetic information, national origin, age, disability, medical condition, religion, marital status or veteran status, or any other basis protected by law.
Read more
Clairvoyant India Private Limited
Taruna Roy
Posted by Taruna Roy
Remote, Pune
3 - 8 yrs
₹4L - ₹15L / yr
Big Data
Hadoop
skill iconJava
Spark
Hibernate (Java)
+5 more
ob Title/Designation:
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
  • 4-10 years of experience in software development.
  • At least 2 years of relevant work experience on large scale Data applications.
  • Strong coding experience in Java is mandatory
  • Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
  • Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
  • Should have good working experience on
  • o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
  • o Kafka
  • o J2EE Frameworks (Spring/Hibernate/REST)
  • o Spark Streaming or any other streaming technology.
  • Strong coding experience in Java is mandatory
  • Ability to work on the sprint stories to completion along with Unit test case coverage.
  • Experience working in Agile Methodology
  • Excellent communication and coordination skills
  • Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
  • Must be able to integrate quickly into the team and work independently towards team goals
Role & Responsibilities:
  • Take the complete responsibility of the sprint stories' execution
  • Be accountable for the delivery of the tasks in the defined timelines with good quality.
  • Follow the processes for project execution and delivery.
  • Follow agile methodology
  • Work with the team lead closely and contribute to the smooth delivery of the project.
  • Understand/define the architecture and discuss the pros-cons of the same with the team
  • Involve in the brainstorming sessions and suggest improvements in the architecture/design.
  • Work with other team leads to get the architecture/design reviewed.
  • Work with the clients and counter-parts (in US) of the project.
  • Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Education: BE/B.Tech from reputed institute.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
Read more
GreedyGame
at GreedyGame
1 video
5 recruiters
Shreyoshi Ghosh
Posted by Shreyoshi Ghosh
Bengaluru (Bangalore)
1 - 2 yrs
₹4L - ₹12L / yr
MS-Excel
SQL
skill iconData Analytics
skill iconPython
skill iconR Language
+1 more

About Us:

GreedyGame is looking for a Business Analyst to join its clan. We are looking to get an enthusiastic Business Analyst who likes to play with Data. You'll be building insights from Data, creating analytical dashboard and monitoring KPI values. Also you will coordinate with teams working on different layers of the infrastructure.

 

Job details:

 

Seniority Level: Associate

Level Industry: Marketing & Advertising

Employment Type: Full Time

Job Location: Bangalore

Experience: 1-2 years

 

WHAT ARE WE LOOKING FOR?

 

  • Excellent planning, organizational, and time management skills.
  • Exceptional analytical and conceptual thinking skills.
  • A previous experience of working closely with Operations and Product Teams.
  • Competency in Excel and SQL is a must.
  • Experience with a programming language like Python is required.
  • Knowledge of Marketing Tools is preferable.

 

 

WHAT WILL BE YOUR RESPONSIBILITIES?

 

  • Evaluating business processes, anticipating requirements, uncovering areas for improvement, developing and implementing solutions.
  • Should be able to generate meaningful insights to help the marketing team and product team in enhancing the user experience for Mobile and Web Apps.
  • Leading ongoing reviews of business processes and developing optimization strategies.
  • Performing requirements analysis from a user and business point of view
  • Combining data from multiple sources like SQL tables, Google Analytics, Inhouse Analytical signals etc and driving relevant insights
  • Deciding the success metrics and KPIs for different Products and features and making sure they are achieved.
  • Act as quality assurance liaison prior to the release of new data analysis or application.

 

Skills and Abilities:

  • Python
  • SQL
  • Business Analytics
  • BigQuery

 

WHAT'S IN IT FOR YOU?

  • An opportunity to be a part of a fast scaling start-up in the AdTech space that offers unmatched services and products.
  • To work with a team of young enthusiasts who are always upbeat and self-driven to achieve bigger milestones in shorter time spans.
  • A workspace that is wide open as per the open door policy at the company, located in the most happening center of Bangalore.
  • A well-fed stomach makes the mind work better and therefore we provide - free lunch with a wide variety on all days of the week, a stocked-up pantry to satiate your want for munchies, a Foosball table to burst stress and above all a great working environment.
  • We believe that we grow as you grow. Once you are a part of our team, your growth also becomes essential to us, and in order to make sure that happens, there are timely formal and informal feedbacks given
Read more
Global content marketplace
Agency job
via Qrata by Mrunal Kokate
Mumbai
4 - 8 yrs
₹20L - ₹30L / yr
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython

We are building a global content marketplace that brings companies and content

creators together to scale up content creation processes across 50+ content verticals and 150+ industries. Over the past 2.5 years, we’ve worked with companies like India Today, Amazon India, Adobe, Swiggy, Dunzo, Businessworld, Paisabazaar, IndiGo Airlines, Apollo Hospitals, Infoedge, Times Group, Digit, BookMyShow, UpGrad, Yulu, YourStory, and 350+ other brands.
Our mission is to become the world’s largest content creation and distribution platform for all kinds of content creators and brands.

 

Our Team

 

We are a 25+ member company and is scaling up rapidly in both team size and our ambition.

If we were to define the kind of people and the culture we have, it would be -

a) Individuals with an Extreme Sense of Passion About Work

b) Individuals with Strong Customer and Creator Obsession

c) Individuals with Extraordinary Hustle, Perseverance & Ambition

We are on the lookout for individuals who are always open to going the extra mile and thrive in a fast-paced environment. We are strong believers in building a great, enduring

a company that can outlast its builders and create a massive impact on the lives of our

employees, creators, and customers alike.

 

Our Investors

 

We are fortunate to be backed by some of the industry’s most prolific angel investors - Kunal Bahl and Rohit Bansal (Snapdeal founders), YourStory Media. (Shradha Sharma); Dr. Saurabh Srivastava, Co-founder of IAN and NASSCOM; Slideshare co-founder Amit Ranjan; Indifi's Co-founder and CEO Alok Mittal; Sidharth Rao, Chairman of Dentsu Webchutney; Ritesh Malik, Co-founder and CEO of Innov8; Sanjay Tripathy, former CMO, HDFC Life, and CEO of Agilio Labs; Manan Maheshwari, Co-founder of WYSH; and Hemanshu Jain, Co-founder of Diabeto.
Backed by Lightspeed Venture Partners



Job Responsibilities:
● Design, develop, test, deploy, maintain and improve ML models
● Implement novel learning algorithms and recommendation engines
● Apply Data Science concepts to solve routine problems of target users
● Translates business analysis needs into well-defined machine learning problems, and
selecting appropriate models and algorithms
● Create an architecture, implement, maintain and monitor various data source pipelines
that can be used across various different types of data sources
● Monitor performance of the architecture and conduct optimization
● Produce clean, efficient code based on specifications
● Verify and deploy programs and systems
● Troubleshoot, debug and upgrade existing applications
● Guide junior engineers for productive contribution to the development
The ideal candidate must -

ML and NLP Engineer
● 4 or more years of experience in ML Engineering
● Proven experience in NLP
● Familiarity with language generative model - GPT3
● Ability to write robust code in Python
● Familiarity with ML frameworks and libraries
● Hands on experience with AWS services like Sagemaker and Personalize
● Exposure to state of the art techniques in ML and NLP
● Understanding of data structures, data modeling, and software architecture
● Outstanding analytical and problem-solving skills
● Team player, an ability to work cooperatively with the other engineers.
● Ability to make quick decisions in high-pressure environments with limited information.
Read more
Sagacito
at Sagacito
2 recruiters
Neha Verma
Posted by Neha Verma
NCR (Delhi | Gurgaon | Noida)
3 - 9 yrs
₹6L - ₹18L / yr
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython
skill iconData Science
Location: Gurgaon Role: • The person will be part of data science team. This person will be working on a close basis with the business analysts and the technology team to deliver the Data Science portion of the project and product. • Data Science contribution to a project can range between 30% to 80%. • Day to Day activities will include data exploration to solve a specific problem, researching of methods to be applied as solution, setting up ML process to be applied in context of a specific engagement/ requirement, contributing to building a DS platform, coding the solution, interacting with client on explanations, integration the DS solution with the technology solution, data cleaning and structuring etc. • Nature of work will depend on stage of a specific engagement, available engagements and individual skill At least 2-6 years of experience in: • Machine Learning (including deep learning methods): Algorithm design, analysis and development and performance improvement o Strong understanding of statistical and predictive modeling concepts, machine-learning approaches, clustering, classification, regression techniques, and recommendation (collaborative filtering) algorithms o Time Series Analysis o Optimization techniques and work experience with solvers for MILP and global optimization. • Data Science o Good experience in exploratory data analysis and feature design & development o Experience of applying and evaluating ML algorithms to practical predictive modeling scenarios in various verticals including (but not limited to) FMCG, Media, E-commerce and Hospitality. • Proficient with programming in Python (must have) & PySpark (good to have). Parallel ML algorithms design and development and usage for maximal performance on multi-core, distributed and/or GPU architectures. • Must be able to write a production ready code with reusable components and integration into data science platform. • Strong inclination to write structured code as per prevailing coding standards and best practices. • Ability to design a data science architecture for repeatability of solutions • Preparedness to manage whole cycle from data preparation to algorithm design to client presentation at individual level. • Comfort in working on AWS including managing data science AWS servers • Team player and good communication and interpersonal skills • Good experience in Natural Language Processing and its applications (Good to Have)
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos