Cutshort logo
An AI based company logo
Lead Computer Vision Engineer
Lead Computer Vision Engineer
An AI based company's logo

Lead Computer Vision Engineer

Agency job
via Qrata
5 - 10 yrs
₹25L - ₹70L / yr
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
Skills
Computer Vision
OpenCV
skill iconPython
TensorFlow
PyTorch
Job Title : Lead Computer Vision Engineer
Location : Gurgaon

About the company:
The company is changing the way cataloging is done across the Globe. Our vision is to empower the smallest of sellers, situated in the farthest of corners, to create superior product images and videos, without the need for any external professional help. Imagine 30M+ merchants shooting Product Images or Videos using their Smartphones, and then choosing Filters for Amazon, Asos, Airbnb, Doordash, etc to instantly compose High-Quality "tuned-in" product visuals, instantly. The company has built the world’s leading image editing AI software, to capture and process beautiful product images for online selling. We are also fortunate and proud to be backed by the biggest names in the investment community including the likes of Accel Partners, Angellist and prominent Founders and Internet company operators, who believe that there is an intelligent and efficient way of doing Digital Production than how the world operates currently.

Job Description :
- We are looking for a seasoned Computer Vision Engineer with AI/ML/CV and Deep Learning skills to
play a senior leadership role in our Product & Technology Research Team.
- You will be leading a team of CV researchers to build models that automatically transform millions of e
commerce, automobiles, food, real-estate ram images into processed final images.
- You will be responsible for researching the latest art of the possible in the field of computer vision,
designing the solution architecture for our offerings and lead the Computer Vision teams to build the core
algorithmic models & deploy them on Cloud Infrastructure.
- Working with the Data team to ensure your data pipelines are well set up and
models are being constantly trained and updated
- Working alongside product team to ensure that AI capabilities are built as democratized tools that
provides internal as well external stakeholders to innovate on top of it and make our customers
successful
- You will work closely with the Product & Engineering teams to convert the models into beautiful products
that will be used by thousands of Businesses everyday to transform their images and videos.

Job Requirements:
- Min 3+ years of work experience in Computer Vision with 5-10 years work experience overall
- BS/MS/ Phd degree in Computer Science, Engineering or a related subject from a ivy league institute
- Exposure on Deep Learning Techniques, TensorFlow/Pytorch
- Prior expertise on building Image processing applications using GANs, CNNs, Diffusion models
- Expertise with Image Processing Python libraries like OpenCV, etc.
- Good hands-on experience on Python, Flask or Django framework
- Authored publications at peer-reviewed AI conferences (e.g. NeurIPS, CVPR, ICML, ICLR,ICCV, ACL)
- Prior experience of managing teams and building large scale AI / CV projects is a big plus
- Great interpersonal and communication skills
- Critical thinker and problem-solving skills
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About An AI based company

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Series B funded product startup
Agency job
via Qrata by Blessy Fernandes
Delhi
2 - 5 yrs
₹8L - ₹14L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
skill iconJava

Job Title -Data Scientist

 

Job Duties

  1. Data Scientist responsibilities includes planning projects and building analytics models.
  2. You should have a strong problem-solving ability and a knack for statistical analysis.
  3. If you're also able to align our data products with our business goals, we'd like to meet you. Your ultimate goal will be to help improve our products and business decisions by making the most out of our data.

 

Responsibilities

Own end-to-end business problems and metrics, build and implement ML solutions using cutting-edge technology.

Create scalable solutions to business problems using statistical techniques, machine learning, and NLP.

Design, experiment and evaluate highly innovative models for predictive learning

Work closely with software engineering teams to drive real-time model experiments, implementations, and new feature creations

Establish scalable, efficient, and automated processes for large-scale data analysis, model development, deployment, experimentation, and evaluation.

Research and implement novel machine learning and statistical approaches.

 

Requirements

 

2-5 years of experience in data science.

In-depth understanding of modern machine learning techniques and their mathematical underpinnings.

Demonstrated ability to build PoCs for complex, ambiguous problems and scale them up.

Strong programming skills (Python, Java)

High proficiency in at least one of the following broad areas: machine learning, statistical modelling/inference, information retrieval, data mining, NLP

Experience with SQL and NoSQL databases

Strong organizational and leadership skills

Excellent communication skills

Read more
CodeCraft Technologies Private Limited
Chandana B
Posted by Chandana B
Bengaluru (Bangalore), Mangalore
6 - 15 yrs
Best in industry
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Artificial Intelligence (AI)

CodeCraft Technologies is an award-winning creative engineering company where highly skilled designers and engineers work closely and bring to life, user-focused solutions.


Proven design & development methodologies are leveraged, and the latest technologies are explored, to deliver best-in-class mobile and web solutions. Our success is built on a team of talented and motivated individuals who drive excellence in everything they do. We are seeking a highly skilled and experienced Lead Data Scientist to join our growing team.


Responsibilities:

● Work with stakeholders across the organization to identify opportunities for leveraging company data to drive business solutions.

● Develop custom data models and algorithms to apply to data sets.

● Use predictive modeling to increase and optimize the business process and solutions

● Research and development of AI algorithms and their applicability in business-related problems to build intelligent systems.

● Build a Solid Data Science Team: Provide strategic direction for the data science team. Lead, mentor, and inspire a team of data scientists, fostering a culture of collaboration and continuous learning.

● Explore the latest technologies in the Data science domain and develop POCs.

● Establish a Technology Partnership with the leading technology providers in the AI/ML space.

● MLOps – Deploy ML solutions to the cloud.

● Collaborate with the content team to produce Tech blogs, case studies, etc.,


Required Skill Set:

● Strong foundational knowledge of data science concepts, machine learning algorithms, and programming skills in Python (and/or R).

● Expertise in Generative AI (GenAI), Large Language Models (LLM), Natural Language Processing (NLP), image processing and/or video analytics

● Proven track record of supporting global clients or internal stakeholders in data science projects.

● Experience in data analytics, descriptive analytics and predictive analytics

● Experience using AI/ML tools available from cloud service providers like AWS/AZURE/GCP including TensorFlow, SageMaker, and Azure ML

● Experience in deploying solutions to the cloud [AWS/Azure/GCP]

● Experience with Data Visualization tools like PowerBI, Tableau

● Proficient in SQL and other database technologies.

● Good understanding of the latest research and technologies in AI.

● Experience working across multiple geographic borders and time zones

● Outstanding communication and presentation skills


Education:

● Graduation/Post-graduation in Computers/Engineering/Statistics from a reputed institute

Read more
Accrete
at Accrete
2 candid answers
1 video
Agency job
Mumbai
5 - 10 yrs
₹50L - ₹70L / yr
skill iconPython
skill iconDocker
skill iconMachine Learning (ML)
skill iconKubernetes

Responsibilities:

  • Collaborating with data scientists and machine learning engineers to understand their requirements and design scalable, reliable, and efficient machine learning platform solutions.
  • Building and maintaining the applications and infrastructure to support end-to-end machine learning workflows, including inference and continual training.
  • Developing systems for the definition deployment and operation of the different phases of the machine learning and data life cycles.
  • Working within Kubernetes to orchestrate and manage containers, ensuring high availability and fault tolerance of applications.
  • Documenting the platform's best practices, guidelines, and standard operating procedures and contributing to knowledge sharing within the team.


Requirements:

  • 3+ years of hands-on experience in developing and managing machine learning or data platforms
  • Proficiency in programming languages commonly used in machine learning and data applications such as Python, Rust, Bash, Go
  • Experience with containerization technologies, such as Docker, and container orchestration platforms like Kubernetes.
  • Familiarity with CI/CD pipelines for automated model training and deployment. Basic understanding of DevOps principles and practices.
  • Knowledge of data storage solutions and database technologies commonly used in machine learning and data workflows.
Read more
Personal Care Product Manufacturing
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Matellio India Private Limited
Harshit Sharma
Posted by Harshit Sharma
Remote only
8 - 15 yrs
₹10L - ₹27L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconDeep Learning
+7 more

Responsibilities include: 

  • Convert the machine learning models into application program interfaces (APIs) so that other applications can use it
  • Build AI models from scratch and help the different components of the organization (such as product managers and stakeholders) understand what results they gain from the model
  • Build data ingestion and data transformation infrastructure
  • Automate infrastructure that the data science team uses
  • Perform statistical analysis and tune the results so that the organization can make better-informed decisions
  • Set up and manage AI development and product infrastructure
  • Be a good team player, as coordinating with others is a must
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
B2B SaaS platform For BFSI
Agency job
via Unnati by Samta Arora
Mumbai
1 - 5 yrs
₹10L - ₹11L / yr
skill iconAmazon Web Services (AWS)
SQL
NOSQL Databases
skill iconPython

Our client focuses on providing solutions in terms of data, analytics, decisioning and automation. They focus on providing solutions to the lending lifecycle of financial institutions and their products are designed to focus on systemic fraud prevention, risk management, compliance etc.

 

Our client is a one stop solution provider, catering to the authentication, verification and diligence needs of various industries including but not limited to, banking, insurance, payments etc.

 

Headquartered in Mumbai, our client was founded in 2015 by a team of three veteran entrepreneurs, two of whom are chartered accountants and one is a graduate from IIT, Kharagpur. They have been funded by tier 1 investors and have raised $1.1M in funding.

 
As a Web Scraper, you will be responsible for applying knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it.

What you will do:

  • Developing a deep understanding of our vast data sources on the web and knowing exactly how, when, and which data to scrap, parse and store
  • Working closely with Database Administrators to store data in SQL and NoSQL databases
  • Developing frameworks for automating and maintaining constant flow of data from multiple sources
  • Working independently with little supervision to research and test innovative solutions skills

 

Desired Candidate Profile

What you need to have:

  • Bachelor/ Master’s degree in Computer science/ Computer Engineering/ Information Technology
  • 1 - 5 years of relevant experience
  • Strong coding experience in Python (knowledge of Java, JavaScript is a plus)
  • Experience with SQL and NoSQL databases
  • Experience with multi-processing, multi-threading, and AWS/Azure
  • Strong knowledge of scraping frameworks such as Python (Request, Beautiful Soup), Web Harvest and others
  • In depth knowledge of algorithms and data structures & previous experience with web crawling is a must    

 

Read more
Infogain
Agency job
via Technogen India PvtLtd by RAHUL BATTA
Bengaluru (Bangalore), Pune, Noida, NCR (Delhi | Gurgaon | Noida)
7 - 10 yrs
₹20L - ₹25L / yr
Data engineering
skill iconPython
SQL
Spark
PySpark
+10 more
  1. Sr. Data Engineer:

 Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python

Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred

Major accountabilities:

  • Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
  • Have good understanding on Foundry Platform landscape and it’s capabilities
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
  • Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
  • Designs data integrations and data quality framework.
  • Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
  • Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
  • Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed

Desired Candidate Profile :

  • Strong data engineering background
  • Experience with Clinical Data Model is preferred
  • Experience in
    • SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
    • Java and Groovy for our back-end applications and data integration tools
    • Python for data processing and analysis
    • Cloud infrastructure based on AWS EC2 and S3
  • 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
  • 5+ years of Python and Pyspark development experience
  • Strong troubleshooting and problem solving skills
  • BTech or master's degree in computer science or a related technical field
  • Experience designing, building, and maintaining big data pipelines systems
  • Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
  • Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
  • Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
  • Hand-on experience in AWS / Azure cloud platform and stack
  • Strong in API based architecture and concept, able to do quick PoC using API integration and development
  • Knowledge of machine learning and AI
  • Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.

 Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision

Read more
Remote, Dubai
7 - 12 yrs
₹25L - ₹25L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Oracle
skill iconR Programming

High Level Scope of Work :

 

  • Work with AI / Analytics team to priorities MACHINE LEARNING Identified USE CASES for Development and Rollout
  • Meet and understand current retail / Marketing Requirements and how AI/ML solution will address and automate the decision process.
  • Develop AI/ML Programs using DATAIKU Solution Python or open source tech with focus to deliver high Quality and accurate ML prediction Model
  • Gather additional and external data sources to support the AI/ML Model as desired .
  • Support the ML Model and FINE TUNEit to ensure high accuracy all the time.
  • Example of use cases (Customer Segmentation , Product Recommendation, Price Optimization, Retail Customer Personalization Offers, Next Best Location for Business Est, CCTV Computer Vision, NLP and Voice Recognition Solutions)

Required technology expertise :

  • Deep Knowledge & Understanding on MACHINE LEARNING ALGORITHMS (Supervised / Unsupervised Learning / Deep Learning Models)
  • Hands on EXP for at least 5+ years with PYTHON and R STATISTICS PROGRAMMING Languages
  • Strong Database Development knowledge using SQL and PL/SQL
  • Must have EXP using Commercial Data Science Solution particularly DATAIKU and (Altryx, SAS, Azure ML, Google ML, Oracle ML is a plus)
  • Strong hands on EXP with BIG DATA Solution Architecture and Optimization for AI/ML Workload.
  • Data Analytics and BI Tools Hand on EXP particularly (Oracle OBIEE and Power BI)
  • Have implemented and Developed at least 3 successful AI/ML Projects with tangible Business Outcomes In retail Focused Industry
  • Have at least 5+ Years EXP in Retail Industry and Customer Focus Business.
  • Ability to communicate with Business Owner & stakeholders to understand their current issues and provide MACHINE LEARNING Solution accordingly.

Qualifications

  • Bachelor Degree or Master Degree in Data Science, Artificial Intelligent, Computer Science
  • Certified as DATA SCIENTIST or MACHINE LEARNING Expert.
Read more
Artivatic
at Artivatic
1 video
3 recruiters
Layak Singh
Posted by Layak Singh
Bengaluru (Bangalore)
3 - 7 yrs
₹6L - ₹14L / yr
skill iconPython
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconDeep Learning
Natural Language Processing (NLP)
+3 more
About Artivatic : Artivatic is a technology startup that uses AI/ML/Deep learning to build intelligent products & solutions for finance, healthcare & insurance businesses. It is based out of Bangalore with 25+ team focus on technology. The artivatic building is cutting edge solutions to enable 750 Millions plus people to get insurance, financial access, and health benefits with alternative data sources to increase their productivity, efficiency, automation power, and profitability, hence improving their way of doing business more intelligently & seamlessly.  - Artivatic offers lending underwriting, credit/insurance underwriting, fraud, prediction, personalization, recommendation, risk profiling, consumer profiling intelligence, KYC Automation & Compliance, healthcare, automated decisions, monitoring, claims processing, sentiment/psychology behaviour, auto insurance claims, travel insurance, disease prediction for insurance and more.   Job description We at artivatic are seeking for passionate, talented and research focused natural processing language engineer with strong machine learning and mathematics background to help build industry-leading technology. The ideal candidate will have research/implementation experience in modeling and developing NLP tools and have experience working with machine learning/deep learning algorithms. Roles and responsibilities Developing novel algorithms and modeling techniques to advance the state of the art in Natural Language Processing. Developing NLP based tools and solutions end to end. Working closely with R&D and Machine Learning engineers implementing algorithms that power user and developer-facing products.Be responsible for measuring and optimizing the quality of your algorithms Requirements Hands-on Experience building NLP models using different NLP libraries ad toolkit like NLTK, Stanford NLP etc Good understanding of Rule-based, Statistical and probabilistic NLP techniques. Good knowledge of NLP approaches and concepts like topic modeling, text summarization, semantic modeling, Named Entity recognition etc. Good understanding of Machine learning and Deep learning algorithms. Good knowledge of Data Structures and Algorithms. Strong programming skills in Python/Java/Scala/C/C++. Strong problem solving and logical skills. A go-getter kind of attitude with the willingness to learn new technologies. Well versed in software design paradigms and good development practices. Basic Qualifications Bachelors or Master degree in Computer Science, Mathematics or related field with specialization in natural language - Processing, Machine Learning or Deep Learning. Publication record in conferences/journals is a plus. 2+ years of working/research experience building NLP based solutions is preferred. If you feel that you are the ideal candidate & can bring a lot of values to our culture & company's vision, then please do apply. If your profile matches as per our requirements, you will hear from one of our team members. We are looking for someone who can be part of our Team not Employee. Job Perks Insurance, Travel compensation & others
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos