Cutshort logo
Digital & Fintech based companies logo
Product Solutions Engineer
Product Solutions Engineer
Digital & Fintech based companies's logo

Product Solutions Engineer

Agency job
2 - 4 yrs
₹12L - ₹16L / yr
Bengaluru (Bangalore)
Skills
skill iconPython
Bash
MySQL
skill iconElastic Search
skill iconAmazon Web Services (AWS)

What are we looking for:

 

  1. Strong experience in MySQL and writing advanced queries
  2. Strong experience in Bash and Python
  3. Familiarity with ElasticSearch, Redis, Java, NodeJS, ClickHouse, S3
  4. Exposure to cloud services such as AWS, Azure, or GCP
  5. 2+ years of experience in the production support
  6. Strong experience in log management and performance monitoring like ELK, Prometheus + Grafana, logging services on various cloud platforms
  7. Strong understanding of Linux OSes like Ubuntu, CentOS / Redhat Linux
  8. Interest in learning new languages / framework as needed
  9. Good written and oral communications skills
  10. A growth mindset and passionate about building things from the ground up, and most importantly, you should be fun to work with

 

As a product solutions engineer, you will:

 

  1. Analyze recorded runtime issues, diagnose and do occasional code fixes of low to medium complexity
  2. Work with developers to find and correct more complex issues
  3. Address urgent issues quickly, work within and measure against customer SLAs
  4. Using shell and python scripts, and use scripting to actively automate manual / repetitive activities
  5. Build anomaly detectors wherever applicable
  6. Pass articulated feedback from customers to the development and product team
  7. Maintain ongoing record of the operation of problem analysis and resolution in a on call monitoring system
  8. Offer technical support needed in development

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Digital & Fintech based companies

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Fatakpay
at Fatakpay
2 recruiters
Ajit Kumar
Posted by Ajit Kumar
Mumbai
2 - 4 yrs
₹8L - ₹12L / yr
SQL
skill iconPython
Problem solving
Data Warehouse (DWH)
Excel VBA

1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.


2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose


3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met


4. Prioritization of issues to meet deadlines while ensuring high-quality delivery


5. Ability to pull data and to perform ad hoc reporting and analysis as needed


6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities


7. Strong interpersonal and presentation skills


SKILLS:


1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel


2. Experience working with senior decision-makers


3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement


4. Good Knowledge and experience in Excel VBA and advanced excel


5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements


6. Strong communication/interpersonal skills


PERSONA:


1. Experience in working on adhoc requirements


2. Ability to toggle around with shifting priorities


3. Experience in working for Fintech or E-commerce industry is preferable


4. Engineering 2+ years of experience as a Business Analyst for the finance processes

Read more
Personal Care Product Manufacturing
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Cubera Tech India Pvt Ltd
Bengaluru (Bangalore), Chennai
5 - 8 yrs
Best in industry
Data engineering
Big Data
skill iconJava
skill iconPython
Hibernate (Java)
+10 more

Data Engineer- Senior

Cubera is a data company revolutionizing big data analytics and Adtech through data share value principles wherein the users entrust their data to us. We refine the art of understanding, processing, extracting, and evaluating the data that is entrusted to us. We are a gateway for brands to increase their lead efficiency as the world moves towards web3.

What are you going to do?

Design & Develop high performance and scalable solutions that meet the needs of our customers.

Closely work with the Product Management, Architects and cross functional teams.

Build and deploy large-scale systems in Java/Python.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Create data tools for analytics and data scientist team members that assist them in building and optimizing their algorithms.

Follow best practices that can be adopted in Bigdata stack.

Use your engineering experience and technical skills to drive the features and mentor the engineers.

What are we looking for ( Competencies) :

Bachelor’s degree in computer science, computer engineering, or related technical discipline.

Overall 5 to 8 years of programming experience in Java, Python including object-oriented design.

Data handling frameworks: Should have a working knowledge of one or more data handling frameworks like- Hive, Spark, Storm, Flink, Beam, Airflow, Nifi etc.

Data Infrastructure: Should have experience in building, deploying and maintaining applications on popular cloud infrastructure like AWS, GCP etc.

Data Store: Must have expertise in one of general-purpose No-SQL data stores like Elasticsearch, MongoDB, Redis, RedShift, etc.

Strong sense of ownership, focus on quality, responsiveness, efficiency, and innovation.

Ability to work with distributed teams in a collaborative and productive manner.

Benefits:

Competitive Salary Packages and benefits.

Collaborative, lively and an upbeat work environment with young professionals.

Job Category: Development

Job Type: Full Time

Job Location: Bangalore

 

Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹25L / yr
Cloud
Google Cloud Platform (GCP)
BigQuery
skill iconPython
SQL
+2 more

Specific Responsibilities

  • Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
  • Design and develop the ETL framework using BigQuery
  • Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
  • Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
  • Should be able to automate the data load from Big Query using APIs or scripting language.
  • Good experience in Advanced SQL concepts.
  • Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
  • Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
  • Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
  • Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
  • Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
  • Understand requirements to translate business problems & technical problems into analytics problems.
  • Effective storyboarding and presentation of the solution to the client and leadership.
  • Client engagement & management
  • Ability to interface effectively with multiple levels of management and functional disciplines.
  • Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.

 

Work Experience
  • 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
  • Relevant experience in Consumer Tech/CPG/Retail industries
  • Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
  • Strong problem solving and web analytical skills. Acute attention to detail.
  • Experience in analyzing large, complex, multi-dimensional data sets.
  • Experience in one or more roles in an online eCommerce or online support environment.
 
Skills
  • Expertise in Google Big Query & Google Cloud Platform
  • Experience in Advanced SQL, Scripting language (Python/R)
  • Hands-on experience in BI tools (Tableau, Power BI)
  • Working Experience & understanding of Adobe Analytics or Google Analytics
  • Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
  • Excellent analytical thinking, analysis, and problem-solving skills.
  • Knowledge of other GCP services is a plus
 
Read more
Helical IT Solutions
at Helical IT Solutions
4 recruiters
Bhavani Thanga
Posted by Bhavani Thanga
Hyderabad
0 - 0 yrs
₹1.2L - ₹3.5L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more

Job description

About Company
Helical Insight an open source Business Intelligence tool from Helical IT Solutions Pvt. Ltd,
based out of Hyderabad, is looking for fresher’s having strong knowledge on SQL. Helical
Insight has more than 50+ clients from various sectors. It has been awarded the most promising
company in the Business Intelligence space. We are looking for rockstar team mate to join
our company.
Job Brief
We are looking for a Business Intelligence (BI) Developer to create and manage BI and analytics
solutions that turn data into knowledge.
In this role, you should have a background in data and business analysis. You should be
analytical and an excellent communicator. If you also have a business acumen and
problemsolving aptitude, we’d like to meet you. Excellent knowledge on SQLQuery is required.
Basic knowledge on HTML CSS and JS is required.
You would be working closely with customers of various domain to understand their data,
understand their business requirement and deliver the required analytics in form of varous
reports dashboards etc. Excellent client interfacing role with opportunity to work across various
sectors and geographies as well as varioud kind of DB including NoSQL, RDBMS, graph db,
Columnar DB etc
Skill set and Qualification required
Responsibilities
 Attending client calls to get requriement, show progress
 Translate business needs to technical specifications
 Design, build and deploy BI solutions (e.g. reporting tools)
 Maintain and support data analytics platforms)
 Conduct unit testing and troubleshooting
 Evaluate and improve existing BI systems
 Collaborate with teams to integrate systems
 Develop and execute database queries and conduct analyses
 Create visualizations and reports for requested projects
 Develop and update technical documentation
Requirements
 Excellent expertise on SQLQueries
 Proven experience as a BI Developer or Data Scientist
 Background in data warehouse design (e.g. dimensional modeling) and data mining
 In-depth understanding of database management systems, online analytical processing
(OLAP) and ETL (Extract, transform, load) framework
 Familiarity with BI technologies
 Proven abilities to take initiative and be innovative
 Analytical mind with a problem-solving aptitude
 BE in Computer Science/IT
Education: BE/ BTech/ MCA/BCA/ MTech/ MS, or equivalent preferred.
Interested candidates call us on +91 7569 765 162
Read more
Pune
2 - 6 yrs
₹12L - ₹16L / yr
SQL
ETL
Data engineering
Big Data
skill iconJava
+2 more
  • Design, create, test, and maintain data pipeline architecture in collaboration with the Data Architect.
  • Build the infrastructure required for extraction, transformation, and loading of data from a wide variety of data sources using Java, SQL, and Big Data technologies.
  • Support the translation of data needs into technical system requirements. Support in building complex queries required by the product teams.
  • Build data pipelines that clean, transform, and aggregate data from disparate sources
  • Develop, maintain and optimize ETLs to increase data accuracy, data stability, data availability, and pipeline performance.
  • Engage with Product Management and Business to deploy and monitor products/services on cloud platforms.
  • Stay up-to-date with advances in data persistence and big data technologies and run pilots to design the data architecture to scale with the increased data sets of consumer experience.
  • Handle data integration, consolidation, and reconciliation activities for digital consumer / medical products.

Job Qualifications:

  • Bachelor’s or master's degree in Computer Science, Information management, Statistics or related field
  • 5+ years of experience in the Consumer or Healthcare industry in an analytical role with a focus on building on data pipelines, querying data, analyzing, and clearly presenting analyses to members of the data science team.
  • Technical expertise with data models, data mining.
  • Hands-on Knowledge of programming languages in Java, Python, R, and Scala.
  • Strong knowledge in Big data tools like the snowflake, AWS Redshift, Hadoop, map-reduce, etc.
  • Having knowledge in tools like AWS Glue, S3, AWS EMR, Streaming data pipelines, Kafka/Kinesis is desirable.
  • Hands-on knowledge in SQL and No-SQL database design.
  • Having knowledge in CI/CD for the building and hosting of the solutions.
  • Having AWS certification is an added advantage.
  • Having Strong knowledge in visualization tools like Tableau, QlikView is an added advantage
  • A team player capable of working and integrating across cross-functional teams for implementing project requirements. Experience in technical requirements gathering and documentation.
  • Ability to work effectively and independently in a fast-paced agile environment with tight deadlines
  • A flexible, pragmatic, and collaborative team player with the innate ability to engage with data architects, analysts, and scientists
Read more
Artivatic
at Artivatic
1 video
3 recruiters
Layak Singh
Posted by Layak Singh
Bengaluru (Bangalore)
2 - 7 yrs
₹5L - ₹12L / yr
OpenCV
skill iconMachine Learning (ML)
skill iconDeep Learning
skill iconPython
Artificial Intelligence (AI)
+1 more
About Artivatic :Artivatic is technology startup that uses AI/ML/Deeplearning to build intelligent products & solutions for finance, healthcare & insurance businesses. It is based out of Bangalore with 20+ team focus on technology. Artivatic building is cutting edge solutions to enable 750 Millions plus people to get insurance, financial access and health benefits with alternative data sources to increase their productivity, efficiency, automation power and profitability, hence improving their way of doing business more intelligently & seamlessly. Artivatic offers lending underwriting, credit/insurance underwriting, fraud, prediction, personalization, recommendation, risk profiling, consumer profiling intelligence, KYC Automation & Compliance, automated decisions, monitoring, claims processing, sentiment/psychology behaviour, auto insurance claims, travel insurance, disease prediction for insurance and more. We have raised US $300K earlier and built products successfully and also done few PoCs successfully with some top enterprises in Insurance, Banking & Health sector. Currently, 4 months away from generating continuous revenue.Skills : - We at artivatic are seeking for passionate, talented and research focused computer engineer with strong machine learning and computer vision background to help build industry-leading technology with a focus in document text extraction and parsing using OCR across different languages.Qualifications :- Bachelors or Master degree in Computer Science, Computer vision or related field with specialization in Image Processing or machine learning.- Research experience in Deep Learning models for Image processing or OCR related field is preferred.- Publication record in Deep Learning models for Computer Vision conferences/journals is a plus.Required Skills :- Excellent skills developing in Python in Linux environment. Programming skills with multi-threaded GPU Cuda computing and API Solutions.- Experience applying machine learning and computer vision principles to real-world data and working in Scanned and Documented Images.- Good knowledge of Computer Science, math and statistics fundamentals (algorithms and data structures, meshing, sampling theory, linear algebra, etc.)- Knowledge of data science technologies such as Python, Pandas, Scipy, Numpy, matplotlib, etc.- Broad Computer Vision knowledge - Construction, Feature Detection, Segmentation, Classification; Machine/Deep Learning - Algorithm Evaluation, Preparation, Analysis, Modeling and Execution.- Familiarity with OpenCV, Dlib, Yolo, Capslule Network or similar and Open Source AR platforms and products- Strong problem solving and logical skills.- A go-getter kind of attitude with a willingness to learn new technologies.- Well versed in software design paradigms and good development practices.Responsibilities :- Developing novel algorithms and modeling techniques to advance the state of the art in- Document and Text Extraction.- Image recognition, Object Identification and Visual Recognition - Working closely with R&D and Machine Learning engineers implementing algorithms that power user and developer-facing products.- Be responsible for measuring and optimizing the quality of your algorithmsExperience : 3 Years+ Location : Sony World Signal, Koramangala 4th Block, Bangalore
Read more
Octro Inc
at Octro Inc
1 recruiter
Reshma Suleman
Posted by Reshma Suleman
Noida, NCR (Delhi | Gurgaon | Noida)
1 - 7 yrs
₹10L - ₹20L / yr
skill iconData Science
skill iconR Programming
skill iconPython

Octro Inc. is looking for a Data Scientist who will support the product, leadership and marketing teams with insights gained from analyzing multiple sources of data. The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action. 

 

They must have strong experience using a variety of data mining/data analysis methods, using a variety of data tools, building and implementing models, using/creating algorithms and creating/running simulations. They must have a proven ability to drive business results with their data-based insights. 

 

They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.

Responsibilities :

- Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.

- Mine and analyze data from multiple databases to drive optimization and improvement of product development, marketing techniques and business strategies.

- Assess the effectiveness and accuracy of new data sources and data gathering techniques.

- Develop custom data models and algorithms to apply to data sets.

- Use predictive modelling to increase and optimize user experiences, revenue generation, ad targeting and other business outcomes.

- Develop various A/B testing frameworks and test model qualities.

- Coordinate with different functional teams to implement models and monitor outcomes.

- Develop processes and tools to monitor and analyze model performance and data accuracy.

Qualifications :

- Strong problem solving skills with an emphasis on product development and improvement.

- Advanced knowledge of SQL and its use in data gathering/cleaning.

- Experience using statistical computer languages (R, Python, etc.) to manipulate data and draw insights from large data sets.

- Experience working with and creating data architectures.

- Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.

- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.

- Excellent written and verbal communication skills for coordinating across teams.

Read more
FarmGuide
at FarmGuide
1 recruiter
Anupam Arya
Posted by Anupam Arya
NCR (Delhi | Gurgaon | Noida)
0 - 8 yrs
₹7L - ₹14L / yr
Computer Security
Image processing
OpenCV
skill iconPython
Rational ClearCase
+8 more
FarmGuide is a data driven tech startup aiming towards digitizing the periodic processes in place and bringing information symmetry in agriculture supply chain through transparent, dynamic & interactive software solutions. We, at FarmGuide (https://angel.co/farmguide) help Government in relevant and efficient policy making by ensuring seamless flow of information between stakeholders.Job Description :We are looking for individuals who want to help us design cutting edge scalable products to meet our rapidly growing business. We are building out the data science team and looking to hire across levels.- Solving complex problems in the agri-tech sector, which are long-standing open problems at the national level.- Applying computer vision techniques to satellite imagery to deduce artefacts of interest.- Applying various machine learning techniques to digitize existing physical corpus of knowledge in the sector.Key Responsibilities :- Develop computer vision algorithms for production use on satellite and aerial imagery- Implement models and data pipelines to analyse terabytes of data.- Deploy built models in production environment.- Develop tools to assess algorithm accuracy- Implement algorithms at scale in the commercial cloudSkills Required :- B.Tech/ M.Tech in CS or other related fields such as EE or MCA from IIT/NIT/BITS but not compulsory. - Demonstrable interest in Machine Learning and Computer Vision, such as coursework, open-source contribution, etc.- Experience with digital image processing techniques - Familiarity/Experience with geospatial, planetary, or astronomical datasets is valuable- Experience in writing algorithms to manipulate geospatial data- Hands-on knowledge of GDAL or open-source GIS tools is a plus- Familiarity with cloud systems (AWS/Google Cloud) and cloud infrastructure is a plus- Experience with high performance or large scale computing infrastructure might be helpful- Coding ability in R or Python. - Self-directed team player who thrives in a continually changing environmentWhat is on offer :- High impact role in a young start up with colleagues from IITs and other Tier 1 colleges- Chance to work on the cutting edge of ML (yes, we do train Neural Nets on GPUs) - Lots of freedom in terms of the work you do and how you do it - Flexible timings - Best start-up salary in industry with additional tax benefits
Read more
GreedyGame
at GreedyGame
1 video
5 recruiters
Debdutta Pal
Posted by Debdutta Pal
Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹12L / yr
skill iconPython
MySQL
skill iconData Science
NOSQL Databases
Greedygame is looking for a data scientist who will help us make sense of the vast amount of available data in order to make smarter decisions and develop high-quality products. Your primary focus will be using data mining techniques, statistical analysis, machine learning, in order to build high-quality prediction systems and strong consumer engagement profiles. Responsibilities Build required statistical models and heuristics to predict, optimize, and guide various aspects of our business based on available data Interact with product and operations teams to identify gaps, questions, and issues for data analysis and experiment Develop and code software programs, algorithms and create automated processes which cleanse,integrate and evaluate large datasets from multiple sources Create systems to use data from user behavior to identify actionable insights. Convey these insights to product and operations teams from time to time. Help in redefining ad viewing experience for consumers on a global scale Skills Required Coding experience in Python, MySQL, NoSQL and building prototypes for algorithms. Comfortable and willing to learn any machine learning algorithm, reading research papers and delving deep into its maths Passionate and curious to learn the latest trends, methods and technologies in this field. What’s in it for you? - Opportunity to be a part of the big disruption we are creating in the ad-tech space. - Work with complete autonomy, and take on multiple responsibilities - Work in a fast paced environment, with uncapped opportunities to learn and grow - Office in one of the most happening places in India. - Amazing colleagues, weekly lunches and beer on fridays! What we are building: GreedyGame is a platform which enables blending of ads within mobile gaming experience using assets like background, characters, power-ups. It helps advertisers engage audiences while they are playing games, empowers game developers monetize their game development efforts through non-intrusive advertising and allows gamers to enjoy gaming content without having to deal with distractive advertising.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos