Cutshort logo
FlyNava Technologies logo
Data Scientist
FlyNava Technologies's logo

Data Scientist

Thahaseen Salahuddin's profile picture
Posted by Thahaseen Salahuddin
3 - 6 yrs
₹3L - ₹8L / yr
Bengaluru (Bangalore), Bengaluru (Bangalore)
Skills
Pyomo
Watson
skill iconPython
Big Data
skill iconData Science
skill iconMachine Learning (ML)
Fly NavaTechnologies is a start-up organization whose vision is to create the finest airline software for distinct competitive advantages in revenue generation and cost management. The software products have been designed and created by veterans of the airline and airline IT industry, to meet the needs of this special customer segment. The software will be constructive by innovative approach to age old practices of pricing, hedging, aircraft induction and will be path breaking to use, encouraging the users to rely and depend on its capabilities. Wewill leverage our competitive edge by incorporating new technology, big data models, operations research and predictive analytics into software products, a means of creating interest and creativity while using the software. This interest and creativity will increase potential revenues or reduce costs considerably, thereby creating a distinct competitive differentiation. ​FlyNava is convinced that when airline users create that differentiation easily, their alignment to the products will be self-motivated rather than mandated. High level of competitive advantage will also flow with the following All the products, solutions and services will be Copyright. FlyNava will benefit with high IPR value including its base thesis/research as the sole owners. Existing product companies are investing in other core areas which our business areas are predominantly manual process Solutions are based on master thesis which need 2-3 years to complete and more time to make them relevant for software development. Expertise in these areas are far and few. Responsible for Collecting, Cataloguing, Filtering of data and Benchmarking solutions - Contribute to model related Data Analytics and Reporting. - Contribute to Secured Software Release activities. Education & Experience : - B.E/B.Tech or M.Tech/MCA in Computer Science/ Information Science / Electronics & Communication - 3 - 6 Years of experience Must Have : - Strong in Data Analytics via Pyomo (for optimization) Scikit-learn (for small data ML algorithms) MLlib (Apache Spark big-data ML algorithms) - Strong in representing metrics and reports via json - Strong in scripting with Python - Familiar with Machine learning, pattern recognition Algorithms - Familiar with Software Development Life Cycle - Effective interpersonal skills Good to have : Social analytics Big data
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About FlyNava Technologies

Founded :
2015
Type :
Product
Size :
20-100
Stage :
Bootstrapped

About

Founded in 2015, FlyNava Technologies is a bootstrapped company based in Bangalore. It has 6-50 employees currently and works in the domain of IT Consultancy.
Read more

Connect with the team

Profile picture
Bhavesh Chowdary
Profile picture
Thahaseen Salahuddin
Profile picture
Shammi YK
Profile picture
Mahesh Shastry

Company social profiles

bloglinkedintwitterfacebook

Similar jobs

Searce Inc
at Searce Inc
3 recruiters
Tejashree Kokare
Posted by Tejashree Kokare
Bengaluru (Bangalore), Pune, Mumbai
6 - 15 yrs
Best in industry
Google Cloud Platform (GCP)
Data engineering
Data warehouse architecture
Data architecture
Data modeling
+6 more

Solutions Architect - Data Engineering


Modern tech solutions advisory & 'futurify' consulting as a Searce lead fds (‘forward deployed solver’) architecting scalable data platforms and robust data engineering solutions that power intelligent insights and fuel AI innovation.

If you’re a tech-savvy, consultative seller with the brain of a strategist, the heart of a builder, and the charisma of a storyteller — we’ve got a seat for you at the front of the table.

You're not a sales lead. You're the transformation driver.


What are we looking for

real solver?

Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.

  • Improver. Solver. Futurist.
  • Great sense of humor.
  • ‘Possible. It is.’ Mindset.
  • Compassionate collaborator. Bold experimenter. Tireless iterator.
  • Natural creativity that doesn’t just challenge the norm, but solves to design what’s better.
  • Thinks in systems. Solves at scale.


This Isn’t for Everyone. But if you’re the kind who questions why things are done a certain way— and then identifies 3 better ways to do it — we’d love to chat with you.


Your Responsibilities

what you will wake up to solve.


You are not just a Solutions Architect; you are a futurifier of our data universe and the primary enabler of our AI ambitions. With a deep-seated passion for data engineering, you will architect and build the foundational data infrastructure that powers the customers entire data intelligence ecosystem.

As the Directly Responsible Individual (DRI) for our enterprise-grade data platforms, you own the outcome, end-to-end. You are the definitive solver for our customer's most complex data challenges, leveraging a powerful tech stack including Snowflake, Databricks, etc. and core GCP & AWS services (BigQuery, Spanner, Airflow, Kafka). This is a hands-on-keys role where you won't just design solutions—you'll build them, break them, and perfect them.


  • Solution Design & Pre-sales Excellence:Collaborate with cross-functional teams, including sales, engineering, and operations, to ensure successful project delivery.
  • Design Core Data Engineering: Master data modeling, architecting high-performance data ingestion pipelines and ensuring data quality and governance throughout the data lifecycle.
  • Enable Cloud & AI: Design and implement solutions utilizing core GCP data services, building foundational data platforms that efficiently support advanced analytics and AI/ML initiatives.
  • Optimize Performance & Cost: Continuously optimize data architectures and implementations for performance, efficiency, and cost-effectiveness within the cloud environment.
  • Bridge Business & Tech: Translate complex business requirements into clear technical designs, providing technical leadership and guidance to data engineering teams.
  • Stay Ahead of the Curve: Continuously research and evaluate new data technologies, architectural patterns, and industry trends to keep our data platforms at the cutting edge.


Functional Skills:


  • Enterprise Data Architecture Design: Expert ability to design holistic, scalable, and resilient data architectures for complex enterprise environments.
  • Cloud Data Platform Strategy: Proven capability to strategize, design, and implement cloud-native data platforms.
  • Pre-Sales & Technical Storyteller: Crafts compelling, client-ready proposals, architectural decks, and technical demonstrations. Doesn't just present; shapes the strategic technical narrative behind every proposed solution.
  • Advanced Data Modelling: Mastery in designing various data models for analytical, operational, and transactional use cases.
  • Data Ingestion & Pipeline Orchestration: Strong expertise in designing and optimizing robust data ingestion and transformation pipelines.
  • Stakeholder Communication: Exceptional skills in articulating complex technical concepts and architectural decisions to both technical and non-technical stakeholders.
  • Performance & Cost Optimization: Adept at optimizing data solutions for performance, efficiency, and cost within a cloud environment.


Tech Superpowers:


  • Cloud Data Mastery: You're a wizard at leveraging public cloud data services, with deep expertise in GCP (BigQuery, Spanner, etc.) and expert proficiency in modern data warehouse solutions like Snowflake.
  • Data Engineering Core: Highly skilled in designing, implementing, and managing data workflows using tools like Apache Airflow and Apache Kafka. You're also an authority on advanced data modeling and ETL/ELT patterns.
  • AI/ML Data Foundation: You instinctively design data pipelines and structures that efficiently feed and empower Machine Learning and Artificial Intelligence applications.
  • Programming for Data: You have a strong command over key programming languages (Python, SQL) for scripting, automation, and building data processing applications.


Experience & Relevance:


  • Architectural Leadership (8+ Years): You bring extensive experience (7+ years) specifically in a Solutions Architect role, focused on data engineering and platform building.
  • Cloud Data Expertise: You have a proven track record of designing and implementing production-grade data solutions leveraging major public cloud platforms, with significant experience in Google Cloud Platform (GCP).
  • Data Warehousing & Data Platform: Demonstrated hands-on experience in the end-to-end design, implementation, and optimization of modern data warehouses and comprehensive data platforms.
  • Databricks & BigQuery Mastery: You possess significant practical experience with Databricks as a core data warehouse and GCP BigQuery for analytical workloads.
  • Data Ingestion & Orchestration: Proven experience designing and implementing complex data ingestion pipelines and workflow orchestration using tools like Airflow and real-time streaming technologies like Kafka.
  • AI/ML Data Enablement: Experience in building data foundations specifically geared towards supporting Machine Learning and Artificial Intelligence initiatives.


Join the ‘real solvers’

ready to futurify?

If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’.


Don’t Just Send a Resume. Send a Statement.


So, If you are passionate about tech, future & what you read above (we really are!), apply here to experience the ‘Art of Possible’

Read more
Nevis Software Solutions Pvt Ltd
Pune
3 - 5 yrs
₹7L - ₹12L / yr
skill iconDjango
skill iconPython
RESTful APIs
Web API

About the Role

We are looking for an experienced Django Developer to join our on-site engineering team in Pune. This role involves building and scaling high-performance backend systems for our SaaS products. You will work closely with product, frontend, and DevOps teams to design robust APIs, optimize databases, and deliver production-grade solutions.

This is a hands-on role with ownership, technical depth, and real impact.


Key Responsibilities

  • Design, develop, test, and maintain scalable backend services using Django & Python
  • Architect and implement secure, high-performance RESTful APIs
  • Work extensively with PostgreSQL for schema design, query optimization, indexing, and performance tuning
  • Build and manage asynchronous workflows using Celery
  • Implement real-time features using Daphne, Redis, and WebSockets (ASGI stack)
  • Containerize applications using Docker; manage Docker Compose and environment setups
  • Collaborate with frontend developers, product managers, and designers for seamless delivery
  • Perform code reviews, mentor junior developers, and enforce best practices
  • Ensure application security, scalability, and reliability
  • Monitor system performance and handle debugging, logging, and error management
  • Maintain clear documentation for APIs, services, and deployment workflows

Required Skills & Qualifications

  • 3-4 years of hands-on experience with Django & Python
  • Strong expertise in REST API design and backend architecture
  • Advanced knowledge of PostgreSQL (queries, indexing, optimization)
  • Solid experience with Celery for background tasks
  • Hands-on experience with Daphne, Redis, and WebSockets
  • Strong command over Docker & containerized deployments
  • Proficiency with Git/GitHub workflows, PR reviews, and basic CI
  • Excellent understanding of ORM concepts and database modeling
  • Strong problem-solving, debugging, and communication skills
  • Experience using AI/LLM tools to improve productivity is a plus

Nice-to-Have

  • Experience with cloud platforms (AWS / GCP / Azure)
  • Exposure to CI/CD pipelines and deployment automation
  • Familiarity with monitoring tools (Sentry, Prometheus, Grafana, etc.)
  • Basic frontend understanding (HTML, CSS, JavaScript)
  • Experience handling high-traffic systems and performance optimization
  • Exposure to Agile / Scrum environments

What We Offer

  • Competitive salary package
  • Opportunity to work on scalable SaaS and AI-driven platforms
  • Strong engineering culture with ownership and autonomy
  • On-site collaborative environment with fast decision-making
  • Learning, growth, and leadership opportunities
  • Challenging projects with end-to-end responsibility

Expectations & Deliverables

  • Production-ready, well-tested, and maintainable code
  • Proactive communication and ownership of deliverables
  • High-quality documentation and clean architecture practices
  • Adherence to security, compliance, and IP standards


Read more
Wissen Technology
at Wissen Technology
4 recruiters
Gagandeep Kaur
Posted by Gagandeep Kaur
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
PySpark
pandas
Airflow
Data engineering

Wissen Technology is hiring for Data Engineer

About Wissen Technology: At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.

Job Summary: Wissen Technology is hiring a Data Engineer with expertise in Python, Pandas, Airflow, and Azure Cloud Services. The ideal candidate will have strong communication skills and experience with Kubernetes.

Experience: 4-7 years

Notice Period: Immediate- 15 days

Location: Pune, Mumbai, Bangalore

Mode of Work: Hybrid

Key Responsibilities:

  • Develop and maintain data pipelines using Python and Pandas.
  • Implement and manage workflows using Airflow.
  • Utilize Azure Cloud Services for data storage and processing.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Ensure data quality and integrity throughout the data lifecycle.
  • Optimize and scale data infrastructure to meet business needs.

Qualifications and Required Skills:

  • Proficiency in Python (Must Have).
  • Strong experience with Pandas (Must Have).
  • Expertise in Airflow (Must Have).
  • Experience with Azure Cloud Services.
  • Good communication skills.

Good to Have Skills:

  • Experience with Pyspark.
  • Knowledge of Kubernetes.

Wissen Sites:


Read more
Snaphyr
Snaphyr
Agency job
via SnapHyr by MUKESHKUMAR CHAUHAN
Mumbai
3 - 6 yrs
₹10L - ₹25L / yr
skill iconPython
Data-flow analysis
Backend testing
Market analysis
Market Research
+1 more

🚀 We’re Hiring: Python Developer – Quant Strategies & Backtesting | Mumbai (Goregaon East)


Are you a skilled Python Developer passionate about financial markets and quantitative trading?


We’re looking for someone to join our growing Quant Research & Algo Trading team, where you’ll work on:

🔹 Developing & optimizing trading strategies in Python

🔹 Building backtesting frameworks across multiple asset classes

🔹 Processing and analyzing large market datasets

🔹 Collaborating with quant researchers & traders on real-world strategies


What we’re looking for:

✔️ 3+ years of experience in Python development (preferably in fintech/trading/quant domains)

✔️ Strong knowledge of Pandas, NumPy, SciPy, SQL

✔️ Experience in backtesting, data handling & performance optimization

✔️ Familiarity with financial markets is a big plus


📍 Location: Goregaon East, Mumbai

💼 Competitive package + exposure to cutting-edge quant strategies


Read more
Tekit Software solution Pvt Ltd
himanshi Tripathi
Posted by himanshi Tripathi
Mumbai, Hyderabad
9 - 10 yrs
₹20L - ₹24L / yr
Tableau
tableauarchitecture
skill iconPython
Powershell
Shell Scripting
+4 more

Tableau Server Administrator (10+ Yrs Exp.) 📊🔒

📍Location: Remote

🗓️ Experience: 10+ years



MandatorySkills & Qualifications:

1. Proven expertise in Tableau architecture, clustering, scalability, and high availability.

2. Proficiency in PowerShell, Python, or Shell scripting.

3. Experience with cloud platforms (AWS, Azure, GCP) and Tableau Cloud.

4. Familiarity with database systems (SQL Server, Oracle, Snowflake).

5. Any certification Plus.





Read more
YOptima Media Solutions Pvt Ltd
Bengaluru (Bangalore)
8 - 11 yrs
₹40L - ₹60L / yr
Generative AI
Google Cloud Platform (GCP)
skill iconPython

Why This Role Matters

  • We are looking for a Staff Engineer to lead the technical direction and hands-on development of our next-generation, agentic AI-first marketing platforms. This is a high-impact role to architect, build, and ship products that change how marketers interact with data, plan campaigns, and make decisions.


What You'll Do

  • Build Gen-AI native products: Architect, build, and ship platforms powered by LLMs, agents, and predictive AI
  • Stay hands-on: Design systems, write code, debug, and drive product excellence
  • Lead with depth: Mentor a high-caliber team of full stack engineers.
  • Speed to market: Rapidly ship and iterate on MVPs to maximize learning and feedback.
  • Own the full stack: From backend data pipelines to intuitive UIs—from Airflow to React - from BigQuery to embeddings.
  • Scale what works: Ensure scalability, security, and performance in multi-tenant, cloud-native environments (GCP).
  • Collaborate deeply: Work closely with product, growth, and leadership to align tech with business priorities.


What You Bring

  • 8+ years of experience building and scaling full-stack, data-driven products
  • Proficiency in backend (Node.js, Python) and frontend (React), with solid GCP experience
  • Strong grasp of data pipelines, analytics, and real-time data processing
  • Familiarity with Gen-AI frameworks (LangChain, LlamaIndex, OpenAI APIs, vector databases)
  • Proven architectural leadership and technical ownership
  • Product mindset with a bias for execution and iteration


Our Tech Stack

  • Cloud: Google Cloud Platform
  • Backend: Node.js, Python, Airflow
  • Data: BigQuery, Cloud SQL
  • AI/ML: TensorFlow, OpenAI APIs, custom agents
  • Frontend: React.js


What You Get

  • Meaningful equity in a high-growth startup
  • The chance to build global products from India
  • A culture that values clarity, ownership, learning, humility, and candor
  • A rare opportunity to build with Gen-AI from the ground up


Who You Are

  • You’re initiative-driven, not interruption-driven.
  • You code because you love building things that matter.
  • You enjoy ambiguity and solve problems from first principles.
  • You believe true leadership is contextual, hands-on, and grounded.
  • You’re here to build — not just maintain.
  • You care deeply about seeing your products empower real users, run reliably at scale, and adapt intelligently with minimal manual effort.
  • You know that elegant code is just 30% of the job — the real craft lies in the engineering rigour, edge-case handling, and production resilience that make great products truly dependable.
Read more
DataToBiz Pvt. Ltd.
at DataToBiz Pvt. Ltd.
2 recruiters
Vibhanshi Bakliwal
Posted by Vibhanshi Bakliwal
Pune
8 - 12 yrs
₹15L - ₹18L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

We are seeking a highly skilled and experienced Power BI Lead / Architect to join our growing team. The ideal candidate will have a strong understanding of data warehousing, data modeling, and business intelligence best practices. This role will be responsible for leading the design, development, and implementation of complex Power BI solutions that provide actionable insights to key stakeholders across the organization.


Location - Pune (Hybrid 3 days)


Responsibilities:


Lead the design, development, and implementation of complex Power BI dashboards, reports, and visualizations.

Develop and maintain data models (star schema, snowflake schema) for optimal data analysis and reporting.

Perform data analysis, data cleansing, and data transformation using SQL and other ETL tools.

Collaborate with business stakeholders to understand their data needs and translate them into effective and insightful reports.

Develop and maintain data pipelines and ETL processes to ensure data accuracy and consistency.

Troubleshoot and resolve technical issues related to Power BI dashboards and reports.

Provide technical guidance and mentorship to junior team members.

Stay abreast of the latest trends and technologies in the Power BI ecosystem.

Ensure data security, governance, and compliance with industry best practices.

Contribute to the development and improvement of the organization's data and analytics strategy.

May lead and mentor a team of junior Power BI developers.


Qualifications:


8-12 years of experience in Business Intelligence and Data Analytics.

Proven expertise in Power BI development, including DAX, advanced data modeling techniques.

Strong SQL skills, including writing complex queries, stored procedures, and views.

Experience with ETL/ELT processes and tools.

Experience with data warehousing concepts and methodologies.

Excellent analytical, problem-solving, and communication skills.

Strong teamwork and collaboration skills.

Ability to work independently and proactively.

Bachelor's degree in Computer Science, Information Systems, or a related field preferred.

Read more
Fintech lead,
Fintech lead,
Agency job
via The Hub by Sridevi Viswanathan
Remote only
3 - 6 yrs
₹5L - ₹25L / yr
BERT
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython

Data Scientist-

We are looking for an experienced Data Scientists to join our engineering team and

help us enhance our mobile application with data. In this role, we're looking for

people who are passionate about developing ML/AI in various domains that solves

enterprise problems. We are keen on hiring someone who loves working in fast paced start-up environment and looking to solve some challenging engineering

problems.

As one of the earliest members in engineering, you will have the flexibility to design

the models and architecture from ground up. As any early-stage start-up, we expect

you to be comfortable wearing various hats, and be proactive contributor in building

something truly remarkable.

Responsibilities

Researches, develops and maintains machine learning and statistical models for

business requirements

Work across the spectrum of statistical modelling including supervised,

unsupervised, & deep learning techniques to apply the right level of solution to

the right problem Coordinate with different functional teams to monitor outcomes and refine/

improve the machine learning models Implements models to uncover patterns and predictions creating business value and innovation

Identify unexplored data opportunities for the business to unlock and maximize

the potential of digital data within the organization

Develop NLP concepts and algorithms to classify and summarize structured/unstructured text data

Qualifications

3+ years of experience solving complex business problems using machine

learning.

Fluency in programming languages such as Python, NLP and Bert, is a must

Strong analytical and critical thinking skills

Experience in building production quality models using state-of-the-art technologies 

Familiarity with databases like MySQL, Oracle, SQL Server, NoSQL, etc. is

desirable Ability to collaborate on projects and work independently when required.

Previous experience in Fintech/payments domain is a bonus

You should have Bachelor’s or Master’s degree in Computer Science, Statistics

or Mathematics or another quantitative field from a top tier Institute

Read more
TestMu AI (Formely LambdaTest)
Remote, Noida, NCR (Delhi | Gurgaon | Noida)
1 - 5 yrs
₹6L - ₹15L / yr
Software Development
Test Automation (QA)
SDET
skill iconPython
Perl
+2 more

http://www.lambdatest.com/">LambdaTest is a cloud-based testing platform aimed at bringing the whole testing ecosystem to cloud. LambdaTest provides access to a powerful cloud network of 2000+ real browsers and operating system that helps testers in cross-browser and cross-platform compatibility testing. The product roadmap is evolving and there are many more functionalities and features which will be added to the product. The company is angel funded by the leading investors and entrepreneurs of the industry. We are growing at a fantastic rate. This is an incredible opportunity for someone talented and ambitious to make a huge impact.

Qualities: Ability to create tools, microsite, devOps and technical solution for testing. Good
debugging skills and programming knowledge. Good in manual testing and white box testing.

Requirements and Qualifications: 

  • 1- 3 years Job Experience (startup preferred)
  • Expert in at least 1 programming language.
  • QA Framework knowledge.
  • Developer attitude with good problem Solving Skills 
  • Any raw development talent with a QA attitude will work for us. 
  • Ability to write unit and integration tests.
  • Understanding of testing frameworks.
  • Good knowledge of Web driver APIs like Actions Class, Select Class, and Alert.
  • Experience with Continuous Integration and Continuous Delivery tool such as Jenkins.
  • Knowledge about DevOps Tools.
  • Experience in Bug tracking and reporting.
Read more
Ejohri Jewel Hub
at Ejohri Jewel Hub
3 recruiters
Jitandra Singh
Posted by Jitandra Singh
Mumbai
4 - 10 yrs
₹12L - ₹20L / yr
skill iconPython
skill iconDjango
skill iconAmazon Web Services (AWS)
skill iconReact.js
  • Work experience as Python developer
  • Should have experience in  developing and working on consumer facing web/app products on in Django framework
  • Should have experience working with react js front end design.
  • Thorough knowledge of data stores, MyScore AWS services and should have experience with EC2, ELB, AutoScaling, CloudFront, S3
  • Experience in Frontend codebases using HTML, CSS and Javascript
  • Good understanding of Data Structures, Algorithms and Operating Systems
  • Good experience of developing and integrating APIs
  • Knowledge of object-relational mapping (ORM)
  • Able to integrate multiple data sources and databases into one system (mysql)
  • Understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
  • Proficient understanding of code versioning tool (git)
  • Good problem-solving skills
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos