Cutshort logo
Securonix logo
Hadoop Developer
Hadoop Developer
Securonix's logo

Hadoop Developer

Ramakrishna Murthy's profile picture
Posted by Ramakrishna Murthy
3 - 7 yrs
₹10L - ₹15L / yr
Pune
Skills
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
Apache Kafka
SOLR Cloud
Apache Spark
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Securonix

Founded :
2007
Type :
Products & Services
Size
Stage :
Bootstrapped

About

The Securonix platform delivers positive security outcomes with zero infrastructure to manage. It provides analytics-driven next-generation SIEM, UEBA, and security data lake capabilities as a pure cloud solution, without compromise. Built on an open big data platform, Securonix Next-Gen SIEM provides unlimited scalability and log management, behavior analytics-based advanced threat detection, and automated incident response on a single platform. Customers use it to address their insider threat, cyber threat, cloud security, and application security monitoring requirements. Securonix UEBA leverages sophisticated machine learning and behavior analytics to analyze and correlate interactions between users, systems, applications, IP addresses, and data. Light, nimble, and quick to deploy, it detects advanced insider threats, cyber threats, fraud, cloud data compromise, and non-compliance. Built-in automated response playbooks and customizable case management workflows allow security teams to respond to threats quickly and accurately. Securonix Security Data Lake is a massively scalable, fault-tolerant, open data platform that ingests massive amounts of data per day and supports reliable, economical, long-term data retention. It transforms raw log data into meaningful security insights using super-enriched data, blazing-fast search, and elegant visualizations to uncover comprehensive, actionable insights into your organization’s security posture.
Read more

Connect with the team

Profile picture
Ramakrishna Murthy

Company social profiles

linkedintwitterfacebook

Similar jobs

Bengaluru (Bangalore)
5 - 10 yrs
₹25L - ₹50L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconPython
skill iconJava
Data engineering
+10 more

Job Title : Senior Software Engineer (Full Stack — AI/ML & Data Applications)

Experience : 5 to 10 Years

Location : Bengaluru, India

Employment Type : Full-Time | Onsite


Role Overview :

We are seeking a Senior Full Stack Software Engineer with strong technical leadership and hands-on expertise in AI/ML, data-centric applications, and scalable full-stack architectures.

In this role, you will design and implement complex applications integrating ML/AI models, lead full-cycle development, and mentor engineering teams.


Mandatory Skills :

Full Stack Development (React/Angular/Vue + Node.js/Python/Java), Data Engineering (Spark/Kafka/ETL), ML/AI Model Integration (TensorFlow/PyTorch/scikit-learn), Cloud & DevOps (AWS/GCP/Azure, Docker, Kubernetes, CI/CD), SQL/NoSQL Databases (PostgreSQL/MongoDB).


Key Responsibilities :

  • Architect, design, and develop scalable full-stack applications for data and AI-driven products.
  • Build and optimize data ingestion, processing, and pipeline frameworks for large datasets.
  • Deploy, integrate, and scale ML/AI models in production environments.
  • Drive system design, architecture discussions, and API/interface standards.
  • Ensure engineering best practices across code quality, testing, performance, and security.
  • Mentor and guide junior developers through reviews and technical decision-making.
  • Collaborate cross-functionally with product, design, and data teams to align solutions with business needs.
  • Monitor, diagnose, and optimize performance issues across the application stack.
  • Maintain comprehensive technical documentation for scalability and knowledge-sharing.

Required Skills & Experience :

  • Education : B.E./B.Tech/M.E./M.Tech in Computer Science, Data Science, or equivalent fields.
  • Experience : 5+ years in software development with at least 2+ years in a senior or lead role.
  • Full Stack Proficiency :
  • Front-end : React / Angular / Vue.js
  • Back-end : Node.js / Python / Java
  • Data Engineering : Experience with data frameworks such as Apache Spark, Kafka, and ETL pipeline development.
  • AI/ML Expertise : Practical exposure to TensorFlow, PyTorch, or scikit-learn and deploying ML models at scale.
  • Databases : Strong knowledge of SQL & NoSQL systems (PostgreSQL, MongoDB) and warehousing tools (Snowflake, BigQuery).
  • Cloud & DevOps : Working knowledge of AWS, GCP, or Azure; containerization & orchestration (Docker, Kubernetes); CI/CD; MLflow/SageMaker is a plus.
  • Visualization : Familiarity with modern data visualization tools (D3.js, Tableau, Power BI).

Soft Skills :

  • Excellent communication and cross-functional collaboration skills.
  • Strong analytical mindset with structured problem-solving ability.
  • Self-driven with ownership mentality and adaptability in fast-paced environments.

Preferred Qualifications (Bonus) :

  • Experience deploying distributed, large-scale ML or data-driven platforms.
  • Understanding of data governance, privacy, and security compliance.
  • Exposure to domain-driven data/AI use cases in fintech, healthcare, retail, or e-commerce.
  • Experience working in Agile environments (Scrum/Kanban).
  • Active open-source contributions or a strong GitHub technical portfolio.
Read more
Tarento Group
at Tarento Group
3 candid answers
1 recruiter
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
6yrs+
Upto ₹27L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
Windows Azure
RESTful APIs
+2 more

About Tarento:

 

Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions.

 

We're proud to be recognized as a Great Place to Work, a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you’ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose.


Job Summary:

We are seeking a highly skilled and self-driven Senior Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.


Key Responsibilities:

  • Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
  • Implement and maintain RESTful APIs, ensuring high performance and scalability.
  • Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
  • Develop and manage Docker containers, enabling efficient development and deployment pipelines.
  • Integrate messaging services like Apache Kafka into microservice architectures.
  • Design and maintain data models using PostgreSQL or other SQL databases.
  • Implement unit testing using JUnit and mocking frameworks to ensure code quality.
  • Develop and execute API automation tests using Cucumber or similar tools.
  • Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
  • Work with Kubernetes for orchestrating containerized services.
  • Utilize Couchbase or similar NoSQL technologies when necessary.
  • Participate in code reviews, design discussions, and contribute to best practices and standards.


Required Skills & Qualifications:

  • Strong experience in Java (11 or above) and Spring Boot framework.
  • Solid understanding of microservices architecture and deployment on Azure.
  • Hands-on experience with Docker, and exposure to Kubernetes.
  • Proficiency in Kafka, with real-world project experience.
  • Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
  • Experience in writing unit tests using JUnit and mocking tools.
  • Experience with Cucumber or similar frameworks for API automation testing.
  • Exposure to CI/CD tools, DevOps processes, and Git-based workflows.


Nice to Have:

  • Azure certifications (e.g., Azure Developer Associate)
  • Familiarity with Couchbase or other NoSQL databases.
  • Familiarity with other cloud providers (AWS, GCP)
  • Knowledge of observability tools (Prometheus, Grafana, ELK)


Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication.
  • Ability to work in an agile environment and contribute to continuous improvement.


Why Join Us:

  • Work on cutting-edge microservice architectures
  • Strong learning and development culture
  • Opportunity to innovate and influence technical decisions
  • Collaborative and inclusive work environment
Read more
Trential Technologies
at Trential Technologies
1 candid answer
Garima Jangid
Posted by Garima Jangid
Gurugram
5 - 8 yrs
₹30L - ₹45L / yr
skill iconNodeJS (Node.js)
skill iconJavascript
RabbitMQ
Apache Kafka
skill iconRedis
+14 more

About us:

Trential is engineering the future of digital identity with W3C Verifiable Credentials—secure, decentralized, privacy-first. We make identity and credentials verifiable anywhere, instantly.


We are looking for a Team lead to architect, build, and scale high-performance web applications that power our core products. You will lead the full development lifecycle—from system design to deployment—while mentoring the team and driving best engineering practices across frontend and backend stacks.


 Design & Implement: Lead the design, implementation and management of Trential products.

 Lead by example: Be the most senior and impactful engineer on the team, setting the technical bar through your direct contributions.

 Code Quality & Best Practices: Enforce high standards for code quality, security, and performance through rigorous code reviews, automated testing, and continuous delivery pipelines.

 Standards Adherence: Ensure all solutions comply with relevant open standards like W3C Verifiable Credentials (VCs), Decentralized Identifiers (DIDs) & Privacy Laws, maintaining global interoperability.

 Continuous Improvement: Lead the charge to continuously evaluate and improve the products & processes. Instill a culture of metrics-driven process improvement to boost team efficiency and product quality.

 Cross-Functional Collaboration: Work closely with the Co-Founders & Product Team to translate business requirements and market needs into clear, actionable technical specifications and stories. Represent Trential in interactions with external stakeholders for integrations.


What we're looking for:

 Experience: 5+ years of experience in software development, with at least 2 years as a Technical Lead.

 Technical Depth: Deep proficiency in JavaScript and experience in building and operating distributed, fault-tolerant systems.

 Cloud & Infrastructure: Hands-on experience with cloud platforms (AWS & GCP) and modern DevOps practices (e.g., CI/CD, Infrastructure as Code, Docker).

 Databases: Strong knowledge of SQL/NoSQL databases and data modeling for high-throughput, secure applications.


Preferred Qualifications (Nice to Have)

 Identity & Credentials: Knowledge of decentralized identity principles, Verifiable Credentials (W3C VCs), DIDs, and relevant protocols (e.g., OpenID4VC, DIDComm)

 Familiarity with data privacy and security standards (GDPR, SOC 2, ISO 27001) and designing systems complying to these laws.

 Experience integrating AI/ML models into verification or data extraction workflows

Read more
Capace Software Private Limited
Bengaluru (Bangalore), Bhopal
5 - 10 yrs
₹4L - ₹10L / yr
skill iconDjango
CI/CD
Software deployment
RESTful APIs
skill iconFlask
+8 more

Senior Python Django Developer 

Experience: Back-end development: 6 years (Required)


Location:  Bangalore/ Bhopal

Job Description:

We are looking for a highly skilled Senior Python Django Developer with extensive experience in building and scaling financial or payments-based applications. The ideal candidate has a deep understanding of system design, architecture patterns, and testing best practices, along with a strong grasp of the start-up environment.

This role requires a balance of hands-on coding, architectural design, and collaboration across teams to deliver robust and scalable financial products.

Responsibilities:

  • Design and develop scalable, secure, and high-performance applications using Python (Django framework).
  • Architect system components, define database schemas, and optimize backend services for speed and efficiency.
  • Lead and implement design patterns and software architecture best practices.
  • Ensure code quality through comprehensive unit testing, integration testing, and participation in code reviews.
  • Collaborate closely with Product, DevOps, QA, and Frontend teams to build seamless end-to-end solutions.
  • Drive performance improvements, monitor system health, and troubleshoot production issues.
  • Apply domain knowledge in payments and finance, including transaction processing, reconciliation, settlements, wallets, UPI, etc.
  • Contribute to technical decision-making and mentor junior developers.

Requirements:

  • 6 to 10 years of professional backend development experience with Python and Django.
  • Strong background in payments/financial systems or FinTech applications.
  • Proven experience in designing software architecture in a microservices or modular monolith environment.
  • Experience working in fast-paced startup environments with agile practices.
  • Proficiency in RESTful APIs, SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Redis).
  • Solid understanding of Docker, CI/CD pipelines, and cloud platforms (AWS/GCP/Azure).
  • Hands-on experience with test-driven development (TDD) and frameworks like pytest, unittest, or factory_boy.
  • Familiarity with security best practices in financial applications (PCI compliance, data encryption, etc.).

Preferred Skills:

  • Exposure to event-driven architecture (Celery, Kafka, RabbitMQ).
  • Experience integrating with third-party payment gateways, banking APIs, or financial instruments.
  • Understanding of DevOps and monitoring tools (Prometheus, ELK, Grafana).
  • Contributions to open-source or personal finance-related projects.

Job Types: Full-time, Permanent


Schedule:

  • Day shift

Supplemental Pay:

  • Performance bonus
  • Yearly bonus

Ability to commute/relocate:

  • JP Nagar, 5th Phase, Bangalore, Karnataka or Indrapuri, Bhopal, Madhya Pradesh: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred)


Read more
iMerit
Bengaluru (Bangalore)
6 - 9 yrs
₹10L - ₹15L / yr
DevOps
Terraform
Apache Kafka
skill iconPython
skill iconGo Programming (Golang)
+4 more

Exp: 7- 10 Years

CTC: up to 35 LPA


Skills:

  • 6–10 years DevOps / SRE / Cloud Infrastructure experience
  • Expert-level Kubernetes (networking, security, scaling, controllers)
  • Terraform Infrastructure-as-Code mastery
  • Hands-on Kafka production experience
  • AWS cloud architecture and networking expertise
  • Strong scripting in Python, Go, or Bash
  • GitOps and CI/CD tooling experience


Key Responsibilities:

  • Design highly available, secure cloud infrastructure supporting distributed microservices at scale
  • Lead multi-cluster Kubernetes strategy optimized for GPU and multi-tenant workloads
  • Implement Infrastructure-as-Code using Terraform across full infrastructure lifecycle
  • Optimize Kafka-based data pipelines for throughput, fault tolerance, and low latency
  • Deliver zero-downtime CI/CD pipelines using GitOps-driven deployment models
  • Establish SRE practices with SLOs, p95 and p99 monitoring, and FinOps discipline
  • Ensure production-ready disaster recovery and business continuity testing



If interested Kindly share your updated resume at 82008 31681

Read more
Posspole
Vibha Shashidhar
Posted by Vibha Shashidhar
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹14L / yr
Apache Kafka
IBM DB2
  • Strong knowledge in Kafka development and architecture.
  • Hands-on experience on KSQL Database.
  • Very good communication, analytical & problem-solving. skills.
  • Proven hands-on Development experience Kafka platforms, lenses, confluent.
  • Strong knowledge of the framework (Kafka Connect).
  • Very comfortable with Shell scripting & Linux commands.
  • Experience in DB2 database


Read more
Fintech product company with low code and no code technology
Fintech product company with low code and no code technology
Agency job
via The Hub by Sridevi Viswanathan
Gurugram, Mumbai, Bengaluru (Bangalore)
2 - 8 yrs
₹2L - ₹15L / yr
skill iconJava
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
+10 more

What you need to succeed in this job ?


  • MS or BS/B.Tech in computer science or equivalent experience from top college.
  • Minimum 2+ Experience in Java 8, Spring Boot, Spring Cloud, Spring Cloud Gateway etc
  • Good understanding of Design Patterns usage and implementations.
  • REST Services and understanding and implementation of Microservices Architecture.
  • Unit testing tools – Junit & Mockito.
  • Experience is PostgreSQL database is must, 
  • Excellent data structure & algorithm and problem solving skills.
  • Should be an active contributor to developer communities like Stackoverflow is added advantage.
  • Experience and knowledge of open source tools & frameworks, broader cutting edge technologies around server side development (Prometheus, Elasticsearch, Kafka).
  • Must be a proven performer and team player that enjoy challenging assignments in a high- energy, fast growing and start-up workplace.
  • Must be a self-starter who can work well with minimal guidance and in fluid environment.
Read more
Conviva
at Conviva
1 recruiter
Adarsh Sikarwar
Posted by Adarsh Sikarwar
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹40L / yr
Apache Kafka
skill iconRedis
Systems design
Data Structures
Algorithms
+5 more

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses. 


Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.

 

What you get to do in this role:

Work on extremely high scale RUST web services or backend systems.

Design and develop solutions for highly scalable web and backend systems.

Proactively identify and solve performance issues.

Maintain a high bar on code quality and unit testing.

 

What you bring to the role:

5+ years of hands-on software development experience.

At least 2+ years of RUST development experience.

Knowledge of cargo packages for kafka, redis etc.

Strong CS fundamentals, including system design, data structures and algorithms.

Expertise in backend and web services development.

Good analytical and troubleshooting skills.

 

What will help you stand out:

Experience working with large scale web services and applications.

Exposure to Golang, Scala or Java

Exposure to Big data systems like Kafka, Spark, Hadoop etc.

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.  


Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 

Read more
BlueYonder
Bengaluru (Bangalore), Hyderabad
10 - 14 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Gradle
+13 more

·      Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.

·      BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.

·      This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.

·      Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment

·      The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools

Our current technical environment:

·      Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake

·      • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture

·      • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)

·      Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite

Read more
Threado
at Threado
3 recruiters
Abhishek Nalin
Posted by Abhishek Nalin
Bengaluru (Bangalore)
4 - 10 yrs
₹40L - ₹60L / yr
skill iconJava
Spring
skill iconSpring Boot
Spring Security
skill iconPostgreSQL
+4 more

Looking for someone with 6+ years of exp and has worked on scalable systems and has good understanding of architecting systems with Redis, Elasticsearch and Kafka.


We are looking for a Senior Java Developer with good product development experience to join our founding engineering team.

👋 Hi! We are Threado

At Threado, we are building the future of community experience for businesses. Community-led growth is the most sustainable way to build businesses and we want to help drive this shift in the years to come. We are a seed stage product-led startup building the best-in-class community management platform for community professionals across the globe.

Threado was founded by Pramod Rao and Abhishek Nalin. Prior to Threado, Pramod was a founding team member and VP, Marketing at Zomato. He comes with a decade of experience in community building and user growth. Abhishek was the Director of Engineering at BillTrim and CTO at Smart Audit. He has years of experience in engineering design, architecture and building SaaS products.

⛰Welcome to ground zero!

"The journey of a thousand miles begins with one step." - Lao Tzu

You'll be joining us at the early stages of our journey. We are a small, fun and passionate team with an ambition to build the next generation of community infrastructure. If you love technology, enabling engaging social experiences, and are interested in building a product for the global market, you are one of us. Join us in the journey ahead!

🛠What can you expect at Threado?

You'll be involved in:

  • Shaping the APIs integrations marketplace and designing and developing solutions on top of APIs.
  • Taking a business problem, coming up with solutions, leading the technical design and implementation of the solution.
  • Writing clean, maintainable and reusable code along with test cases.
  • Mentoring junior developers.


🥷 We are looking for:

  • Proficient with Java. Good understanding of Spring framework.
  • Good understanding of SQL (MySQL or PostgreSQL). Experience with No-SQL (Cassandra, MongoDB, DynamoDB) will be a plus.
  • Experience in server-side services using Redis, Elasticsearch, Kafka will be a plus. Working experience of Microservices would be a plus.
  • Experience with AWS stack. Experience with CI/CD processes.
  • Good written and verbal communication skills with the ability to present complex technical information clearly and concisely to a variety of audiences.
  • Bachelor's Degree in Computer Science or related field with 4+ years of experience in software development.

💭 Parting thoughts on why Threado:

  • Opportunity to join a small passionate team in the early days of building a global SaaS business out of India
  • Take the path less traveled, have fun building and enjoy the learning journey

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos