Cutshort logo
Tredence logo
Snowflake Developer- Architect/Manager
Snowflake Developer- Architect/Manager
Tredence's logo

Snowflake Developer- Architect/Manager

Jyoti Chetry's profile picture
Posted by Jyoti Chetry
8 - 12 yrs
₹12L - ₹30L / yr
Full time
Bengaluru (Bangalore), Pune, Gurugram, Chennai
Skills
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
Data migration

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Tredence

Founded :
2013
Type
Size :
1000-5000
Stage :
Profitable
About

Tredence is an analytics services and solutions company serving some of the leading Fortune 500 clients & successful track record of delivering business impact

Read more
Company social profiles
bloglinkedintwitterfacebook

Similar jobs

Semi Stealth Mode startup in Delhi
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 6 yrs
₹35L - ₹40L / yr
Data Analytics
Python
Data Visualization
SQL

A Delhi NCR based Applied AI & Consumer Tech company tackling one of the largest unsolved consumer internet problems of our time. We are a motley crew of smart, passionate and nice people who believe you can build a high performing company with a culture of respect aka a sports team with a heart aka a caring meritocracy.

Our illustrious angels include unicorn founders, serial entrepreneurs with exits, tech & consumer industry stalwarts and investment professionals/bankers.

We are hiring for our founding team (in Delhi NCR only, no remote) that will take the product from prototype to a landing! Opportunity for disproportionate non-linear impact, learning and wealth creation in a classic 0-1 with a Silicon Valley caliber founding team.


Key Responsibilities:

1.   Data Strategy and Vision:

·       Develop and drive the company's data analytics strategy, aligning it with overall business goals.

·       Define the vision for data analytics, outlining clear objectives and key results (OKRs) to measure success.

2.   Data Analysis and Interpretation:

·       Oversee the analysis of complex datasets to extract valuable insights, trends, and patterns.

·       Utilize statistical methods and data visualization techniques to present findings in a clear and compelling manner to both technical and non-technical stakeholders.

3.   Data Infrastructure and Tools:

·       Evaluate, select, and implement advanced analytics tools and platforms to enhance data processing and analysis capabilities.

·       Collaborate with IT teams to ensure a robust and scalable data infrastructure, including data storage, retrieval, and security protocols.

4.   Collaboration and Stakeholder Management:

·       Collaborate cross-functionally with teams such as marketing, sales, and product development to identify opportunities for data-driven optimizations.

·       Act as a liaison between technical and non-technical teams, ensuring effective communication of data insights and recommendations.

5.   Performance Measurement:

·       Establish key performance indicators (KPIs) and metrics to measure the impact of data analytics initiatives on business outcomes.

·       Continuously assess and improve the accuracy and relevance of analytical models and methodologies.


Qualifications:

  • Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or related field.
  • Proven experience (5+ years) in data analytics, with a focus on leading analytics teams and driving strategic initiatives.
  • Proficiency in data analysis tools such as Python, R, SQL, and advanced knowledge of data visualization tools.
  • Strong understanding of statistical methods, machine learning algorithms, and predictive modelling techniques.
  • Excellent communication skills, both written and verbal, to effectively convey complex findings to diverse audie 
Read more
Redica Systems
at Redica Systems
2 candid answers
4 products
Nikhil Mohite
Posted by Nikhil Mohite
Bengaluru (Bangalore)
5 - 8 yrs
₹35L - ₹40L / yr
Python
SQL
Spark
Snowflake

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)


About Us-

Redica Systems is a data analytics platform built to help life sciences companies improve their quality and stay on top of evolving regulations. Our proprietary processes transform one of the industry’s most complete data sets, aggregated from hundreds of health agencies and unique Freedom of Information Act (FOIA) sourcing, into meaningful answers and insights that reduce regulatory and compliance risk. Founded in 2010, Redica Systems serves over 200 customers in the Pharma, BioPharma, MedTech, and Food and Cosmetics industries, including 19 of the top 20 Pharma companies and 9 of the 10 top MedTech companies. Redica Systems’ headquarters are in Pleasanton, CA, but we are a geographically distributed company. More information is available at redica.com.


The Role-

We’re looking for an experienced Senior Data Engineer II to join our team as we continue to develop the first-of-its-kind quality and regulatory intelligence (QRI) platform for the life science industry. The ideal candidate will come with experience leading/mentoring a team of developers and maintaining a high bar of quality while remaining hands-on in the code.


Core Responsibilities

Full understanding of the technical architecture and the different sub-systems

Able to work as a lead in an Agile Scrum environment, with a keen focus on delivering sustainable, high-performance, scalable, and easily maintainable enterprise solutions

Helps prioritize technical issues with engineering managers

Proactively guides technical decisions in a domain of expertise

Recommend and validate different ways to improve data reliability, efficiency, and quality

Identify optimal approaches for resolving data quality or consistency issues

Ensure successful system delivery to the production environment and assist the operations and support team in resolving production issues, as necessary

Lead the acquisition of data from a variety of sources, intelligent change monitoring, data mapping, transformations, and analysis

Develop, test, and maintain architectures for data stores, databases, processing systems, and microservices

Integrate various sub-systems or components to deliver end-to-end solutions

Integrate data pipeline with NLP/ML services


Qualifications-

5+ years of senior or lead developer experience with an emphasis on technical mentorship, code/system architecture, and quality output

Extensive experience designing and building data pipelines, data APIs, and ETL/ELT processes

Extensive experience in data modelling and datawarehouse concepts

Deep, hands-on experience in Python

Hands-on experience setting up, configuring, and maintaining SQL and no-SQL databases (MySQL/MariaDB, PostgreSQL, MongoDB, Snowflake)

Computer Science, Computer Engineering, or similar technical degree


Bonus Points-

Experience with the data engineering stack within AWS is a major plus (S3, Lake Formation, Lambda, Fargate, Kinesis Data Streams/Data Firehose, DynamoDB, Neptune DB)

Experience with event-driven data architectures

Experience with the ELK stack is a major plus (ElasticSearch, LogStash, Kibana)


Read more
Hyderabad
5 - 10 yrs
₹19L - ₹25L / yr
ETL
Informatica
Data Warehouse (DWH)
Windows Azure
Microsoft Windows Azure
+4 more

A Business Transformation Organization that partners with businesses to co–create customer-centric hyper-personalized solutions to achieve exponential growth. Invente offers platforms and services that enable businesses to provide human-free customer experience, Business Process Automation.


Location: Hyderabad (WFO)

Budget: Open

Position: Azure Data Engineer

Experience: 5+ years of commercial experience


Responsibilities

●     Design and implement Azure data solutions using ADLS Gen 2.0, Azure Data Factory, Synapse, Databricks, SQL, and Power BI

●     Build and maintain data pipelines and ETL processes to ensure efficient data ingestion and processing

●     Develop and manage data warehouses and data lakes

●     Ensure data quality, integrity, and security

●     Implement from existing use cases required by the AI and analytics teams.

●     Collaborate with other teams to integrate data solutions with other systems and applications

●     Stay up-to-date with emerging data technologies and recommend new solutions to improve our data infrastructure


Read more
Deep-Rooted.co (formerly Clover)
at Deep-Rooted.co (formerly Clover)
6 candid answers
1 video
Likhithaa D
Posted by Likhithaa D
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹15L / yr
Java
Python
SQL
AWS Lambda
HTTP
+5 more

Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.


Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.


Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.  

How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with thehttps://www.linkedin.com/in/gururajsrao/"> Product Leader (Founder) andhttps://www.linkedin.com/in/sriki77/"> CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.


We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.

Purpose of the role:

* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making.
* Handle nuances of Excel and Google Sheets API.
* Pull data in and manage it growth, freshness and correctness.
* Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads.
* Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.

Technical expertise:
* Good Knowledge And Experience with Programming languages - Java, SQL,Python.
* Good Knowledge of Data Warehousing, Data Architecture.
* Experience with Data Transformations and ETL; 
* Experience with API tools and more closed systems like Excel, Google Sheets etc.
* Experience AWS Cloud Platform and Lambda
* Experience with distributed data processing tools.
* Experiences with container-based deployments on cloud.

Skills:
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
Read more
Global Media Agency
Gurugram
2 - 6 yrs
₹10L - ₹12L / yr
SQL
Campaign Analytics
Google Analytics
Data Analytics
We are looking for a Manager - Campaign Analytics for one of the leading Global Media agencies in Delhi.
 
Role - Manager (Campaign Analytics)
Experience - 2 to 6 years
Location - Gurgaon/Gurugram, Delhi

About our Client:-

Our client is part of the global media agency with mission is to make advertising more valuable to the world. They do this by employing the world's very best talent to solve some of the toughest challenges of today's digital marketing landscape. They hire people whose values reflect those of our own: genuine, results-focused, daring and insightful. They promise you a workplace that invests in your career, cares for you and is fun and engaging.

About the role:-

- Accountable for quantifying and measuring the success of our executions and for delivering insights that enable us to innovate the work we deliver at Essence. You will be supporting analyses for campaigns and helping the development of new analytical tools.

Some of the things we’d like you to do :-

● Take ownership of delivery for designated clients/projects
● Collaborate with campaign team to identify opportunities where our analytics offering can add
value
● Gain an understanding of marketing plans and their objectives to be able to assist with comprehensive measurement, and test & learn plans
● Develop analytical techniques to analyze results from experiments/studies/observations
● Explore methods/techniques for solving business problems given the data available (potentially 3rd party)
● Help internal and external clients understand the capability and limitations of methods used
● Develop and present compelling and innovative presentations.

A bit about yourself :-

● Degree from a top-tier College, 3.0 GPA or equivalent (preferably numerical)
● Proficiency with systems such as SQL, Social Analytics tools (Crimson Hexagon), Python, and ‘R’
● Have some understanding of marketing campaigns and their objectives
● Strong analytical skills - ability to analyze raw data, find insights, and provide actionable strategic recommendations
● Ability to manage someone effectively to bring out the best in their skill sets, motivating them to succeed, and keeping their focus
● Strong work ethic, with ability to manage multiple projects, people, and time zones to meet deadlines and deliver results
Read more
GradMener Technology Pvt. Ltd.
Pune
2 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Oracle
Job Description : 
 
Roles and Responsibility : 
 
  • Designing and coding the data warehousing system to desired company specifications 
  • Conducting preliminary testing of the warehousing environment before data is extracted
  • Extracting company data and transferring it into the new warehousing environment
  • Testing the new storage system once all the data has been transferred
  • Troubleshooting any issues that may arise
  • Providing maintenance support
  • Consulting with data management teams to get a big-picture idea of the company’s data storage needs
  • Presenting the company with warehousing options based on their storage needs
Requirements :
  • Experience of 1-3 years in Informatica Power Center
  • Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
  • Knowledge of SQL Server database 
  • Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques  Understanding of ETL Control Framework
  • Experience in UNIX shell/Perl Scripting
  • Good communication skills, including the ability to write clearly
  • Able to function effectively as a member of a team 
  • Proactive with respect to personal and technical development
Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹20L / yr
Data Science
R Programming
Python
SQL
Natural Language Processing (NLP)

- Modeling complex problems, discovering insights, and identifying opportunities through the use of statistical, algorithmic, mining, and visualization techniques

- Experience working with business understanding the requirement, creating the problem statement, and building scalable and dependable Analytical solutions

- Must have hands-on and strong experience in Python

- Broad knowledge of fundamentals and state-of-the-art in NLP and machine learning

- Strong analytical & algorithm development skills

- Deep knowledge of techniques such as Linear Regression, gradient descent, Logistic Regression, Forecasting, Cluster analysis, Decision trees, Linear Optimization, Text Mining, etc

- Ability to collaborate across teams and strong interpersonal skills

 

Skills

- Sound theoretical knowledge in ML algorithm and their application

- Hands-on experience in statistical modeling tools such as R, Python, and SQL

- Hands-on experience in Machine learning/data science

- Strong knowledge of statistics

- Experience in advanced analytics / Statistical techniques – Regression, Decision trees, Ensemble machine learning algorithms, etc

- Experience in Natural Language Processing & Deep Learning techniques 

- Pandas, NLTK, Scikit-learn, SpaCy, Tensorflow

Read more
BDI Plus Lab
at BDI Plus Lab
2 recruiters
Puja Kumari
Posted by Puja Kumari
Remote only
2 - 6 yrs
₹6L - ₹20L / yr
Apache Hive
Spark
Scala
PySpark
Data engineering
+4 more
We are looking for big data engineers to join our transformational consulting team serving one of our top US clients in the financial sector. You'd get an opportunity to develop big data pipelines and convert business requirements to production grade services and products. With
lesser concentration on enforcing how to do a particular task, we believe in giving people the opportunity to think out of the box and come up with their own innovative solution to problem solving.
You will primarily be developing, managing and executing handling multiple prospect campaigns as part of Prospect Marketing Journey to ensure best conversion rates and retention rates. Below are the roles, responsibilities and skillsets we are looking for and if you feel these resonate with you, please get in touch with us by applying to this role.
Roles and Responsibilities:
• You'd be responsible for development and maintenance of applications with technologies involving Enterprise Java and Distributed technologies.
• You'd collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements.
• You'd Assist in the definition, development, and documentation of software’s objectives, business requirements, deliverables, and specifications in collaboration with multiple cross-functional teams.
• Assist in the design and implementation process for new products, research and create POC for possible solutions.
Skillset:
• Bachelors or Masters Degree in a technology related field preferred.
• Overall experience of 2-3 years on the Big Data Technologies.
• Hands on experience with Spark (Java/ Scala)
• Hands on experience with Hive, Shell Scripting
• Knowledge on Hbase, Elastic Search
• Development experience In Java/ Python is preferred
• Familiar with profiling, code coverage, logging, common IDE’s and other
development tools.
• Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations.
• Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks.
• Ability to identify non-obvious solutions to complex problems
Read more
Bungee Tech India
Abigail David
Posted by Abigail David
Remote, NCR (Delhi | Gurgaon | Noida), Chennai
5 - 10 yrs
₹10L - ₹30L / yr
Big Data
Hadoop
Apache Hive
Spark
ETL
+3 more

Company Description

At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering. 

 

We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.

 

Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.

We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

You will also be responsible for integrating them with the architecture used in the company.

 

We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.

 

Responsibilities

As an experienced member of the team, in this role, you will:

 

  • Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development

 

  • You will research, design and code, troubleshoot and support. What you create is also what you own.

 

  • Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.

 

  • Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.

 

BASIC QUALIFICATIONS

  • Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
  • 5+ years relevant professional experience in Data Engineering and Business Intelligence
  • 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
  • Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
  • Ability to effectively communicate with both business and technical teams.
  • Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
  • Understanding of relational and non-relational databases and basic SQL
  • Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script

 

PREFERRED QUALIFICATIONS

 

  • Experience with building data pipelines from application databases.
  • Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
  • Experience working with Data Lakes.
  • Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
  • Sharp problem solving skills and ability to resolve ambiguous requirements
  • Experience on working with Big Data
  • Knowledge and experience on working with Hive and the Hadoop ecosystem
  • Knowledge of Spark
  • Experience working with Data Science teams
Read more
NeoQuant Solutions Pvt Ltd
Shehnaz Siddiki
Posted by Shehnaz Siddiki
Mumbai
3 - 6 yrs
₹8L - ₹11L / yr
Microsoft Business Intelligence (MSBI)
SSIS
SQL Server Reporting Services (SSRS)
SQL server
Microsoft SQL Server
+3 more

MSBI Developer- 

We have the following opening in our organization:

Years of Experience: Experience of  4-8 years. 

Location- Mumbai ( Thane)/BKC/Andheri
Notice period: Max 15 days or Immediate 

Educational Qualification: MCA/ME/Msc-IT/BE/B-Tech/BCA/BSC IT in Computer Science/B.Tech

Requirements:

  •   3- 8 years of consulting or relevant work experience
  • Should be good in SQL Server 2008 R2 and above.
  • Should be excellent at SQL, SSRS & SSIS, SSAS,
  • Data modeling, Fact & dimension design, work on a data warehouse or dw architecture design.
  • Implementing new technology like power BI, power bi modeling.  
  • Knowledge of Azure or R-programming is an added advantage.
  • Experiences in BI and Visualization Technology (Tableau, Power  BI).
  • Advanced T-SQL programming skill
  • Can scope out a simple or semi-complex project based on business requirements and achievable benefits
  • Evaluate, design, and implement enterprise IT-based business solutions, often working on-site to help customers deploy their solutions.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos