Cutshort logo
Wissen Technology logo
Datawarehouse Developer
Datawarehouse Developer
Wissen Technology's logo

Datawarehouse Developer

Sukanya Mohan's profile picture
Posted by Sukanya Mohan
5 - 8 yrs
₹1L - ₹18L / yr
Bengaluru (Bangalore)
Skills
Informatica PowerCenter
ETL
Teradata DBA
Shell Scripting

Position Description


We are looking for a highly motivated, hands-on Sr. Database/Data Warehouse Data Analytics developer to work at our Bangalore, India location. Ideal candidate will have solid software technology background with capability in the making and supporting of robust, secure, and multi-platform financial applications to contribute to Fully Paid Lending (FPL) project. The successful candidate will be a proficient and productive developer, a team leader, have good communication skills, and demonstrate ownership.

 

Responsibilities


  • Produce service metrics, analyze trends, and identify opportunities to improve the level of service and reduce cost as appropriate.
  • Responsible for design, development and maintenance of database schema and objects throughout the lifecycle of the applications.
  • Supporting implemented solutions by monitoring and tuning queries and data loads, addressing user questions concerning data integrity, monitoring performance, and communicating functional and technical issues.
  • Helping the team by taking care of production releases.
  • Troubleshoot data issues and work with data providers for resolution.
  • Closely work with business and applications teams in implementing the right design and solution for the business applications.
  • Build reporting solutions for WM Risk Applications.
  • Work as part of a banking Agile Squad / Fleet.
  • Perform proof of concepts in new areas of development.
  • Support continuous improvement of automated systems.
  • Participate in all aspects of SDLC (analysis, design, coding, testing and implementation).

 

Required Skill

  • 5 to 7 Years of strong database (SQL) Knowledge, ETL (Informatica PowerCenter), Unix Shell Scripting.
  • Database (preferably Teradata) knowledge, database design, performance tuning, writing complex DB programs etc.
  • Demonstrate proficient skills in analysis and resolution of application performance problems.
  • Database fundamentals; relational and Datawarehouse concepts.
  • Should be able to lead a team of 2-3 members and guide them in their day to work technically and functionally.
  • Ensure designs, code and processes are optimized for performance, scalability, security, reliability, and maintainability.
  • Understanding of requirements of large enterprise applications (security, entitlements, etc.)
  • Provide technical leadership throughout the design process and guidance with regards to practices, procedures, and techniques. Serve as a guide and mentor for junior level Software Development Engineers
  • Exposure to JIRA or other ALM tools to create a productive, high-quality development environment.
  • Proven experience in working within an Agile framework.
  • Strong problem-solving skills and the ability to produce high quality work independently and work well in a team.
  • Excellent communication skills (written, interpersonal, presentation), with the ability to easily and effectively interact and negotiate with business stakeholders.
  • Ability and strong desire to learn new languages, frameworks, tools, and platforms quickly.
  • Growth mindset, personal excellence, collaborative spirit

Good to have skills.

  • Prior work experience with Azure or other cloud platforms such as Google Cloud, AWS, etc.
  • Exposure to programming languages python/R/ java and experience with implementing Data analytics projects.
  • Experience in Git and development workflows.
  • Prior experience in Banking and Financial domain.
  • Exposure to security-based lending is a plus.
  • Experience with Reporting/BI Tools is a plus.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Wissen Technology

Founded :
2000
Type
Size :
1000-5000
Stage :
Profitable
About

The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.

With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.


We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).


Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.


Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.

Read more
Connect with the team
Profile picture
Lokesh Manikappa
Profile picture
Vijayalakshmi Selvaraj
Profile picture
Adishi Sood
Profile picture
Shiva Kumar J Goud
Company social profiles
bloglinkedinfacebook

Similar jobs

Nyteco
at Nyteco
2 candid answers
1 video
Alokha Raj
Posted by Alokha Raj
Remote only
4 - 6 yrs
₹17L - ₹20L / yr
Data Transformation Tool (DBT)
ETL
SQL
Big Data
Google Cloud Platform (GCP)
+2 more

Join Our Journey

Jules develops an amazing end-to-end solution for recycled materials traders, importers and exporters. Which means a looooot of internal, structured data to play with in order to provide reporting, alerting and insights to end-users. With about 200 tables, covering all business processes from order management, to payments including logistics, hedging and claims, the wealth the data entered in Jules can unlock is massive. 


After working on a simple stack made of PostGres, SQL queries and a visualization solution, the company is now ready to set-up its data stack and only misses you. We are thinking DBT, Redshift or Snowlake, Five Tran, Metabase or Luzmo etc. We also have an AI team already playing around text driven data interaction. 


As a Data Engineer at Jules AI, your duties will involve both data engineering and product analytics, enhancing our data ecosystem. You will collaborate with cross-functional teams to design, develop, and sustain data pipelines, and conduct detailed analyses to generate actionable insights.


Roles And Responsibilities:

  • Work with stakeholders to determine data needs, and design and build scalable data pipelines.
  • Develop and sustain ELT processes to guarantee timely and precise data availability for analytical purposes.
  • Construct and oversee large-scale data pipelines that collect data from various sources.
  • Expand and refine our DBT setup for data transformation.
  • Engage with our data platform team to address customer issues.
  • Apply your advanced SQL and big data expertise to develop innovative data solutions.
  • Enhance and debug existing data pipelines for improved performance and reliability.
  • Generate and update dashboards and reports to share analytical results with stakeholders.
  • Implement data quality controls and validation procedures to maintain data accuracy and integrity.
  • Work with various teams to incorporate analytics into product development efforts.
  • Use technologies like Snowflake, DBT, and Fivetran effectively.


Mandatory Qualifications:

  • Hold a Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
  • Possess at least 4 years of experience in Data Engineering, ETL Building, database management, and Data Warehousing.
  • Demonstrated expertise as an Analytics Engineer or in a similar role.
  • Proficient in SQL, a scripting language (Python), and a data visualization tool.
  • Mandatory experience in working with DBT.
  • Experience in working with Airflow, and cloud platforms like AWS, GCP, or Snowflake.
  • Deep knowledge of ETL/ELT patterns.
  • Require at least 1 year of experience in building Data pipelines and leading data warehouse projects.
  • Experienced in mentoring data professionals across all levels, from junior to senior.
  • Proven track record in establishing new data engineering processes and navigating through ambiguity.
  • Preferred Skills: Knowledge of Snowflake and reverse ETL tools is advantageous.


Grow, Develop, and Thrive With Us

  • Global Collaboration: Work with a dynamic team that’s making an impact across the globe, in the recycling industry and beyond. We have customers in India, Singapore, United-States, Mexico, Germany, France and more
  • Professional Growth: a highway toward setting-up a great data team and evolve into a leader
  • Flexible Work Environment: Competitive compensation, performance-based rewards, health benefits, paid time off, and flexible working hours to support your well-being.


Apply to us directly : https://nyteco.keka.com/careers/jobdetails/41442

Read more
Startup Focused on simplifying Buying Intent
Startup Focused on simplifying Buying Intent
Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
4 - 9 yrs
₹28L - ₹56L / yr
Big Data
Apache Spark
Spark
Hadoop
ETL
+7 more
5+ years of experience in a Data Engineer role.
 Proficiency in Linux.
 Must have SQL knowledge and experience working with relational databases,
query authoring (SQL) as well as familiarity with databases including Mysql,
Mongo, Cassandra, and Athena.
 Must have experience with Python/Scala.
Must have experience with Big Data technologies like Apache Spark.
 Must have experience with Apache Airflow.
 Experience with data pipeline and ETL tools like AWS Glue.
 Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Read more
Personal Care Product Manufacturing
Personal Care Product Manufacturing
Agency job
via Qrata by Rayal Rajan
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
bitsCrunch technology pvt ltd
Remote only
3 - 7 yrs
₹5L - ₹10L / yr
SQL
skill iconPython
skill iconJavascript
NOSQL Databases
Web3js
+2 more

Job Description: 

We are looking for an experienced SQL Developer to become a valued member of our dynamic team. In the role of SQL Developer, you will be tasked with creating top-notch database solutions, fine-tuning SQL databases, and providing support for our applications and systems. Your proficiency in SQL database design, development, and optimization will be instrumental in delivering efficient and dependable solutions to fulfil our business requirements.


Responsibilities:

 ● Create high-quality database solutions that align with the organization's requirements and standards.

● Design, manage, and fine-tune SQL databases, queries, and procedures to achieve optimal performance and scalability.

● Collaborate on the development of DBT pipelines to facilitate data transformation and modelling within our data warehouse.

● Evaluate and interpret ongoing business report requirements, gaining a clear understanding of the data necessary for insightful reporting.

● Conduct research to gather the essential data for constructing relevant and valuable reporting materials for stakeholders.

● Analyse existing SQL queries to identify areas for performance enhancements, implementing optimizations for greater efficiency.

● Propose new queries to extract meaningful insights from the data and enhance reporting capabilities.

● Develop procedures and scripts to ensure smooth data migration between systems, safeguarding data integrity.

● Deliver timely management reports on a scheduled basis to support decision-making processes.

● Investigate exceptions related to asset movements to maintain accurate and dependable data records.


Duties and Responsibilities: 

● A minimum of 3 years of hands-on experience in SQL development and administration, showcasing a strong proficiency in database management.

● A solid grasp of SQL database design, development, and optimization techniques.

● A Bachelor's degree in Computer Science, Information Technology, or a related field.

● An excellent understanding of DBT (Data Build Tool) and its practical application in data transformation and modelling.

● Proficiency in either Python or JavaScript, as these are commonly utilized for data-related tasks.

● Familiarity with NoSQL databases and their practical application in specific scenarios.

● Demonstrated commitment and pride in your work, with a focus on contributing to the company's overall success.

● Strong problem-solving skills and the ability to collaborate effectively within a team environment.

● Excellent interpersonal and communication skills that facilitate productive collaboration with colleagues and stakeholders.

● Familiarity with Agile development methodologies and tools that promote efficient project management and teamwork.

Read more
Mumbai
10 - 15 yrs
₹8L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Exp-Min 10 Years

Location Mumbai

Sal-Nego

 

 

Powerbi, Tableau, QlikView,

 

 

Solution Architect/Technology Lead – Data Analytics

 

Role

Looking for Business Intelligence lead (BI Lead) having hands on experience BI tools (Tableau, SAP Business Objects, Financial and Accounting modules, Power BI), SAP integration, and database knowledge including one or more of Azure Synapse/Datafactory, SQL Server, Oracle, cloud-based DB Snowflake. Good knowledge of AI-ML, Python is also expected.

  • You will be expected to work closely with our business users. The development will be performed using an Agile methodology which is based on scrum (time boxing, daily scrum meetings, retrospectives, etc.) and XP (continuous integration, refactoring, unit testing, etc) best practices. Candidates must therefore be able to work collaboratively, demonstrate good ownership, leadership and be able to work well in teams.
  • Responsibilities :
  • Design, development and support of multiple/hybrid Data sources, data visualization Framework using Power BI, Tableau, SAP Business Objects etc. and using ETL tools, Scripting, Python Scripting etc.
  • Implementing DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code-utilizing tools like Git. Primary Skills

Requirements

  • 10+ years working as a hands-on developer in Information Technology across Database, ETL and BI (SAP Business Objects, integration with SAP Financial and Accounting modules, Tableau, Power BI) & prior team management experience
  • Tableau/PowerBI integration with SAP and knowledge of SAP modules related to finance is a must
  • 3+ years of hands-on development experience in Data Warehousing and Data Processing
  • 3+ years of Database development experience with a solid understanding of core database concepts and relational database design, SQL, Performance tuning
  • 3+ years of hands-on development experience with Tableau
  • 3+ years of Power BI experience including parameterized reports and publishing it on PowerBI Service
  • Excellent understanding and practical experience delivering under an Agile methodology
  • Ability to work with business users to provide technical support
  • Ability to get involved in all the stages of project lifecycle, including analysis, design, development, testing, Good To have Skills
  • Experience with other Visualization tools and reporting tools like SAP Business Objects.

 

Read more
Synechron
at Synechron
3 recruiters
Ranjini N
Posted by Ranjini N
Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹2L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
Shell Scripting
+2 more

Position: ETL Developer

Location: Mumbai

Exp.Level: 4+ Yrs

Required Skills:

* Strong scripting knowledge such as: Python and Shell

* Strong relational database skills especially with DB2/Sybase

* Create high quality and optimized stored procedures and queries

* Strong with scripting language such as Python and Unix / K-Shell

* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.

* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.

* Experienced in Agile development process

* Java Knowledge is a big plus but not essential

* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus

* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.

* Good team player; Integrity & ownership

Read more
EnterpriseMinds
at EnterpriseMinds
2 recruiters
Komal S
Posted by Komal S
Remote only
4 - 10 yrs
₹10L - ₹35L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+2 more

Enterprise Minds, with core focus on engineering products, automation and intelligence, partners customers on the trajectory towards increasing outcomes, relevance, and growth. 

Harnessing the power of Data and the forces that define AI, Machine Learning and Data Science, we believe in institutionalizing go-to-market models and not just explore possibilities. 

We believe in a customer-centric ethic without and people-centric paradigm within. With a strong sense of community, ownership, and collaboration our people work in a spirit of co-creation, co-innovation, and co-development to engineer next-generation software products with the help of accelerators. 

Through Communities we connect and attract talent that shares skills and expertise. Through Innovation Labs and global design studios we deliver creative solutions. 
We create vertical isolated pods which has narrow but deep focus. We also create horizontal pods to collaborate and deliver sustainable outcomes. 

We follow Agile methodologies to fail fast and deliver scalable and modular solutions. We are constantly self-asses and realign to work with each customer in the most impactful manner. 

Pre-requisites for the Role 

 

  1. Job ID-EMBD0120PS 
  1. Primary skill: GCP DATA ENGINEER, BIGQUERY, ETL 
  1. Secondary skill: HADOOP, PYTHON, SPARK 
  1. Years of Experience: 5-8Years  
  1. Location: Remote 

 

Budget- Open  

NP- Immediate 

 

 

GCP DATA ENGINEER 

Position description 

  • Designing and implementing software systems 
  • Creating systems for collecting data and for processing that data 
  • Using Extract Transform Load operations (the ETL process) 
  • Creating data architectures that meet the requirements of the business 
  • Researching new methods of obtaining valuable data and improving its quality 
  • Creating structured data solutions using various programming languages and tools 
  • Mining data from multiple areas to construct efficient business models 
  • Collaborating with data analysts, data scientists, and other teams. 

Candidate profile 

  • Bachelor’s or master’s degree in information systems/engineering, computer science and management or related. 
  • 5-8 years professional experience as Big Data Engineer 
  • Proficiency in modelling and maintaining Data Lakes with PySpark – preferred basis. 
  • Experience with Big Data technologies (e.g., Databricks) 
  • Ability to model and optimize workflows GCP. 
  • Experience with Streaming Analytics services (e.g., Kafka, Grafana) 
  • Analytical, innovative and solution-oriented mindset 
  • Teamwork, strong communication and interpersonal skills 
  • Rigor and organizational skills 
  • Fluency in English (spoken and written). 
Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
keerthi varman
Posted by keerthi varman
Bengaluru (Bangalore)
3 - 8 yrs
₹10L - ₹14L / yr
Oracle SQL Developer
PL/SQL
ETL
Informatica
Data Warehouse (DWH)
+4 more
The role and responsibilities of Oracle or PL/SQL Developer and Database Administrator:
• Working Knowledge of XML, JSON, Shell and other DBMS scripts
• Hands on Experience on Oracle 11G,12c. Working knowledge of Oracle 18 and 19c
• Analysis, design, coding, testing, debugging and documentation. Complete knowledge of
Software Development Life Cycle (SDLC).
• Writing Complex Queries, stored procedures, functions and packages
• Knowledge of REST Services, UTL functions, DBMS functions and data integration is required
• Good knowledge on table level partitions, row locks and experience in OLTP.
• Should be aware about ETL tools, Data Migration, Data Mapping functionalities
• Understand the business requirement, transform/design the same into business solutions.
Perform data modelling and implement the business rules using Oracle database objects.
• Define source to target data mapping and data transformation logic as per the business
need.
• Should have worked on Materialised views creation and maintenance. Experience in
Performance tuning, impact analysis required
• Monitoring and optimizing the performance of the database. Planning for backup and
recovery of database information. Maintaining archived data. Backing up and restoring
databases.
• Hands on Experience on SQL Developer
Read more
zyoin
at zyoin
44 recruiters
RAKESH RANJAN
Posted by RAKESH RANJAN
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹40L / yr
MySQL
MySQL DBA
Shell Scripting
Scripting
Complete involvement in the database requirement starting from the design phase for every
project.
• Deploying required database assets on production (DDL, DML)
• Good understanding of MySQL Replication (Master-slave, Master-Master, GTID-based)
• Understanding of MySQL partitioning.
• A better understanding of MySQL logs and Configuration.
• Ways to schedule backup and restoration.
• Good understanding of MySQL versions and their features.
• Good understanding of InnoDB-Engine.
• Exploring ways to optimize the current environment and also lay a good platform for new
projects.
• Able to understand and resolve any database related production outages
Read more
Netconnect Pvt. Ltd.
at Netconnect Pvt. Ltd.
2 recruiters
Ruchika M
Posted by Ruchika M
Pune
3 - 5 yrs
₹3L - ₹10L / yr
Perl
Shell Scripting
PL/SQL
Skill required-

• Experienced Developer in Shell scripting,

• PERL Scripting

• PL/SQL knowledge is required.

• Advance Communication skill is a must.

• Ability to learn new applications and technologies
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos