Cutshort logo
Branch International logo
Backend Software Engineer (Data)
Backend Software Engineer (Data)
Branch International's logo

Backend Software Engineer (Data)

Reshika Mendiratta's profile picture
Posted by Reshika Mendiratta
7yrs+
₹50L - ₹70L / yr (ESOP available)
Remote only
Skills
Data Structures
Algorithms
Object Oriented Programming (OOPs)
ETL
ETL architecture
Data Warehouse (DWH)
Business Intelligence (BI)
DBT
Airflow
Data architecture

Branch Overview


Imagine a world where every person has improved access to financial services. People could start new businesses, pay for their children’s education, cover their emergency medical bills – the possibilities to improve life are endless. 


Branch is a global technology company revolutionizing financial access for millions of underserved banking customers today across Africa and India. By leveraging the rapid adoption of smartphones, machine learning and other technology, Branch is pioneering new ways to improve access and value for those overlooked by banks. From instant loans to market-leading investment yields, Branch offers a variety of products that help our customers be financially empowered.


Branch’s mission-driven team is led by the co-founders of Kiva.org and one of the earliest product leaders of PayPal. Branch has raised over $100 million from leading Silicon Valley investors, including Andreessen Horowitz (a16z) and Visa. 

 

With over 32 million downloads, Branch is one of the most popular finance apps in the world.

 

Job Overview

Branch launched in India in early 2019 and has seen rapid adoption and growth. In 2020 we started building out a full Engineering team in India to accelerate our success here. This team is working closely with our engineering team (based in the United States, Nigeria, and Kenya) to strengthen the capabilities of our existing product and build out new product lines for the company.


You will work closely with our Product and Data Science teams to design and maintain multiple technologies, including our API backend, credit scoring and underwriting systems, payments integrations, and operations tools. We face numerous interesting technical challenges ranging from maintaining complex financial systems to accessing and processing creative data sources for our algorithmic credit model. 


As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data. As an engineering team, we value bottom-up innovation and decentralized decision-making: We believe the best ideas can come from anyone in the company, and we are working hard to create an environment where everyone feels empowered to propose solutions to the challenges we face. We are looking for individuals who thrive in a fast-moving, innovative, and customer-focused setting.


Responsibilities

  • Make significant contributions to Branch’s data platform including data models, transformations, warehousing, and BI systems by bringing in best practices.
  • Build customer facing and internal products and APIs with industry best practices around security and performance in mind.
  • Influence and shape the company’s technical and product roadmap by providing timely and accurate inputs and owning various outcomes.
  • Collaborate with peers in other functional areas (Machine Learning, DevOps, etc.) to identify potential growth areas and systems needed.
  • Guide and mentor other younger engineers around you.
  • Scale our systems to ever-growing levels of traffic and handle complexity.


Qualifications

  • You have strong experience (8+ years) of designing, coding, and shipping data and backend software for web-based or mobile products.
  • Experience coordinating and collaborating with various business stakeholders and company leadership on critical functional decisions and technical roadmap.
  • You have strong knowledge of software development fundamentals, including relevant background in computer science fundamentals, distributed systems, data storage and processing, and agile development methodologies.
  • Have experience designing maintainable and scalable data architecture for ETL and BI purposes.
  • You are able to utlize your knowledge and expertise to code and ship quality products in a timely manner.
  • You are pragmatic and combine a strong understanding of technology and product needs to arrive at the best solution for a given problem.
  • You are highly entrepreneurial and thrive in taking ownership of your own impact. You take the initiative to solve problems before they arise.
  • You are an excellent collaborator & communicator. You know that startups are a team sport. You listen to others, aren’t afraid to speak your mind and always try to ask the right questions. 
  • You are excited by the prospect of working in a distributed team and company, working with teammates from all over the world.

Benefits of Joining

  • Mission-driven, fast-paced and entrepreneurial environment
  • Competitive salary and equity package
  • A collaborative and flat company culture
  • Remote first, with the option to work in-person occasionally
  • Fully-paid Group Medical Insurance and Personal Accidental Insurance
  • Unlimited paid time off including personal leave, bereavement leave, sick leave
  • Fully paid parental leave - 6 months maternity leave and 3 months paternity leave
  • Monthly WFH stipend alongside a one time home office set-up budget
  • $500 Annual professional development budget 
  • Discretionary trips to our offices across the globe, with global travel medical insurance 
  • Team meals and social events- Virtual and In-person

Branch International is an Equal Opportunity Employer. The company does not and will not discriminate in employment on any basis prohibited by applicable law. We’re looking for more than just qualifications -- so if you’re unsure that you meet the criteria, please do not hesitate to apply!

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Branch International

Founded :
2015
Type
Size
Stage :
Raised funding
About

Branch delivers world-class financial services to the mobile generation. With offices in San Francisco, Lagos, Nairobi, Mumbai and Mexico City, Branch is a for-profit socially conscious company that uses the power of data science to reduce the cost of delivering financial services in emerging markets. We believe that everyone, everywhere deserves fair financial access. The rapid spread of smartphones presents an opportunity for the world's emerging middle class to access banking options and achieve financial flexibility.


Branch's mission-driven team is led by founder and former CEO of Kiva.org. The company presents a rich opportunity for our team members to drive meaningful growth in rapidly evolving and changing markets. Most recently, Branch announced its Series C and has garnered more than $100M in funding with investments from leading Silicon Valley firms.


Our mission is to deliver world-class financial services to the mobile generation.


We're interested in the rising stars with a worldly perspective, a deep interest in financial technology, and an appetite for growth.


Why Branch?


The future looks bright:

We've built a profitable business model, secured multiple rounds of funding, and already disbursed millions of loans. Now it's time to scale exponentially.


Passion is rewarded:

We're not just building a new gadget or solving for far-out futures – we're designing financial services that impact real people, communities and economies.


Your work matters:

Our teams are lean and agile which means every person not only contributes but can leave a lasting mark on the product, the business, and the lives of our global customers.


Challenges await:

The work we do at Branch isn't superficial or easily tackled – it's a real utility for people and with that comes tough challenges. That's why we need the best minds on the forefront of our mission.


More than just a paycheck

While we believe interesting work is the best reward, we also like to incentivize our employees with unlimited time off, discretionary travel to global offices, equity, and more.


Get out of your bubble

We don't solve problems for people, we solve problems with people. As part of Branch, you'll get to work in exciting places and meet people from all over the globe. And yes, you'll want to have a passport handy.


Read more
Company video
Branch International's video section
Branch International's video section
Candid answers by the company
What does the company do?
What is the location preference of jobs?
Why Branch?
What benefits and perks can employees expect when they join the team?

Branch makes it easy for people to access a loan, anytime, anywhere. People can complete the application in seconds and receive loan straight into their bank account. Visit our website to learn more.

Product showcase
Branch's logo
Branch
Visit
Experience the ease of accessing instant credit within minutes. Say goodbye to tedious paperwork and physical documentation – our 100% online system ensures a hassle-free experience.
Read more
Photos
Company featured pictures
Company featured pictures
Connect with the team
Profile picture
Nandita Coutinho
Profile picture
Anshul Agrawal
Profile picture
Seun Olafusi
Company social profiles
instagramlinkedintwitterfacebook

Similar jobs

globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹25L / yr
ETL
skill iconPython
Informatica
Talend



Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,

Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.



 Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with

project scope, Analysis, requirements gathering, data modeling, ETL Design, development,

System testing, Implementation, and production support.

 Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts

and Dimensions

 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,

Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

 Developed mapping parameters and variables to support SQL override.

 Created applets to use them in different mappings.

 Created sessions, configured workflows to extract data from various sources, transformed data,

and loading into the data warehouse.

 Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

 Modified existing mappings for enhancements of new business requirements.

 Involved in Performance tuning at source, target, mappings, sessions, and system levels.

 Prepared migration document to move the mappings from development to testing and then to

production repositories

 Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex

SQL queries using PL/SQL.


 Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica

/Talend sessions as well as performance tuning of mappings and sessions.

 Experience in all phases of Data warehouse development from requirements gathering for the

data warehouse to develop the code, Unit Testing, and Documenting.

 Extensive experience in writing UNIX shell scripts and automation of the ETL processes using

UNIX shell scripting.

 Experience in using Automation Scheduling tools like Control-M.

 Hands-on experience across all stages of Software Development Life Cycle (SDLC) including

business requirement analysis, data mapping, build, unit testing, systems integration, and user

acceptance testing.

 Build, operate, monitor, and troubleshoot Hadoop infrastructure.

 Develop tools and libraries, and maintain processes for other engineers to access data and write

MapReduce programs.

Read more
Impact Guru
at Impact Guru
15 recruiters
Fahad Kazi
Posted by Fahad Kazi
Mumbai
2 - 6 yrs
₹3L - ₹9L / yr
Data Analysis
Business Analysis
Business Intelligence (BI)
Tableau
SQL
+6 more
Experience - 2 years to 5 years.
Location - Andheri (Mumbai)
 
Job Responsibilities:
 
Excellent problem solving and analytical skills - ability to develop hypotheses,
understand and interpret data within the context of the product / business -
solve problems and distill data into actionable recommendations.

 Strong communication skills with the ability to confidently work with cross-
functional teams across the globe and to present information to all levels of the
organization.
 Intellectual and analytical curiosity - initiative to dig into the why, what & how.
 Strong number crunching and quantitative skills.
 Advanced knowledge of MS Excel and PowerPoint.
 Good hands-on SQL
 Experience within Google Analytics, Optimize, Tag Manager and other Google Suite tools
 Understanding of Business analytics tools & statistical programming languages - R, SAS, SPSS, Tableau is a plus
 Inherent interest in e-commerce & marketplace technology platforms and broadly in the consumer Internet & mobile space.
 Previous experience of 1+ years working in a product company in a product analytics role
 Strong understanding of building and interpreting product funnels.
Read more
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more
Roles & Responsibilties
What will you do?
  • Deliver plugins for our Python-based ETL pipelines
  • Deliver Python microservices for provisioning and managing cloud infrastructure
  • Implement algorithms to analyse large data sets
  • Draft design documents that translate requirements into code
  • Effectively manage challenges associated with handling large volumes of data working to tight deadlines
  • Manage expectations with internal stakeholders and context-switch in a fast-paced environment
  • Thrive in an environment that uses AWS and Elasticsearch extensively
  • Keep abreast of technology and contribute to the engineering strategy
  • Champion best development practices and provide mentorship to others
What are we looking for?
  • First and foremost you are a Python developer, experienced with the Python Data stack
  • You love and care about data
  • Your code is an artistic manifest reflecting how elegant you are in what you do
  • You feel sparks of joy when a new abstraction or pattern arises from your code
  • You support the manifests DRY (Don’t Repeat Yourself) and KISS (Keep It Short and Simple)
  • You are a continuous learner
  • You have a natural willingness to automate tasks
  • You have critical thinking and an eye for detail
  • Excellent ability and experience of working to tight deadlines
  • Sharp analytical and problem-solving skills
  • Strong sense of ownership and accountability for your work and delivery
  • Excellent written and oral communication skills
  • Mature collaboration and mentoring abilities
  • We are keen to know your digital footprint (community talks, blog posts, certifications, courses you have participated in or you are keen to, your personal projects as well as any kind of contributions to the open-source communities if any)
Nice to have:
  • Delivering complex software, ideally in a FinTech setting
  • Experience with CI/CD tools such as Jenkins, CircleCI
  • Experience with code versioning (git / mercurial / subversion)
Read more
Blend360
at Blend360
1 recruiter
VasimAkram Shaik
Posted by VasimAkram Shaik
Hyderabad
5 - 13 yrs
Best in industry
Tableau
SQL
Business Intelligence (BI)
Spotfire
Qlikview
+3 more

Key Responsibilities:


•Design, development, support and maintain automated business intelligence products in Tableau.


•Rapidly design, develop and implement reporting applications that insert KPI metrics and actionable insights into the operational, tactical and strategic activities of key business functions.


•Develop strong communication skills with a proven success communicating with users, other tech teams.


•Identify business requirements, design processes that leverage/adapt the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs.


•Design, code and review business intelligence projects developed in tools Tableau & Power BI.


•Work as a member and lead teams to implement BI solutions for our customers.


•Develop dashboards and data sources that meet and exceed customer requirements.


•Partner with business information architects to understand the business use cases that support and fulfill business and data strategy.


•Partner with Product Owners and cross functional teams in a collaborative and agile environment


•Provide best practices for data visualization and Tableau implementations.


•Work along with solution architect in RFI / RFP response solution design, customer presentations, demonstrations, POCs etc. for growth.



Desired Candidate Profile:


•6-10 years of programming experience and a demonstrated proficiency in Experience with Tableau Certifications in Tableau is highly preferred.


•Ability to architect and scope complex projects.


•Strong understanding of SQL and basic understanding of programming languages; experience with SAQL, SOQL, Python, or R a plus.


•Applied experience in Agile development processes (SCRUM)


•Ability to independently learn new technologies.


•Ability to show initiative and work independently with minimal direction.


•Presentation skills – demonstrated ability to simplify complex situations and ideas and distill them into compelling and effective written and oral presentations.


•Learn quickly – ability to understand and rapidly comprehend new areas, functional and technical, and apply detailed and critical thinking to customer solutions.



Education:


•Bachelor/master’s degree in Computer Science, Computer Engineering, quantitative studies, such as Statistics, Math, Operation Research, Economics and Advanced Analytics

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
Mobile Programming India Pvt Ltd
at Mobile Programming India Pvt Ltd
1 video
17 recruiters
Inderjit Kaur
Posted by Inderjit Kaur
Bengaluru (Bangalore), Chennai, Pune, Gurugram
4 - 8 yrs
₹9L - ₹14L / yr
ETL
Data Warehouse (DWH)
Data engineering
Data modeling
BRIEF JOB RESPONSIBILITIES:

• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem
Read more
GitHub
at GitHub
4 recruiters
Nataliia Mediana
Posted by Nataliia Mediana
Remote only
3 - 8 yrs
$24K - $60K / yr
ETL
PySpark
Data engineering
Data engineer
athena
+9 more
We are a nascent quant hedge fund; we need to stage financial data and make it easy to run and re-run various preprocessing and ML jobs on the data.
- We are looking for an experienced data engineer to join our team.
- The preprocessing involves ETL tasks, using pyspark, AWS Glue, staging data in parquet formats on S3, and Athena

To succeed in this data engineering position, you should care about well-documented, testable code and data integrity. We have devops who can help with AWS permissions.
We would like to build up a consistent data lake with staged, ready-to-use data, and to build up various scripts that will serve as blueprints for various additional data ingestion and transforms.

If you enjoy setting up something which many others will rely on, and have the relevant ETL expertise, we’d like to work with you.

Responsibilities
- Analyze and organize raw data
- Build data pipelines
- Prepare data for predictive modeling
- Explore ways to enhance data quality and reliability
- Potentially, collaborate with data scientists to support various experiments

Requirements
- Previous experience as a data engineer with the above technologies
Read more
KAUSHIK BAKERY
Neha Sharma
Posted by Neha Sharma
NCR (Delhi | Gurgaon | Noida), NCR (Delhi | Gurgaon | Noida)
0 - 7 yrs
₹2L - ₹5L / yr
skill iconData Science
Business Analysis
Business Intelligence (BI)
data analytics data science business analytics business intelligence
Read more
fintech
Agency job
via Talentojcom by Raksha Pant
Remote only
2 - 6 yrs
₹9L - ₹30L / yr
ETL
Druid Database
skill iconJava
skill iconScala
SQL
+2 more
● Education in a science, technology, engineering, or mathematics discipline, preferably a
bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos