Data Analyst, Business Intelligence

at HighLevel LLC

DP
Posted by Shivam Tiwari
icon
Remote only
icon
3 - 7 yrs
icon
Best in industry
icon
Full time
Skills
Data Analytics
Business Intelligence (BI)
Data Analyst
PowerBI
Tableau
Data Analysis
Data Visualization
Microsoft Excel
SQL
Storytelling
Reporting
Google Sheets

About HighLevel

Founded in 2018, HighLevel Inc. (www.gohighlevel.com) is a two tier B2B SaaS Platform focussed on Marketing Agencies. We aspire to be a one stop shop for marketing agencies. We enable agencies to serve their clients with ease using the best cutting edge tools of the industry. Our clientele includes digital marketing agencies, Ads agencies, SEO agencies, Call center / Sales agencies and freelancers. We operate across niches like real-estate, dental & medical, local businesses, e-commerce, professional services and field services. 


Our platform comprises various product areas including CRM, funnel builder, website builder, forms & surveys, WordPress hosting, email marketing, telephony, reviews management, omni-box communications, social media, invoicing & payments to name a few. We currently service over 15,000 agencies and 120,000 small & medium businesses with a 200+ strong team that works entirely remote across 15 nations. 


We encourage you to check out out youtube channel to learn more about out platform - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g 


Why should you join HighLevel?

HighLevel is an exciting place to work because of the passionate, driven team that we have. At HighLevel

  • It’s never somebody else’s job
  • We are passionately focussed on adding value for our users
  • We deliver fast using lean principles. We go to market in weeks instead of quarters
  • A good idea always gets tested
  • We take care of our team so our team can take care of our users
  • We embrace that improvement is constant and iterative
  • You will learn how to scale B2B SaaS Startup and build relevant, impactful products for customers

About the role

HighLevel Inc is looking for a Data Analyst (Business Intelligence) who will enable other teams like Product, Marketing, Services & Leadership to make data-backed data-driven business decisions. You will be responsible for coordinating instrumentation practices with product, sales, marketing, customer success & user education teams and will own our data strategy end to end. This encompasses revenue analytics being collected in Stripe and ChartMogul and behavioural data collected in Pendo & GA. You will analyze and identify key correlations, data legitimacy, business opportunities and data maintenance in conjunction with other leaders in the organization. 


This is a highly cross-functional role that will require you to work closely with product managers, sales, marketing, customer success & support teams to build a comprehensive strategy of instrumenting, managing, analyzing and reporting on the data we collect.


Type - Full Time (Remote in India)


Your Responsibilities

  1. Build a data strategy for HighLevel which includes how we collect data, how it is stored, how it is analyzed and how insights are consumed by various teams
  2. Build strategies to leverage tools like Stripe Sigma, Chart Mogul & Pendo to better understand the revenue levers in our model
  3. Own the data initiatives at HighLevel and work closely with sales, marketing, branding & content creation teams to build data competency
  4. Create buy-in across verticals and mobilize different teams to achieve a common outcome
  5. Plan, design and vet world-class data practices at HighLevel

Your Core Skills

  1. You love data! You trust only in god and everyone else needs to get data to be trusted
  2. You have a demonstrated ability of guiding business decisions using your data findings in the past
  3. You like presenting complex scenarios as stories that can be understood by non-technical stakeholders and customers
  4. You have strong data sanitization, segmentation and analytical skills 
  5. You have working knowledge of SQL and are comfortable with Excel & Google sheets
  6. You have experience working with revenue analytics and platform analytics tools
  7. You can build visual reports to guide other teams in their functioning
  8. You easily gain trust of various stakeholders while working in a flat cross-functional environment and can lead without authority
  9. You strive to educate other teams to be data driven and understand the value in data-driven decision making
  10. You are comfortable while dealing with abstract high level problems and have an ability to break them down into bite-sized executable tasks
  11. You have high standards of personal accountability and have a positive outlook

Additional Skills

  1. Experience in a B2B SaaS or agency environment

About HighLevel LLC

The fastest growing all in one platform for SMB's and digital marketing agencies. CRM, Email, 2-way SMS, phone system, facebook, instagram, WhatsApp, Email marketing, Social media posting, Websites, Funnel Builder, Wordpress hosting & more!


We have a very strong and independent team. We value tinkerers and people with an entrepreneurial spirit. We want people to come to work and explore their curiosity every day. Our growth offers a unique opportunity for the right individual to scale and build world class products.


Some of the perks we offer


  • 100% Remote


  • Uncapped Leave Policy


  • WFH Setup


  • Champion Big Problems
Founded
2018
Type
Product
Size
100-500 employees
Stage
Profitable
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Big Data Engineer

at Propellor.ai

Founded 2016  •  Products & Services  •  20-100 employees  •  Raised funding
Python
SQL
Spark
Hadoop
Big Data
Data engineering
PySpark
icon
Remote only
icon
1 - 4 yrs
icon
₹5L - ₹15L / yr

Big Data Engineer/Data Engineer


What we are solving
Welcome to today’s business data world where:
• Unification of all customer data into one platform is a challenge

• Extraction is expensive
• Business users do not have the time/skill to write queries
• High dependency on tech team for written queries

These facts may look scary but there are solutions with real-time self-serve analytics:
• Fully automated data integration from any kind of a data source into a universal schema
• Analytics database that streamlines data indexing, query and analysis into a single platform.
• Start generating value from Day 1 through deep dives, root cause analysis and micro segmentation

At Propellor.ai, this is what we do.
• We help our clients reduce effort and increase effectiveness quickly
• By clearly defining the scope of Projects
• Using Dependable, scalable, future proof technology solution like Big Data Solutions and Cloud Platforms
• Engaging with Data Scientists and Data Engineers to provide End to End Solutions leading to industrialisation of Data Science Model Development and Deployment

What we have achieved so far
Since we started in 2016,
• We have worked across 9 countries with 25+ global brands and 75+ projects
• We have 50+ clients, 100+ Data Sources and 20TB+ data processed daily

Work culture at Propellor.ai
We are a small, remote team that believes in
• Working with a few, but only with highest quality team members who want to become the very best in their fields.
• With each member's belief and faith in what we are solving, we collectively see the Big Picture
• No hierarchy leads us to believe in reaching the decision maker without any hesitation so that our actions can have fruitful and aligned outcomes.
• Each one is a CEO of their domain.So, the criteria while making a choice is so our employees and clients can succeed together!

To read more about us click here:
https://bit.ly/3idXzs0" target="_blank">https://bit.ly/3idXzs0

About the role
We are building an exceptional team of Data engineers who are passionate developers and wants to push the boundaries to solve complex business problems using the latest tech stack. As a Big Data Engineer, you will work with various Technology and Business teams to deliver our Data Engineering offerings to our clients across the globe.

Role Description

• The role would involve big data pre-processing & reporting workflows including collecting, parsing, managing, analysing, and visualizing large sets of data to turn information into business insights
• Develop the software and systems needed for end-to-end execution on large projects
• Work across all phases of SDLC, and use Software Engineering principles to build scalable solutions
• Build the knowledge base required to deliver increasingly complex technology projects
• The role would also involve testing various machine learning models on Big Data and deploying learned models for ongoing scoring and prediction.

Education & Experience
• B.Tech. or Equivalent degree in CS/CE/IT/ECE/EEE 3+ years of experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions.

Must have (hands-on) experience
• Python and SQL expertise
• Distributed computing frameworks (Hadoop Ecosystem & Spark components)
• Must be proficient in any Cloud computing platforms (AWS/Azure/GCP)  • Experience in in any cloud platform would be preferred - GCP (Big Query/Bigtable, Pub sub, Data Flow, App engine )/ AWS/ Azure

• Linux environment, SQL and Shell scripting Desirable
• Statistical or machine learning DSL like R
• Distributed and low latency (streaming) application architecture
• Row store distributed DBMSs such as Cassandra, CouchDB, MongoDB, etc
. • Familiarity with API design

Hiring Process:
1. One phone screening round to gauge your interest and knowledge of fundamentals
2. An assignment to test your skills and ability to come up with solutions in a certain time
3. Interview 1 with our Data Engineer lead
4. Final Interview with our Data Engineer Lead and the Business Teams

Preferred Immediate Joiners

Job posted by
Kajal Jain

Data Analyst

at Vahak

Founded 2016  •  Product  •  20-100 employees  •  Raised funding
Data Analytics
Data Analyst
MS-Excel
SQL
Tableau
Python
PowerBI
icon
Bengaluru (Bangalore)
icon
4 - 10 yrs
icon
₹10L - ₹15L / yr

Who Are We?

Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 5 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.

 

Vahak has raised a capital of $5 Million in a “Pre Series A” round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.


Responsibilities for Data Analyst:

  • Undertake preprocessing of structured and unstructured data
  • Propose solutions and strategies to business challenges
  • Present information using data visualization techniques
  • Identify valuable data sources and automate collection processes
  • Analyze large amounts of information to discover trends and patterns
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
  • Proactively analyze data to answer key questions from stakeholders or out of self-initiated curiosity with an eye for what drives business performance.

Qualifications for Data Analyst

  • Experience using business intelligence tools (e.g. Tableau, Power BI – not mandatory)
  • Strong SQL or Excel skills with the ability to learn other analytic tools
  • Conceptual understanding of various modelling techniques, pros and cons of each technique
  • Strong problem solving skills with an emphasis on product development.
  • Programming advanced computing, Developing algorithms and predictive modeling experience
  • Experience using statistical computer languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
  • Advantage - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
  • Demonstrated experience applying data analysis methods to real-world data problems
Job posted by
Vahak Talent

Data Analyst

at Extramarks Education India Pvt Ltd

Founded 2007  •  Product  •  1000-5000 employees  •  Profitable
Tableau
PowerBI
Data Analytics
SQL
Python
icon
Noida, Delhi, Gurugram, Ghaziabad, Faridabad
icon
3 - 5 yrs
icon
₹8L - ₹10L / yr

Required Experience

· 3+ years of relevant technical experience as a data analyst role

· Intermediate / expert skills with SQL and basic statistics

· Experience in Advance SQL

· Python programming- Added advantage

· Strong problem solving and structuring skills

· Automation in connecting various sources to the data and representing it through various dashboards

· Excellent with Numbers and communicate data points through various reports/templates

· Ability to communicate effectively internally and outside Data Analytics team

· Proactively take up work responsibilities and take adhocs as and when needed

· Ability and desire to take ownership of and initiative for analysis; from requirements clarification to deliverable

· Strong technical communication skills; both written and verbal

· Ability to understand and articulate the "big picture" and simplify complex ideas

· Ability to identify and learn applicable new techniques independently as needed

· Must have worked with various Databases (Relational and Non-Relational) and ETL processes

· Must have experience in handling large volume and data and adhere to optimization and performance standards

· Should have the ability to analyse and provide relationship views of the data from different angles

· Must have excellent Communication skills (written and oral).

· Knowing Data Science is an added advantage

Required Skills

MYSQL, Advanced Excel, Tableau, Reporting and dashboards, MS office, VBA, Analytical skills

Preferred Experience

· Strong understanding of relational database MY SQL etc.

· Prior experience working remotely full-time

· Prior Experience working in Advance SQL

· Experience with one or more BI tools, such as Superset, Tableau etc.

· High level of logical and mathematical ability in Problem Solving

Job posted by
Prachi Sharma

Data Scientist

at Information Solution Provider Company

Agency job
via Jobdost
SQL
Hadoop
Spark
Machine Learning (ML)
Data Science
Algorithms
Python
Big Data
icon
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
icon
3 - 7 yrs
icon
₹10L - ₹15L / yr

Job Description:

The data science team is responsible for solving business problems with complex data. Data complexity could be characterized in terms of volume, dimensionality and multiple touchpoints/sources. We understand the data, ask fundamental-first-principle questions, apply our analytical and machine learning skills to solve the problem in the best way possible. 

 

Our ideal candidate

The role would be a client facing one, hence good communication skills are a must. 

The candidate should have the ability to communicate complex models and analysis in a clear and precise manner. 

 

The candidate would be responsible for:

  • Comprehending business problems properly - what to predict, how to build DV, what value addition he/she is bringing to the client, etc.
  • Understanding and analyzing large, complex, multi-dimensional datasets and build features relevant for business
  • Understanding the math behind algorithms and choosing one over another
  • Understanding approaches like stacking, ensemble and applying them correctly to increase accuracy

Desired technical requirements

  • Proficiency with Python and the ability to write production-ready codes. 
  • Experience in pyspark, machine learning and deep learning
  • Big data experience, e.g. familiarity with Spark, Hadoop, is highly preferred
  • Familiarity with SQL or other databases.
Job posted by
Sathish Kumar

Data Scientist

at Blue Sky Analytics

Founded 2018  •  Product  •  20-100 employees  •  Raised funding
NumPy
SciPy
Data Science
Python
pandas
Git
GitHub
SQL
Amazon S3
Amazon EC2
GIS analysis
GDAL
QGIS
icon
Remote only
icon
1 - 5 yrs
icon
Best in industry

About the Company

Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!


We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!


Your Role

Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.

Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.

Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.

Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.

Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.

Requirements

These are must have skill-sets that we are looking for:

  • Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
  • Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
  • Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
  • Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
  • Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
  • Capable of writing clear and lucid reports and demystifying data for the rest of us.
  • Be curious and care about the planet!
  • Minimum 2 years of demonstrable industry experience working with large and noisy datasets.

Benefits

  • Work from anywhere: Work by the beach or from the mountains.
  • Open source at heart: We are building a community where you can use, contribute and collaborate on.
  • Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
  • Flexible timings: Fit your work around your lifestyle.
  • Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
  • Work Machine of choice: Buy a device and own it after completing a year at BSA.
  • Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
  • Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
Job posted by
Balahun Khonglanoh

Data Analyst

at Games 24x7

Agency job
via zyoin
PowerBI
Big Data
Hadoop
Apache Hive
Business Intelligence (BI)
Data Warehouse (DWH)
SQL
Python
Tableau
Java
icon
Bengaluru (Bangalore)
icon
0 - 6 yrs
icon
₹10L - ₹21L / yr
Location: Bangalore
Work Timing: 5 Days A Week

Responsibilities include:

• Ensure right stakeholders gets right information at right time
• Requirement gathering with stakeholders to understand their data requirement
• Creating and deploying reports
• Participate actively in datamarts design discussions
• Work on both RDBMS as well as Big Data for designing BI Solutions
• Write code (queries/procedures) in SQL / Hive / Drill that is both functional and elegant,
following appropriate design patterns
• Design and plan BI solutions to automate regular reporting
• Debugging, monitoring and troubleshooting BI solutions
• Creating and deploying datamarts
• Writing relational and multidimensional database queries
• Integrate heterogeneous data sources into BI solutions
• Ensure Data Integrity of data flowing from heterogeneous data sources into BI solutions.

Minimum Job Qualifications:
• BE/B.Tech in Computer Science/IT from Top Colleges
• 1-5 years of experience in Datawarehousing and SQL
• Excellent Analytical Knowledge
• Excellent technical as well as communication skills
• Attention to even the smallest detail is mandatory
• Knowledge of SQL query writing and performance tuning
• Knowledge of Big Data technologies like Apache Hadoop, Apache Hive, Apache Drill
• Knowledge of fundamentals of Business Intelligence
• In-depth knowledge of RDBMS systems, Datawarehousing and Datamarts
• Smart, motivated and team oriented
Desirable Requirements
• Sound knowledge of software development in Programming (preferably Java )
• Knowledge of the software development lifecycle (SDLC) and models
Job posted by
Shubha N

DataStage Developer

at Datametica Solutions Private Limited

Founded 2013  •  Products & Services  •  100-1000 employees  •  Profitable
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
Linux/Unix
icon
Pune
icon
3 - 8 yrs
icon
₹5L - ₹20L / yr

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Job posted by
Sumangali Desai

Data Analyst

at MSMEx

Founded 2019  •  Product  •  20-100 employees  •  Raised funding
Data Analytics
Data Analysis
Data Analyst
SQL
Python
Google Analytics
CleverTap
Team Management
metabase
icon
Remote, Mumbai, Pune
icon
4 - 6 yrs
icon
₹5L - ₹12L / yr

We are looking for a Data Analyst that oversees organisational data analytics. This will require you to design and help implement the data analytics platform that will keep the organisation running. The team will be the go-to for all data needs for the app and we are looking for a self-starter who is hands on and yet able to abstract problems and anticipate data requirements.
This person should be very strong technical data analyst who can design and implement data systems on his own. Along with him, he also needs to be proficient in business reporting and should have keen interest in provided data needed for business.

 

Tools familiarity:  SQL, Python, Mix panel, Metabase, Google Analytics,  Clever Tap, App Analytics

Responsibilities

  • Processes and frameworks for metrics, analytics, experimentation and user insights, lead the data analytics team
  • Metrics alignment across teams to make them actionable and promote accountability
  • Data based frameworks for assessing and strengthening Product Market Fit
  • Identify viable growth strategies through data and experimentation
  • Experimentation for product optimisation and understanding user behaviour
  • Structured approach towards deriving user insights, answer questions using data
  • This person needs to closely work with Technical and Business teams to get this implemented.

Skills

  • 4 to 6 years at a relevant role in data analytics in a Product Oriented company
  • Highly organised, technically sound & good at communication
  • Ability to handle & build for cross functional data requirements / interactions with teams
  • Great with Python, SQL
  • Can build, mentor a team
  • Knowledge of key business metrics like cohort, engagement cohort, LTV, ROAS, ROE

 

Eligibility

BTech or MTech in Computer Science/Engineering from a Tier1, Tier2 colleges

 

Good knowledge on Data Analytics, Data Visualization tools. A formal certification would be added advantage.

We are more interested in what you CAN DO than your location, education, or experience levels.

 

Send us your code samples / GitHub profile / published articles if applicable.

Job posted by
Sujata Ranjan

Sr Product Analyst

at High-Growth Fintech Startup

Agency job
via Unnati
Product Analyst
Product Management
Product Manager
SQL
Product Strategy
Google Analytics
Web Analytics
Business Analysis
Process automation
feature prioritization
icon
Mumbai
icon
3 - 5 yrs
icon
₹7L - ₹11L / yr
Want to join the trailblazing Fintech company which is leveraging software and technology to change the face of short-term financing in India!

Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
 
Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
 
As a Sr Product Analyst, you will partner with business & product teams to define goals and to perform extensive analysis.
 
What you will do:
  • Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
  • Estimating impact and weighing in on feature prioritization, looking for insights and anomalies across lending funnel,
  • Defining key metrics and monitor on a day to day basis
  • Helping Marketing, product and UX team in defining segments by conducting user interviews and data backed insights

 


Candidate Profile:

What you need to have:

  • B.Tech /B.E.;Any Graduation
  • Strong background in statistical concepts & calculations to perform analysis/ modeling
  • Proficient in SQL
  • Good knowledge of Google Analytics and any other web analytics platforms (preferred)
  • Strong analytical and problem solving skills to analyze large quantum of datasets
  • Ability to work independently and bring innovative solutions to the team
  • Experience of working with a start-up or a product organization (preferred)
Job posted by
Prabha Ramamurthy

ETL developer

at fintech

Agency job
via Talentojcom
ETL
Druid Database
Java
Scala
SQL
Tableau
Python
icon
Remote only
icon
2 - 6 yrs
icon
₹9L - ₹30L / yr
● Education in a science, technology, engineering, or mathematics discipline, preferably a
bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
Job posted by
Raksha Pant
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at HighLevel LLC?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort