Cutshort logo
Metrics management Jobs in Bangalore (Bengaluru)

11+ Metrics management Jobs in Bangalore (Bengaluru) | Metrics management Job openings in Bangalore (Bengaluru)

Apply to 11+ Metrics management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Metrics management Job opportunities across top companies like Google, Amazon & Adobe.

icon
Bengaluru (Bangalore)
3 - 8 yrs
₹20L - ₹35L / yr
SQL
skill iconPython
Metrics management
skill iconData Analytics

Responsibilities

  • Work with large and complex blockchain data sets and derive investment relevant metrics in close partnership with financial analysts and blockchain engineers.
  • Apply knowledge of statistics, programming, data modeling, simulation, and advanced mathematics to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries leading to the development of fundamental metrics needed to evaluate various crypto assets.
  • Build a strong understanding of existing metrics used to value various decentralized applications and protocols.
  • Build customer facing metrics and dashboards.
  • Work closely with analysts, engineers, Product Managers and provide feedback as we develop our data analytics and research platform.

Qualifications

  • Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical experience (or) degree in an analytical field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research, Management Science)
  • 3+ years experience with data analysis and metrics development
  • 3+ years experience analyzing and interpreting data, drawing conclusions, defining recommended actions, and reporting results across stakeholders
  • 2+ years experience writing SQL queries
  • 2+ years experience scripting in Python
  • Demonstrated curiosity in and excitement for Web3/blockchain technologies
Read more
SmartHub Innovation Pvt Ltd
Sathya Venkatesh
Posted by Sathya Venkatesh
Bengaluru (Bangalore)
5 - 7 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more

JD Code: SHI-LDE-01 

Version#: 1.0 

Date of JD Creation: 27-March-2023 

Position Title: Lead Data Engineer 

Reporting to: Technical Director 

Location: Bangalore Urban, India (on-site) 

 

SmartHub.ai (www.smarthub.ai) is a fast-growing Startup headquartered in Palo Alto, CA, and with offices in Seattle and Bangalore. We operate at the intersection of AI, IoT & Edge Computing. With strategic investments from leaders in infrastructure & data management, SmartHub.ai is redefining the Edge IoT space. Our “Software Defined Edge” products help enterprises rapidly accelerate their Edge Infrastructure Management & Intelligence. We empower enterprises to leverage their Edge environment to increase revenue, efficiency of operations, manage safety and digital risks by using Edge and AI technologies. 

 

SmartHub is an equal opportunity employer and will always be committed to nurture a workplace culture that supports, inspires and respects all individuals, encourages employees to bring their best selves to work, laugh and share. We seek builders who hail from a variety of backgrounds, perspectives and skills to join our team.  

Summary 

This role requires the candidate to translate business and product requirements to build, maintain, optimize data systems which can be relational or non-relational in nature. The candidate is expected to tune and analyse the data including from a short and long-term trend analysis and reporting, AI/ML uses cases. 

We are looking for a talented technical professional with at least 8 years of proven experience in owning, architecting, designing, operating and optimising databases that are used for large scale analytics and reports. 

Responsibilities 

  • Provide technical & architectural leadership for the next generation of product development. 
  • Innovate, Research & Evaluate new technologies and tools for a quality output. 
  • Architect, Design and Implement ensuring scalability, performance and security. 
  • Code and implement new algorithms to solve complex problems. 
  • Analyze complex data, develop, optimize and transform large data sets both structured and unstructured. 
  • Ability to deploy and administrator the database and continuously tuning for performance especially container orchestration stacks such as Kubernetes  
  • Develop analytical models and solutions Mentor Junior members technically in Architecture, Designing and robust Coding. 
  • Work in an Agile development environment while continuously evaluating and improvising engineering processes 

Required 

  • At least 8 years of experience with significant depth in designing and building scalable distributed database systems for enterprise class products, experience of working in product development companies. 
  • Should have been feature/component lead for several complex features involving large datasets. 
  • Strong background in relational and non-relational database like Postgres, MongoDB, Hadoop etl. 
  • Deep exp database optimization, tuning ertise in SQL, Time Series Databases, Apache Drill, HDFS, Spark are good to have 
  • Excellent analytical and problem-solving skill sets. 
  • Experience in  for high throughput is highly desirable 
  • Exposure to database provisioning in Kubernetes/non-Kubernetes environments, configuration and tuning in a highly available mode. 
  • Demonstrated ability to provide technical leadership and mentoring to the team 


Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹35L / yr
Big Data
Data engineering
Big Data Engineering
Data Engineer
ETL
+5 more

Data Engineer JD:

  • Designing, developing, constructing, installing, testing and maintaining the complete data management & processing systems.
  • Building highly scalable, robust, fault-tolerant, & secure user data platform adhering to data protection laws.
  • Taking care of the complete ETL (Extract, Transform & Load) process.
  • Ensuring architecture is planned in such a way that it meets all the business requirements.
  • Exploring new ways of using existing data, to provide more insights out of it.
  • Proposing ways to improve data quality, reliability & efficiency of the whole system.
  • Creating data models to reduce system complexity and hence increase efficiency & reduce cost.
  • Introducing new data management tools & technologies into the existing system to make it more efficient.
  • Setting up monitoring and alarming on data pipeline jobs to detect failures and anomalies

What do we expect from you?

  • BS/MS in Computer Science or equivalent experience
  • 5 years of recent experience in Big Data Engineering.
  • Good experience in working with Hadoop and Big Data technologies like HDFS, Pig, Hive, Zookeeper, Storm, Spark, Airflow and NoSQL systems
  • Excellent programming and debugging skills in Java or Python.
  • Apache spark, python, hands on experience in deploying ML models
  • Has worked on streaming and realtime pipelines
  • Experience with Apache Kafka or has worked with any of Spark Streaming, Flume or Storm

 

 

 

 

 

 

 

 

 

 

 

 

Focus Area:

 

R1

Data structure & Algorithms

R2

Problem solving + Coding

R3

Design (LLD)

 

Read more
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹15L / yr
Underwriting
skill iconData Analytics
Investment analysis
Payment gateways
Credit Analytics
+5 more
If you are interested in joining a purpose-driven community that is dedicated to creating ambitious and inclusive workplaces, then be a part of a high growth startup with a world class team, building a revolutionary product!

Our client is a vertical fintech play focused on solving industry-specific financing gaps in the food sector through the application of data. The platform provides skin-in-the-game growth capital to much-loved F&B brands. Founded in 2019, they're VC funded and based out of Singapore and India-Bangalore.

Founders are the alumnus of IIT-D, IIM-B and Wharton. They have 12+ years of experience as Venture capital and corporate entrepreneurship at DFJ, Vertex, InMobi, VP at Snyder UAE, investment banking at Unitus Capital - leading the financial services practice, and institutional equities at Kotak. They've a team of high-quality professionals coming together for this mission to disrupt the convention.

As a Data Analyst - Underwriting & Risk, you will be developing a first of its kind risk engine for revenue-based financing in India and automating investment appraisals for the company's different revenue-based financing products

What you will do:
  • Identifying alternate data sources beyond financial statements and implementing them as a part of assessment criteria.
  • Automating appraisal mechanisms for all newly launched products and revisiting the same for an existing product.
  • Back-testing investment appraisal models at regular intervals to improve the same.
  • Complementing appraisals with portfolio data analysis and portfolio monitoring at regular intervals.
  • Working closely with the business and the technology team to ensure the portfolio is performing as per internal benchmarks and that relevant checks are put in place at various stages of the investment lifecycle.
  • Identifying relevant sub-sector criteria to score and rating investment opportunities internally.

 


Candidate Profile:

What you need to have:

  • Bachelor’s degree with relevant work experience of at least 3 years with CA/MBA (mandatory).
  • Experience in working in lending/investing fintech (mandatory).
  • Strong Excel skills (mandatory).
  • Previous experience in credit rating or credit scoring or investment analysis (preferred).
  • Prior exposure to working on data-led models on payment gateways or accounting systems (preferred).
  • Proficiency in data analysis (preferred)
  • Good verbal and written skills.
Read more
Novo

at Novo

2 recruiters
Dishaa Ranjan
Posted by Dishaa Ranjan
Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹25L - ₹35L / yr
SQL
skill iconPython
pandas
Scikit-Learn
TensorFlow
+1 more

About Us: 

Small businesses are the backbone of the US economy, comprising almost half of the GDP and the private workforce. Yet, big banks don’t provide the access, assistance and modern tools that owners need to successfully grow their business. 


We started Novo to challenge the status quo—we’re on a mission to increase the GDP of the modern entrepreneur by creating the go-to banking platform for small businesses (SMBs). Novo is flipping the script of the banking world, and we’re excited to lead the small business banking revolution.


At Novo, we’re here to help entrepreneurs, freelancers, startups and SMBs achieve their financial goals by empowering them with an operating system that makes business banking as easy as iOS. We developed modern bank accounts and tools to help to save time and increase cash flow. Our unique product integrations enable easy access to tracking payments, transferring money internationally, managing business transactions and more. We’ve made a big impact in a short amount of time, helping thousands of organizations access powerfully simple business banking.  



We are looking for a Senior Data Scientist who is enthusiastic about using data and technology to solve complex business problems. If you're passionate about leading and helping to architect and develop thoughtful data solutions, then we want to chat. Are you ready to revolutionize the small business banking industry with us?


About the Role: (specific to the role-- describe the role activities/duties, who they interact with, what they are accountable for, how the role operates in the team, department and organization)


  • Build and manage predictive models focussed on credit risk, fraud, conversions, churn, consumer behaviour etc
  • Provides best practices, direction for data analytics and business decision making across multiple projects and functional areas
  • Implements performance optimizations and best practices for scalable data models, pipelines and modelling
  • Resolve blockers and help the team stay productive
  • Take part in building the team and iterating on hiring processes

Requirements for the Role: (these are specific to the role-- technical skills and requirements to fulfill the job duties, certifications, years of experience, degree)


  • 4+ years of experience in data science roles focussed on managing data processes, modelling and dashboarding
  • Strong experience in python, SQL and in-depth understanding of modelling techniques
  • Experience working with Pandas, scikit learn, visualization libraries like plotly, bokeh etc.
  • Prior experience with credit risk modelling will be preferred
  • Deep Knowledge of Python to write scripts to manipulate data and generate automated  reports

How We Define Success: (these are specific to the role-- should be tied to performance management, OKRs or general goals)


  • Expand access to data driven decision making across the organization
  • Solve problems in risk, marketing, growth, customer behaviour through analytics models that increase efficacy

Nice To Have, but Not Required:

  • Experience in dashboarding libraries like Python Dash and exposure to CI/CD 
  • Exposure to big data tools like Spark, and some core tech knowledge around API’s, data streaming etc.


Novo values diversity as a core tenant of the work we do and the businesses we serve. We are an equal opportunity employer, indiscriminate of race, religion, ethnicity, national origin, citizenship, gender, gender identity, sexual orientation, age, veteran status, disability, genetic information or any other protected characteristic. 

Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconR Programming
skill iconPython
SQL
skill iconMachine Learning (ML)
+1 more

Responsibilities:

  • Design and develop strong analytics system and predictive models
  • Managing a team of data scientists, machine learning engineers, and big data specialists
  • Identify valuable data sources and automate data collection processes
  • Undertake pre-processing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

Requirements:

  • Proven experience as a seasoned Data Scientist
  • Good Experience in data mining processes
  • Understanding of machine learning and Knowledge of operations research is a value addition
  • Strong understanding and experience in R, SQL, and Python; Knowledge base with Scala, Java, or C++ is an asset
  • Experience using business intelligence tools (e. g. Tableau) and data frameworks (e. g. Hadoop)
  • Strong math skills (e. g. statistics, algebra)
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • Experience in Natural Language Processing (NLP)
  • Strong competitive coding skills
  • BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
upGrad

at upGrad

1 video
19 recruiters
Priyanka Muralidharan
Posted by Priyanka Muralidharan
Bengaluru (Bangalore), Mumbai
4 - 6 yrs
₹19L - ₹24L / yr
SQL
skill iconPython
Tableau
Team Management
Statistical Analysis

Role Summary

We Are looking for an analytically inclined, Insights Driven Product Analyst to make our organisation more data driven. In this role you will be responsible for creating dashboards to drive insights for product and business teams. Be it Day to Day decisions as well as long term impact assessment, Measuring the Efficacy of different products or certain teams, You'll be Empowering each of them. The growing nature of the team will require you to be in touch with all of the teams at upgrad. Are you the "Go-To" person everyone looks at for getting Data, Then this role is for you.

 

Roles & Responsibilities

  • Lead and own the analysis of highly complex data sources, identifying trends and patterns in data and provide insights/recommendations based on analysis results
  • Build, maintain, own and communicate detailed reports to assist Marketing, Growth/Learning Experience and Other Business/Executive Teams
  • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
  • Analyze data and generate insights in the form of user analysis, user segmentation, performance reports, etc.
  • Facilitate review sessions with management, business users and other team members
  • Design and create visualizations to present actionable insights related to data sets and business questions at hand
  • Develop intelligent models around channel performance, user profiling, and personalization

Skills Required

  • Having 4-6 yrs hands-on experience with Product related analytics and reporting
  • Experience with building dashboards in Tableau or other data visualization tools such as D3
  • Strong data, statistics, and analytical skills with a good grasp of SQL.
  • Programming experience in Python is must
  • Comfortable managing large data sets
  • Good Excel/data management skills
Read more
Simplilearn Solutions

at Simplilearn Solutions

1 video
36 recruiters
Aniket Manhar Nanjee
Posted by Aniket Manhar Nanjee
Bengaluru (Bangalore)
2 - 5 yrs
₹6L - ₹10L / yr
skill iconData Science
skill iconR Programming
skill iconPython
skill iconScala
Tableau
+1 more
Simplilearn.com is the world’s largest professional certifications company and an Onalytica Top 20 influential brand. With a library of 400+ courses, we've helped 500,000+ professionals advance their careers, delivering $5 billion in pay raises. Simplilearn has over 6500 employees worldwide and our customers include Fortune 1000 companies, top universities, leading agencies and hundreds of thousands of working professionals. We are growing over 200% year on year and having fun doing it. Description We are looking for candidates with strong technical skills and proven track record in building predictive solutions for enterprises. This is a very challenging role and provides an opportunity to work on developing insights based Ed-Tech software products used by large set of customers across globe. It provides an exciting opportunity to work across various advanced analytics & data science problem statement using cutting-edge modern technologies collaborating with product, marketing & sales teams. Responsibilities • Work on enterprise level advanced reporting requirements & data analysis. • Solve various data science problems customer engagement, dynamic pricing, lead scoring, NPS improvement, optimization, chatbots etc. • Work on data engineering problems utilizing our tech stack - S3 Datalake, Spark, Redshift, Presto, Druid, Airflow etc. • Collect relevant data from source systems/Use crawling and parsing infrastructure to put together data sets. • Craft, conduct and analyse A/B experiments to evaluate machine learning models/algorithms. • Communicate findings and take algorithms/models to production with ownership. Desired Skills • BE/BTech/MSc/MS in Computer Science or related technical field. • 2-5 years of experience in advanced analytics discipline with solid data engineering & visualization skills. • Strong SQL skills and BI skills using Tableau & ability to perform various complex analytics in data. • Ability to propose hypothesis and design experiments in the context of specific problems using statistics & ML algorithms. • Good overlap with Modern Data processing framework such as AWS-lambda, Spark using Scala or Python. • Dedication and diligence in understanding the application domain, collecting/cleaning data and conducting various A/B experiments. • Bachelor Degree in Statistics or, prior experience with Ed-Tech is a plus
Read more
Japan Based Leading Company
Bengaluru (Bangalore)
3 - 10 yrs
₹0L - ₹20L / yr
Big Data
skill iconAmazon Web Services (AWS)
skill iconJava
skill iconPython
MySQL
+2 more
A data engineer with AWS Cloud infrastructure experience to join our Big Data Operations team. This role will provide advanced operations support, contribute to automation and system improvements, and work directly with enterprise customers to provide excellent customer service.
The candidate,
1. Must have a very good hands-on technical experience of 3+ years with JAVA or Python
2. Working experience and good understanding of AWS Cloud; Advanced experience with IAM policy and role management
3. Infrastructure Operations: 5+ years supporting systems infrastructure operations, upgrades, deployments using Terraform, and monitoring
4. Hadoop: Experience with Hadoop (Hive, Spark, Sqoop) and / or AWS EMR
5. Knowledge on PostgreSQL/MySQL/Dynamo DB backend operations
6. DevOps: Experience with DevOps automation - Orchestration/Configuration Management and CI/CD tools (Jenkins)
7. Version Control: Working experience with one or more version control platforms like GitHub or GitLab
8. Knowledge on AWS Quick sight reporting
9. Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, AWS CloudTrail, Datadog and Elastic Search
10. Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB) and high availability architecture
11. Security: Experience implementing role-based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools for remediation of security vulnerabilities.
12. Demonstrated successful experience learning new technologies quickly
WHAT WILL BE THE ROLES AND RESPONSIBILITIES?
1. Create procedures/run books for operational and security aspects of AWS platform
2. Improve AWS infrastructure by developing and enhancing automation methods
3. Provide advanced business and engineering support services to end users
4. Lead other admins and platform engineers through design and implementation decisions to achieve balance between strategic design and tactical needs
5. Research and deploy new tools and frameworks to build a sustainable big data platform
6. Assist with creating programs for training and onboarding for new end users
7. Lead Agile/Kanban workflows and team process work
8. Troubleshoot issues to resolve problems
9. Provide status updates to Operations product owner and stakeholders
10. Track all details in the issue tracking system (JIRA)
11. Provide issue review and triage problems for new service/support requests
12. Use DevOps automation tools, including Jenkins build jobs
13. Fulfil any ad-hoc data or report request queries from different functional groups
Read more
US Healthcare
Agency job
via turtlebowl by swati m
Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹11L / yr
skill iconData Analytics
Relational Database (RDBMS)
Dashboard Manager
Reporting
Trend analysis
+1 more
About - 

 
Relevant Years of Exp  - Minimum 4-8 years of experience in data analysis, data reporting, identify and analyze pattern/trends.
 
Knowledge Skill Sets -
 
§ Experience with Tableau dashboard 

§ Careful and attentive to details 

§ Willing and eager to call out mistakes 

§ Beginner to intermediate knowledge of relational databases, reporting, business intelligence 

§ Professional communicator 

§ Inquisitive/Curious readily asking questions about anything that doesn’t make sense or feels right 

§ Good interpersonal skills with a proven ability to communicate effectively (both written and verbal). 

§ Well-developed skill in MS Excel 

§ Displays awareness of the need for confidentially in sensitive matters. 

§ Eye for detailing.
 
Role Description - 
 
§ Execute tasks assigned by reporting manager and/or Bedford SPOC 

§ Identify, analyze, and interpret trends or patterns 

§ Audit and report discrepancies/inconsistencies in Tableau reports/dashboards 

§ Publish weekly/monthly reports in pre-defined format and frequency to reporting manager
 
Job Purpose - 
§ Prepare reports using Tableau for delivery to clients 

§ Adjust parameters and prepare custom reports using previously built dashboards. 

§ Print reports to PDF and deliver to folders on a predetermined schedule 

§ Become familiar with Tableau - our clients, created workbooks, parameters, filters, and databases 

§ QA existing dashboards and look for inconsistencies in naming, filters, charts, tables, etc
 
Note - Looking for Immediate joiner with N.P. of 30 days.
Timing - US shift (6 p.m. - 3.30 a.m.)
Benefits - Transport facility + Night shift Allowance.
Location - Domlur.
Working - 5 days.
 
Reach me ASAP if exploring opportunity with updated resume -
--
Thanks and Regards,
M.Swati 
Associate Consultant
 
#intelligenthiring
India | Singapore 
http://www.turtlebowl.com/" target="_blank">www.turtlebowl.com 
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort