Cutshort logo
Risk analysis Jobs in Mumbai

Risk analysis Jobs in Mumbai

Explore top Risk analysis Job opportunities in Mumbai from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon

High-Growth Fintech Startup

Agency job
via Unnati by Sarika Tamhane
Mumbai
1 - 6 yrs
₹10L - ₹12L / yr
Risk analysis
Want to join the trailblazing Fintech company which is leveraging software and technology to change the face of short-term financing in India!

Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
 
As a Credit Risk Analyst, you will be responsible for analyzing data to better understand potential risks, concerns and outcomes of decisions.

What you will do:

  • Reviewing the portfolio monitoring/ early warning signals mechanism on ongoing basis
  • Monitoring internal and external data points that may affect the risk level of a decision
  • Aggregating data from multiple sources to provide a comprehensive assessment
  • Coming up with the solution to reduce risks
  • Bringing fresh ideas to the table and being keen observers of trends on analytics and financial services industry
  • Creating reports, summaries, presentations and process documents to display results

 

Desired Candidate Profile

What you need to have:
 
  • MBA/BE/ Masters Statistics/ Mathematics, with work experience of 1-5 years in a similar company or related field
  • Work experience with analytics consulting into financial services Indian Banks/ NBFCs in-house analytics units or Fintech/analytics start-ups would be a plus

 

Read more

A global business process management company

Agency job
via Jobdost by Saida Jabbar
Pune, Bengaluru (Bangalore), Chennai, Mumbai, Gurugram, Nashik
5 - 10 yrs
₹20L - ₹22L / yr
Data Science
Kofax
Data Scientist
Machine Learning (ML)
Natural Language Processing (NLP)
+5 more

B1 – Data Scientist  -  Kofax Accredited Developers

 

Requirement – 3

 

Mandatory –

  • Accreditation of Kofax KTA / KTM
  • Experience in Kofax Total Agility Development – 2-3 years minimum
  • Ability to develop and translate functional requirements to design
  • Experience in requirement gathering, analysis, development, testing, documentation, version control, SDLC, Implementation and process orchestration
  • Experience in Kofax Customization, writing Custom Workflow Agents, Custom Modules, Release Scripts
  • Application development using Kofax and KTM modules
  • Good/Advance understanding of Machine Learning /NLP/ Statistics
  • Exposure to or understanding of RPA/OCR/Cognitive Capture tools like Appian/UI Path/Automation Anywhere etc
  • Excellent communication skills and collaborative attitude
  • Work with multiple teams and stakeholders within like Analytics, RPA, Technology and Project management teams
  • Good understanding of compliance, data governance and risk control processes

Total Experience – 7-10 Years in BPO/KPO/ ITES/BFSI/Retail/Travel/Utilities/Service Industry

Good to have

  • Previous experience of working on Agile & Hybrid delivery environment
  • Knowledge of VB.Net, C#( C-Sharp ), SQL Server , Web services

 

Qualification -

  • Masters in Statistics/Mathematics/Economics/Econometrics Or BE/B-Tech, MCA or MBA 

 

Read more
Mumbai
10 - 15 yrs
₹8L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Exp-Min 10 Years

Location Mumbai

Sal-Nego

 

 

Powerbi, Tableau, QlikView,

 

 

Solution Architect/Technology Lead – Data Analytics

 

Role

Looking for Business Intelligence lead (BI Lead) having hands on experience BI tools (Tableau, SAP Business Objects, Financial and Accounting modules, Power BI), SAP integration, and database knowledge including one or more of Azure Synapse/Datafactory, SQL Server, Oracle, cloud-based DB Snowflake. Good knowledge of AI-ML, Python is also expected.

  • You will be expected to work closely with our business users. The development will be performed using an Agile methodology which is based on scrum (time boxing, daily scrum meetings, retrospectives, etc.) and XP (continuous integration, refactoring, unit testing, etc) best practices. Candidates must therefore be able to work collaboratively, demonstrate good ownership, leadership and be able to work well in teams.
  • Responsibilities :
  • Design, development and support of multiple/hybrid Data sources, data visualization Framework using Power BI, Tableau, SAP Business Objects etc. and using ETL tools, Scripting, Python Scripting etc.
  • Implementing DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code-utilizing tools like Git. Primary Skills

Requirements

  • 10+ years working as a hands-on developer in Information Technology across Database, ETL and BI (SAP Business Objects, integration with SAP Financial and Accounting modules, Tableau, Power BI) & prior team management experience
  • Tableau/PowerBI integration with SAP and knowledge of SAP modules related to finance is a must
  • 3+ years of hands-on development experience in Data Warehousing and Data Processing
  • 3+ years of Database development experience with a solid understanding of core database concepts and relational database design, SQL, Performance tuning
  • 3+ years of hands-on development experience with Tableau
  • 3+ years of Power BI experience including parameterized reports and publishing it on PowerBI Service
  • Excellent understanding and practical experience delivering under an Agile methodology
  • Ability to work with business users to provide technical support
  • Ability to get involved in all the stages of project lifecycle, including analysis, design, development, testing, Good To have Skills
  • Experience with other Visualization tools and reporting tools like SAP Business Objects.

 

Read more

at ORBO

4 recruiters
DP
Posted by Hardika Bhansali
Mumbai, Noida
2 - 5 yrs
₹6L - ₹20L / yr
Machine Learning (ML)
Data Science
Computer Vision
Deep Learning
OpenCV
+6 more

Who Are We

 

A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.

 

ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.

 

WHY US

  • Join top AI company
  • Grow with your best companions
  • Continuous pursuit of excellence, equality, respect
  • Competitive compensation and benefits

You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.

 

To learn more about how we work, please check out

https://www.orbo.ai/.

 

Description:

We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.

 

Responsibilities:

  • Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
  • Lead a team of ML engineers in developing an industrial AI product from scratch
  • Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
  • Tune the models to achieve high accuracy rates and minimum latency
  • Deploying developed computer vision models on edge devices after optimization to meet customer requirements

 

 

Requirements:

  • Bachelor’s degree
  • Understanding about depth and breadth of computer vision and deep learning algorithms.
  • 4+ years of industrial experience in computer vision and/or deep learning
  • Experience in taking an AI product from scratch to commercial deployment.
  • Experience in Image enhancement, object detection, image segmentation, image classification algorithms
  • Experience in deployment with OpenVINO, ONNXruntime and TensorRT
  • Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
  • Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
  • Proficient understanding of code versioning tools, such as Git

Our perfect candidate is someone that:

  • is proactive and an independent problem solver
  • is a constant learner. We are a fast growing start-up. We want you to grow with us!
  • is a team player and good communicator

 

What We Offer:

  • You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
  • You will be in charge of what you build and be an integral part of the product development process
  • Technical and financial growth!
Read more
Remote, Mumbai, Navi Mumbai, Pune, Nashik
7 - 12 yrs
₹15L - ₹16L / yr
PostgreSQL
PL/SQL
Big Data
Optimization
Stored Procedures

Job Role : Associate Manager (Database Development)


Key Responsibilities:

  • Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
  • Designing and developing numerous complex queries, views, functions, and stored procedures
  • to work seamlessly with the Application/Development team’s data needs.
  • Responsible for providing solutions to all data related needs to support existing and new
  • applications.
  • Creating scalable structures to cater to large user bases and manage high workloads
  • Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
  • Developing custom stored procedures and packages to support new enhancement needs.
  • Working with multiple teams to design, develop and deliver early warning systems.
  • Reviewing query performance and optimizing code
  • Writing queries used for front-end applications
  • Designing and coding database tables to store the application data
  • Data modelling to visualize database structure
  • Working with application developers to create optimized queries
  • Maintaining database performance by troubleshooting problems.
  • Accomplishing platform upgrades and improvements by supervising system programming.
  • Securing database by developing policies, procedures, and controls.
  • Designing and managing deep statistical systems.

Desired Skills and Experience  :

  • 7+ years of experience in database development
  • Minimum 4+ years of experience in PostgreSQL is a must
  • Experience and in-depth knowledge in PL/SQL
  • Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
  • Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
  • Experience in Big Data technologies is an added advantage
  • Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
  • Ability to take ownership of tasks and flexibility to work individually or in team
  • Ability to communicate with teams and clients across time zones and global regions
  • Good communication and self-motivated
  • Should have the ability to work under pressure
  • Knowledge of NoSQL and Cloud Architecture will be an advantage
Read more

at upGrad

1 video
19 recruiters
DP
Posted by Priyanka Muralidharan
Bengaluru (Bangalore), Mumbai
2 - 5 yrs
₹14L - ₹20L / yr
product analyst
Data Analytics
Python
SQL
Tableau
 

About Us

upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.

  • upGrad was awarded the Best Tech for Education by IAMAI for 2018-19

  • upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most

    sought-after startups in India

  • upGrad was earlier selected as one of the top ten most innovative companies in India

    by FastCompany.

  • We were also covered by the Financial Times along with other disruptors in Ed-Tech

  • upGrad is the official education partner for Government of India - Startup India

    program

  • Our program with IIIT B has been ranked #1 program in the country in the domain of

    Artificial Intelligence and Machine Learning

    Role Summary

    We Are looking for an analytically inclined , Insights Driven Data Analyst to make our organisation more data driven. In this role you will be responsible for creating dashboards to drive insights for product and business teams. Be it Day to Day decisions as well as long term impact assessment, Measuring the Efficacy of different products or certain teams, You'll be Empowering each of them. The growing nature of the team will require you to be in touch with all of the teams at upgrad. Are you the "Go To" person everyone looks at for getting Data, Then this role is for you.

    Roles & Responsibilities

    • Lead and own the analysis of highly complex data sources, identifying trends and patterns in data and provide insights/recommendations based on analysis results

    • Build, maintain, own and communicate detailed reports to assist Marketing, Growth/Learning Experience and Other Business/Executive Teams

    • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.

    • Analyze data and generate insights in the form of user analysis, user segmentation, performance reports, etc.

    • Facilitate review sessions with management, business users and other team members

    • Design and create visualizations to present actionable insights related to data sets and business questions at hand

    • Develop intelligent models around channel performance, user profiling, and personalization

      Skills Required

      • Having 3-5 yrs hands-on experience with Product related analytics and reporting

      • Experience with building dashboards in Tableau or other data visualization tools

        such as D3

      • Strong data, statistics, and analytical skills with a good grasp of SQL.

      • Programming experience in Python is must

      • Comfortable managing large data sets

      • Good Excel/data management skills

Read more

at Episource LLC

11 recruiters
DP
Posted by Ahamed Riaz
Mumbai
5 - 12 yrs
₹18L - ₹30L / yr
Big Data
Python
Amazon Web Services (AWS)
Serverless
DevOps
+4 more

ABOUT EPISOURCE:


Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.


The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.


What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.


ABOUT THE ROLE:


We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.


This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.


You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.


During the course of a typical day with our team, expect to work on one or more projects around the following;


1. Create and maintain optimal data pipeline architectures for ML


2. Develop a strong API ecosystem for ML pipelines


3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible


4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems


5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations  


6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms

 

7. Designing scalable implementations of the models developed by our Data Science teams  


8. Big data and distributed ML with PySpark on AWS EMR, and more!



BASIC REQUIREMENTS 


  1.  Bachelor’s degree or greater in Computer Science, IT or related fields

  2.  Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects

  3. Strong experience with bash scripting, unix environments and building scalable/distributed systems

  4. Experience with automation/configuration management using Ansible, Terraform, or equivalent

  5. Very strong experience with AWS and Python

  6. Experience building CI/CD systems

  7. Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent

  8. Ability to build and manage application and performance monitoring processes

Read more

at 1CH

1 recruiter
DP
Posted by Sathish Sukumar
Chennai, Bengaluru (Bangalore), Hyderabad, NCR (Delhi | Gurgaon | Noida), Mumbai, Pune
4 - 15 yrs
₹10L - ₹25L / yr
Data engineering
Data engineer
ETL
SSIS
ADF
+3 more
  • Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
  • Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
  • Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
  • Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
  • Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
  • Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree  and Random forest Algorithms.
  • PolyBase queries for exporting and importing data into Azure Data Lake.
  • Building data models both tabular and multidimensional using SQL Server data tools.
  • Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
  • Programming experience using python libraries NumPy, Pandas and Matplotlib.
  • Implementing NOSQL databases and writing queries using cypher.
  • Designing end user visualizations using Power BI, QlikView and Tableau.
  • Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
  • Experience using the expression languages MDX and DAX.
  • Experience in migrating on-premise SQL server database to Microsoft Azure.
  • Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
  • Performance tuning complex SQL queries, hands on experience using SQL Extended events.
  • Data modeling using Power BI for Adhoc reporting.
  • Raw data load automation using T-SQL and SSIS
  • Expert in migrating existing on-premise database to SQL Azure.
  • Experience in using U-SQL for Azure Data Lake Analytics.
  • Hands on experience in generating SSRS reports using MDX.
  • Experience in designing predictive models using Python and SQL Server.
  • Developing machine learning models using Azure Databricks and SQL Server
Read more
DP
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more

at UnFound

1 recruiter
DP
Posted by Ankur Pandey
Mumbai
1 - 40 yrs
₹5L - ₹5L / yr
Machine Learning (ML)
Deep Learning
Natural Language Processing (NLP)
Python
Microservices
+3 more
Does the current state of media frustrate you? Do you want to change the way we consume news? Are you a kickass machine learning practitioner and aspiring entrepreneur, who has opinions on world affairs as well? If so, continue reading! We at UnFound are developing a product which simplifies complex and cluttered news into simple themes, removes bias by showing all (& often unheard of) perspectives, and produce crisp summaries- all with minimal human intervention! We are looking for passionate and experienced machine learning ENGINEER/INTERN, *preferably* with experience in NLP. We want someone who can take initiatives. If you need to be micro-managed, this is NOT the role for you. 1. Demonstrable background in machine learning, especially NLP, information retrieval, etc. 2. Hands on with popular data science frameworks- Python, Jupyter, TensorFlow, PyTorch. 3. Implementation ready background in deep learning techniques like word embeddings, CNN, RNN/LSTM, etc. 4. Experience with productionizing machine learning solutions, especially ML powered mobile/ web-apps/ BOTs. 5. Hands on experience on AWS, and other cloud platforms. GPU experience is strongly preferred. 6. Thorough understanding of back-end concepts, and databases (SQL, Postgres, NoSQL, etc.) 7. Good Kaggle (or similar) scores, MOOC (Udacity, Coursera, fast.ai, etc.) preferred.
Read more

at mPaani Solutions Pvt Ltd

1 video
2 recruiters
DP
Posted by Julie K
Mumbai
3 - 7 yrs
₹5L - ₹15L / yr
Machine Learning (ML)
Python
Data Science
Big Data
R Programming
+2 more
Data Scientist - We are looking for a candidate to build great recommendation engines and power an intelligent m.Paani user journey Responsibilities : - Data Mining using methods like associations, correlations, inferences, clustering, graph analysis etc. - Scale machine learning algorithm that powers our platform to support our growing customer base and increasing data volume - Design and implement machine learning, information extraction, probabilistic matching algorithms and models - Care about designing the full machine learning pipeline. - Extending company's data with 3rd party sources. - Enhancing data collection procedures. - Processing, cleaning and verifying data collected. - Ad hoc analysis of the data and present clear results. - Creating advanced analytics products that provide actionable insights. The Individual : - We are looking for a candidate with the following skills, experience and attributes: Required : - Someone with 2+ years of work experience in machine learning. - Educational qualification relevant to the role. Degree in Statistics, certificate courses in Big Data, Machine Learning etc. - Knowledge of Machine Learning techniques and algorithms. - Knowledge in languages and toolkits like Python, R, Numpy. - Knowledge of data visualization tools like D3,js, ggplot2. - Knowledge of query languages like SQL, Hive, Pig . - Familiar with Big Data architecture and tools like Hadoop, Spark, Map Reduce. - Familiar with NoSQL databases like MongoDB, Cassandra, HBase. - Good applied statistics skills like distributions, statistical testing, regression etc. Compensation & Logistics : This is a full-time opportunity. Compensation will be in line with startup, and will be based on qualifications and experience. The position is based in Mumbai, India, and the candidate must live in Mumbai or be willing to relocate.
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort