Credit Risk Analyst

at Niro

DP
Posted by Vinay Gurram
icon
Bengaluru (Bangalore)
icon
2 - 4 yrs
icon
₹7L - ₹15L / yr (ESOP available)
icon
Full time
Skills
Risk assessment
Risk Management
Risk analysis
Python
SAS
SPSS
R
  • Gather information from multiple data sources make Approval Decisions mechanically
  • Read and interpret credit related information to the borrowers
  • Interpret, analyze and assess all forms of complex information
  • Embark on risk assessment analysis
  • Maintain the credit exposure of the company within certain risk level with set limit in mind
  • Build strategies to minimize risk and increase approval rates
  • Design Champion and Challenger tests, implement and read test results
  • Build Line assignment strategies
Skills required:
- Credit Risk Modeling
- Statistical Data Understanding and interpretation
- Basic Regression and Advanced Machine Learning Models
- Conversant with coding on Python using libraries like Sklearn etc.
- Build and understand decision trees

About Niro

QuantiFi‘s core hypothesis rests on working with consumer internet companies such as Bounce, Quikr, MakeMyTrip, Swiggy and TrueCaller to leverage their data and distribution – ultimately reducing acquisition costs, operating costs and risk. QuantiFi aims to manufacture and distribute competitive financial products, through innovation on segment coverage, products and delivery – in leading B2B2C partnerships.

Embedded finance is the revolutionising distribution in the financial services space by working in close liaison with brands that have large, engaged customer bases. As of today, embedded offerings predominantly focus on increasing consumers’ propensity to purchase products – the most popular structures being: add-on insurance, ‘buy-now-pay-later’, and consumption loans. These structures make a compelling use-case for mid-to-high ticket sized purchases, and for brands having a well-defined commercial value proposition.

Hypotheses for embedded finance, however, can be extended to practically any business regardless of frequency and size of transaction. Consumers of any platform can be solicited for targeted, competitive financial products by providing a native user-friendly experience. Targeting is performed using data models and algorithms on platform generated user behaviours. QuantiFi aims to achieve such frictionless delivery of financial products that could range from anything between a Personal Loan, Credit Line to a Bank Deposit or Investment advisory.

The founders of QuantiFi have a plethora of experience in the financial services - Corporate as well as the Fintech start-up ecosystem. Aditya started his career as an Investment Banker with Lehman Brothers and later founded Qbera in late 2015. Between the two experiences he founded an international school in Kolkata and grew his family-owned businesses. Qbera was acquired by InCred Financial Services in 2020 and he headed the personal loans and platform business at InCred upto 2021.

Sankalp is an engineer from IIT, Kanpur class of 2003. He has spent a large majority of his career with Banks like HSBC, Citi and Barclays and the last 5 years with start-up’s KredX and MoneyTap. During his stint with corporates, he worked across multiple geographies and on different products like Cards, Loans and Premium Banking. He worked on Invoice discounting products and personal credit line products with the two high growth start-ups. He has extensive experience in Analytics, Underwriting, Collections, Modeling and Forecasting.

QuantiFi is striving to hire and build the core key team members spanning across Technology, Data Science, Product, Revenue, Collections and Operations. We are aiming to build a great team of leaders which could deliver across the board, drive decision-making and build an explosive growth engine for the years to come.
Founded
2021
Type
Product
Size
20-100 employees
Stage
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Engineer

at Tech Prescient

Founded 2017  •  Products & Services  •  20-100 employees  •  Profitable
Amazon Web Services (AWS)
Python
Spark
Amazon Redshift
Snow flake schema
Hadoop
icon
Remote only
icon
2 - 4 yrs
icon
₹10L - ₹20L / yr

We are looking for: Data Engineers

 

The Data engineer will be responsible for conceiving, building and managing the data warehouse and data platforms.

 

Typical responsibilities include: Data pipelines, analytics engineering, distributed computing, big data and small data management, caching, data warehouse, cluster management, graph processing, dynamic programming

 

Qualifications

  • 2+yrs production software experience
  • Experience with Cloud platforms, preferably AWS.
  • Strong knowledge of popular database and data warehouse technologies & concepts from Google, Amazon such as BigQuery, Redshift, Snowflake etc.
  • Experience in data modeling, data design and persistence on large complex datasets.
  • Experience with object-oriented design and development (preferably Python/Java)
  • Background in Spark, or other Big Data related technologies, non-relational databases is a plus.
  • Experience with software development best-practices, including unit testing and continuous delivery.
  • Desire to apply agile development principles in a fast-paced startup environment.
  • Strong teamwork and communications.

 

Other non-negotiable requirements are - 

1. Good academics

2. Good communication skills

 

 

About Tech Prescient - We are a technology based product development service company working with technology companies to build their awesome products. We work with customers to design and develop their product stack and hence, quality of work we produce is always premium. We are looking for equally motivated people to join our vibrant team and am sure we will make it a win-win situation.

Job posted by
Rutuja Prayag

Data Engineer - (Azure, Logic apps, Powerapp)

at Onepoint IT Consulting Pvt Ltd

Founded 2008  •  Services  •  20-100 employees  •  Profitable
SQL
logic apps
ADF
snowflake
loqic apps
power apps
icon
Pune
icon
2 - 8 yrs
icon
₹7L - ₹15L / yr
This developer will focus on developing new features, implementing unit test and help develop forward thinking development strategies.

Is this you?

 

  • I am passionate about developing software and scripts to maintain complex Systems.
  • I have vision and talent to contribute in new emerging areas such as cloud technologies.
  • I love logic and solving puzzles.
  • I strive working with a diverse, highly skilled team based in the UK and India.
  • I am fluent in English, both written and spoken.

 

Qualifications

 

  • Experience in Azure (ADF, Logic App), Azure Functions
  • Experience with SQL and Databases.
  • Knowledge of data warehousing/ data modeling knowledge
  • Knowledge of Snowflake
  • Knowledge or experience in low code applications (e.g. PowerApp)
  • Knowledge of design patterns and Software Development Life Cycle.

 

Competencies

 

  • Excellent written and verbal communication skills in English and Hindi.
  • Excellent interpersonal skills to collaborate with various stakeholders.
  • Good reasoning and logical ability
  • Identifying the right questions and understand the big picture.
  • Constant learning which enjoys new challenges.
  • Self-Starter with excellent time management skills.

 

Benefits

Excellent work life balance, including flexible working hours within core working hours.

  • Actively encouraged in decision making at all levels.
  • Assigned mentor for self-development.
  • 18 days annual leave.
  • Medical Insurance and Provident Fund.
  • Flexible work culture
Job posted by
Maithili Shetty

Data scientist

at SocialPrachar.com

Founded 2014  •  Services  •  20-100 employees  •  Profitable
Data Science
Machine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
icon
Hyderabad
icon
0 - 1 yrs
icon
₹1.8L - ₹3L / yr

Hi We are Looking foe Data Science AI Professional role for our KPHB, Hyderabad Branch Requirements:

• Min 6 months to 1 Year experience in Data Science AI

• Need to have proven Experience with Good GitHub profile and projects

• Need to good with Data Science & AI Concepts

• Need to be good with Python, ML, Stats, Deep Learning, NLP, OpenCV etc

• Good Communication and presentation skills

Job posted by
Mahesh Ch

Senior Data Engineer

at Velocity.in

Founded 2019  •  Product  •  20-100 employees  •  Raised funding
ETL
Informatica
Data Warehouse (DWH)
Data engineering
Oracle
PostgreSQL
DevOps
Amazon Web Services (AWS)
NodeJS (Node.js)
Ruby on Rails (ROR)
React.js
Python
icon
Bengaluru (Bangalore)
icon
4 - 9 yrs
icon
₹15L - ₹35L / yr

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 5+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 5+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

Job posted by
Newali Hazarika

Python developer

at Gauge Data Solutions Pvt Ltd

Founded 2014  •  Products & Services  •  20-100 employees  • 
Data Science
Machine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
Artificial Intelligence (AI)
Python
OOAD
Data storage
recommendation algorithm
icon
Noida
icon
0 - 4 yrs
icon
₹3L - ₹8L / yr

Essential Skills :

- Develop, enhance and maintain Python related projects, data services, platforms and processes.

- Apply and maintain data quality checks to ensure data integrity and completeness.

- Able to integrate multiple data sources and databases.

- Collaborate with cross-functional teams across, Decision Sciences, Search, Database Management. To design innovative solutions, capture requirements and drive a common future vision.

Technical Skills/Capabilities :

- Hands on experience in Python programming language.

- Understanding and proven application of Computer Science fundamentals in object oriented design, data structures, algorithm design, Regular expressions, data storage procedures, problem solving, and complexity analysis.

- Understanding of natural language processing and basic ML algorithms will be a plus.

- Good troubleshooting and debugging skills.

- Strong individual contributor, self-motivated, and a proven team player.

- Eager to learn and develop new experience and skills.

- Good communication and interpersonal skills.

About Company Profile :

Gauge Data Solutions Pvt Ltd :

- We are a leading company into Data Science, Machine learning and Artificial Intelligence.

- Within Gauge data we have a competitive environment for the Developers and Engineers.

- We at Gauge create potential solutions for the real world problems. One such example of our engineering is Casemine.

- Casemine is a legal research platform powered by Artificial Intelligence. It helps lawyers, judges and law researchers in their day to day life.

- Casemine provides exhaustive case results to its users with the use of cutting edge technologies.

- It is developed with the efforts of great engineers at Gauge Data.

- One such opportunity is now open for you. We at Gauge Data invites application for competitive, self motivated Python Developer.

Purpose of the Role :

- This position will play a central role in developing new features and enhancements for the products and services at Gauge Data.

- To know more about what we do and how we do it, feel free to read these articles:

- https://bit.ly/2YfVAsv

- https://bit.ly/2rQArJc

- You can also visit us at https://www.casemine.com/.

- For more information visit us at: - www.gaugeanalytics.com

- Join us on LinkedIn, Twitter & Facebook
Job posted by
Deeksha Dewal

Data Scientist

at upGrad

Founded 2015  •  Product  •  100-500 employees  •  Raised funding
Data Science
R Programming
Python
SQL
Natural Language Processing (NLP)
Machine Learning (ML)
Tableau
icon
Bengaluru (Bangalore), Mumbai
icon
4 - 6 yrs
icon
₹10L - ₹21L / yr

About Us

upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.

  • upGrad was awarded the Best Tech for Education by IAMAI for 2018-19

  • upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most sought-

    after startups in India

  • upGrad was earlier selected as one of the top ten most innovative companies in India

    by FastCompany.

  • We were also covered by the Financial Times along with other disruptors in Ed-Tech

  • upGrad is the official education partner for Government of India - Startup India

    program

  • Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning

     

    Role Summary

    Are you excited by the challenge and the opportunity of applying data-science and data- analytics techniques to the fast developing education technology domain? Do you look forward to, the sense of ownership and achievement that comes with innovating and creating data products from scratch and pushing it live into Production systems? Do you want to work with a team of highly motivated members who are on a mission to empower individuals through education?
    If this is you, come join us and become a part of the upGrad technology team. At upGrad the technology team enables all the facets of the business - whether it’s bringing efficiency to ourmarketing and sales initiatives, to enhancing our student learning experience, to empowering our content, delivery and student success teams, to aiding our student’s for their desired careeroutcomes. We play the part of bringing together data & tech to solve these business problems and opportunities at hand.
    We are looking for an highly skilled, experienced and passionate data-scientist who can come on-board and help create the next generation of data-powered education tech product. The ideal candidate would be someone who has worked in a Data Science role before wherein he/she is comfortable working with unknowns, evaluating the data and the feasibility of applying scientific techniques to business problems and products, and have a track record of developing and deploying data-science models into live applications. Someone with a strong math, stats, data-science background, comfortable handling data (structured+unstructured) as well as strong engineering know-how to implement/support such data products in Production environment.
    Ours is a highly iterative and fast-paced environment, hence being flexible, communicating well and attention-to-detail are very important too. The ideal candidate should be passionate about the customer impact and comfortable working with multiple stakeholders across the company.


    Roles & Responsibilities

      • 3+ years of experience in analytics, data science, machine learning or comparable role
      • Bachelor's degree in Computer Science, Data Science/Data Analytics, Math/Statistics or related discipline 
      • Experience in building and deploying Machine Learning models in Production systems
      • Strong analytical skills: ability to make sense out of a variety of data and its relation/applicability to the business problem or opportunity at hand
      • Strong programming skills: comfortable with Python - pandas, numpy, scipy, matplotlib; Databases - SQL and noSQL
      • Strong communication skills: ability to both formulate/understand the business problem at hand as well as ability to discuss with non data-science background stakeholders 
      • Comfortable dealing with ambiguity and competing objectives

       

      Skills Required

      • Experience in Text Analytics, Natural Language Processing

      • Advanced degree in Data Science/Data Analytics or Math/Statistics

      • Comfortable with data-visualization tools and techniques

      • Knowledge of AWS and Data Warehousing

      • Passion for building data-products for Production systems - a strong desire to impact

        the product through data-science technique

Job posted by
Priyanka Muralidharan

Data Engineer

at Our client company is into Analytics. (RF1)

Agency job
via Multi Recruit
Data Engineer
Big Data
Python
Amazon Web Services (AWS)
SQL
Java
ETL
icon
Bengaluru (Bangalore)
icon
3 - 5 yrs
icon
₹12L - ₹14L / yr
  •  We are looking for a Data Engineer with 3-5 years experience in Python, SQL, AWS (EC2, S3, Elastic Beanstalk, API Gateway), and Java.
  • The applicant must be able to perform Data Mapping (data type conversion, schema harmonization) using Python, SQL, and Java.
  • The applicant must be familiar with and have programmed ETL interfaces (OAUTH, REST API, ODBC) using the same languages.
  • The company is looking for someone who shows an eagerness to learn and who asks concise questions when communicating with teammates.
Job posted by
Ragul Ragul

Data Engineer

at Aptus Data LAbs

Founded 2014  •  Products & Services  •  100-1000 employees  •  Profitable
Data engineering
Big Data
Hadoop
Data Engineer
Apache Kafka
Apache Spark
Python
Elastic Search
Kibana
Cisco Certified Network Associate (CCNA)
icon
Bengaluru (Bangalore)
icon
5 - 10 yrs
icon
₹6L - ₹15L / yr

Roles & Responsibilities

  1. Proven experience with deploying and tuning Open Source components into enterprise ready production tooling Experience with datacentre (Metal as a Service – MAAS) and cloud deployment technologies (AWS or GCP Architect certificates required)
  2. Deep understanding of Linux from kernel mechanisms through user space management
  3. Experience on CI/CD (Continuous Integrations and Deployment) system solutions (Jenkins).
  4. Using Monitoring tools (local and on public cloud platforms) Nagios, Prometheus, Sensu, ELK, Cloud Watch, Splunk, New Relic etc. to trigger instant alerts, reports and dashboards.  Work closely with the development and infrastructure teams to analyze and design solutions with four nines (99.99%) up-time, globally distributed, clustered, production and non-production virtualized infrastructure. 
  5. Wide understanding of IP networking as well as data centre infrastructure

Skills

  1. Expert with software development tools and sourcecode management, understanding, managing issues, code changes and grouping them into deployment releases in a stable and measurable way to maximize production Must be expert at developing and using ansible roles and configuring deployment templates with jinja2.
  2. Solid understanding of data collection tools like Flume, Filebeat, Metricbeat, JMX Exporter agents.
  3. Extensive experience operating and tuning the kafka streaming data platform, specifically as a message queue for big data processing
  4. Strong understanding and must have experience:
  5. Apache spark framework, specifically spark core and spark streaming, 
  6. Orchestration platforms, mesos and kubernetes, 
  7. Data storage platforms, elasticstack, carbon, clickhouse, cassandra, ceph, hdfs
  8. Core presentation technologies kibana, and grafana.
  9. Excellent scripting and programming skills (bash, python, java, go, rust). Must have previous experience with “rust” in order to support, improve in house developed products

Certification

Red Hat Certified Architect certificate or equivalent required CCNA certificate required 3-5 years of experience running open source big data platforms

Job posted by
Merlin Metilda

Machine Learning Engineer

at Centime

Agency job
via FlexAbility
Machine Learning (ML)
Artificial Intelligence (AI)
Deep Learning
Java
Python
icon
Hyderabad
icon
8 - 14 yrs
icon
₹15L - ₹35L / yr

Required skill

  • Around 6- 8.5 years of experience and around 4+ years in AI / Machine learning space
  • Extensive experience in designing large scale machine learning solution for the ML use case,  large scale deployments and establishing continues automated improvement / retraining framework.
  • Strong experience in Python and Java is required.
  • Hands on experience on Scikit-learn, Pandas, NLTK
  • Experience in Handling of Timeseries data and associated techniques like Prophet, LSTM
  • Experience in Regression, Clustering, classification algorithms
  • Extensive experience in buildings traditional Machine Learning SVM, XGBoost, Decision tree and Deep Neural Network models like RNN, Feedforward is required.
  • Experience in AutoML like TPOT or other
  • Must have strong hands on experience in Deep learning frameworks like Keras, TensorFlow or PyTorch 
  • Knowledge of Capsule Network or reinforcement learning, SageMaker is a desirable skill
  • Understanding of Financial domain is desirable skill

 Responsibilities 

  • Design and implementation of solutions for ML Use cases
  • Productionize System and Maintain those
  • Lead and implement data acquisition process for ML work
  • Learn new methods and model quickly and utilize those in solving use cases
Job posted by
srikanth voona

Senior Machine Learning Engineer

at AthenasOwl

Founded 2017  •  Product  •  100-500 employees  •  Raised funding
Deep Learning
Natural Language Processing (NLP)
Machine Learning (ML)
Computer vision
Python
Data Structures
icon
Mumbai
icon
3 - 7 yrs
icon
₹10L - ₹20L / yr

Company Profile and Job Description  

About us:  

AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries.  

For more details visit: www.athenasowl.tv   

  

Role:   

Senior Machine Learning Engineer  

Experience Level:   

4 -6 Years of experience  

Work location:   

Mumbai (Malad W)   

  

Responsibilities:   

  • Develop cutting edge machine learning solutions at scale to solve computer vision problems in the domain of media, entertainment and sports
  • Collaborate with media houses and broadcasters across the globe to solve niche problems in the field of post-production, archiving and viewership
  • Manage a team of highly motivated engineers to deliver high-impact solutions quickly and at scale

 

 

The ideal candidate should have:   

  • Strong programming skills in any one or more programming languages like Python and C/C++
  • Sound fundamentals of data structures, algorithms and object-oriented programming
  • Hands-on experience with any one popular deep learning framework like TensorFlow, PyTorch, etc.
  • Experience in implementing Deep Learning Solutions (Computer Vision, NLP etc.)
  • Ability to quickly learn and communicate the latest findings in AI research
  • Creative thinking for leveraging machine learning to build end-to-end intelligent software systems
  • A pleasantly forceful personality and charismatic communication style
  • Someone who will raise the average effectiveness of the team and has demonstrated exceptional abilities in some area of their life. In short, we are looking for a “Difference Maker”

 

Job posted by
Ericsson Fernandes
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Niro?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort