Cutshort logo
Fintech Pioneer | GGN logo
Head- Data Science
Head- Data Science
Fintech Pioneer | GGN's logo

Head- Data Science

Agency job
via Unnati
8 - 13 yrs
₹60L - ₹70L / yr
Delhi, Gurugram, Noida
Skills
skill iconData Science
Data Scientist
skill iconPython
SQL
skill iconMachine Learning (ML)
Analytics
Predictive modelling
skill iconAmazon Web Services (AWS)
Quality management
Join a leading MCommerce company, set your career on a flight towards success and growth.
 
Our client is one of the oldest fintech companies that is taking banking and financial services to all the customers through their online platform. Having served over 50 million customers in the last 15 years, it is enabling over 7mn banking transactions each month, with a network of nearly 2 lac merchants. Using its vast network of merchant outlets, the platform is reaching the lower and mid-income groups who deal in cash, for them to be able to remit money across the country digitally. It now plans to take its unique digital financial solutions to developing markets across the globe. As pioneers of mobile-based payment services in India, they empower Retailers, Individuals and Businesses to have an online presence and earn or save a little extra through the transactions.
 
As a Head - Data Science, you will be part of the leadership team and will be expected to manage ambiguity & help the Founders & other leaders in building the roadmap forward for the business.
 
You will be expected to adopt an "iron sharpens iron" attitude where you will focus on making everyone and every data-driven process better, blend people leadership/ management skills, use predictive modelling and analytics expertise, cloud computing skills and operational know-how.
 
What you will do:
  • Working closely with business stakeholders to define, strategize and execute crucial business problem statements which lie at the core of improvising current and future data-backed product offerings.
  • Building and refining underwriting models for extending credit to sellers and API Partners in collaboration with the lending team
  • Conceiving, planning and prioritizing data projects and manage timelines
  • Building analytical systems and predictive models as a part of the agile ecosystem
  • Testing performance of data-driven products participating in sprint-wise feature releases
  • Managing a team of data scientists and data engineers to develop, train and test predictive models
  • Managing collaboration with internal and external stakeholders
  • Building data-centric culture from within, partnering with every team, learning deeply about business, working with highly experienced, sharp and insanely ambitious colleagues
 

What you need to have:

  • B.Tech/ M.Tech/ MS/ PhD in Data Science / Computer Science, Statistics, Mathematics & Computation with a demonstrated skill-set in leading an Analytics and Data Science team from IIT, BITS Pilani, ISI
  • 8+ years working in the Data Science and analytics domain with 3+ years of experience in leading a data science team to understand the projects to be prioritized, how the team strategy aligns with the organization mission;
  • Deep understanding of credit risk landscape; should have built or maintained underwriting models for unsecured lending products
  • Should have handled a leadership team in a tech startup preferably a fintech/ lending/ credit risk startup.
  • We value entrepreneurship spirit: if you have had the experience of starting your own venture - that is an added advantage.
  • Strategic thinker with agility and endurance
  • Aware of the latest industry trends in Data Science and Analytics with respect to Fintech, Digital Transformations and Credit-lending domain
  • Excellent command over communication is the key to manage multiple stakeholders like the leadership team, product teams, existing & new investors.
  • Cloud Computing, Python, SQL, ML algorithms, Analytics and problem - solving mindset
  • Knowledge and demonstrated skill-sets in AWS
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Fintech Pioneer | GGN

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Publicis Sapient
at Publicis Sapient
10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹20L - ₹40L / yr
skill iconData Science
Weka
Data Scientist
Statistical Modeling
Mathematics
+5 more
Roles and Responsibilities
● Research and develop advanced statistical and machine learning models for
analysis of large-scale, high-dimensional data.
● Dig deeper into data, understand characteristics of data, evaluate alternate
models and validate hypothesis through theoretical and empirical approaches.
● Productize proven or working models into production quality code.
● Collaborate with product management, marketing and engineering teams in
Business Units to elicit & understand their requirements & challenges and
develop potential solutions
● Stay current with latest research and technology ideas; share knowledge by
clearly articulating results and ideas to key decision makers.
● File patents for innovative solutions that add to company's IP portfolio

Requirements
● 4 to 6 years of strong experience in data mining, machine learning and
statistical analysis.
● BS/MS/PhD in Computer Science, Statistics, Applied Math, or related areas
from Premier institutes (only IITs / IISc / BITS / Top NITs or top US university
should apply)
● Experience in productizing models to code in a fast-paced start-up
environment.
● Expertise in Python programming language and fluency in analytical tools
such as Matlab, R, Weka etc.
● Strong intuition for data and Keen aptitude on large scale data analysis

● Strong communication and collaboration skills.
Read more
Top IT MNC
Chennai, Bengaluru (Bangalore), Kochi (Cochin), Coimbatore, Hyderabad, Pune, Kolkata, Noida, Gurugram, Mumbai
5 - 13 yrs
₹8L - ₹20L / yr
Snow flake schema
skill iconPython
snowflake
Greetings,

We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction
Read more
Perfios
Agency job
via Seven N Half by Susmitha Goddindla
Bengaluru (Bangalore)
4 - 6 yrs
₹4L - ₹15L / yr
SQL
ETL tool
python developer
skill iconMongoDB
skill iconData Science
+15 more
Job Description
1. ROLE AND RESPONSIBILITIES
1.1. Implement next generation intelligent data platform solutions that help build high performance distributed systems.
1.2. Proactively diagnose problems and envisage long term life of the product focusing on reusable, extensible components.
1.3. Ensure agile delivery processes.
1.4. Work collaboratively with stake holders including product and engineering teams.
1.5. Build best-practices in the engineering team.
2. PRIMARY SKILL REQUIRED
2.1. Having a 2-6 years of core software product development experience.
2.2. Experience of working with data-intensive projects, with a variety of technology stacks including different programming languages (Java,
Python, Scala)
2.3. Experience in building infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data
sources to support other teams to run pipelines/jobs/reports etc.
2.4. Experience in Open-source stack
2.5. Experiences of working with RDBMS databases, NoSQL Databases
2.6. Knowledge of enterprise data lakes, data analytics, reporting, in-memory data handling, etc.
2.7. Have core computer science academic background
2.8. Aspire to continue to pursue career in technical stream
3. Optional Skill Required:
3.1. Understanding of Big Data technologies and Machine learning/Deep learning
3.2. Understanding of diverse set of databases like MongoDB, Cassandra, Redshift, Postgres, etc.
3.3. Understanding of Cloud Platform: AWS, Azure, GCP, etc.
3.4. Experience in BFSI domain is a plus.
4. PREFERRED SKILLS
4.1. A Startup mentality: comfort with ambiguity, a willingness to test, learn and improve rapidl
Read more
Amagi Media Labs
at Amagi Media Labs
3 recruiters
Rajesh C
Posted by Rajesh C
Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹14L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+3 more
1. 2 to 4 years of experience
2. hands on experience using python, sql, tablaue
3. Data Analyst 
About Amagi (http://www.amagi.com/" target="_blank">www.amagi.com): Amagi is a market leader in cloud based media technology services for channel creation, distribution and ad monetization. Amagi’s cloud technology and managed services is used by TV networks, content owners, sports rights owners and pay TV / OTT platforms to create 24x7 linear channels for OTT and broadcast and deliver them to end consumers. Amagi’s pioneering and market leading cloud platform has won numerous accolades and is deployed in over 40 countries by 400+ TV networks. Customers of Amagi include A+E Networks, Comcast, Google, NBC Universal, Roku, Samsung and Warner Media. This is a unique and transformative opportunity to participate and grow a world-class technology company that changes the tenets of TV. Amagi is a private equity backed firm with investments from KKR (Emerald Media Fund), Premji Invest and MayField. Amagi has offices in New York, Los Angeles, London, New Delhi and Bangalore. LinkedIn page : https://www.linkedin.com/company/amagicorporation" target="_blank">https://www.linkedin.com/company/amagicorporation News: https://www.amagi.com/about/newsroom/amagi-clocks-120-yoy-quarterly-growth-as-channels-on-its-platform-grows-to-400" target="_blank">https://www.amagi.com/about/newsroom/amagi-clocks-120-yoy-quarterly-growth-as-channels-on-its-platform-grows-to-400/ Cofounder on Youtube: https://www.youtube.com/watch?v=EZ0nBT3ht0E" target="_blank">https://www.youtube.com/watch?v=EZ0nBT3ht0E
 

About Amagi & Growth


Amagi Corporation is a next-generation media technology company that provides cloud broadcast and targeted advertising solutions to broadcast TV and streaming TV platforms. Amagi enables content owners to launch, distribute and monetize live linear channels on Free-Ad-Supported TV and video services platforms. Amagi also offers 24x7 cloud managed services bringing simplicity, advanced automation, and transparency to the entire broadcast operations. Overall, Amagi supports 500+ channels on its platform for linear channel creation, distribution, and monetization with deployments in over 40 countries. Amagi has offices in New York (Corporate office), Los Angeles, and London, broadcast operations in New Delhi, and our Development & Innovation center in Bangalore. Amagi is also expanding in Singapore, Canada and other countries.

Amagi has seen phenomenal growth as a global organization over the last 3 years. Amagi has been a profitable firm for the last 2 years, and is now looking at investing in multiple new areas. Amagi has been backed by 4 investors - Emerald, Premji Invest, Nadathur and Mayfield. As of the fiscal year ending March 31, 2021, the company witnessed stellar growth in the areas of channel creation, distribution, and monetization, enabling customers to extend distribution and earn advertising dollars while saving up to 40% in cost of operations compared to traditional delivery models. Some key highlights of this include:

·   Annual revenue growth of 136%
·   44% increase in customers
·   50+ Free Ad Supported Streaming TV (FAST) platform partnerships and 100+ platform partnerships globally
·   250+ channels added to its cloud platform taking the overall tally to more than 500
·   Approximately 2 billion ad opportunities every month supporting OTT ad-insertion for 1000+ channels
·   60% increase in workforce in the US, UK, and India to support strong customer growth (current headcount being 360 full-time employees + Contractors)
·   5-10x growth in ad impressions among top customers
 
Over the last 4 years, Amagi has grown more than 400%. Amagi now has an aggressive growth plan over the next 3 years - to grow 10X in terms of Revenue. In terms of headcount, Amagi is looking to grow to more than 600 employees over the next 1 year. Amagi is building several key organizational processes to support the high growth journey and has gone digital in a big way.
 
Read more
Technology service company
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
Relational Database (RDBMS)
NOSQL Databases
NOSQL
Performance tuning
SQL
+10 more

Preferred Education & Experience:

  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 5+ years of hands-on demonstrable experience with:
    ▪ Data Analysis & Data Modeling
    ▪ Database Design & Implementation
    ▪ Database Performance Tuning & Optimization
    ▪ PL/pgSQL & SQL

  • 5+ years of hands-on development experience in Relational Database (PostgreSQL/SQL Server/Oracle).

  • 5+ years of hands-on development experience in SQL, PL/PgSQL, including stored procedures, functions, triggers, and views.

  • Hands-on experience with demonstrable working experience in Database Design Principles, SQL Query Optimization Techniques, Index Management, Integrity Checks, Statistics, and Isolation levels

  • Hands-on experience with demonstrable working experience in Database Read & Write Performance Tuning & Optimization.

  • Knowledge and Experience working in Domain Driven Design (DDD) Concepts, Object Oriented Programming System (OOPS) Concepts, Cloud Architecture Concepts, NoSQL Database Concepts are added values

  • Knowledge and working experience in Oil & Gas, Financial, & Automotive Domains is a plus

  • Hands-on development experience in one or more NoSQL data stores such as Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc. a plus.

Read more
Falcon Autotech
at Falcon Autotech
1 recruiter
Rohit Kaushik
Posted by Rohit Kaushik
Noida
3 - 7 yrs
₹4L - ₹7L / yr
skill iconData Analytics
Data Analyst
Tableau
MySQL
SQL
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Expertise of SQL/PL-SQL -ability to write procedures and create queries for reporting purpose.
  • Must have worked on a reporting tool – Power BI/Tableau etc.
  • Strong knowledge of excel/Google Sheets – must have worked with pivot tables, aggregate functions, logical if conditions.
  • Strong verbal and written communication skills for coordination with departments.
  • An analytical mind and inclination for problem-solving
Read more
NCR (Delhi | Gurgaon | Noida)
2 - 12 yrs
₹25L - ₹40L / yr
Data governance
DevOps
Data integration
Data engineering
skill iconPython
+14 more
Data Platforms (Data Integration) is responsible for envisioning, building and operating the Bank’s data integration platforms. The successful candidate will work out of Gurgaon as a part of a high performing team who is distributed across our two development centers – Copenhagen and Gurugram. The individual must be driven, passionate about technology and display a level of customer service that is second to none.

Roles & Responsibilities

  • Designing and delivering a best-in-class, highly scalable data governance platform
  • Improving processes and applying best practices
  • Contribute in all scrum ceremonies; assuming the role of ‘scum master’ on a rotational basis
  •  Development, management and operation of our infrastructure to ensure it is easy to deploy, scalable, secure and fault-tolerant
  • Flexible on working hours as per business needs
Read more
Bengaluru (Bangalore)
5 - 7 yrs
₹14.5L - ₹16.5L / yr
skill iconData Science
Data scientist
skill iconData Analytics
skill iconMachine Learning (ML)
skill iconPython
+2 more
  • Actively engage with internal business teams to understand their challenges and deliver robust, data-driven solutions.
  • Work alongside global counterparts to solve data-intensive problems using standard analytical frameworks and tools.
  • Be encouraged and expected to innovate and be creative in your data analysis, problem-solving, and presentation of solutions.
  • Network and collaborate with a broad range of internal business units to define and deliver joint solutions.
  • Work alongside customers to leverage cutting-edge technology (machine learning, streaming analytics, and ‘real’ big data) to creatively solve problems and disrupt existing business models.

In this role, we are looking for:

  • A problem-solving mindset with the ability to understand business challenges and how to apply your analytics expertise to solve them.
  • The unique person who can present complex mathematical solutions in a simple manner that most will understand, including customers.
  • An individual excited by innovation and new technology and eager to finds ways to employ these innovations in practice.
  • A team mentality, empowered by the ability to work with a diverse set of individuals.

Basic Qualifications

  • A Bachelor’s degree in Data Science, Math, Statistics, Computer Science or related field with an emphasis on analytics.
  • 5+ Years professional experience in a data scientist/analyst role or similar.
  • Proficiency in your statistics/analytics/visualization tool of choice, but preferably in the Microsoft Azure Suite, including Azure ML Studio and PowerBI as well as R, Python, SQL.

Preferred Qualifications

  • Excellent communication, organizational transformation, and leadership skills
  • Demonstrated excellence in Data Science, Business Analytics and Engineering

 

 

 

 

 

Read more
TechChefs Software
at TechChefs Software
2 recruiters
Shilpa Yadav
Posted by Shilpa Yadav
Remote, Anywhere from india
5 - 10 yrs
₹1L - ₹15L / yr
ETL
Informatica
skill iconPython
SQL

Responsibilities

  • Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices
  • Day to day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst).
  • Informatica capacity planning and on-going monitoring (e.g. CPU, Memory, etc.) to proactively increase capacity as needed.
  • Manage backup and security of Data Integration Infrastructure.
  • Design, develop, and maintain all data warehouse, data marts, and ETL functions for the organization as a part of an infrastructure team.
  • Consult with users, management, vendors, and technicians to assess computing needs and system requirements.
  • Develop and interpret organizational goals, policies, and procedures.
  • Evaluate the organization's technology use and needs and recommend improvements, such as software upgrades.
  • Prepare and review operational reports or project progress reports.
  • Assist in the daily operations of the Architecture Team , analyzing workflow, establishing priorities, developing standards, and setting deadlines.
  • Work with vendors to manage support SLA’s and influence vendor product roadmap
  • Provide leadership and guidance in technical meetings, define standards and assist/provide status updates
  • Work with cross functional operations teams such as systems, storage and network to design technology stacks.

 

Preferred Qualifications

  • Minimum 6+ years’ experience as Informatica Engineer and Developer role
  • Minimum of 5+ years’ experience in an ETL environment as a developer.
  • Minimum of 5+ years of experience in SQL coding and understanding of databases
  • Proficiency in Python
  • Proficiency in command line troubleshooting
  • Proficiency in writing code in Perl/Shell scripting languages
  • Understanding of Java and concepts of Object-oriented programming
  • Good understanding of systems, networking, and storage
  • Strong knowledge of scalability and high availability
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos