Cutshort logo
App-based lending platform. ( AF1) logo
Data Analyst
App-based lending platform. ( AF1)
App-based lending platform. ( AF1)'s logo

Data Analyst

at App-based lending platform. ( AF1)

Agency job
2 - 4 yrs
₹6.5L - ₹8.5L / yr
Bengaluru (Bangalore)
Skills
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
skill iconData Analytics
  • Bachelor's degree in Computer Science, Engineering, Operations Research, Math, or related discipline.
  • Minimum 2+ years of experience as an Analyst role preferred.
  • Highly proficient in Microsoft Office and Windows based applications.
  • SQL AND Excel Knowledge and Hands-on experience is a must.
  • Demonstrated Analytical ability, results-oriented environment with external customer interaction.
  • Excellent written and verbal communication and presentation skills and the ability to express thoughts logically and succinctly.
  • Entrepreneurial drive and demonstrated ability to achieve stretch goals in an innovative and fast-paced environment.

 

Preferred Qualifications

  • Experience with E-Commerce, Retail and Business Analytics would be an advantage.
  • Understanding of data warehousing, data modeling concept and building new DW tables
  • Advanced SQL skills, fluent in R and/or Python, advanced Microsoft Office skills, particularly Excel and analytical platforms

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

Similar jobs

TIFIN FINTECH
at TIFIN FINTECH
1 recruiter
Vrishali Mishra
Posted by Vrishali Mishra
Mumbai
2 - 5 yrs
Best in industry
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+2 more

Quant Research, TIFIN

Mumbai, India


WHO WE ARE:

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.

We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.


OUR VALUES: Go with your GUT

●     Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.

●     Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.

●     I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.

 

 

WHAT YOU'LL BE DOING:

We are looking for an experienced quantitative professional to develop, implement, test, and maintain the core algorithms and R&D framework for our Investment and investment advisory platform. The ideal candidate for this role has successfully implemented and maintained quantitative and statistical modules using modular software design constructs. The candidate needs to be a responsible product owner, a problem solver and a team player looking to make a significant impact on a fast-growing company. The successful candidate will directly report to the Head of Quant Research & Development.

 

 

RESPONSIBILITIES:

  • The end-to-end research, development, and maintenance of investment platform, data and algorithms
  • Take part in building out the R&D back testing and simulation engines
  • Thoroughly vet investment algorithmic results 
  • Contribute to the research data platform design
  • Investigate datasets for use in new or existing algorithms
  • Participate in agile development practices
  • Liaise with stakeholders to gather & understand the functional requirements
  • Take part in code reviews ensuring quality meets highest level of standards
  • Develop software using high quality standards and best practices, conduct thorough end-to-end unit testing, and provide support during testing and post go-live
  • Support research innovation through the creative and aggressive experimentation of cutting-edge hardware, software, processes, procedures, and methods
  • Collaborate with technology teams to ensure appropriate requirements, standards, and integration

 

REQUIREMENTS:

  • Experience in a quant research & development role
  • Proficient in Python, Git and Jira
  • Knowledge in SQL and database development (PostgreSQL is a plus)
  • Understanding of R and RMarkdown is a plus
  • Bachelor’s degree in computer science, computational mathematics, or financial engineering 
  • Master’s degree or advanced training is a strong plus
  • Excellent mathematical foundation and hands-on experience working in the finance industry
  • Proficient in quantitative, statistical, and ML/AI techniques and their implementation using Python modules such as Pandas, NumPy, SciPy, SciKit-Learn, etc.
  • Strong communication (written and oral) and analytical problem-solving skills
  • Strong sense of attention to detail, pride in delivering high quality work and willingness to learn
  • An understanding of or exposure to financial capital markets, various financial instruments (such as stocks, ETFs, Mutual Funds, etc.), and financial tools (such as Bloomberg, Reuters, etc.)


 

BENEFITS PACKAGE:

TIFIN offers a competitive benefits package that includes:

· Performance-linked variable compensation.

· Medical insurance

· Tax saving benefits

· Flexible PTO policy and Company-paid holidays

· Parental Leave: 6 month paid maternity, 2 week paid paternity leave

· Access to our Wellness trainers, including 1:1 personal coaching

for executives and rising stars

 

A note on location. While we have team centres in Boulder, New York City, San Francisco, Charlotte, and Bangalore,this role is based out of Mumbai.  

 

TIFIN is proud to be an equal opportunity workplace and values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status.

 

 

 

 

 

Read more
Metadata Technologies, North America
Metadata Technologies, North America
Agency job
via RS Consultants by Biswadeep RS
Remote only
4 - 8 yrs
₹15L - ₹45L / yr
skill iconJava
skill iconGo Programming (Golang)
Data engineering
Network
Multithreading
+12 more

 

We are looking for an exceptional Software Developer for our Data Engineering India team who can-

contribute to building a world-class big data engineering stack that will be used to fuel us

Analytics and Machine Learning products. This person will be contributing to the architecture,

operation, and enhancement of:

Our petabyte-scale data platform with a key focus on finding solutions that can support

Analytics and Machine Learning product roadmap. Everyday terabytes of ingested data

need to be processed and made available for querying and insights extraction for

various use cases.

About the Organisation:

 

- It provides a dynamic, fun workplace filled with passionate individuals. We are at the cutting edge of advertising technology and there is never a dull moment at work.

 

- We have a truly global footprint, with our headquarters in Singapore and offices in Australia, United States, Germany, United Kingdom, and India.

 

- You will gain work experience in a global environment. We speak over 20 different languages, from more than 16 different nationalities and over 42% of our staff are multilingual.


Job Description

Position:
Software Developer, Data Engineering team
Location: Pune(Initially 100% Remote due to Covid 19 for coming 1 year)

 

  • Our bespoke Machine Learning pipelines. This will also provide opportunities to

contribute to the prototyping, building, and deployment of Machine Learning models.

You:

  • Have at least 4+ years’ Experience.
  • Deep technical understanding of Java or Golang.
  • Production experience with Python is a big plus, extremely valuable supporting skill for

us.

  • Exposure to modern Big Data tech: Cassandra/Scylla, Kafka, Ceph, the Hadoop Stack,

Spark, Flume, Hive, Druid etc… while at the same time understanding that certain

problems may require completely novel solutions.

  • Exposure to one or more modern ML tech stacks: Spark ML-Lib, TensorFlow, Keras,

GCP ML Stack, AWS Sagemaker - is a plus.

  • Experience includes working in Agile/Lean model
  • Experience with supporting and troubleshooting large systems
  • Exposure to configuration management tools such as Ansible or Salt
  • Exposure to IAAS platforms such as AWS, GCP, Azure…
  • Good addition - Experience working with large-scale data
  • Good addition - Good to have experience architecting, developing, and operating data

warehouses, big data analytics platforms, and high velocity data pipelines

**** Not looking for a Big Data Developer / Hadoop Developer

Read more
OSBIndia Private Limited
Bengaluru (Bangalore)
6 - 12 yrs
₹12L - ₹18L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+1 more

1.      Core Responsibilities

·        Build, maintain and manage a team capable of delivering the data operations needs of the banks data teams and other stakeholders, ensuring the team is right sized, motivated and focused on key goals and SLAs.

·        Maintain, develop and enhance the tools and environments used by the data teams to ensure availability of Development, Test and production environments

·        Manage all data operations database changes observing industry standard software development life cycle approaches with development, testing and deployment supported by comprehensive documentation

·        Manage releases of code from the engineering team to ensure separation of duty

·        As subject matter expert take a key contributing role in data initiatives such as infrastructure development, software tool evaluation, intra company data integration

·        Assess system performance and process efficiency making recommendations for change

·        Identify gaps and technical issues affecting data flows and lead the development of solutions to these

·        Work with internal and external stakeholders to ensure that data solutions are reliably populated on time

·        Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.

·        Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.

·        Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.

 

2.      Experience Requirements

·        5 years previous experience supporting datawarehousing, data lake solutions is essential

·        5 years previous experience of Microsoft SQL Server SSIS is desirable

·        Experience with monitoring and incident management is essential

·        Experience of managing a technical team is desirable

·        Experience working in an IT environment

·        Experience working within banking or other financial institutions is desirable

·         

 

3.      Knowledge Requirements

·        Strong background of working in data teams

·        Robust knowledge of RDBMS principles and how best to manage environments

·        Strong knowledge of a standardised SDLC is desirable

Read more
Deltacubes
at Deltacubes
6 recruiters
Bavithra Kanniyappan
Posted by Bavithra Kanniyappan
Remote only
5 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
PySpark
skill iconScala
Spark
+3 more

Hiring - Python Developer Freelance Consultant (WFH-Remote)

Greetings from Deltacubes Technology!!

 

Skillset Required:

Python

Pyspark

AWS

Scala

 

Experience:

5+ years

 

Thanks

Bavithra

 

Read more
Remote only
0 - 2 yrs
₹2.4L - ₹10L / yr
Quantitative analyst
Algorithmic trading
skill iconR Programming
Matlab
SPSS
+3 more

Work Experience : 0-2 years

Responsibilities :

- Design and implement mathematical models for fundamental valuation of securities. The person will need to understand latest research in quantitative finance and implement the same.

- Design, back-testing and implementation of high-frequency trading strategies on international exchanges. Work as part of the market-making team to determine the signals and trading strategies to go live with.

- Conduct performance attribution of live portfolios.

Required Skills :

- Strong candidates should have 0-2 years of work experience and successful track record in quantitative analysis preferably in the capital markets domain.

- Post-Graduate degree in statistics, finance, mathematics, engineering (Computer Science preferred) or other quantitative or computational disciplines

- Experience in using some or all of the following packages: R, MATLAB, SPSS, CART, C# .Net, Python

- Good written and oral communication skills.

- Strong experience working both independently and in a team-oriented collaborative environment.

- Entrepreneurial, self-motivated individual - high energy, high activity levels - passion for working with an innovative, small but rapidly growing company.

Read more
Banyan Data Services
at Banyan Data Services
1 recruiter
Sathish Kumar
Posted by Sathish Kumar
Bengaluru (Bangalore)
3 - 15 yrs
₹6L - ₹20L / yr
skill iconData Science
Data Scientist
skill iconMongoDB
skill iconJava
Big Data
+14 more

Senior Big Data Engineer 

Note:   Notice Period : 45 days 

Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA. 

 

We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure. 

 

It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges. 

 

 

Key Qualifications

 

·   5+ years of experience working with Java and Spring technologies

· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations

· Knowledge of microservices architecture is plus 

· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra

· Experience with Kafka or any streaming tools

· Knowledge of Scala would be preferable

· Experience with agile application development 

· Exposure of any Cloud Technologies including containers and Kubernetes 

· Demonstrated experience of performing DevOps for platforms 

· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity

· Exposure to Graph databases

· Passion for learning new technologies and the ability to do so quickly 

· A Bachelor's degree in a computer-related field or equivalent professional experience is required

 

Key Responsibilities

 

· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture

· Design and develop the big data-focused micro-Services

· Involve in big data infrastructure, distributed systems, data modeling, and query processing

· Build software with cutting-edge technologies on cloud

· Willing to learn new technologies and research-orientated projects 

· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed 

Read more
netmedscom
at netmedscom
3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
2 - 5 yrs
₹6L - ₹25L / yr
Big Data
Hadoop
Apache Hive
skill iconScala
Spark
+12 more

We are looking for an outstanding Big Data Engineer with experience setting up and maintaining Data Warehouse and Data Lakes for an Organization. This role would closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Roles and Responsibilities:

  • Develop and maintain scalable data pipelines and build out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'Big Data' technologies.
  • Develop programs in Scala and Python as part of data cleaning and processing.
  • Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization.  
  • Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Provide high operational excellence guaranteeing high availability and platform stability.
  • Closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Skills:

  • Experience with Big Data pipeline, Big Data analytics, Data warehousing.
  • Experience with SQL/No-SQL, schema design and dimensional data modeling.
  • Strong understanding of Hadoop Architecture, HDFS ecosystem and eexperience with Big Data technology stack such as HBase, Hadoop, Hive, MapReduce.
  • Experience in designing systems that process structured as well as unstructured data at large scale.
  • Experience in AWS/Spark/Java/Scala/Python development.
  • Should have Strong skills in PySpark (Python & SPARK). Ability to create, manage and manipulate Spark Dataframes. Expertise in Spark query tuning and performance optimization.
  • Experience in developing efficient software code/frameworks for multiple use cases leveraging Python and big data technologies.
  • Prior exposure to streaming data sources such as Kafka.
  • Should have knowledge on Shell Scripting and Python scripting.
  • High proficiency in database skills (e.g., Complex SQL), for data preparation, cleaning, and data wrangling/munging, with the ability to write advanced queries and create stored procedures.
  • Experience with NoSQL databases such as Cassandra / MongoDB.
  • Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission.
  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development).
  • Experience building and deploying applications on on-premise and cloud-based infrastructure.
  • Having a good understanding of machine learning landscape and concepts. 

 

Qualifications and Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Big Data Engineer or a similar role for 3-5 years.

Certifications:

Good to have at least one of the Certifications listed here:

    AZ 900 - Azure Fundamentals

    DP 200, DP 201, DP 203, AZ 204 - Data Engineering

    AZ 400 - Devops Certification

Read more
Dataweave Pvt Ltd
at Dataweave Pvt Ltd
32 recruiters
Megha M
Posted by Megha M
Bengaluru (Bangalore)
0 - 1 yrs
Best in industry
Data engineering
Internship
skill iconPython
Looking for the Candiadtes , good in coding
scraping , and problem skills
Read more
www.couponcrown.com
Raghu Ram
Posted by Raghu Ram
Hyderabad
0 - 2 yrs
₹0L - ₹2L / yr
skill iconData Science
skill iconPython
skill iconR Programming
We are looking for data scientist with some coding skill
Read more
INSTAFUND INTERNET PRIVATE LIMITED
Pruthiraj Rath
Posted by Pruthiraj Rath
Chennai
1 - 3 yrs
₹3L - ₹6L / yr
skill iconReact.js
skill iconJavascript
skill iconPython
LAMP Stack
skill iconMongoDB
+2 more
At Daddyswallet, we’re using today’s technology to bring significant disruptive innovation to the financial industry. We focus on improving the lives of consumers by delivering simple, honest and transparent financial products.Looking for Fullstack developer having skills mainly in React native,react js.python.node js.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos