Risk analysis Jobs in Mumbai
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
What you will do:
- Reviewing the portfolio monitoring/ early warning signals mechanism on ongoing basis
- Monitoring internal and external data points that may affect the risk level of a decision
- Aggregating data from multiple sources to provide a comprehensive assessment
- Coming up with the solution to reduce risks
- Bringing fresh ideas to the table and being keen observers of trends on analytics and financial services industry
- Creating reports, summaries, presentations and process documents to display results
Desired Candidate Profile
What you need to have:- MBA/BE/ Masters Statistics/ Mathematics, with work experience of 1-5 years in a similar company or related field
- Work experience with analytics consulting into financial services Indian Banks/ NBFCs in-house analytics units or Fintech/analytics start-ups would be a plus
B1 – Data Scientist - Kofax Accredited Developers
Requirement – 3
Mandatory –
- Accreditation of Kofax KTA / KTM
- Experience in Kofax Total Agility Development – 2-3 years minimum
- Ability to develop and translate functional requirements to design
- Experience in requirement gathering, analysis, development, testing, documentation, version control, SDLC, Implementation and process orchestration
- Experience in Kofax Customization, writing Custom Workflow Agents, Custom Modules, Release Scripts
- Application development using Kofax and KTM modules
- Good/Advance understanding of Machine Learning /NLP/ Statistics
- Exposure to or understanding of RPA/OCR/Cognitive Capture tools like Appian/UI Path/Automation Anywhere etc
- Excellent communication skills and collaborative attitude
- Work with multiple teams and stakeholders within like Analytics, RPA, Technology and Project management teams
- Good understanding of compliance, data governance and risk control processes
Total Experience – 7-10 Years in BPO/KPO/ ITES/BFSI/Retail/Travel/Utilities/Service Industry
Good to have
- Previous experience of working on Agile & Hybrid delivery environment
- Knowledge of VB.Net, C#( C-Sharp ), SQL Server , Web services
Qualification -
- Masters in Statistics/Mathematics/Economics/Econometrics Or BE/B-Tech, MCA or MBA
Exp-Min 10 Years
Location Mumbai
Sal-Nego
Powerbi, Tableau, QlikView,
Solution Architect/Technology Lead – Data Analytics
Role
Looking for Business Intelligence lead (BI Lead) having hands on experience BI tools (Tableau, SAP Business Objects, Financial and Accounting modules, Power BI), SAP integration, and database knowledge including one or more of Azure Synapse/Datafactory, SQL Server, Oracle, cloud-based DB Snowflake. Good knowledge of AI-ML, Python is also expected.
- You will be expected to work closely with our business users. The development will be performed using an Agile methodology which is based on scrum (time boxing, daily scrum meetings, retrospectives, etc.) and XP (continuous integration, refactoring, unit testing, etc) best practices. Candidates must therefore be able to work collaboratively, demonstrate good ownership, leadership and be able to work well in teams.
- Responsibilities :
- Design, development and support of multiple/hybrid Data sources, data visualization Framework using Power BI, Tableau, SAP Business Objects etc. and using ETL tools, Scripting, Python Scripting etc.
- Implementing DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code-utilizing tools like Git. Primary Skills
Requirements
- 10+ years working as a hands-on developer in Information Technology across Database, ETL and BI (SAP Business Objects, integration with SAP Financial and Accounting modules, Tableau, Power BI) & prior team management experience
- Tableau/PowerBI integration with SAP and knowledge of SAP modules related to finance is a must
- 3+ years of hands-on development experience in Data Warehousing and Data Processing
- 3+ years of Database development experience with a solid understanding of core database concepts and relational database design, SQL, Performance tuning
- 3+ years of hands-on development experience with Tableau
- 3+ years of Power BI experience including parameterized reports and publishing it on PowerBI Service
- Excellent understanding and practical experience delivering under an Agile methodology
- Ability to work with business users to provide technical support
- Ability to get involved in all the stages of project lifecycle, including analysis, design, development, testing, Good To have Skills
- Experience with other Visualization tools and reporting tools like SAP Business Objects.
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
Job Role : Associate Manager (Database Development)
Key Responsibilities:
- Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
- Designing and developing numerous complex queries, views, functions, and stored procedures
- to work seamlessly with the Application/Development team’s data needs.
- Responsible for providing solutions to all data related needs to support existing and new
- applications.
- Creating scalable structures to cater to large user bases and manage high workloads
- Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
- Developing custom stored procedures and packages to support new enhancement needs.
- Working with multiple teams to design, develop and deliver early warning systems.
- Reviewing query performance and optimizing code
- Writing queries used for front-end applications
- Designing and coding database tables to store the application data
- Data modelling to visualize database structure
- Working with application developers to create optimized queries
- Maintaining database performance by troubleshooting problems.
- Accomplishing platform upgrades and improvements by supervising system programming.
- Securing database by developing policies, procedures, and controls.
- Designing and managing deep statistical systems.
Desired Skills and Experience :
- 7+ years of experience in database development
- Minimum 4+ years of experience in PostgreSQL is a must
- Experience and in-depth knowledge in PL/SQL
- Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
- Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
- Experience in Big Data technologies is an added advantage
- Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
- Ability to take ownership of tasks and flexibility to work individually or in team
- Ability to communicate with teams and clients across time zones and global regions
- Good communication and self-motivated
- Should have the ability to work under pressure
- Knowledge of NoSQL and Cloud Architecture will be an advantage
About Us |
|
upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.
|
ABOUT EPISOURCE:
Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.
The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.
What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.
ABOUT THE ROLE:
We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.
This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.
You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.
During the course of a typical day with our team, expect to work on one or more projects around the following;
1. Create and maintain optimal data pipeline architectures for ML
2. Develop a strong API ecosystem for ML pipelines
3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible
4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems
5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations
6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms
7. Designing scalable implementations of the models developed by our Data Science teams
8. Big data and distributed ML with PySpark on AWS EMR, and more!
BASIC REQUIREMENTS
-
Bachelor’s degree or greater in Computer Science, IT or related fields
-
Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects
-
Strong experience with bash scripting, unix environments and building scalable/distributed systems
-
Experience with automation/configuration management using Ansible, Terraform, or equivalent
-
Very strong experience with AWS and Python
-
Experience building CI/CD systems
-
Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent
-
Ability to build and manage application and performance monitoring processes
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.
Key functions & responsibilities:
Communication & interaction with the Project Manager to understand the requirement
Dashboard designing, development and deployment using Tableau eco-system
Ensure delivery within a given time frame while maintaining quality
Stay up to date with current tech and bring relevant ideas to the table
Proactively work with the Management team to identify and resolve issues
Performs other related duties as assigned or advised
He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
Contribute in dashboard designing, R&D and project delivery using Tableau
Candidate’s Profile
Academics:
Batchelor’s degree preferable in Computer science.
Master’s degree would have an added advantage.
Experience:
Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.
Technology & Skills:
Hands on expertise of Tableau administration and maintenance
Strong working knowledge and development experience with Tableau Server and Desktop
Strong knowledge in SQL, PL/SQL and Data modelling
Knowledge of databases like Microsoft SQL Server, Oracle, etc.
Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written