Cutshort logo

11+ ACL Jobs in Mumbai | ACL Job openings in Mumbai

Apply to 11+ ACL Jobs in Mumbai on CutShort.io. Explore the latest ACL Job opportunities across top companies like Google, Amazon & Adobe.

icon
A Reputed Consulting Frm
Agency job
via 2COMS by Rafikhunnisa Shaik
Gurugram, Bengaluru (Bangalore), Mumbai
1 - 8 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+3 more

Job Role : Audit Analytics – Need from Non Banking domain

Job Location : Gurgaon / Mumbai / B’lore


  • Understanding of business processes and potential risk scenarios.
  • Ability to conceptualize appropriate logic for analyzing potential risk scenarios
  • Ability to understand requirements clearly and to be flexible in learning new data sources and technologies, meeting tight deadlines, and delivering quality reports for auditors.
  • Maintain strong client focus by building positive relationships with clients, scheduling, conducting and presenting on key client meetings.
  • Should be able to write/optimize complex scripts in the technology of expertise. Should be able to review results and identify false positives basis business understanding
  • Should be a self-starter and eager to tackle business problem using experience and skills
  • Play a key role in the development of less expert staff through mentoring, training and advising.
  • 30% Travel in India and Overseas, if required
  • Excellent communication skills and willingness to stretch and multi-task
  • May be assigned on a project on a long term basis.
  • Responsibilities include managing projects involving audit analytics and continuous control monitoring.
    • Understanding of business process (Accounts Payable, Revenue, Fixed Asset, Inventory, MJEs) from analytics requirements
    • Understanding of ERPs (SAP\ JDE\ Oracle\ Concur etc.) – Techno Functional side (Tables and Reports)

 

Qualifications

Minimum qualifications

  • Preferred Post Graduates– MCom\ MSc (IT)\ MBA (IT)\ BE
  • Years of experience in related field of Audit\ Business \ Financial analytics (Non-banking).
  • Working knowledge of analytical / BI tools
  • ACL, SQL / R / Python, Alteryx – Should have any 2 or more
  • VBA, Power BI / Tableau / QlikView
  • GRC Solutions, AWS/Azure cloud based analytical solutionsGood to have
  • Have worked on data analytics support work either as tool implementation, automation of control or MIS development
Read more
Mumbai
5 - 10 yrs
₹8L - ₹20L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+6 more


Data Scientist – Delivery & New Frontiers Manager 

Job Description:   

We are seeking highly skilled and motivated data scientist to join our Data Science team. The successful candidate will play a pivotal role in our data-driven initiatives and be responsible for designing, developing, and deploying data science solutions that drives business values for stakeholders. This role involves mapping business problems to a formal data science solution, working with wide range of structured and unstructured data, architecture design, creating sophisticated models, setting up operations for the data science product with the support from MLOps team and facilitating business workshops. In a nutshell, this person will represent data science and provide expertise in the full project cycle. Expectation of the successful candidate will be above that of a typical data scientist. Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. 

Responsibilities: 

  • Collaborate with cross-functional teams, including software engineers, product managers, and business stakeholders, to understand business needs and identify data science opportunities. 
  • Map complex business problems to data science problem, design data science solution using GCP/Azure Databricks platform. 
  • Collect, clean, and preprocess large datasets from various internal and external sources.  
  • Streamlining data science process working with Data Engineering, and Technology teams. 
  • Managing multiple analytics projects within a Function to deliver end-to-end data science solutions, creation of insights and identify patterns.  
  • Develop and maintain data pipelines and infrastructure to support the data science projects 
  • Communicate findings and recommendations to stakeholders through data visualizations and presentations. 
  • Stay up to date with the latest data science trends and technologies, specifically for GCP companies 

 

Education / Certifications:  

Bachelor’s or Master’s in Computer Science, Engineering, Computational Statistics, Mathematics. 

Job specific requirements:  

  • Brings 5+ years of deep data science experience 

∙       Strong knowledge of machine learning and statistical modeling techniques in a in a clouds-based environment such as GCP, Azure, Amazon 

  • Experience with programming languages such as Python, R, Spark 
  • Experience with data visualization tools such as Tableau, Power BI, and D3.js 
  • Strong understanding of data structures, algorithms, and software design principles 
  • Experience with GCP platforms and services such as Big Query, Cloud ML Engine, and Cloud Storage 
  • Experience in configuring and setting up the version control on Code, Data, and Machine Learning Models using GitHub. 
  • Self-driven, be able to work with cross-functional teams in a fast-paced environment, adaptability to the changing business needs. 
  • Strong analytical and problem-solving skills 
  • Excellent verbal and written communication skills 
  • Working knowledge with application architecture, data security and compliance team. 


Read more
Media.net

at Media.net

21 recruiters
Akshata  Kulkarni
Posted by Akshata Kulkarni
Mumbai
0 - 0 yrs
₹2L - ₹3.5L / yr
SQL
MS-Excel
Communication Skills
SAS
SPSS

Kindly note that candidates who have graduated in 2022 and 2023 only will be considered for the role who are based in Mumbai, immediate joiners



JD - Data Operations Analyst


What is the job and team like

  • As a Data Operations Analyst we manage business reporting of numerous teams, constantly monitor performance
  • Checking the integrity of the revenue reporting done by the different systems for the correct profitability to be
  • reported to the CXOs
  • Send reports periodically and alert stakeholders for changes in the key performance metrics
  • Allocate efforts to different Business Implementations that help build Profit/Loss statement for the Financials.
  • Track crucial data points which affect the core of the business and escalate it to senior stakeholders


Roles and Responsibilities


  • Graduate in IT Background(BE/BSc IT/ BCA) 2022 and 2023 graduates only
  • Executing a set of business processes daily/weekly/monthly as per Business requirement.
  • Provide ad-hoc data support on any urgent reports and material in an expedited manner
  • Maintain a list of open tasks and escalations, and send updates to the relevant stakeholders
  • Have an eye for detail, should have the ability to look at numbers, spot trends and identify gaps
  • Identify efficient and meaningful ways to communicate data and analysis through ongoing reports and dashboards
  • Proficiency in SQL, Excel and any statistical and analytical tools such as SAS, SPSS is a big plus
  • Managing master data, including creation, updates, and deletion.
  • Ability to work in a fast paced, technical, cross functional environment
  • Familiarity with Internet Industry and Online Advertising Business is a plus


Ideal candidate


  • Import and export large volume of data to database tables as required
  • Should be able to write Data Definition Language or Data Manipulation Language SQL commands
  • Develop programs, methodologies to get analyzable data on a regular basis
  • Good team player and multi-tasker
  • Should have the ability to learn and adapt to change
  • Self-starter Must be productive with minimal direction
  • High-level written and verbal communication sk


Job Details

Work mode- In office

Must have skills - SQL, MS Excel, Communications


Read more
TreQ

at TreQ

Nidhi Tiwari
Posted by Nidhi Tiwari
Mumbai
2 - 5 yrs
₹7L - ₹12L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL
+1 more

Responsibilities :


  • Involve in planning, design, development and maintenance of large-scale data repositories, pipelines, analytical solutions and knowledge management strategy
  • Build and maintain optimal data pipeline architecture to ensure scalability, connect operational systems data for analytics and business intelligence (BI) systems
  • Build data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
  • Reporting and obtaining insights from large data chunks on import/export and communicating relevant pointers for helping in decision-making
  • Preparation, analysis, and presentation of reports to the management for further developmental activities
  • Anticipate, identify and solve issues concerning data management to improve data quality


Requirements :


  • Ability to build and maintain ETL pipelines 
  • Technical Business Analysis experience and hands-on experience developing functional spec
  • Good understanding of Data Engineering principles including data modeling methodologies
  • Sound understanding of PostgreSQL
  • Strong analytical and interpersonal skills as well as reporting capabilities
Read more
Crisp Analytics

at Crisp Analytics

8 recruiters
Seema Pahwa
Posted by Seema Pahwa
Mumbai
2 - 6 yrs
₹6L - ₹15L / yr
Big Data
Spark
skill iconScala
skill iconAmazon Web Services (AWS)
Apache Kafka

 

The Data Engineering team is one of the core technology teams of Lumiq.ai and is responsible for creating all the Data related products and platforms which scale for any amount of data, users, and processing. The team also interacts with our customers to work out solutions, create technical architectures and deliver the products and solutions.

If you are someone who is always pondering how to make things better, how technologies can interact, how various tools, technologies, and concepts can help a customer or how a customer can use our products, then Lumiq is the place of opportunities.

 

Who are you?

  • Enthusiast is your middle name. You know what’s new in Big Data technologies and how things are moving
  • Apache is your toolbox and you have been a contributor to open source projects or have discussed the problems with the community on several occasions
  • You use cloud for more than just provisioning a Virtual Machine
  • Vim is friendly to you and you know how to exit Nano
  • You check logs before screaming about an error
  • You are a solid engineer who writes modular code and commits in GIT
  • You are a doer who doesn’t say “no” without first understanding
  • You understand the value of documentation of your work
  • You are familiar with Machine Learning Ecosystem and how you can help your fellow Data Scientists to explore data and create production-ready ML pipelines

 

Eligibility

Experience

  • At least 2 years of Data Engineering Experience
  • Have interacted with Customers


Must Have Skills

  • Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES
  • Apache Spark
  • Python
  • Scala
  • PostgreSQL
  • Git
  • Linux


Good to have Skills

  • Apache NiFi
  • Apache Kafka
  • Apache Hive
  • Docker
  • Amazon Certification

 

 

Read more
nymbleUP
Remote, Mumbai
3 - 5 yrs
₹6L - ₹12L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
NumPy
Keras
+3 more
ML / AI  engineer with hands-on experience of working with Time Series Data, layering of data and adopting complex parameters. At least 3-5 years of experience of working with customer data and handling ETL operations. Experience of converting machine learning models into APIs

Responsibilities

  1. Create data funnels to feed into models via web, structured and unstructured data
  2. Maintain coding standards using  SDLC, Git, AWS deployments etc
  3. Keep abreast of developments in the field
  4. Deploy models in production and monitor them
  5. Documentations of processes and logic
  6. Take ownership of the solution from code to deployment and performance

 

Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Nashik, Pune, Visakhapatnam, Chennai, Noida
3 - 5 yrs
₹8L - ₹12L / yr
Oracle Analytics
OAS
OAC
Oracle OAS
Oracle
+8 more

Oracle OAS Developer

 

 

Senior OAS/OAC (Oracle analytics) designer and developer having 3+ years of experience. Worked on new Oracle Analytics platform. Used latest features, custom plug ins and design new one using Java. Has good understanding about the various graphs data points and usage for appropriate financial data display. Worked on performance tuning and build complex data security requirements.

Qualifications



Bachelor university degree in Engineering/Computer Science.

Additional information

Have knowledge of Financial and HR dashboard

 

Read more
Nascentvision

at Nascentvision

1 recruiter
Shanu Mohan
Posted by Shanu Mohan
Gurugram, Mumbai, Bengaluru (Bangalore)
2 - 4 yrs
₹10L - ₹17L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
Spark
skill iconScala
+2 more
  • Hands-on experience in any Cloud Platform
· Versed in Spark, Scala/python, SQL
  • Microsoft Azure Experience
· Experience working on Real Time Data Processing Pipeline
Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
AWS KINESYS
Data engineering
AWS Lambda
DynamoDB
data pipeline
+11 more
  • Key responsibility is to design & develop a data pipeline for real-time data integration, processing, executing of the model (if required), and exposing output via MQ / API / No-SQL DB for consumption
  • Provide technical expertise to design efficient data ingestion solutions to store & process unstructured data, such as Documents, audio, images, weblogs, etc
  • Developing API services to provide data as a service
  • Prototyping Solutions for complex data processing problems using AWS cloud-native solutions
  • Implementing automated Audit & Quality assurance Checks in Data Pipeline
  • Document & maintain data lineage from various sources to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Skills

  • Programming experience using Python & SQL
  • Extensive working experience in Data Engineering projects, using AWS Kinesys,  AWS S3, DynamoDB, EMR, Lambda, Athena, etc for event processing
  • Experience & expertise in implementing complex data pipeline
  • Strong Familiarity with AWS Toolset for Storage & Processing. Able to recommend the right tools/solutions available to address specific data processing problems
  • Hands-on experience in Unstructured (Audio, Image, Documents, Weblogs, etc) Data processing.
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Real-time Event Processing
  • Data Governance & Quality assurance
  • Containerized deployment
  • Linux
  • Unstructured Data Processing
  • AWS Toolsets for Storage & Processing
  • Data Security

 

Read more
IT Company
Agency job
via Volibits by Manasi D
Pune, Mumbai, Bengaluru (Bangalore)
5 - 8 yrs
₹1L - ₹10L / yr
JDE
JDE DSI
JDE DSI (integration tool) - (4- 8 yrs).
JD Edwards Data system Intergration.data-source agnostic. Modeling structure.
Read more
Quantiphi Inc.

at Quantiphi Inc.

1 video
10 recruiters
Bhavisha Mansukhani
Posted by Bhavisha Mansukhani
Mumbai
1 - 5 yrs
₹3L - ₹15L / yr
skill iconData Science
Decision Science
Data modeling
Statistical Modeling
skill iconPython
+3 more
About us: Quantiphi is a category defining Data Science and Machine Learning Software and Services Company focused on helping organizations translate the big promise of Big Data & Machine Learning technologies into quantifiable business impact. We were founded on the belief that machine learning and artificial intelligence are transformative technologies that will create the next quantum gain in customer experience and unit economics of businesses. Quantiphi helps clients find and capture hidden value from data through a unique blend of business acumen, big-data, machine learning and intuitive information design. AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries. Please Find Attached fact sheet for your reference. For more details visit: www.quantiphi.com ; www.athenasowl.tv Job Description: -Developing high-level solution architecture related to different use-cases in the media industry -Leveraging both structured and unstructured data from external sources and our proprietary AI/ML models to build solutions and workflows that can be used to give data driven insights. -Develop sophisticated yet easy to digest interpretations and communicate insights to clients that lead to quantifiable business impact. -Building deep relationship with clients by understanding their stated but more importantly, latent needs. -Working closely with the client-side delivery managers to ensure a seamless communication and delivery cadence. Essential Skills and Qualifications: -Hands-on experience with statistical tools and techniques in Python -Great analytical skills, with expertise in analytical toolkits such as Logistic Regression, Cluster Analysis, Factor Analysis, Multivariate Regression, Statistical modelling, predictive analysis. -Advanced knowledge of supervised and unsupervised machine learning algorithms like Random Forest, Boosting, SVM, Neural Networks, Collaborative filtering etc. -Ability to think creatively and work well both as part of a team and as an individual contributor -Critical eye for the quality of data and strong desire to get it right -Strong communication skills. -Should be able to read a paper and quickly implement ideas from scratch. -A pleasantly forceful personality and charismatic communication style.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort