11+ Alteryx Jobs in Mumbai | Alteryx Job openings in Mumbai
Apply to 11+ Alteryx Jobs in Mumbai on CutShort.io. Explore the latest Alteryx Job opportunities across top companies like Google, Amazon & Adobe.
Job Role : Audit Analytics – Need from Non Banking domain
Job Location : Gurgaon / Mumbai / B’lore
- Understanding of business processes and potential risk scenarios.
- Ability to conceptualize appropriate logic for analyzing potential risk scenarios
- Ability to understand requirements clearly and to be flexible in learning new data sources and technologies, meeting tight deadlines, and delivering quality reports for auditors.
- Maintain strong client focus by building positive relationships with clients, scheduling, conducting and presenting on key client meetings.
- Should be able to write/optimize complex scripts in the technology of expertise. Should be able to review results and identify false positives basis business understanding
- Should be a self-starter and eager to tackle business problem using experience and skills
- Play a key role in the development of less expert staff through mentoring, training and advising.
- 30% Travel in India and Overseas, if required
- Excellent communication skills and willingness to stretch and multi-task
- May be assigned on a project on a long term basis.
- Responsibilities include managing projects involving audit analytics and continuous control monitoring.
- Understanding of business process (Accounts Payable, Revenue, Fixed Asset, Inventory, MJEs) from analytics requirements
- Understanding of ERPs (SAP\ JDE\ Oracle\ Concur etc.) – Techno Functional side (Tables and Reports)
Qualifications
Minimum qualifications
- Preferred Post Graduates– MCom\ MSc (IT)\ MBA (IT)\ BE
- Years of experience in related field of Audit\ Business \ Financial analytics (Non-banking).
- Working knowledge of analytical / BI tools –
- ACL, SQL / R / Python, Alteryx – Should have any 2 or more
- VBA, Power BI / Tableau / QlikView
- GRC Solutions, AWS/Azure cloud based analytical solutions – Good to have
- Have worked on data analytics support work either as tool implementation, automation of control or MIS development
at LS Spectrum Solutions Private Limited
Job Description
▪ You are responsible for setting up, operating, and monitoring LS system solutions on premise and in the cloud
▪ You are responsible for the analysis and long-term elimination of system errors
▪ You provide support in the area of information and IT security
▪ You will work on the strategic further development and optimize the platform used
▪ You will work in a global, international team requirement profile
▪ You have successfully completed an apprenticeship / degree in the field of IT
▪ You can demonstrate in-depth knowledge and experience in the following areas:
▪ PostgreSQL databases
▪ Linux (e.g. Ubuntu, Oracle Linux, RHEL)
▪ Windows (e.g. Windows Server 2019/2022)
▪ Automation / IaC (e.g. Ansible, Terraform)
▪ Containerization with Kubernetes / Virtualization with Vmware is an advantage
▪ Service APIs (AWS, Azure)
▪ You have very good knowledge of English, knowledge of German is an advantage
▪ You are a born team player, show high commitment and are resilient
Professional experience in Python – Mandatory experience
Basic knowledge of any BI Tool (Microsoft Power BI, Tableau etc.) and experience in R
will be an added advantage
Proficient in Excel
Good verbal and written communication skills
Key Responsibilities:
Analyze data trends and provide intelligent business insights, monitor operational and
business metrics
Complete ownership of business excellence dashboard and preparation of reports for
senior management stating trends, patterns, and predictions using relevant data
Review, validate and analyse data points and implement new data analysis
methodologies
Perform data profiling to identify and understand anomalies
Perform analysis to assess quality and meaning of data
Develop policies and procedures for the collection and analysis of data
Analyse existing process with the help of data and propose process change and/or lead
process re-engineering initiatives
Use BI Tools (Microsoft Power BI/Tableau) and develop and manage BI solutions
Job DescriptionPosition: Sr Data Engineer – Databricks & AWS
Experience: 4 - 5 Years
Company Profile:
Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are transforming ourselves and rapidly expanding our business.
Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.
One of the top partners of Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik in 2017.
Get to know more about us on our website: http://www.exponentia.ai/ and Life @Exponentia.
Role Overview:
· A Data Engineer understands the client requirements and develops and delivers the data engineering solutions as per the scope.
· The role requires good skills in the development of solutions using various services required for data architecture on Databricks Delta Lake, streaming, AWS, ETL Development, and data modeling.
Job Responsibilities
• Design of data solutions on Databricks including delta lake, data warehouse, data marts and other data solutions to support the analytics needs of the organization.
• Apply best practices during design in data modeling (logical, physical) and ETL pipelines (streaming and batch) using cloud-based services.
• Design, develop and manage the pipelining (collection, storage, access), data engineering (data quality, ETL, Data Modelling) and understanding (documentation, exploration) of the data.
• Interact with stakeholders regarding data landscape understanding, conducting discovery exercises, developing proof of concepts and demonstrating it to stakeholders.
Technical Skills
• Has more than 2 Years of experience in developing data lakes, and datamarts on the Databricks platform.
• Proven skill sets in AWS Data Lake services such as - AWS Glue, S3, Lambda, SNS, IAM, and skills in Spark, Python, and SQL.
• Experience in Pentaho
• Good understanding of developing a data warehouse, data marts etc.
• Has a good understanding of system architectures, and design patterns and should be able to design and develop applications using these principles.
Personality Traits
• Good collaboration and communication skills
• Excellent problem-solving skills to be able to structure the right analytical solutions.
• Strong sense of teamwork, ownership, and accountability
• Analytical and conceptual thinking
• Ability to work in a fast-paced environment with tight schedules.
• Good presentation skills with the ability to convey complex ideas to peers and management.
Education:
BE / ME / MS/MCA.
Responsibilities :
- Involve in planning, design, development and maintenance of large-scale data repositories, pipelines, analytical solutions and knowledge management strategy
- Build and maintain optimal data pipeline architecture to ensure scalability, connect operational systems data for analytics and business intelligence (BI) systems
- Build data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
- Reporting and obtaining insights from large data chunks on import/export and communicating relevant pointers for helping in decision-making
- Preparation, analysis, and presentation of reports to the management for further developmental activities
- Anticipate, identify and solve issues concerning data management to improve data quality
Requirements :
- Ability to build and maintain ETL pipelines
- Technical Business Analysis experience and hands-on experience developing functional spec
- Good understanding of Data Engineering principles including data modeling methodologies
- Sound understanding of PostgreSQL
- Strong analytical and interpersonal skills as well as reporting capabilities
In today’s marketing-dependent consumer world, businesses demand innovative brand solutions and our client is engaged in providing brand solutions across businesses. They import renowned global brands & distribute the same. They have a dedicated eCommerce practice to build digital retail footprints for their brands in India. They focus on major categories under FMCG such as F&B, Personal care, etc. and also Home & Fabric Care.
Founded in 2003, our client has over 250 Pan-India Distributors and more than 300 SKUs. They deal with 25+ brands and some of the brands are Mogu Mogu, SAN REMO, Simpkins, Moccona, OMINO BIANCO, etc.
As a Assistant Manager - Business Intelligence, you will be responsible for support building and executing portfolio and respective brand execution strategy by mining, understanding, internalising, and creating action stands about inflow and outflow of information from different internal and external sources.
What you will do:
- Assimilating sales & inventory data streams from different ecommerce partners daily
- Planning different inventory streams for all portfolio brands vis--vis PO requirements
- Building future projection models for brands to support healthy and sufficient inventory capacities
Desired Candidate Profile
What you need to have:- Self direction
- Ability to think beyond operational hurdles
Job Details:-
Designation - Data Scientist
Urgently required. (NP of maximum 15 days)
Location:- Mumbai
Experience:- 5-7 years.
Package Offered:- Rs.5,00,000/- to Rs.9,00,000/- pa.
Data Scientist
Job Description:-
Responsibilities:
- Identify valuable data sources and automate collection processes
- Undertake preprocessing of structured and unstructured data
- Analyze large amounts of information to discover trends and patterns
- Build predictive models and machine-learning algorithms
- Combine models through ensemble modeling
- Present information using data visualization techniques
- Propose solutions and strategies to business challenges
- Collaborate with engineering and product development teams
Requirements:
- Proven experience as a Data Scientist or Data Analyst
- Experience in data mining
- Understanding of machine-learning and operations research
- Knowledge of R, SQL and Python; familiarity with Scala, Java is an asset
- Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop)
- Analytical mind and business acumen
- Strong math skills (e.g. statistics, algebra)
- Problem-solving aptitude
- Excellent communication and presentation skills
- BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Should be able to use the transformations components to transform the data
Should possess knowledge on incremental load, full load etc.
Should Design, build and deploy effective packages
Should be able to schedule these packages through task schedulers
Implement stored procedures and effectively query a database
Translate requirements from the business and analyst into technical code
Identify and test for bugs and bottlenecks in the ETL solution
Ensure the best possible performance and quality in the packages
Provide support and fix issues in the packages
Writes advanced SQL including some query tuning
Experience in the identification of data quality
Some database design experience is helpful
Experience designing and building complete ETL/SSIS processes moving and transforming data for
ODS, Staging, and Data Warehousing
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
- Ensuring ease of data availability, with relevant dimensions, using Business Intelligence tools.
- Providing strong reporting and analytical information support to the management team.
- Transforming raw data into essential metrics basis needs of relevant stakeholders.
- Performing data analysis for generating reports on a periodic basis.
- Converting essential data into easy to reference visuals using Data Visualization tools (PowerBI, Metabase).
- Providing recommendations to update current MIS to improve reporting efficiency and consistency.
- Bringing fresh ideas to the table and keen observers of trends in the analytics and financial services industry.
What you need to have:
- MBA/ BE/ Graduate, with work experience of 3+ years.
- B.Tech /B.E.; MBA / PGDM
- Experience in Reporting, Data Management (SQL, MongoDB), Visualization (PowerBI, Metabase, Data studio)
- Work experience (into financial services, Indian Banks/ NBFCs in-house analytics units or Fintech/ analytics start-ups would be a plus.)
- Skilled at writing & optimizing large complicated SQL queries & MongoDB scripts.
- Strong knowledge of Banking/ Financial Services domain
- Experience with some of the modern relational databases
- Ability to work on multiple projects of different nature and self- driven,
- Liaise with cross-functional teams to resolve data issues and build strong reports
- Data Steward :
Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.
Primary Responsibilities:
- Responsible for data quality and data accuracy across all group/division delivery initiatives.
- Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
- Responsible for reviewing and governing data queries and DML.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
- Accountable for the performance, quality, and alignment to requirements for all data query design and development.
- Responsible for defining standards and best practices for data analysis, modeling, and queries.
- Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
- Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
- Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
- Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
- Owns group's data assets including reports, data warehouse, etc.
- Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
- Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
- Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
- Responsible for solving data-related issues and communicating resolutions with other solution domains.
- Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
- Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
- Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
- Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
- Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.
Additional Responsibilities:
- Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
- Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
- Knowledge and understanding of Information Technology systems and software development.
- Experience with data modeling and test data management tools.
- Experience in the data integration project • Good problem solving & decision-making skills.
- Good communication skills within the team, site, and with the customer
Knowledge, Skills and Abilities
- Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
- Solid understanding of key DBMS platforms like SQL Server, Azure SQL
- Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
- Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
- Experience in Report and Dashboard development
- Statistical and Machine Learning models
- Python (sklearn, numpy, pandas, genism)
- Nice to Have:
- 1yr of ETL experience
- Natural Language Processing
- Neural networks and Deep learning
- xperience in keras,tensorflow,spacy, nltk, LightGBM python library
Interaction : Frequently interacts with subordinate supervisors.
Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required
Experience : 7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.