9+ Data-flow analysis Jobs in Mumbai | Data-flow analysis Job openings in Mumbai
Apply to 9+ Data-flow analysis Jobs in Mumbai on CutShort.io. Explore the latest Data-flow analysis Job opportunities across top companies like Google, Amazon & Adobe.
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
AI Agent Builder – Internal Functions and Data Platform Development Tools
About the Role:
We are seeking a forward-thinking AI Agent Builder to lead the design, development, and deployment, and usage reporting of Microsoft Copilot and other AI-powered agents across our data platform development tools and internal business functions. This role will be instrumental in driving automation, improving onboarding, and enhancing operational efficiency through intelligent, context-aware assistants.
This role is central to our GenAI transformation strategy. You will help shape the future of how our teams interact with data, reduce administrative burden, and unlock new efficiencies across the organization. Your work will directly contribute to our “Art of the Possible” initiative—demonstrating tangible business value through AI.
You Will:
• Copilot Agent Development: Use Microsoft Copilot Studio and Agent Builder to create, test, and deploy AI agents that automate workflows, answer queries, and support internal teams.
• Data Engineering Enablement: Build agents that assist with data connector scaffolding, pipeline generation, and onboarding support for engineers.
• Knowledge Base Integration: Curate and integrate documentation (e.g., ERDs, connector specs) into Copilot-accessible repositories (SharePoint, Confluence) to support contextual AI responses.
• Prompt Engineering: Design reusable prompt templates and conversational flows to streamline repeated tasks and improve agent usability.
• Tool Evaluation & Integration: Assess and integrate complementary AI tools (e.g., GitLab Duo, Databricks AI, Notebook LM) to extend Copilot capabilities.
• Cross-Functional Collaboration: Partner with product, delivery, PMO, and security teams to identify high-value use cases and scale successful agent implementations.
• Governance & Monitoring: Ensure agents align with Responsible AI principles, monitor performance, and iterate based on feedback and evolving business needs.
• Adoption and Usage Reporting: Use Microsoft Viva Insights and other tools to report on user adoption, usage and business value delivered.
What We're Looking For:
• Proven experience with Microsoft 365 Copilot, Copilot Studio, or similar AI platforms, ChatGPT, Claude, etc.
• Strong understanding of data engineering workflows, tools (e.g., Git, Databricks, Unity Catalog), and documentation practices.
• Familiarity with SharePoint, Confluence, and Microsoft Graph connectors.
• Experience in prompt engineering and conversational UX design.
• Ability to translate business needs into scalable AI solutions.
• Excellent communication and collaboration skills across technical and non-technical
Bonus Points:
• Experience with GitLab Duo, Notebook LM, or other AI developer tools.
• Background in enterprise data platforms, ETL pipelines, or internal business systems.
• Exposure to AI governance, security, and compliance frameworks.
• Prior work in a regulated industry (e.g., healthcare, finance) is a plus.
Key Responsibilities:
● Client Engagement: Serve as the primary point of contact for assigned clients, understanding their unique operation processes and requirements. Build and maintain strong relationships to facilitate successful implementations.
● Project Management: Lead the end-to-end implementation , ensuring projects are delivered on time, within scope, and within budget. Coordinate with cross-functional teams to align resources and objectives.
● Process Analysis and Improvement: Evaluate clients' existing operation workflows, identify inefficiencies, and recommend optimized processes leveraging platform. Utilize process mapping and data analysis to drive continuous improvement.
● Data Analysis: Analyze substantial datasets to ensure accurate configuration and integration. Employ statistical tools and SQL-based queries to interpret data and provide actionable insights.
● Problem Solving: Break down complex problems into manageable components, developing effective solutions in collaboration with clients and internal teams.
● Process Excellence: Advocate for and implement best practices in process management, utilizing methodologies such as Lean Six Sigma to enhance operational efficiency.
● Customer Excellence: Ensure a superior customer experience by proactively addressing client needs, providing training and support, and promptly resolving any issues that arise.
Qualifications:
● Minimum of 3 years of experience in project management, preferably in financial services, software implementation, consulting or analytics.
● Strong analytical skills with experience in data analysis, SQL querying, and handling large datasets.
● Excellent communication and interpersonal skills, with the ability to manage client relationships effectively.
● Demonstrated ability to lead cross-functional teams and manage multiple projects concurrently.
● Proven expertise in financial operation processes and related software solutions is a plus
● Proficiency in developing business intelligence solutions or with low-code tools is a plus
Why Join ?
● Opportunity to work with a cutting-edge financial technology company.
● Collaborative and innovative work environment.
● Competitive compensation and benefits package.
- ● Professional development and growth opportunities.
- Must have 3+ years of project/program management experience in Financial Services/Banking/NBFC/Fintech companies only.
- Hands-on proficiency in data analysis and SQL querying, with ability to work on large datasets
- Ability to lead end-to-end implementation projects and manage cross-functional teams effectively.
- Experience in process analysis, optimization, and mapping for operational efficiency.
- Strong client-facing communication and stakeholder management capabilities.
- Good expertise in financial operations processes and workflows with proven implementation experience.
🚀 We’re Hiring: Python Developer – Quant Strategies & Backtesting | Mumbai (Goregaon East)
Are you a skilled Python Developer passionate about financial markets and quantitative trading?
We’re looking for someone to join our growing Quant Research & Algo Trading team, where you’ll work on:
🔹 Developing & optimizing trading strategies in Python
🔹 Building backtesting frameworks across multiple asset classes
🔹 Processing and analyzing large market datasets
🔹 Collaborating with quant researchers & traders on real-world strategies
What we’re looking for:
✔️ 3+ years of experience in Python development (preferably in fintech/trading/quant domains)
✔️ Strong knowledge of Pandas, NumPy, SciPy, SQL
✔️ Experience in backtesting, data handling & performance optimization
✔️ Familiarity with financial markets is a big plus
📍 Location: Goregaon East, Mumbai
💼 Competitive package + exposure to cutting-edge quant strategies
🚨 Priority Requirement – Salesforce Solution Architect
📍 Location: Bengaluru | Hyderabad | Mumbai | Pune | Mohali | Delhi
🕒 Shift: Noon | 🏢 Work from Office
✨ Key Focus: Highly skilled candidates with excellent communication skills
🔑 Role: Salesforce Solution Architect (Classic ➝ Lightning Migration)
Responsibilities:
- 🚀 Lead end-to-end Classic → Lightning migration
- 🗂️ Redesign data model (optimize custom → standard objects)
- ⚡ Configure Lightning features (Dynamic Forms, Flows, Omni-Channel, Pages)
- 🛠️ Recommend & implement AppExchange solutions
- 🤝 Collaborate with dev, admin & QA for releases
- 🔒 Ensure security, profiles & permissions compliance
- 🎯 Act as trusted advisor for Salesforce roadmap
Qualifications:
- 🏆 7+ yrs Salesforce exp. | 3+ yrs Lightning implementations
- ☁️ Expertise: Service Cloud, Marketing Cloud, Case Mgmt, Digital Engagement
- 🔄 Proven Classic → Lightning migration exp.
- ⚙️ Strong low-code/no-code (Flows, Dynamic Actions)
- 🎓 Salesforce Certified Architect (preferred)
- 💬 Excellent communication & stakeholder management
Key Responsibilities:
● Design, develop, and maintain scalable web applications using .NET Core, .NET
Framework, C#, and related technologies.
● Participate in all phases of the SDLC, including requirements gathering, architecture
design, coding, testing, deployment, and support.
● Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and
modern front-end technologies such as Angular, React, and JavaScript.
● Conduct thorough code reviews, write unit tests, and ensure adherence to coding
standards and best practices.
● Lead or support .NET Framework to .NET Core migration initiatives, ensuring
minimal disruption and optimal performance.
● Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or
GitLab CI/CD.
● Containerize applications using Docker and deploy/manage them on orchestration
platforms like Kubernetes or GKE.
● Lead and execute database migration projects, particularly transitioning from SQL
Server to PostgreSQL.
● Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and
ongoing maintenance.
● Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud
Run, and Dataflow to build and maintain cloud-native solutions.
● Handle schema conversion and data transformation tasks as part of migration and
modernization efforts.
Required Skills & Experience:
● 5+ years of hands-on experience with C#, .NET Core, and .NET Framework.
● Proven experience in application modernization and cloud-native development.
● Strong knowledge of containerization (Docker) and orchestration tools like
Kubernetes/GKE.
● Expertise in implementing and managing CI/CD pipelines.
● Solid understanding of relational databases and experience in SQL Server to
PostgreSQL migrations.
● Familiarity with cloud infrastructure, especially GCP services relevant to application
hosting and data processing.
● Excellent problem-solving, communication,
We have a urgent requirement for the post of IBM MDM (AE) profile
Notice period - should b e 15-30 days
Shift:- Night
Responsibilities
- Design conversational chatbots using state of the art technology
- Collaborate with the senior team (System Architect and Senior Programmer) on application architecture design and decision-making
- Clean / analyze data coming from bot conversation
- Define recurring questions that can be handled automatically / defined by the client
- Improve dialog flow to handle those recurring questions / develop new actions
- Help with the handling of multilingual support
- Develop internal testing tools




