8+ Data-flow analysis Jobs in Pune | Data-flow analysis Job openings in Pune
Apply to 8+ Data-flow analysis Jobs in Pune on CutShort.io. Explore the latest Data-flow analysis Job opportunities across top companies like Google, Amazon & Adobe.
About the Role
We are looking for a Big Data Engineer with 2–5 years of experience in designing, building, and operating large-scale data processing systems, preferably on Google Cloud Platform
(GCP). This role is suited for engineers who understand modern data architectures and are comfortable working across multiple stages of the data lifecycle.
We do not expect expertise in every GCP data service. Instead, candidates should have
strong hands-on experience in at least one service from each core data area listed below
and the ability to learn and adapt to new tools quickly.
Key Responsibilities
● Design, develop, and maintain scalable data pipelines on GCP.
● Build batch and streaming data processing workflows using managed cloud services.
● Develop and maintain data transformation workflows using SQL and Python.
● Create and manage workflow orchestration using DAG-based schedulers.
● Collaborate with analytics, product, and engineering teams to deliver reliable datasets.
● Optimize data pipelines for performance, cost, and reliability.
● Ensure data quality, monitoring, and observability across pipelines.
● Participate in code reviews and contribute to data engineering best practices.
Core Experience Areas (At Least One From Each)
1. Data Warehousing & Analytics
● BigQuery
● Dataproc (Spark / Hadoop)
● Other cloud data warehouse or analytics platforms
2. Data Processing and Pipelines
● Dataflow (Apache Beam)
● Cloud Run Jobs / Cloud Run Services
● Apache Spark (batch or streaming)
● dbt for transformations
3. Databases & Storage
● Bigtable
● Cloud Storage
● Relational databases (PostgreSQL, MySQL, Cloud SQL)
● NoSQL databases
4. Data Preparation Exploration
● SQL-based data analysis
● Python for data manipulation (Pandas, PySpark)
● Exploratory data analysis on large datasets
Workflow Orchestration & Scheduling
● Cloud Composer (Airflow)
● Cloud Scheduler
● Experience creating and maintaining DAGs in Python
Required Skills & Qualifications
● 2–5 years of experience in data engineering or big data processing.
● Hands-on experience with Google Cloud Platform (preferred).
● Strong proficiency in Python and SQL.
● Understanding of distributed data processing concepts.
● Experience with CI/CD, Git, and production-grade data systems.
● Ability to work across ambiguous problem statements and evolving requirements.
AI & System Mindset
Experience working with AI-powered systems is a strong plus. Candidates should be
comfortable integrating AI agents, third-party APIs, and automation workflows into
applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.
Good to Have
● Experience with streaming data (Pub/Sub, Kafka).
● Cost optimization experience on cloud data platforms.
● Exposure to AI/ML pipelines or feature engineering.
● Experience working in product-driven or startup environments.
Education
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
AI Agent Builder – Internal Functions and Data Platform Development Tools
About the Role:
We are seeking a forward-thinking AI Agent Builder to lead the design, development, and deployment, and usage reporting of Microsoft Copilot and other AI-powered agents across our data platform development tools and internal business functions. This role will be instrumental in driving automation, improving onboarding, and enhancing operational efficiency through intelligent, context-aware assistants.
This role is central to our GenAI transformation strategy. You will help shape the future of how our teams interact with data, reduce administrative burden, and unlock new efficiencies across the organization. Your work will directly contribute to our “Art of the Possible” initiative—demonstrating tangible business value through AI.
You Will:
• Copilot Agent Development: Use Microsoft Copilot Studio and Agent Builder to create, test, and deploy AI agents that automate workflows, answer queries, and support internal teams.
• Data Engineering Enablement: Build agents that assist with data connector scaffolding, pipeline generation, and onboarding support for engineers.
• Knowledge Base Integration: Curate and integrate documentation (e.g., ERDs, connector specs) into Copilot-accessible repositories (SharePoint, Confluence) to support contextual AI responses.
• Prompt Engineering: Design reusable prompt templates and conversational flows to streamline repeated tasks and improve agent usability.
• Tool Evaluation & Integration: Assess and integrate complementary AI tools (e.g., GitLab Duo, Databricks AI, Notebook LM) to extend Copilot capabilities.
• Cross-Functional Collaboration: Partner with product, delivery, PMO, and security teams to identify high-value use cases and scale successful agent implementations.
• Governance & Monitoring: Ensure agents align with Responsible AI principles, monitor performance, and iterate based on feedback and evolving business needs.
• Adoption and Usage Reporting: Use Microsoft Viva Insights and other tools to report on user adoption, usage and business value delivered.
What We're Looking For:
• Proven experience with Microsoft 365 Copilot, Copilot Studio, or similar AI platforms, ChatGPT, Claude, etc.
• Strong understanding of data engineering workflows, tools (e.g., Git, Databricks, Unity Catalog), and documentation practices.
• Familiarity with SharePoint, Confluence, and Microsoft Graph connectors.
• Experience in prompt engineering and conversational UX design.
• Ability to translate business needs into scalable AI solutions.
• Excellent communication and collaboration skills across technical and non-technical
Bonus Points:
• Experience with GitLab Duo, Notebook LM, or other AI developer tools.
• Background in enterprise data platforms, ETL pipelines, or internal business systems.
• Exposure to AI governance, security, and compliance frameworks.
• Prior work in a regulated industry (e.g., healthcare, finance) is a plus.
🚨 Priority Requirement – Salesforce Solution Architect
📍 Location: Bengaluru | Hyderabad | Mumbai | Pune | Mohali | Delhi
🕒 Shift: Noon | 🏢 Work from Office
✨ Key Focus: Highly skilled candidates with excellent communication skills
🔑 Role: Salesforce Solution Architect (Classic ➝ Lightning Migration)
Responsibilities:
- 🚀 Lead end-to-end Classic → Lightning migration
- 🗂️ Redesign data model (optimize custom → standard objects)
- ⚡ Configure Lightning features (Dynamic Forms, Flows, Omni-Channel, Pages)
- 🛠️ Recommend & implement AppExchange solutions
- 🤝 Collaborate with dev, admin & QA for releases
- 🔒 Ensure security, profiles & permissions compliance
- 🎯 Act as trusted advisor for Salesforce roadmap
Qualifications:
- 🏆 7+ yrs Salesforce exp. | 3+ yrs Lightning implementations
- ☁️ Expertise: Service Cloud, Marketing Cloud, Case Mgmt, Digital Engagement
- 🔄 Proven Classic → Lightning migration exp.
- ⚙️ Strong low-code/no-code (Flows, Dynamic Actions)
- 🎓 Salesforce Certified Architect (preferred)
- 💬 Excellent communication & stakeholder management
We have a urgent requirement for the post of IBM MDM (AE) profile
Notice period - should b e 15-30 days
Shift:- Night
Responsibilities
- Design conversational chatbots using state of the art technology
- Collaborate with the senior team (System Architect and Senior Programmer) on application architecture design and decision-making
- Clean / analyze data coming from bot conversation
- Define recurring questions that can be handled automatically / defined by the client
- Improve dialog flow to handle those recurring questions / develop new actions
- Help with the handling of multilingual support
- Develop internal testing tools
Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
• Gain in-depth knowledge of current Operations processes, inefficiencies, issues, and risks.
• Pro-actively engage, manage and build strong relationships with both Operations business and
technology) stakeholders.
• Knowledge of Agile.
• Ability to drive multiple agendas and effectively manage priorities
• Strong communication and presentation skills, with senior stakeholders, with an excellent
standard of English (written and spoken)
• Work with the business to understand their requirements, define new processes and
workflows.
• Be an active, vocal team member of workshops, discussions, and working groups.
• Document requirements in a clear, unambiguous manner. Ensure they are understood by all
parties and sign-off are achieved.
• Propose relevant solutions and alternatives to meet business needs. Translate into clear,
automated and globally standardized future state models
• Create functional and design specifications, ensuring all stakeholders, globally, are in
agreement and provide sign off
• Preparing Data Flow Diagram, Process Flow Manual DFD in Visio, UML diagrams.
• Ensure the overall quality of business requirements, functional specifications, and supporting
documents meet the bank's standards
• Ensure development and testing teams understand the requirements and future state
processes and that they are fully engaged early on in the project.
• Provide ongoing support to the technology and testing teams as required e.g. reviewing of
test cases, closeout open issues/questions, manage change request process, etc.
• Assist the business with UAT, including communications, planning, preparation, and test cases
• Proficiency in Access, PowerPoint, and MS Word
• Identify risks and issues early. Communicate them to the relevant people in a timely manner.
Propose and plan mitigating steps.
• Work closely with Project Managers to ensure stakeholders are fully engaged,
communications are regular and honest and expectations are well managed.





