29+ PowerBI Jobs in Pune | PowerBI Job openings in Pune
Apply to 29+ PowerBI Jobs in Pune on CutShort.io. Explore the latest PowerBI Job opportunities across top companies like Google, Amazon & Adobe.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
Must have Strong SQL skills (queries, optimization, procedures, triggers)
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locations
About the company:
Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.
About the role:
As a Technical Project Manager, you will lead the planning, execution, and delivery of complex technical projects while ensuring alignment with business objectives and timelines. You will act as a bridge between technical teams and stakeholders, managing resources, risks, and communications to deliver high-quality solutions. This role demands strong leadership, project management expertise, and technical acumen to drive project success in a dynamic and collaborative environment.
Qualifications:
- Education Background: Any ME / M Tech / BE / B Tech
Key Competencies:
Technical Skills
1. Data & BI Technologies-
- Proficiency in SQL & PL/SQL for database querying and optimization.
- Understanding of data warehousing concepts, dimensional modeling, and data lake/lakehouse architectures.
- Experience with BI tools such as Power BI, Tableau, Qlik Sense/View.
- Familiarity with traditional platforms like Oracle, Informatica, SAP BO, BODS, BW.
2. Cloud & Data Engineering :
- Strong knowledge of AWS (EC2, S3, Lambda, Glue, Redshift), Azure (Data Factory, Synapse, Databricks, ADLS),
- Snowflake (warehouse architecture, performance tuning), and Databricks (Delta Lake, Spark).
- Experience with cloud-based ETL/ELT pipelines, data ingestion, orchestration, and workflow automation.
3. Programming
- Hands-on experience in Python or similar scripting languages for data processing and automation.
Soft Skills
- Strong leadership and team management skills.
- Excellent verbal and written communication for stakeholder alignment.
- Structured problem-solving and decision-making capability.
- Ability to manage ambiguity and handle multiple priorities.
Tools & Platforms
- Cloud: AWS, Azure
- Data Platforms: Snowflake, Databricks
- BI Tools: Power BI, Tableau, Qlik
- Data Management: Oracle, Informatica, SAP BO
- Project Tools: JIRA, MS Project, Confluence (recommended additions if you want)
Key Responsibilities:
- End-to-End Project Management: Lead the team through the full project lifecycle, delivering techno-functional solutions.
- Methodology Expertise: Apply Agile, PMP, and other frameworks to ensure effective project execution and resource management.
- Technology Integration: Oversee technology integration and ensure alignment with business goals.
- Stakeholder & Conflict Management: Manage relationships with customers, partners, and vendors, addressing expectations and conflicts proactively.
- Technical Guidance: Provide expertise in software design, architecture, and ensure project feasibility.
- Change Management: Analyse new requirements/change requests, ensuring alignment with project goals.
- Effort & Cost Estimation: Estimate project efforts and costs and identify potential risks early.
- Risk Mitigation: Proactively identify risks and develop mitigation strategies, escalating issues in advance.
- Hands-On Contribution: Participate in coding, code reviews, testing, and documentation as needed.
- Project Planning & Monitoring: Develop detailed project plans, track progress, and monitor task dependencies.
- Scope Management: Manage project scope, deliverables, and exclusions, ensuring technical feasibility.
- Effective Communication: Communicate with stakeholders to ensure agreement on scope, timelines, and objectives.
- Reporting: Provide status and RAG reports, proactively addressing risks and issues.
- Change Control: Manage changes in project scope, schedule, and costs using appropriate verification techniques.
- Performance Measurement: Measure project performance with tools and techniques to ensure progress.
- Operational Process Management: Oversee operational tasks like timesheet approvals, leave, appraisals, and invoicing.
About the company:
Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.
About the role:
As a Power BI Developer, you will work closely with business analysts, data engineers, and key stakeholders to transform complex datasets into actionable insights. Your expertise will be pivotal in designing and delivering visually engaging reports, dashboards, and data-driven stories that empower informed decision-making across the organization. By translating raw data into meaningful visuals, you will play a critical role in driving strategic initiatives and fostering a culture of data-driven excellence.
Qualifications:
- Bachelor’s or master’s degree in computer sciences, Information Technology, or a related field.
- Certifications with related field will be an added advantage
Key Competencies:
- Technical Skills: Proficiency in Power BI, DAX, Power Query, SQL, and data visualization best practices.
- Additional Tools: Familiarity with Azure Data Factory, Power Automate, and other components of the Power Platform is advantageous.
- Soft Skills: Strong analytical thinking, problem-solving, and communication skills for interacting with technical and non-technical audiences.
- Additional skills: Domain understanding is a plus
Key Responsibilities:
1. Data Integration & Modelling
- Extract, transform, and load (ETL) data from various sources (SQL, Excel, APIs, etc.).
- Design and develop efficient data models to support reporting needs.
- Ensure data integrity and optimize performance through best practices.
2. Report Development
- Understand the business requirement and build reports to provide analytical insights
- Build visually appealing, interactive dashboards and reports in Power BI.
- Implement DAX (Data Analysis Expressions) for complex calculations and measures.
- Design user-friendly layouts that align with stakeholder requirements.
3. Collaboration
- Work with stakeholders to gather business requirements and translate them into technical solutions.
- Collaborate with data engineers and analysts to ensure cohesive reporting strategies.
- Provide support and training for end-users to maximize adoption and usage of Power BI solutions
4. Performance Optimization
- Optimize dashboards and reports for better speed and responsiveness.
- Monitor and improve data refresh processes for real-time reporting.
5. Governance and Security
- Implement row-level security (RLS) and adhere to organizational data governance policies.
- Manage Power BI workspaces and permissions.
6. Continuous Improvement
- Stay updated with Power BI features and industry trends.
- Proactively recommend enhancements to existing solutions
Strong Data engineer profile
Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.
Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).
Mandatory (Technical): Must have strong SQL capability
Preferred
Preferred (Experience): Worked on Call center data
Job Specific Criteria
CV Attachment is mandatory
Have you used Databricks or any notebook environment?
Have you worked on ETL/ELT workflow?
We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?
Role: Azure Fabric Data Engineer
Experience: 5–10 Years
Location: Pune/Bangalore
Employment Type: Full-Time
About the Role
We are looking for an experienced Azure Data Engineer with strong expertise in Microsoft Fabric and Power BI to build scalable data pipelines, Lakehouse architectures, and enterprise analytics solutions on the Azure cloud.
Key Responsibilities
- Design & build data pipelines using Microsoft Fabric (Pipelines, Dataflows Gen2, Notebooks).
- Develop and optimize Lakehouse / Data Lake / Delta Lake architectures.
- Build ETL/ELT workflows using Fabric, Azure Data Factory, or Synapse.
- Create and optimize Power BI datasets, data models, and DAX calculations.
- Implement semantic models, incremental refresh, and Direct Lake/DirectQuery.
- Work with Azure services: ADLS Gen2, Azure SQL, Synapse, Event Hub, Functions, Databricks.
- Build dimensional models (Star/Snowflake) and support BI teams.
- Ensure data governance & security using Purview, RBAC, and AAD.
Required Skills
- Strong hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Dataflows, Notebooks).
- Expertise in Power BI (DAX, modeling, Dataflows, optimized datasets).
- Deep knowledge of Azure Data Engineering stack (ADF, ADLS, Synapse, SQL).
- Strong SQL, Python/PySpark skills.
- Experience in Delta Lake, Medallion architecture, and data quality frameworks.
Nice to Have
- Azure Certifications (DP-203, PL-300, Fabric Analytics Engineer).
- Experience with CI/CD (Azure DevOps/GitHub).
- Databricks experience (preferred).
Note: One Technical round is mandatory to be taken F2F from either Pune or Bangalore office
Review Criteria
- Strong Senior Data Engineer profile
- 4+ years of hands-on Data Engineering experience
- Must have experience owning end-to-end data architecture and complex pipelines
- Must have advanced SQL capability (complex queries, large datasets, optimization)
- Must have strong Databricks hands-on experience
- Must be able to architect solutions, troubleshoot complex data issues, and work independently
- Must have Power BI integration experience
- CTC has 80% fixed and 20% variable in their ctc structure
Preferred
- Worked on Call center data, understand nuances of data generated in call centers
- Experience implementing data governance, quality checks, or lineage frameworks
- Experience with orchestration tools (Airflow, ADF, Glue Workflows), Python, Delta Lake, Lakehouse architecture
Job Specific Criteria
- CV Attachment is mandatory
- Are you Comfortable integrating with Power BI datasets?
- We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
Ideal Candidate
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
ROLES AND RESPONSIBILITIES:
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
IDEAL CANDIDATE:
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
ROLES AND RESPONSIBILITIES:
We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.
Key Responsibilities-
- Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
- Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
- Help troubleshoot data issues and contribute to ensuring pipeline reliability.
- Work with senior engineers and analysts to understand data requirements and deliver small tasks.
- Assist in maintaining documentation, data dictionaries, and process notes.
- Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
- Support basic tasks related to Power BI data preparation or integrations as needed.
IDEAL CANDIDATE:
- Foundational SQL skills with the ability to write and understand basic queries.
- Basic exposure to Databricks, data transformation concepts, or similar data tools.
- Understanding of ETL/ELT concepts, data structures, and analytical workflows.
- Eagerness to learn modern data engineering tools, technologies, and best practices.
- Strong problem-solving attitude and willingness to work under guidance.
- Good communication and collaboration skills to work with senior engineers and analysts.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
- Strong Senior Data Engineer profile
- Mandatory (Experience 1): Must have 4+ years of hands-on Data Engineering experience
- Mandatory (Experience 2): Must have experience owning end-to-end data architecture and complex pipelines
- Mandatory (Technical 1): Must have advanced SQL capability (complex queries, large datasets, optimization)
- Mandatory (Technical 2): Must have strong Databricks hands-on experience
- Mandatory (Role Requirement): Must be able to architect solutions, troubleshoot complex data issues, and work independently
- Mandatory (BI Requirement): Must have Power BI integration experience
- Mandatory (Note): Bajaj CTC has 80% fixed and 20% variable
ROLES AND RESPONSIBILITIES:
We are seeking a skilled Data Engineer who can work independently on data pipeline development, troubleshooting, and optimisation tasks. The ideal candidate will have strong SQL skills, hands-on experience with Databricks, and familiarity with cloud platforms such as AWS and Azure. You will be responsible for building and maintaining reliable data workflows, supporting analytical teams, and ensuring high-quality, secure, and accessible data across the organisation.
KEY RESPONSIBILITIES:
- Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
- Build, optimise, and troubleshoot SQL queries, transformations, and Databricks data processes.
- Work with large datasets to deliver efficient, reliable, and high-performing data solutions.
- Collaborate closely with analysts, data scientists, and business teams to support data requirements.
- Ensure data quality, availability, and security across systems and workflows.
- Monitor pipeline performance, diagnose issues, and implement improvements.
- Contribute to documentation, standards, and best practices for data engineering processes.
IDEAL CANDIDATE:
- Proven experience as a Data Engineer or in a similar data-focused role (3+ years).
- Strong SQL skills with experience writing and optimising complex queries.
- Hands-on experience with Databricks for data engineering tasks.
- Experience with cloud platforms such as AWS and Azure.
- Understanding of ETL/ELT concepts, data modelling, and pipeline orchestration.
- Familiarity with Power BI and data integration with BI tools.
- Strong analytical and troubleshooting skills, with the ability to work independently.
- Experience working end-to-end on data engineering workflows and solutions.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
About the Company:
Verinite is a global technology consulting and services company laser-focused on the banking & financial services sector, especially in cards, payments, lending, trade, and treasury
They partner with banks, fintechs, payment processors, and other financial institutions to modernize their systems, improve operational resilience, and accelerate digital transformation. Their services include consulting, digital strategy, data, application modernization, quality engineering (testing), cloud & infrastructure, and application maintenance.
Skill – Authorization, Clearing and Settlement
1. Individual should have worked on scheme (Visa, Amex, Discover, Rupay & Mastercard both on authorization or clearing section.
2. Should be able to read scheme specifications and create business requirement/mapping for authorization and Clearing
3. Should have Hands on experience in implementing scheme related changes
4. Should be able to validate the and certify the change post development based on the mapping created
5. Should be able to work with Dev team on explaining and guiding on time-to-time basis.
6. Able to communicate with various teams & senior stakeholders
7. Go getter and great googler
8. Schemes – VISA/MC/AMEX/JCB/CUP/Mercury – Discover and Diners, CBUAE, Jaywan ( Local Scheme from UAE)
9.Experience with Issuing side is plus (good to have).
Review Criteria
- Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
- 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
- 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
- Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
- Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
- Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
- Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
- Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Preferred
- Ability to build insightful dashboards and executive reports highlighting user engagement, retention, and behavioral metrics; familiarity with mixed-method research, AI-assisted insight tools (Dovetail, EnjoyHQ, Qualtrics, UserZoom), and mentoring junior researchers
Job Specific Criteria
- CV Attachment is mandatory
- We have an alternate Saturday’s working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
Product Conceptualization & UX Strategy Development:
- Conceptualize customer experience strategies
- Collaborate with product managers to conceptualize new products & align UX with product roadmaps.
- Develop and implement UX strategies that align with business objectives.
- Stay up-to-date with industry trends and best practices in UX & UI for AI.
- Assist in defining product requirements and features.
- Use data analytics to inform product strategy and prioritize features.
- Ensure product alignment with customer needs and business goals.Develop platform blueprints that include a features and functionalities map, ecosystem map, and information architecture.
- Create wireframes, prototypes, and mock-ups using tools like Figma
- Conduct usability testing and iterate designs based on feedback
- Employ tools like X-Mind for brainstorming and mind mapping
Customer Journey Analysis:
- Understand and map out customer journeys and scenarios.
- Identify pain points and opportunities for improvement.
- Develop customer personas and empathy maps.
Cross-Functional Collaboration:
- Work closely with internal units such as UX Research, Design, UX Content, and UX QA to ensure seamless delivery of CX initiatives.
- Coordinate with development teams to ensure UX designs are implemented accurately.
Data Analytics and Tools:
- Utilize clickstream and analytics tools like Google Analytics, CleverTap, and Medallia to gather and analyse user data.
- Leverage data to drive decisions and optimize customer experiences.
- Strong background in data analytics, including proficiency in interpreting complex datasets to inform UX decisions.
Ideal Candidate
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
Knowledge & Experience:
- Providing Technical leadership and guidance to Teams In Data and Analytics engineering solutions and platforms
- Strong problem-solving skills and the ability to translate business requirements into actionable data science solutions.
- Excellent communication skills, with the ability to effectively convey complex ideas to technical and non-technical stakeholders.
- Strong team player with excellent interpersonal and collaboration skills.
- Ability to manage multiple projects simultaneously and deliver high-quality results within specified timelines.
- Proven ability to work collaboratively in a global, matrixed environment and engage effectively with global stakeholders across multiple business groups.
Relevant Experience:
- 12+ years of IT experience in delivering medium-to-large data engineering, and analytics solutions
- Min. 4 years of Experience working with Azure Databricks, Azure Data Factory, Azure Data Lake, Azure SQL DW, Azure SQL, Power BI, SAC and other BI, data visualization and exploration tools
- Deep understanding of master data management & governance concepts and methodologies
- Experience in Data Modelling & Source System Analysis
- Familiarity with PySpark
- Mastery of SQL
- Experience with Python programming language used for Data Engineering purpose.
- Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows.
- Preferred but not required -
- Microsoft Certified: Azure Data Engineer Associate
- Experience in preparing data for Data Science and Machine Learning
- Knowledge of Jupyter Notebooks or Databricks Notebooks for Python development
- Power BI Dataset Development and Dax
- Power BI Report development
- Exposure to AI services in Azure and Agentic Analytics solutions
Role: Data Scientist (Python + R Expertise)
Exp: 8 -12 Years
CTC: up to 30 LPA
Required Skills & Qualifications:
- 8–12 years of hands-on experience as a Data Scientist or in a similar analytical role.
- Strong expertise in Python and R for data analysis, modeling, and visualization.
- Proficiency in machine learning frameworks (scikit-learn, TensorFlow, PyTorch, caret, etc.).
- Strong understanding of statistical modeling, hypothesis testing, regression, and classification techniques.
- Experience with SQL and working with large-scale structured and unstructured data.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and deployment practices (Docker, MLflow).
- Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
- Experience with NLP, time series forecasting, or deep learning projects.
- Exposure to data visualization tools (Tableau, Power BI, or R Shiny).
- Experience working in product or data-driven organizations.
- Knowledge of MLOps and model lifecycle management is a plus.
If interested kindly share your updated resume on 82008 31681
CTC: 15 LPA to 21 LPA
Exp: 5 to 8 Years
Mandatory
- Strong Behavioral Data Analyst Profiles
- Mandatory (Experience 1): Minimum 4+ years of experience in user analytics or behavioural data analysis, focusing on user app and web journeys
- Mandatory (Experience 2): Experience in analyzing clickstream and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, or Firebase
- Mandatory (Skills 1): Hands-on experience in A/B testing, including hypothesis design, experimentation, and result interpretation.
- Mandatory (Skills 2): Strong analytical ability to identify behavioral patterns, anomalies, funnel drop-offs, and engagement trends from large datasets.
- Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization tools such as Tableau or Power BI for dashboard creation and data storytelling.
- Mandatory (Skills 4): Basic understanding of UX principles and customer journey mapping, collaborating effectively with UX/CX teams
- Mandatory (Company): B2C product Companies (fintech, or e-commerce organizations with large user behavior dataset is a plus)
- Mandatory (Note): Don't want data analysis but business/product/user analysts
Ideal Candidate:
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
If interested kindly share your updated resume on 82008 31681
Exp: 10+ Years
CTC: 1.7 LPM
Location: Pune
SnowFlake Expertise Profile
Should hold 10 + years of experience with strong skills with core understanding of cloud data warehouse principles and extensive experience in designing, building, optimizing, and maintaining robust and scalable data solutions on the Snowflake platform.
Possesses a strong background in data modelling, ETL/ELT, SQL development, performance tuning, scaling, monitoring and security handling.
Responsibilities:
* Collaboration with Data and ETL team to review code, understand current architecture and help improve it based on Snowflake offerings and experience
* Review and implement best practices to design, develop, maintain, scale, efficiently monitor data pipelines and data models on the Snowflake platform for
ETL or BI.
* Optimize complex SQL queries for data extraction, transformation, and loading within Snowflake.
* Ensure data quality, integrity, and security within the Snowflake environment.
* Participate in code reviews and contribute to the team's development standards.
Education:
* Bachelor’s degree in computer science, Data Science, Information Technology, or anything equivalent.
* Relevant Snowflake certifications are a plus (e.g., Snowflake certified Pro / Architecture / Advanced).
Job Role - Power BI Lead
9 to 12 Years Experience Required.
Location - Pune Baner/ Vimaan Nagar
Work Model - Hybrid (Wednesday and Thursday WFO) 12 PM to 9 PM
Experience with Banking or GRC Domain is preferred.
- JOB SUMMARY
- Role Overview: We are seeking a highly skilled Power BI Expert to design, develop, implement and governance of Power BI solutions. Ideal candidate will have in-depth knowledge of Power BI architecture, data modeling, governance, embedded analytics and database management. The role requires expertise Power BI Data Gateways, report deployment, and governance frameworks ensuring scalable and secure data solutions.
- PRIMARY RESPONSIBILITIES
- Power BI Lead & Implementation:
- Design, develop, and deploy interactive Power BI reports and dashboards.
- Create efficient data models to optimize performance and scalability.
- Develop complex DAX expressions for business logic and calculations.
- Optimize report performance by using best practices in Power BI and SQL
- Power BI Architecture & Configuration:
- Configure and manage Power BI Data Gateways for secure and seamless data access
- Define and enforce Power BI workspace, dataset, and security policies.
- Implement row-level security (RLS) and data governance best practices.
- Establish data refresh schedules and ensure efficient data ingestion pipelines.
- Maintain and enhance Power BI Premium and Embedded solutions.
- Embedded Analytics & Integration:
- Integrate Power BI reports with external applications using Power BI Embedded.
- Work with Power BI REST APIs to automate workflows.
- Integrate Power BI with Oracle, SQL Server, MySQL, Microsoft Share point, Excel, Cloud data source etc.,
- Database & Performance Optimization:
- Write optimized SQL queries and stored procedures for report development.
- Ensure high-performance data refreshes and query execution.
- Work with ETL team to improve data integration with PowerBI
- Governance & Security:
- Define Power BI governance framework and best practices for standardization.
- Monitor user access, performance, and usage analytics to drive efficiency.
- Manage user roles, access controls, and data security.
- PowerApps & Power Automate (Nice to Have):
- Build PowerApps applications to extend Power BI functionality and create
- interactive business solutions
- Automate data flows and reporting updates using Power Automate (Flows,
- Triggers, Approvals, Notifications, etc.).
- Integrate Power BI, PowerApps, and Power Automate to create end-to-end
- business process automation.
- Stakeholder Collaboration
- Training:
- Work closely with business users, data engineers, and leadership teams to understand and document reporting requirements.
- Provide training and best practice guidance to Power BI users across the organization.
- Develop self-service Power BI frameworks to empower business teams for reporting.
- Troubleshoot Power BI performance and user issues.
We are seeking a highly skilled and experienced Power BI Lead / Architect to join our growing team. The ideal candidate will have a strong understanding of data warehousing, data modeling, and business intelligence best practices. This role will be responsible for leading the design, development, and implementation of complex Power BI solutions that provide actionable insights to key stakeholders across the organization.
Location - Pune (Hybrid 3 days)
Responsibilities:
Lead the design, development, and implementation of complex Power BI dashboards, reports, and visualizations.
Develop and maintain data models (star schema, snowflake schema) for optimal data analysis and reporting.
Perform data analysis, data cleansing, and data transformation using SQL and other ETL tools.
Collaborate with business stakeholders to understand their data needs and translate them into effective and insightful reports.
Develop and maintain data pipelines and ETL processes to ensure data accuracy and consistency.
Troubleshoot and resolve technical issues related to Power BI dashboards and reports.
Provide technical guidance and mentorship to junior team members.
Stay abreast of the latest trends and technologies in the Power BI ecosystem.
Ensure data security, governance, and compliance with industry best practices.
Contribute to the development and improvement of the organization's data and analytics strategy.
May lead and mentor a team of junior Power BI developers.
Qualifications:
8-12 years of experience in Business Intelligence and Data Analytics.
Proven expertise in Power BI development, including DAX, advanced data modeling techniques.
Strong SQL skills, including writing complex queries, stored procedures, and views.
Experience with ETL/ELT processes and tools.
Experience with data warehousing concepts and methodologies.
Excellent analytical, problem-solving, and communication skills.
Strong teamwork and collaboration skills.
Ability to work independently and proactively.
Bachelor's degree in Computer Science, Information Systems, or a related field preferred.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

looking for Trainer who has experience on any one in Data Analytics Tableau, Power BI, Excel, MYSQL
Job description
Experience: At least 1 year of experience in imparting trainings Bachelor / Master degree in any field
Requirements :
- Good interpersonal with excellent communication skills
- Imparting Training on Tableau, Excel,SQL, Python or R Programming, Statistics.
- Good understanding of Data Analysis concepts and understanding of Machine Learning, Big Data, Business Intelligence Tools.
- Conduct online and classroom training sessions by providing practical use cases and assignments
- Able to cater College Seminar, Workshop, Corporate training.
- Provide Interview Preparation and placement assistance to students
- Manage Data Analytics Training Content including Session Presentations, Assignments, and Quizzes.
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Senior Software Engineer
MUST HAVE:
POWER BI with PLSQL
experience: 5+ YEARS
cost: 18 LPA
WHF- HYBRID
Job Description:
- Bookkeeping and accounting in Tally ERP, Xero, QuickBooks, and applicable accounting software
- Responsible for preparation and management of books of accounts, records, and documents for foreign entities
- Preparation and reporting of Monthly/periodical MIS.
- Managing billing, receivables, and collection.
- Liaising with foreign consultants with respect to Bookkeeping, compliances
- Ensure compliance under various laws for payroll and non-payroll compliances.
- Managing Audits of the offshore entities under different statutes (GST/Sales Tax, Companies House)
- Managing payroll and payroll compliances
- Managing Banking operations and payments and operational fund flow/cash flow.
- Desired Candidate Profile:
- Must have good communication skills to deal with foreign clients.
- Should have good knowledge of MS office and tally.
- Experience in Corporate Reporting, MIS, Power BI and Tableau etc.
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
Requirements
Must have experience in BFSC domain.
Exp- Min 3yrs
Location- Pune
Mandatory Skills- Exp in powerBI/Tableau,
SQL
Basic Python
Data warehousing architect/ desinger
Data migration architect/ designer
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
- Experience with Azure Analysis Services
- Experience in Power BI
- Experience with third-party solutions like Attunity/Stream sets, Informatica
- Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
- Capacity Planning and Performance Tuning on Azure Stack and Spark.













