10+ Data Visualization Jobs in Pune | Data Visualization Job openings in Pune
Apply to 10+ Data Visualization Jobs in Pune on CutShort.io. Explore the latest Data Visualization Job opportunities across top companies like Google, Amazon & Adobe.
About Vijay Sales
Vijay Sales is one of India’s leading electronics retail brands with 160+ stores nationwide and a fast-growing digital presence. We are on a mission to build the most advanced data-driven retail intelligence ecosystem—using AI, predictive analytics, LLMs, and real-time automation to transform customer experience, supply chain, and omnichannel operations.
Role Overview
We are looking for a highly capable AI Engineer who is passionate about building production-grade AI systems, designing scalable ML architecture, and working with cutting-edge AI/ML tools. This role involves hands-on work with Databricks, SQL, PySpark, modern LLM/GenAI frameworks, and full lifecycle ML system design.
Key Responsibilities
Machine Learning & AI Development
- Build, train, and optimize ML models for forecasting, recommendation, personalization, churn prediction, inventory optimization, anomaly detection, and pricing intelligence.
- Develop GenAI solutions using modern LLM frameworks (e.g., LangChain, LlamaIndex, HuggingFace Transformers).
- Explore and implement RAG (Retrieval Augmented Generation) pipelines for product search, customer assistance, and support automation.
- Fine-tune LLMs on company-specific product and sales datasets (using QLoRA, PEFT, and Transformers).
- Develop scalable feature engineering pipelines leveraging Delta Lake and Databricks Feature Store.
Databricks / Data Engineering
- Build end-to-end ML workflows on Databricks using PySpark, MLflow, Unity Catalog, Delta Live Tables.
- Optimize Databricks clusters for cost, speed, and stability.
- Maintain reusable notebooks and parameterized pipelines for model ingestion, validation, and deployment.
- Use MLflow for tracking experiments, model registry, and lifecycle management.
Data Handling & SQL
- Write advanced SQL for multi-source data exploration, aggregation, and anomaly detection.
- Work on large, complex datasets from ERP, POS, CRM, Website, and Supply Chain systems.
- Automate ingestion of streaming and batch data into Databricks pipelines.
Deployment & MLOps
- Deploy ML models using REST APIs, Databricks Model Serving, Docker, or cloud-native endpoints.
- Build CI/CD pipelines for ML using GitHub Actions, Azure DevOps, or Databricks Workflows.
- Implement model monitoring for drift, accuracy decay, and real-time alerts.
- Maintain GPU/CPU environments for training workflows.
Must-Have Technical Skills
Core AI/ML
- Strong fundamentals in machine learning: regression, classification, time-series forecasting, clustering.
- Experience in deep learning using PyTorch or TensorFlow/Keras.
- Expertise in LLMs, embeddings, vector databases, and GenAI architecture.
- Hands-on experience with HuggingFace, embedding models, and RAG.
Databricks & Big Data
- Hands-on experience with Databricks (PySpark, SQL, Delta Lake, MLflow, Feature Store).
- Strong understanding of Spark execution, partitioning, and optimization.
Programming
- Strong proficiency in Python.
- Experience writing high-performance SQL with window functions, CTEs, and analytical queries.
- Knowledge of Git, CI/CD, REST APIs, and Docker.
MLOps & Production Engineering
- Experience deploying models to production and monitoring them.
- Familiarity with tools like MLflow, Weights & Biases, or SageMaker equivalents.
- Experience in building automated training pipelines and handling model drift/feedback loops.
Preferred Domain Experience
- Retail/e-commerce analytics
- Demand forecasting
- Inventory optimization
- Customer segmentation & personalization
- Price elasticity and competitive pricing
Review Criteria
- Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
- 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
- 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
- Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
- Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
- Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
- Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
- Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Preferred
- Ability to build insightful dashboards and executive reports highlighting user engagement, retention, and behavioral metrics; familiarity with mixed-method research, AI-assisted insight tools (Dovetail, EnjoyHQ, Qualtrics, UserZoom), and mentoring junior researchers
Job Specific Criteria
- CV Attachment is mandatory
- We have an alternate Saturday’s working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
Product Conceptualization & UX Strategy Development:
- Conceptualize customer experience strategies
- Collaborate with product managers to conceptualize new products & align UX with product roadmaps.
- Develop and implement UX strategies that align with business objectives.
- Stay up-to-date with industry trends and best practices in UX & UI for AI.
- Assist in defining product requirements and features.
- Use data analytics to inform product strategy and prioritize features.
- Ensure product alignment with customer needs and business goals.Develop platform blueprints that include a features and functionalities map, ecosystem map, and information architecture.
- Create wireframes, prototypes, and mock-ups using tools like Figma
- Conduct usability testing and iterate designs based on feedback
- Employ tools like X-Mind for brainstorming and mind mapping
Customer Journey Analysis:
- Understand and map out customer journeys and scenarios.
- Identify pain points and opportunities for improvement.
- Develop customer personas and empathy maps.
Cross-Functional Collaboration:
- Work closely with internal units such as UX Research, Design, UX Content, and UX QA to ensure seamless delivery of CX initiatives.
- Coordinate with development teams to ensure UX designs are implemented accurately.
Data Analytics and Tools:
- Utilize clickstream and analytics tools like Google Analytics, CleverTap, and Medallia to gather and analyse user data.
- Leverage data to drive decisions and optimize customer experiences.
- Strong background in data analytics, including proficiency in interpreting complex datasets to inform UX decisions.
Ideal Candidate
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
Role: Data Scientist (Python + R Expertise)
Exp: 8 -12 Years
CTC: up to 30 LPA
Required Skills & Qualifications:
- 8–12 years of hands-on experience as a Data Scientist or in a similar analytical role.
- Strong expertise in Python and R for data analysis, modeling, and visualization.
- Proficiency in machine learning frameworks (scikit-learn, TensorFlow, PyTorch, caret, etc.).
- Strong understanding of statistical modeling, hypothesis testing, regression, and classification techniques.
- Experience with SQL and working with large-scale structured and unstructured data.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and deployment practices (Docker, MLflow).
- Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
- Experience with NLP, time series forecasting, or deep learning projects.
- Exposure to data visualization tools (Tableau, Power BI, or R Shiny).
- Experience working in product or data-driven organizations.
- Knowledge of MLOps and model lifecycle management is a plus.
If interested kindly share your updated resume on 82008 31681
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
Company Description
UpSolve Solutions is a company specializing in Video and Text Analytics to drive business decisions. We aim to solve business problems that are taking more time and resources than expected, and turn them into opportunities for growth and improvement. We are located in Pune and offer innovative solutions to help businesses succeed.
Role Description
This is a full-time on-site role for a QlikView Developer. As a QlikView Developer, you will be responsible for developing data models, creating dashboards, and utilizing your analytical skills. You will also work with data warehousing and ETL (Extract Transform Load) processes to ensure effective data management. You will be an integral part of our team in Pune.
Qualifications
- Data Modeling and Dashboard development skills
- Strong analytical skills
- Experience in data warehousing and ETL processes
- Proficiency in QlikView development
- Good understanding of business requirements and data analysis
- Excellent problem-solving and communication skills
- Experience in the software industry is a plus
- Bachelor's degree in Computer Science, Information Technology, or related field
Vue.JS Developer
1. Should have worked in Agile methodology
2. Should have at least 5+ years of experience
3. Should have expertise in creating reusable components and coupling including routings
4. Should have good knowledge of data visualization in Reports/Charts
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
Job Role & Responsibility:
VMWare NSBU HCX team is looking for passionate engineers with a startup mindset and ready to solve business problems by applying technical engineering knowledge. You will be a part of enterprise class HCX cloud mobility platform to help customers solve datacenter evacuation, consolidation and hybrid cloud use cases. You’ll be part of a dynamic engineering team where a passion for innovation is a key to address technical problems, improve performance and scalability and focus on continuous improvement is of paramount importance. You will be a part of R&D development team focused on various aspects of HCX ranging from workload mobility to networking in the context of hybrid cloud paradigm. As part of the job, you will work with senior architects and other engineers of the team to deliver world class enterprise class product.
Required Skills:
- Background with Computer Science fundamentals (based on a BS or MS in CS or related field) with 4+ years of substantial professional experience
- Strong programming skills in Java/Go/Python
- Knowledge of distributed systems
- Understanding of Micro Services architecture, REST APIs Design and Development
- Understanding of Kubernetes, Kafka, No-sql, Java/Sprint, Client MVC
- Exposure to one or more UI technologies
- Organized and passionate about details; able to effectively perform multiple/concurrent tasks within deadlines in a dynamic environment
Preferred Skills:
- Strong knowledge about virtualization, and/or container technologies
- Experience in SDN/Networking/Network Management domain is added plus
- Exposure to AWS, Azure is added plus
Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.
Responsibilities:
- Manage the customer relationship by serving as the single point of contact before, during and after engagements.
- Architect data management solutions.
- Provide technical leadership to other consultants and/or customer/partner resources.
- Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
- Supervise and mentor all intermediate and junior level team members.
- Provide regular reports to communicate status both internally and externally.
- Qualifications:
- A typical profile that would suit this position would be if the following background:
- A graduate from a reputed engineering college
- An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
- A willingness to work with a small team in a fast-growing environment.
- A good knowledge of Business Intelligence concepts
Mandatory Requirements:
- Knowledge of Business Intelligence
- Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
- Knowledge of SQL
- Excellent English and communication skills
- Intelligent, quick to learn new technologies
- Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
Job Description:
Roles & Responsibilities:
· You will be involved in every part of the project lifecycle, right from identifying the business problem and proposing a solution, to data collection, cleaning, and preprocessing, to training and optimizing ML/DL models and deploying them to production.
· You will often be required to design and execute proof-of-concept projects that can demonstrate business value and build confidence with CloudMoyo’s clients.
· You will be involved in designing and delivering data visualizations that utilize the ML models to generate insights and intuitively deliver business value to CXOs.
Desired Skill Set:
· Candidates should have strong Python coding skills and be comfortable working with various ML/DL frameworks and libraries.
· Hands-on skills and industry experience in one or more of the following areas is necessary:
1) Deep Learning (CNNs/RNNs, Reinforcement Learning, VAEs/GANs)
2) Machine Learning (Regression, Random Forests, SVMs, K-means, ensemble methods)
3) Natural Language Processing
4) Graph Databases (Neo4j, Apache Giraph)
5) Azure Bot Service
6) Azure ML Studio / Azure Cognitive Services
7) Log Analytics with NLP/ML/DL
· Previous experience with data visualization, C# or Azure Cloud platform and services will be a plus.
· Candidates should have excellent communication skills and be highly technical, with the ability to discuss ideas at any level from executive to developer.
· Creative problem-solving, unconventional approaches and a hacker mindset is highly desired.





