11+ Dashboard Manager Jobs in Pune | Dashboard Manager Job openings in Pune
Apply to 11+ Dashboard Manager Jobs in Pune on CutShort.io. Explore the latest Dashboard Manager Job opportunities across top companies like Google, Amazon & Adobe.

Job Description
Create intelligent dashboards and analytics for business team using advanced excel/access and powerpoint
Understand the sales and leadership metrics and create business reports with insights and recommendation
Maintain documentation related to reports/procedures and personal tasks
Collaborate with internal team for required inputs within deadline
Perform quality checks for data correctness and execute with fine attention to details
2-5 years of industry experience in MIS reports
Proficiency in advanced excel including Macros
High attention to details with ability to keep track of the assigned tasks
Willingness to work on flexible hours, ability to plan, estimate deadlines, prioritize workload
Skills required:
1. Advanced excel knowledge
2. Practical knowledge of Macros
3. Graduation is necessary
4. Min. experience of 2-3 years
About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking an experienced and highly skilled Technical Lead with a strong background in Java, SaaS architectures, firewalls and cybersecurity products, including SIEM and SOAR platforms. The ideal candidate will lead technical initiatives, design and implement scalable systems, and drive best practices across the engineering team. This role requires deep technical expertise, leadership abilities, and a passion for building secure and high-performing security solutions.
Key Roles & Responsibilities:
- Lead the design and development of scalable and secure software solutions using Java.
- Architect and build SaaS-based cybersecurity applications, ensuring high availability, performance, and reliability.
- Provide technical leadership, mentoring, and guidance to the development team.
- Ensure best practices in secure coding, threat modeling, and compliance with industry standards.
- Collaborate with cross-functional teams including Product Management, Security, and DevOps to deliver high-quality security solutions.
- Design and implement security analytics, automation workflows and ITSM integrations.
- Drive continuous improvements in engineering processes, tools, and technologies.
- Troubleshoot complex technical issues and lead incident response for critical production systems.
Basic Qualifications:
- A bachelor’s or master’s degree in computer science, electronics engineering or a related field
- 8-10 years of software development experience, with expertise in Java.
- Strong background in building SaaS applications with cloud-native architectures (AWS, GCP, or Azure).
- In-depth understanding of microservices architecture, APIs, and distributed systems.
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Knowledge of DevSecOps principles, CI/CD pipelines, and infrastructure as code (Terraform, Ansible, etc.).
- Strong problem-solving skills and ability to work in an agile, fast-paced environment.
- Excellent communication and leadership skills, with a track record of mentoring engineers.
Preferred Qualifications:
- Experience with cybersecurity solutions, including SIEM (e.g., Splunk, ELK, IBM QRadar) and SOAR (e.g., Palo Alto XSOAR, Swimlane).
- Knowledge of zero-trust security models and secure API development.
- Hands-on experience with machine learning or AI-driven security analytics.

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions.
Key Roles & Responsibilities:
- Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
- Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
- Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
- Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
- Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
- Implement data governance, security, and compliance best practices.
- Build and maintain data models, transformations, and data marts for analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
- Automate infrastructure and deployments using Terraform, Airflow, or dbt.
- Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
- Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.
Basic Qualifications:
- Bachelor’s or Master’s Degree in Computer Science or Data Science.
- 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
- Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
- Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
- Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
- Proficiency in SQL, Python, or Scala for data transformation and analytics.
- Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
- Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
- Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
- Strong understanding of data governance, access control, and encryption strategies.
- Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.
Preferred Qualifications:
- Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
- Experience in BI and analytics tools (Tableau, Power BI, Looker).
- Familiarity with data observability tools (Monte Carlo, Great Expectations).
- Experience with machine learning feature engineering pipelines in Databricks.
- Contributions to open-source data engineering projects.
Responsibilities
Designing a user interface (UI)
· Create responsive, logical, and visually appealing user interfaces for web and mobile applications.
· Work together with cross-functional teams to guarantee uniform and unified UI designs throughout the product.
· For the purpose of efficiently communicating design thoughts and ideas, create wireframes, mock-ups, and prototypes.
Designing for User Experience (UX)
· To better understand user needs, preferences, and behaviours, conduct user research.
· To guide design choices, create user personas, user journeys, and empathy maps.
· To improve the product's overall usability and user experience, iterate and improve user flows.
Interaction Design
· Define and implement interactive elements and animations to enhance user engagement and delight.
· Work closely with development teams to ensure seamless integration of design elements and interactions into the product.
Usability Testing
· Conduct usability testing and gather feedback from users to identify areas for improvement.
· Analyze user feedback and iterate designs based on insights obtained from testing.
· Design Guidelines and Documentation:
· Create and maintain design guidelines and documentation to ensure design consistency and scalability across the product.
Cross-Functional Collaboration:
· Collaborate with product managers, developers, and other stakeholders to align design goals with product objectives and technical feasibility.
· Participate in brainstorming sessions and provide creative input to product development discussions.
Qualifications
· Bachelor's degree in Design, Human-Computer Interaction, or related field.
· 2-3 years of professional experience in UI/UX design for digital products.
· Proficient in design tools such as Adobe XD, Sketch, Figma, or similar.
· Strong understanding of design principles, user-centered design, and design thinking.
· Knowledge of HTML, CSS, and JavaScript is a plus.
· Excellent communication and collaboration skills.
· Ability to work in a fast-paced, dynamic environment and adapt to changing priorities.


We are looking for a Data Analyst that oversees organisational data analytics. This will require you to design and help implement the data analytics platform that will keep the organisation running. The team will be the go-to for all data needs for the app and we are looking for a self-starter who is hands on and yet able to abstract problems and anticipate data requirements.
This person should be very strong technical data analyst who can design and implement data systems on his own. Along with him, he also needs to be proficient in business reporting and should have keen interest in provided data needed for business.
Tools familiarity: SQL, Python, Mix panel, Metabase, Google Analytics, Clever Tap, App Analytics
Responsibilities
- Processes and frameworks for metrics, analytics, experimentation and user insights, lead the data analytics team
- Metrics alignment across teams to make them actionable and promote accountability
- Data based frameworks for assessing and strengthening Product Market Fit
- Identify viable growth strategies through data and experimentation
- Experimentation for product optimisation and understanding user behaviour
- Structured approach towards deriving user insights, answer questions using data
- This person needs to closely work with Technical and Business teams to get this implemented.
Skills
- 4 to 6 years at a relevant role in data analytics in a Product Oriented company
- Highly organised, technically sound & good at communication
- Ability to handle & build for cross functional data requirements / interactions with teams
- Great with Python, SQL
- Can build, mentor a team
- Knowledge of key business metrics like cohort, engagement cohort, LTV, ROAS, ROE
Eligibility
BTech or MTech in Computer Science/Engineering from a Tier1, Tier2 colleges
Good knowledge on Data Analytics, Data Visualization tools. A formal certification would be added advantage.
We are more interested in what you CAN DO than your location, education, or experience levels.
Send us your code samples / GitHub profile / published articles if applicable.
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
Experience in Sprnig Boot, Spring Cloud, Spring Security, Webservices
Good Communication Skills
- Designing and deploying database
- Ensuring the entire stack is designed and built for speed and scalability
- Designing and constructing REST API
- Mentoring other developers of the team with code and design reviews
What you need to have:
- Strong proficiency Primary Stack (Golang, Node.Js, Express, ES6, Docker, AWS, PHP, Laravel, Microservices, Rest APIs)
- Strong proficiency in Database tools (MongoDB, Mongoose, MySQL, Postgres, Eloquent, Sequelize, DynamoDB, Lucid Models, PDO, Redis, Memcached, GraphQL)
- Experience implementing testing platforms and unit tests Proficiency with Git
- Proficiency in tools (Ajax, Axios, TDD, OOP, MVC, jQuery, npm, Webpack, Guzzle, Git, HTML, CSS, Linux, Kubernetes,SVN, Blade, Ubuntu, PHPunit, jest, JIRA)
- Strong proficiency in AWS, or similar environments (Microservices, Docker, AWS, Lambda, S3 bucket, SQS).


Very good knowledge of material design components
Angular CLI and CDK.
Creating new Components
String Interpolation and property binding.
In depth knowledge on angular Core package
Two way binding concept
Use cases on Angular Life cycle methods
Inbuilt directives and custom directive
Creation of Custom events and Event binding between components.
Communication between two custom components
Local references and View child and Content Child options
Services and Dependency Injection and its implementation in various use cases
Angular Routers and routing
Observables and their use and implementation
Pipes to transform output Creating Http request ( API handling)Dynamic Components
Good Knowledge of HTML5, CSS3, JQuery, Typescript
Proficient understanding of code versioning tools, such as Git, SVN
Adhering to best design practice
Good Written and Verbal communications.