50+ Data Visualization Jobs in India
Apply to 50+ Data Visualization Jobs on CutShort.io. Find your next job, effortlessly. Browse Data Visualization Jobs and apply today!
Job Title : Senior Full Stack Developer (Dashboard Applications)
Experience : 7+ Years
Location : Gurgaon (Mandatory)
Duration : 6 to 9 Months Contract (Initial 6 Months Commitment Mandatory)
Start Date : 1st April (Immediate Joiners Preferred)
Work Mode : Hybrid (Work From Home + 1 Day/Week at Home Office in Gurgaon)
Work Schedule : 45 Hours/Week + Mandatory Weekend Availability (Saturday & Sunday)
About the Role :
We are hiring a highly skilled Full Stack Developer based out of Gurgaon to work directly with the leadership team of a Cybersecurity Product Company.
This role focuses on building advanced, high-performance dashboard applications to significantly enhance the company’s product capabilities.
This is a high-ownership, high-impact role, where the selected candidate will work closely with the founder/leadership and contribute directly to core product innovation.
Mandatory Skills :
React + TypeScript, Dashboard/Data Visualization (ECharts/D3.js), Node.js (Fastify), Kafka (real-time streaming), ClickHouse, PostgreSQL, Redis, Microservices Architecture, Performance Optimization, Gurgaon-based candidate with 7+ years experience.
Important Note (Strict Requirements) :
- Candidate must be based in Gurgaon (No remote-only candidates)
- Must be comfortable working on weekends (Sat & Sun)
- Must commit to minimum 6 months
- Must be available for in-person collaboration (1 day/week at home office)
- No compromise on the above requirements
Key Responsibilities :
- Design and develop scalable, real-time dashboard applications
- Build data visualization systems using modern frontend frameworks
- Work on end-to-end full stack development (UI to backend APIs)
- Integrate large-scale data pipelines and streaming systems
- Optimize dashboards for performance, usability, and responsiveness
- Collaborate directly with leadership to translate business needs into technical solutions
- Contribute to architecture decisions and system design
Required Technical Skills :
Frontend :
- React.js (18+)
- TypeScript
- TailwindCSS
- State Management (Zustand / TanStack Query)
- Data Visualization:
- Apache ECharts
- D3.js
- Leaflet.js (Maps)
Backend :
- Node.js (20 LTS)
- Fastify (preferred)
- REST API Development
- Prisma ORM
Data & Streaming :
- Apache Kafka / KafkaJS
- Apache Flink (or Kafka Streams)
Databases :
- ClickHouse (must have for analytics use cases)
- PostgreSQL
- Redis
Good to Have :
- Graph DB (Neo4j)
- Object Storage (MinIO)
Machine Learning (Nice to Have) :
- Python (FastAPI)
- Basic experience with:
- scikit-learn
- pandas / numpy
- Understanding of anomaly detection systems
DevOps & Infrastructure :
- Docker & Kubernetes
- Terraform
- Nginx
- Monitoring: Prometheus, Grafana
- Observability: OpenTelemetry
- Authentication: Keycloak / JWT
Testing & Quality :
- Unit & Integration Testing:
- Vitest / Jest
- Supertest
- E2E Testing:
- Playwright
- Performance Testing:
- k6
What We’re Looking For :
- Strong experience in building complex dashboards or analytics platforms
- Ability to work independently as a single contributor
- High ownership and accountability mindset
- Strong problem-solving and system design skills
- Someone who is practical, execution-focused, and reliable
Why Join :
- Direct collaboration with leadership
- Opportunity to build cutting-edge cybersecurity dashboards
- High-impact role with end-to-end ownership
- Flexible work setup (with meaningful in-person collaboration)
Description
Power BI JD
Mandatory:
• 5+ years of Power BI Report development experience.
• Building Analysis Services reporting models.
• Developing visual reports, KPI scorecards, and dashboards using Power BI desktop.
• Connecting data sources, importing data, and transforming data for Business intelligence.
• Analytical thinking for translating data into informative reports and visuals.
• Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI.
• Should have an edge over making DAX queries in Power BI desktop.
• Expert in using advanced-level calculations on the data set.
• Responsible for design methodology and project documentaries.
• Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards.
• Very good communication skills must be able to discuss the requirements effectively with the client teams, and with internal teams.
• Experience working with Microsoft Business Intelligence Stack having Power BI, SSAS, SSRS, and SSIS
• Mandate to have experience with BI tools and systems such as Power BI, Tableau, and SAP.
• Must have 3-4years of experience in data-specific roles.
• Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more
• Knowledge of all the Power BI products (Power Bi premium, Power BI server, Power BI services, Powerquery etc)
• Grip over data analytics
• Interact with customers to understand their business problems and provide best-in-class analytics solutions
• Proficient in SQL and Query performance tuning skills
• Understand data governance, quality and security and integrate analytics with these corporate platforms
• Attention to detail and ability to deliver accurate client outputs
• Experience of working with large and multiple datasets / data warehouses
• Ability to derive insights from data and analysis and create presentations for client teams
• Experience with performance optimization of the dashboards
• Interact with UX/UI designers to create best in class visualization for business harnessing all product capabilities.
• Resilience under pressure and against deadlines.
• Proactive attitude and an open outlook.
• Strong analytical problem-solving skills
• Skill in identifying data issues and anomalies during the analysis
• Strong business acumen demonstrated an aptitude for analytics that incite action
• Ability to execute on design requirements defined by business
• Ability to understand required Power BI functionality from wireframes/ requirement documents
• Ability to architect and design reporting solutions based on client needs.
• Being able to communicate with internal/external customers, desire to develop communication and client-facing skills.
• Ability to seamlessly work with MS Excel working knowledge of pivot table and related functions
Good to have:
• Experience in working with Azure and connecting synapse with Tableau
• Demonstrate strength in data modelling, ETL development, and data warehousing
• Knowledge of leading large-scale data warehousing and analytics projects using Azure, Synapse, MS SQL DB
• Good knowledge of building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
• Good to have knowledge of Supply Chain Domain.
Role & Responsibilities
- Expertise with various Pharma datasets such as Sales, Rx, Claims, Specialty Pharmacy etc.
- Consulting with US pharma clients to plan, design and execute consulting projects.
- Understand business problems and requirements to recommend solutions.
- Designing dashboards with the use of visualization tools like Power BI, or Tableau
- Strong knowledge of Data Management concepts
- Optimize BI Solutions for performance and intuitiveness.
- Own accounts independently and ensure delivery excellence and quality
- Managing team of offshore Analysts for delivery operations
- Manage project deadlines and deliverables with minimal supervision.
- Develop solutions based on BI platforms such as custom extensions, and NLG/NLP-based solutions.
- Analyzing the data to identify trends and share insights.
- Providing support to pre-sales and marketing initiatives within Datazymes
- Participate in client discussions, demos, and Proof of Concepts development.
- Mentor team on data, technology, and business aspects to ensure seamless delivery.
Ideal Candidate
- Mandatory (Experience 1) : Must have 4+ years of Data Visualization/ Data Analytics Dashboard experience of which 2+ years is in pharma industry
- Mandatory (Experience 2) : Must have 2+ years of hands-on experience working with pharma datasets (Sales, Rx, Claims, Specialty Pharmacy data)
- Mandatory (Skill 1) : Must have experience translating business requirements into analytics / BI solutions and generating actionable insights
- Mandatory (Skill 2) : Must have strong experience in BI tools (Tableau / Power BI / Qlik) including dashboard design, development, and optimization
- Mandatory (Skill 3) : Must have strong SQL skills and working knowledge of Python for data analysis
- Mandatory (Skill 4) : Must have strong understanding of commercial analytics and data modelling
- Mandatory (Client Exposure) : Must have experience working with external clients or stakeholders and participating in client discussions, demos, or consulting engagements
- Mandatory (Ownership) : Must have experience handling end-to-end project delivery, managing timelines, and owning accounts independently
- Mandatory (Company) : Pharma / life sciences analytics / consulting companies
- Mandatory (Note 1) : Hybrid, WFH flexibility 6 days a month
- Mandatory (Note 2) : CTC is inclusive of 10% variable
- Preferred (Skill 1) : Experience in predictive modeling (regression, classification, clustering)
- Preferred (Skill 2) : Exposure to advanced analytics (NLP/NLG, POCs, pre-sales support)
- Preferred (Experience) : Experience working with US pharma clients or global stakeholders
Position : Senior Power BI Developer
Experience : 4+Years
Location : Ahmedabad - WFO
Key Responsibilities:
- Design, develop, and maintain interactive and user-friendly Power BI dashboards and
- reports.
- Translate business requirements into functional and technical specifications.
- Perform data modeling, DAX calculations, and Power Query transformations.
- Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs.
- Optimize Power BI datasets, reports, and dashboards for performance and usability.
- Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy
- and relevance.
- Ensure security and governance best practices in Power BI workspaces and datasets.
- Provide ongoing support and troubleshooting for existing Power BI solutions.
- Stay updated with Power BI updates, best practices, and industry trends.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a
- related field.
- 4+ years of professional experience in data analytics or business intelligence.
- 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service).
- Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema).
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Experience in working with large and complex datasets.
- Experience in BigQuery, MySql, Looker Studio is a plus.
- Ecommerce Industry Experience will be an added advantage.
- Solid understanding of data warehousing concepts and ETL processes.
- Experience with version control tools such as Power Apps & Power Automate would be a plus.
Preferred Qualifications:
- Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse).
- Knowledge of other BI tools (Tableau, Qlik) is a plus.
- Familiarity with scripting languages (Python, R) for data analysis is a bonus.
- Experience integrating Power BI into web portals using Power BI Embedded.
Job Description: Data Analyst Intern
Location: On-site, Bangalore
Duration: 6 months (Full-time)
About us:
- Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).
- Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making). As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.
What we offer:
- Join our dynamic startup team and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit features analytics, collections, and portfolio management.
- The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.
- This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment.
- We believe that the freedom and accountability to make decisions in analytics and technology brings out the best in you and helps us build the best for the company.
- This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.
What we look for:
- We are looking for individuals with a strong analytical mindset, high levels of initiative / ownership, ability to drive tasks independently, clear communication and comfort working across teams.
- We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team.
- We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems.
Key Responsibilities:
- Conduct analytical deep-dives such as funnel analysis, cohort tracking, branch-wise performance reviews, TAT analysis, portfolio diagnostic, credit risk analytics that lead to clear actions.
- Work closely with stakeholders to convert business questions into measurable analyses and clearly communicated outputs.
- Support digital underwriting initiatives, including assisting in the development and analysis of underwriting APIs that enable decisioning on borrower eligibility (“whom to lend”) and exposure sizing (“how much to lend”).
- Develop and maintain periodic MIS and KPI reporting for key business functions (e.g., pipeline, disbursals, TAT, conversion, collections performance, portfolio trends).
- Use Python (pandas, numpy) to clean, transform, and analyse datasets; automate recurring reports and data workflows.
- Perform basic scripting to support data validation, extraction, and lightweight automation.
Required Skills and Qualifications:
- Strong proficiency in Excel, including pivots, lookup functions, data cleaning, and structured analysis.
- Strong working knowledge of SQL, including joins, aggregations, CTEs, and window functions.
- Proficiency in Python for data analysis (pandas, numpy); ability to write clean, maintainable scripts/notebooks.
- Strong logical reasoning and attention to detail, including the ability to identify errors and validate results rigorously.
- Ability to work with ambiguous requirements and imperfect datasets while maintaining output quality.
Preferred (Good to Have):
- REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development/integrations.
- Familiarity with basic AWS tools/services: (S3, lambda, EC2, Glue Jobs).
- Experience with Git and basic engineering practices.
- Any experience with the lending/finance industry.
JOB DETAILS:
* Job Title: Head - Visual Communication (Consumer Electronics)
* Industry: ECommerce and Electronics Industry
* Salary: Best in Industry
* Experience: 10-15 years
* Location: Gurugram
Role & Responsibilities
- Head and manage work intake and the overall design project assignment process.
- Interpreting abstract business concepts and turning them into creative ideas.
- Head and direct the team, providing key ideas, methods, and brand positioning.
- Developing strategic design plans with projected timelines.
- Pitching ideas and the creative vision and communicating the project outline to the design team.
- Choosing the design elements for different projects.
- Overseeing the design projects, from start to finish, and monitoring the team members.
- Analyzing market research to create more effective designs.
Ideal Candidate
- Strong Creative Director or Design Lead profiles
- Must have minimum 10+ YOE in Visual / Graphic Design, Branding and Marketing Campaigns
- Must have strong experience in Brand campaigns for Consumer Electronics / Durable products (like Smartphones, Smart Watch, Consumer Electronics) Or Automobile Brands - Read clients / brands worked for
- Must be a Design focused profile, not copywriting focused
- Must be managing a Creative team currently (Lead or Above in Current role)
- (Portfolio) - Very Strong portfolio of Visual Design / branding works for Physical Consumer Products (Candidate should demonstrate strong portfolio evidence of creative direction, with competency in ideation, visualization, and Design)
- Candidates with international exposure and experience on global brands are highly preferred.
- Must be managing a Creative team currently (Lead or Above in Current role)
Job Description: Data Analyst
About the Role
We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..
Key Responsibilities
- Data Extraction & Management
- Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.
- Ensure accuracy, reliability, and consistency of data across different platforms.
- Data Analysis & Insights
- Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.
- Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.
- Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.
- Business Intelligence & Visualization
- Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).
- Create visualizations that simplify complex datasets for stakeholders and management.
- Python (Pandas)
- Use Python (Pandas, NumPy) for advanced analytics.
- Collaboration & Stakeholder Management
- Work closely with product, operations, and leadership teams to provide insights that drive decision-making.
- Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.
Required Skills
- SQL/PostgreSQL
- Complex joins, window functions, CTEs, aggregations, query optimization.
- Python (Pandas & Analytics)
- Data wrangling, cleaning, transformations, exploratory data analysis (EDA).
- Libraries: Pandas, NumPy, Matplotlib, Seaborn
- Data Visualization & BI Tools
- Expertise in creating dashboards and reports using Metabase or Looker.
- Ability to translate raw data into meaningful visual insights.
- Business Intelligence
- Strong analytical reasoning to connect data insights with e-commerce KPIs.
- Experience in funnel analysis, customer journey mapping, and retention analysis.
- Analytics & E-commerce Knowledge
- Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.
- General Skills
- Strong communication and presentation skills.
- Ability to work cross-functionally in fast-paced environments.
- Problem-solving mindset with attention to detail.
Education: Bachelor’s degree in Data Science, Computer Science, data processing
Role: Creative Producer – Web Series
Industry: Media and Entertainment Industry
Function: Product Management
Age Upto: 38 Years
City/State: Noida
Education: Graduation
Work Mode : ONSITE
Working Day : 5
Required Skills: Creative Vision & IP Development, End-to-End Content Production Leadership, Cinematic Storytelling & Audience Psychology, Team Leadership & Cross-Functional Collaboration, Data-Driven Content Strategy
Description :
Job Overview:
The Creative Producer – Web Series will lead the overall creative vision, strategy, and execution of Web Series content across all formats. The role demands a visionary storyteller who deeply understands cinematic language, audience psychology, and cultural nuances, while maintaining creative excellence and market relevance.
The Creative Producer will be responsible for ideating and developing high-impact original IPs, supervising the full content lifecycle, and ensuring every story aligns with Company’s mission — “Entertainment First in Culture.”
Key Responsibilities:
1. Creative Strategy & Vision:
● Drive the creative vision for company Web Series, ensuring all projects align with the platform’s cultural and entertainment goals.
● Lead ideation and development of original IPs across genres with strong emotional and cinematic value.
● Work closely with screenwriters, directors, and producers to develop highly engaging and culturally resonant scripts.
● Ensure creative consistency, strong storytelling structure, and high production quality across all stages — from concept to scripting, production, post, and release.
2. Content Development & Execution:
● Responsible for content commissioning as per Company’s overall strategy and creative roadmap.
● Ensure smooth execution of projects within defined budget and TATs, maintaining excellence in storytelling and visual quality.
● Maintain a yearly content pipeline, pre-planning content commissioning and licensing in sync with business and cultural objectives.
● Supervise the pre-production, production, and post-production stages of Web Series content to ensure alignment with creative vision.
3. Team Supervision & Collaboration:
● Lead and mentor a team of associate creative producers, writers, producers, and content strategists, fostering innovation and creative ownership.
● Supervise cross-functional teams (internal & external) through all phases — from ideation and scripting to marketing and release.
● Collaborate with marketing, insights, and analytics teams to align creative output with performance goals and audience insights.
4. Platform & Audience Insights:
● Monitor content trends, audience behavior, and performance data to refine content strategies.
● Scout emerging formats, genres, creators, and storytelling trends to keep the company ahead of the curve.
● Analyze content performance metrics and deliver stories with high engagement and completion rates.
● Identify opportunities for IP expansion, spin-offs, or multi-platform storytelling to strengthen the Company's brand footprint.
Qualifications & Skillset:
Creative & Storytelling Skills:
● In-depth understanding of characters, emotions, and visual storytelling.
● Strong grasp of concept selection, narrative pacing, and audience connection.
● Proven ability to develop culturally rooted, high-engagement content.
Technical & Process Understanding
● Deep understanding of Pre-Production (casting, budgeting, locations, costumes, treatment).
● Strong knowledge of Post-Production (editing, sound design, BGM, DI, VFX).
● Ability to maintain storytelling quality across production pipelines.
Strategic & Analytical Strengths:
● Strong understanding of market trends and audience insights.
● Skilled in using data and analytics to shape creative decisions and assess content performance.
Leadership & Decision-Making
● Proactive problem-solver with strong creative judgment.
● Excellent team management and multi-tasking capabilities.
● Ability to balance creative ambition with business strategy and timelines.
Experience:
● Minimum 8+ years of experience in web series creation, creative leadership roles.
● At least 5+ years of on-ground experience in Web Series production and 3+ years in OTT/platform-based content creation.
● High on learning, high on passion, and driven by creative innovation.
Cultural Understanding:
● Deeply rooted understanding of regional culture or broader Hindi-speaking belt sensibilities.
● Ability to translate cultural authenticity into mass, relatable storytelling.
Ideal Candidate:
This role is ideal for a visionary creative leader who lives and breathes cinema — someone with the rare blend of creative instinct, analytical acumen, and cultural depth. The ideal candidate will be passionate about building a cinematic universe that entertains, represents, and elevates regional India on a global scale.
Senior Full Stack Developer – Analytics Dashboard
Job Summary
We are seeking an experienced Full Stack Developer to design and build a scalable, data-driven analytics dashboard platform. The role involves developing a modern web application that integrates with multiple external data sources, processes large datasets, and presents actionable insights through interactive dashboards.
The ideal candidate should be comfortable working across the full stack and have strong experience in building analytical or reporting systems.
Key Responsibilities
- Design and develop a full-stack web application using modern technologies.
- Build scalable backend APIs to handle data ingestion, processing, and storage.
- Develop interactive dashboards and data visualisations for business reporting.
- Implement secure user authentication and role-based access.
- Integrate with third-party APIs using OAuth and REST protocols.
- Design efficient database schemas for analytical workloads.
- Implement background jobs and scheduled tasks for data syncing.
- Ensure performance, scalability, and reliability of the system.
- Write clean, maintainable, and well-documented code.
- Collaborate with product and design teams to translate requirements into features.
Required Technical Skills
Frontend
- Strong experience with React.js
- Experience with Next.js
- Knowledge of modern UI frameworks (Tailwind, MUI, Ant Design, etc.)
- Experience building dashboards using chart libraries (Recharts, Chart.js, D3, etc.)
Backend
- Strong experience with Node.js (Express or NestJS)
- REST and/or GraphQL API development
- Background job systems (cron, queues, schedulers)
- Experience with OAuth-based integrations
Database
- Strong experience with PostgreSQL
- Data modelling and performance optimisation
- Writing complex analytical SQL queries
DevOps / Infrastructure
- Cloud platforms (AWS)
- Docker and basic containerisation
- CI/CD pipelines
- Git-based workflows
Experience & Qualifications
- 5+ years of professional full stack development experience.
- Proven experience building production-grade web applications.
- Prior experience with analytics, dashboards, or data platforms is highly preferred.
- Strong problem-solving and system design skills.
- Comfortable working in a fast-paced, product-oriented environment.
Nice to Have (Bonus Skills)
- Experience with data pipelines or ETL systems.
- Knowledge of Redis or caching systems.
- Experience with SaaS products or B2B platforms.
- Basic understanding of data science or machine learning concepts.
- Familiarity with time-series data and reporting systems.
- Familiarity with meta ads/Google ads API
Soft Skills
- Strong communication skills.
- Ability to work independently and take ownership.
- Attention to detail and focus on code quality.
- Comfortable working with ambiguous requirements.
Ideal Candidate Profile (Summary)
A senior-level full stack engineer who has built complex web applications, understands data-heavy systems, and enjoys creating analytical products with a strong focus on performance, scalability, and user experience.
JOB DETAILS:
- Job Title: Senior Business Analyst
- Industry: Ride-hailing
- Experience: 4-7 years
- Working Days: 5 days/week
- Work Mode: ONSITE
- Job Location: Bangalore
- CTC Range: Best in Industry
Required Skills: Data Visualization, Data Analysis, Strong in Python and SQL, Cross-Functional Communication & Stakeholder Management
Criteria:
1. Candidate must have 4–7 years of experience in analytics / business analytics roles.
2. Candidate must be currently based in Bangalore only (no relocation allowed).
3. Candidate must have hands-on experience with Python and SQL.
4. Candidate must have experience working with databases/APIs (Mongo, Presto, REST or similar).
5. Candidate must have experience building dashboards/visualizations (Tableau, Metabase or similar).
6. Candidate must be available for face-to-face interviews in Bangalore.
7. Candidate must have experience working closely with business, product, and operations teams.
Description
Job Responsibilities:
● Acquiring data from primary/secondary data sources like mongo/presto/Rest APIs.
● Candidate must have strong hands-on experience in Python and SQL.
● Build visualizations to communicate data to key decision-makers and preferably familiar with building interactive dashboards in Tableau/Metabase
● Establish relationship between output metric and its drivers in order to identify critical drivers and control the critical drivers so as to achieve the desired value of output metric
● Partner with operations/business teams to consult, develop and implement KPIs, automated reporting/process solutions, and process improvements to meet business needs
● Collaborating with our business owners + product folks and perform data analysis of experiments and recommend the next best action for the business. Involves being embedded into business decision teams for driving faster decision making
● Collaborating with several functional teams within the organization and use raw data and metrics to back up assumptions, develop hypothesis/business cases and complete root cause analyses; thereby delivering output to business users
Job Requirements:
● Undergraduate and/or graduate degree in Math, Economics, Statistics, Engineering, Computer Science, or other quantitative field.
● Around 4-6 years of experience being embedded in analytics and adjacent business teams working as analyst aiding decision making
● Proficiency in Excel and ability to structure and present data in creative ways to drive insights
● Some basic understanding of (or experience in) evaluating financial parameters like return-on-investment (ROI), cost allocation, optimization, etc. is good to have
👉 ● Candidate must have strong hands-on experience in Python and SQL.
What’s there for you?
● Opportunity to understand the overall business & collaborate across all functional departments
● Prospect to disrupt the existing mobility industry business models (ideate, pilot, monitor & scale)
● Deal with the ambiguity of decision making while balancing long-term/strategic business needs and short-term/tactical moves
● Full business ownership working style which translates to freedom to pick problem statements/workflow and self-driven culture
The Power BI Intern will assist the analytics team in using Microsoft Power BI to create interactive dashboards and reports. Working with actual datasets to assist well-informed business decision-making, this position provides practical exposure to data analysis, visualization, and business intelligence techniques.
About the Role:
We are looking for a detail-oriented and analytical Senior Data Analyst to join our Chennai team. The ideal candidate will have strong skills in T-SQL/PL-SQL, data platform management, and reporting tools, along with experience in database administration and performance tuning. You will play a key role in transforming raw data into meaningful insights that drive business decisions.
Key Responsibilities:
- Write complex queries using T-SQL / PL-SQL to extract, manipulate, and analyze data.
- Develop and maintain reports, dashboards, and visualizations using modern reporting platforms (Power BI, SSRS, etc.).
- Manage and maintain the data platform to ensure availability, accuracy, and security.
- Support database administration tasks including backups, restores, access management, and monitoring.
- Perform performance tuning of SQL queries and databases to ensure optimal response times.
- Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.
- Ensure data integrity and consistency across all reporting systems.
Mandatory Skills:
- T-SQL / PL-SQL – Advanced query writing and stored procedure development.
- Reporting Platforms – Experience with tools like Power BI, SSRS, Tableau, etc.
- Data Platform Management – Knowledge of modern data ecosystems (SQL Server, Oracle, Azure Data Services, etc.).
- Database Administration – Proficient in handling database maintenance, security, and monitoring.
- Performance Tuning – Strong understanding of optimizing SQL queries and database performance.
Preferred Qualifications:
- Experience with cloud-based data services (Azure, AWS, etc.)
- Knowledge of ETL tools and data warehousing concepts
- Understanding of data governance and compliance
What We Offer:
- Great Culture – A workplace that feels like home! We believe in collaboration, trust, and creating a supportive environment.
- Growth Opportunities – We invest in your learning and development to help you grow both personally and professionally.
- Health & Wellness – Comprehensive wellness programs and benefits to support your well-being.
- Performance Recognition – We celebrate achievements and recognize efforts through regular rewards and appreciation.
Drop us a message or apply now to be part of something big.
kindly Mentions your:
-Current Location
-Your Total Experience
-CTC
-ECTC
-N.P-
Let's Grow Together
Job Type: Full-time
Benefits:
- Food provided
- Health insurance
- Paid sick time
- Paid time off
- Provident Fund
Schedule:
- Day shift
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods

Global digital transformation solutions provider.
Job Description – Senior Technical Business Analyst
Location: Trivandrum (Preferred) | Open to any location in India
Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST
About the Role
We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.
As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.
Key Responsibilities
Business & Analytical Responsibilities
- Partner with business teams to understand one-line problem statements and translate them into detailed business requirements, opportunities, and project scope.
- Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
- Create documentation including Business Requirement Documents (BRDs), user stories, process flows, and analytical models.
- Break down business needs into concise, actionable, and development-ready user stories in Jira.
Data & Technical Responsibilities
- Collaborate with data engineering teams to design, review, and validate data pipelines, data models, and ETL/ELT workflows.
- Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
- Apply foundational data science concepts such as statistical analysis, predictive modeling, and machine learning fundamentals.
- Validate and ensure data quality, consistency, and accuracy across datasets and systems.
Collaboration & Execution
- Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
- Assist in development, testing, and rollout of data-driven solutions.
- Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.
Required Skillsets
Core Technical Skills
- 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
- Data Analytics: SQL, descriptive analytics, business problem framing.
- Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
- Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
- Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.
Soft Skills
- Strong analytical thinking and structured problem-solving capability.
- Ability to convert business problems into clear technical requirements.
- Excellent communication, documentation, and presentation skills.
- High curiosity, adaptability, and eagerness to learn new tools and techniques.
Educational Qualifications
- BE/B.Tech or equivalent in:
- Computer Science / IT
- Data Science
What We Look For
- Demonstrated passion for data and analytics through projects and certifications.
- Strong commitment to continuous learning and innovation.
- Ability to work both independently and in collaborative team environments.
- Passion for solving business problems using data-driven approaches.
- Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.
Why Join Us?
- Exposure to modern data platforms, analytics tools, and AI technologies.
- A culture that promotes innovation, ownership, and continuous learning.
- Supportive environment to build a strong career in data and analytics.
Skills: Data Analytics, Business Analysis, Sql
Must-Haves
Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R
What you will do:-
● Partnering with Product Managers and cross-functional teams to define metrics, build dashboards, and track product performance.
● Conducting deep-dive analyses of large-scale data to identify trends, user behavior patterns, growth gaps, and improvement opportunities.
● Performing competitive benchmarking and industry research to support product strategy and prioritization.
● Generating data-backed insights to drive feature enhancements, product experiments, and business decisions.
● Tracking post-launch impact by measuring adoption, engagement, retention, and ROI of new features.
● Working with Data, Engineering, Business, and Ops teams to design and measure experiments (A/B tests, cohorts, funnels).
● Creating reports, visualizations, and presentations that simplify complex data for stakeholdersand leadership.
● Supporting the product lifecycle with relevant data inputs during research, ideation, launch, and optimization phases.
What we are looking for:-
● Bachelor’s degree in engineering, statistics, business, economics, mathematics, data science, or a related field.
● Strong analytical, quantitative, and problem-solving skills.
● Proficiency in SQL and ability to work with large datasets.
● Experience with data visualization/reporting tools (e.g., Excel, Google Sheets, Power BI, Tableau, Looker, Mixpanel, GA).
● Excellent communication skills — able to turn data into clear narratives and actionable recommendations.
● Ability to work collaboratively in cross-functional teams.
● Passion for product, user behavior, and data-driven decision-making
● Prior internship or work experience in product analytics, business analysis, consulting, or growth teams.
● Familiarity with experimentation techniques (A/B testing, funnels, cohorts, retention metrics).
● Understanding of product management concepts and tools (Jira, Confluence, etc.).
● Knowledge of Python or R for data analysis (optional but beneficial).
● Exposure to consumer tech, mobility, travel, or marketplaces. .
- Candidate Must be a graduate from IIT, NIT, NSUT, or DTU.
- Need candidate with 1–2 years of pure Product Analyst experience is mandatory.
- Candidate must have strong hands-on experience in Product + Data Analysis + Python.
- Candidate should have Python skill on the scale of 3/5 at least.
- Proficiency in SQL and ability to work with large datasets.
- The candidate must experience with A/B testing, cohorts, funnels, retention, and product metrics.
- Hands-on experience with data visualization tools (Tableau, Power BI, Looker, Mixpanel, GA, etc.).
- Candidate must have experienece in Jira.
MANDATORY CRITERIA:
- Candidate Must be a graduate from IIT, NIT, NSUT, or DTU.
- Need candidate with 1–2 years of pure Product Analyst experience is mandatory.
- Candidate must have strong hands-on experience in Product + Data Analysis + Python.
- Candidate should have Python skill on the scale of 3/5 at least.
- Proficiency in SQL and ability to work with large datasets.
- The candidate must experience with A/B testing, cohorts, funnels, retention, and product metrics.
- Hands-on experience with data visualization tools (Tableau, Power BI, Looker, Mixpanel, GA, etc.).
- Candidate must have experienece in Jira.
- Strong communication skills with the ability to work with Product, Engineering, Business, and Ops teams.
OVERVIEW:
As a Product Analyst, you will play a critical role in driving product decisions through data insights and operational understanding. You’ll work closely with Product Managers, Engineering, Business, and Operations teams to analyze user behavior, monitor feature performance, and identify opportunities that accelerate growth, improve user experience, and increase revenue. Your focus will be on translating data into actionable strategies, supporting product roadmaps, and enabling informed decision-making across demand-side projects and operations.
WHAT YOU WILL DO?
● Partnering with Product Managers and cross-functional teams to define metrics, build dashboards, and track product performance.
● Conducting deep-dive analyses of large-scale data to identify trends, user behavior patterns, growth gaps, and improvement opportunities.
● Performing competitive benchmarking and industry research to support product strategy and prioritization.
● Generating data-backed insights to drive feature enhancements, product experiments, and business decisions.
● Tracking post-launch impact by measuring adoption, engagement, retention, and ROI of new features.
● Working with Data, Engineering, Business, and Ops teams to design and measure experiments (A/B tests, cohorts, funnels).
● Creating reports, visualizations, and presentations that simplify complex data for stakeholdersand leadership.
● Supporting the product lifecycle with relevant data inputs during research, ideation, launch, and optimization phases.
WHAT WE ARE LOOKING FOR?
● Bachelor’s degree in engineering, statistics, business, economics, mathematics, data science, or a related field.
● Strong analytical, quantitative, and problem-solving skills.
● Proficiency in SQL and ability to work with large datasets.
● Experience with data visualization/reporting tools (e.g., Excel, Google Sheets, Power BI, Tableau, Looker, Mixpanel, GA).
● Excellent communication skills — able to turn data into clear narratives and actionable recommendations.
● Ability to work collaboratively in cross-functional teams.
● Passion for product, user behavior, and data-driven decision-making
● Prior internship or work experience in product analytics, business analysis, consulting, or growth teams.
● Familiarity with experimentation techniques (A/B testing, funnels, cohorts, retention metrics).
● Understanding of product management concepts and tools (Jira, Confluence, etc.).
● Knowledge of Python or R for data analysis (optional but beneficial).
● Exposure to consumer tech, mobility, travel, or marketplaces.
MANDATORY CRITERIA:
- Candidate Must be a graduate from IIT, NIT, NSUT, or DTU.
- Need candidate with 1–2 years of pure Product Analyst experience is mandatory.
- Candidate must have strong hands-on experience in Product + Data Analysis + Python.
- Candidate should have Python skill on the scale of 3/5 at least.
- Proficiency in SQL and ability to work with large datasets.
- The candidate must experience with A/B testing, cohorts, funnels, retention, and product metrics.
- Hands-on experience with data visualization tools (Tableau, Power BI, Looker, Mixpanel, GA, etc.).
- Candidate must have experienece in Jira.
- Strong communication skills with the ability to work with Product, Engineering, Business, and Ops teams.
OVERVIEW:
As a Product Analyst, you will play a critical role in driving product decisions through data insights and operational understanding. You’ll work closely with Product Managers, Engineering, Business, and Operations teams to analyze user behavior, monitor feature performance, and identify opportunities that accelerate growth, improve user experience, and increase revenue. Your focus will be on translating data into actionable strategies, supporting product roadmaps, and enabling informed decision-making across demand-side projects and operations.
WHAT YOU WILL DO?
● Partnering with Product Managers and cross-functional teams to define metrics, build dashboards, and track product performance.
● Conducting deep-dive analyses of large-scale data to identify trends, user behavior patterns, growth gaps, and improvement opportunities.
● Performing competitive benchmarking and industry research to support product strategy and prioritization.
● Generating data-backed insights to drive feature enhancements, product experiments, and business decisions.
● Tracking post-launch impact by measuring adoption, engagement, retention, and ROI of new features.
● Working with Data, Engineering, Business, and Ops teams to design and measure experiments (A/B tests, cohorts, funnels).
● Creating reports, visualizations, and presentations that simplify complex data for stakeholdersand leadership.
● Supporting the product lifecycle with relevant data inputs during research, ideation, launch, and optimization phases.
WHAT WE ARE LOOKING FOR?
● Bachelor’s degree in engineering, statistics, business, economics, mathematics, data science, or a related field.
● Strong analytical, quantitative, and problem-solving skills.
● Proficiency in SQL and ability to work with large datasets.
● Experience with data visualization/reporting tools (e.g., Excel, Google Sheets, Power BI, Tableau, Looker, Mixpanel, GA).
● Excellent communication skills — able to turn data into clear narratives and actionable recommendations.
● Ability to work collaboratively in cross-functional teams.
● Passion for product, user behavior, and data-driven decision-making
● Prior internship or work experience in product analytics, business analysis, consulting, or growth teams.
● Familiarity with experimentation techniques (A/B testing, funnels, cohorts, retention metrics).
● Understanding of product management concepts and tools (Jira, Confluence, etc.).
● Knowledge of Python or R for data analysis (optional but beneficial).
● Exposure to consumer tech, mobility, travel, or marketplaces.
About Vijay Sales
Vijay Sales is one of India’s leading electronics retail brands with 160+ stores nationwide and a fast-growing digital presence. We are on a mission to build the most advanced data-driven retail intelligence ecosystem—using AI, predictive analytics, LLMs, and real-time automation to transform customer experience, supply chain, and omnichannel operations.
Role Overview
We are looking for a highly capable AI Engineer who is passionate about building production-grade AI systems, designing scalable ML architecture, and working with cutting-edge AI/ML tools. This role involves hands-on work with Databricks, SQL, PySpark, modern LLM/GenAI frameworks, and full lifecycle ML system design.
Key Responsibilities
Machine Learning & AI Development
- Build, train, and optimize ML models for forecasting, recommendation, personalization, churn prediction, inventory optimization, anomaly detection, and pricing intelligence.
- Develop GenAI solutions using modern LLM frameworks (e.g., LangChain, LlamaIndex, HuggingFace Transformers).
- Explore and implement RAG (Retrieval Augmented Generation) pipelines for product search, customer assistance, and support automation.
- Fine-tune LLMs on company-specific product and sales datasets (using QLoRA, PEFT, and Transformers).
- Develop scalable feature engineering pipelines leveraging Delta Lake and Databricks Feature Store.
Databricks / Data Engineering
- Build end-to-end ML workflows on Databricks using PySpark, MLflow, Unity Catalog, Delta Live Tables.
- Optimize Databricks clusters for cost, speed, and stability.
- Maintain reusable notebooks and parameterized pipelines for model ingestion, validation, and deployment.
- Use MLflow for tracking experiments, model registry, and lifecycle management.
Data Handling & SQL
- Write advanced SQL for multi-source data exploration, aggregation, and anomaly detection.
- Work on large, complex datasets from ERP, POS, CRM, Website, and Supply Chain systems.
- Automate ingestion of streaming and batch data into Databricks pipelines.
Deployment & MLOps
- Deploy ML models using REST APIs, Databricks Model Serving, Docker, or cloud-native endpoints.
- Build CI/CD pipelines for ML using GitHub Actions, Azure DevOps, or Databricks Workflows.
- Implement model monitoring for drift, accuracy decay, and real-time alerts.
- Maintain GPU/CPU environments for training workflows.
Must-Have Technical Skills
Core AI/ML
- Strong fundamentals in machine learning: regression, classification, time-series forecasting, clustering.
- Experience in deep learning using PyTorch or TensorFlow/Keras.
- Expertise in LLMs, embeddings, vector databases, and GenAI architecture.
- Hands-on experience with HuggingFace, embedding models, and RAG.
Databricks & Big Data
- Hands-on experience with Databricks (PySpark, SQL, Delta Lake, MLflow, Feature Store).
- Strong understanding of Spark execution, partitioning, and optimization.
Programming
- Strong proficiency in Python.
- Experience writing high-performance SQL with window functions, CTEs, and analytical queries.
- Knowledge of Git, CI/CD, REST APIs, and Docker.
MLOps & Production Engineering
- Experience deploying models to production and monitoring them.
- Familiarity with tools like MLflow, Weights & Biases, or SageMaker equivalents.
- Experience in building automated training pipelines and handling model drift/feedback loops.
Preferred Domain Experience
- Retail/e-commerce analytics
- Demand forecasting
- Inventory optimization
- Customer segmentation & personalization
- Price elasticity and competitive pricing
About Ven Analytics
At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.
Role Overview
We’re looking for a Power BI Data Analyst who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL..
Key Responsibilities
- Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.
- Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.
- Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.
- Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.
- Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.
- Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.
- Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.
- Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.
- Power BI Development: Use power BI desktop for report building and service for distribution
- Backend development: Develop optimized SQL queries that are easy to consume, maintain and debug.
- Version Control: Strict control on versions by tracking CRs and Bugfixes. Ensuring the maintenance of Prod and Dev dashboards.
- Client Servicing : Engage with clients to understand their data needs, gather requirements, present insights, and ensure timely, clear communication throughout project cycles.
- Team Management : Lead and mentor a small team by assigning tasks, reviewing work quality, guiding technical problem-solving, and ensuring timely delivery of dashboards and reports..
Must-Have Skills
- Strong experience building robust data models in Power BI
- Hands-on expertise with DAX (complex measures and calculated columns)
- Proficiency in M Language (Power Query) beyond drag-and-drop UI
- Clear understanding of data visualization best practices (less fluff, more insight)
- Solid grasp of SQL and Python for data processing
- Strong analytical thinking and ability to craft compelling data stories
- Client Servicing Background.
Good-to-Have (Bonus Points)
- Experience using DAX Studio and Tabular Editor
- Prior work in a high-volume data processing production environment
- Exposure to modern CI/CD practices or version control with BI tools
Why Join Ven Analytics?
- Be part of a fast-growing startup that puts data at the heart of every decision.
- Opportunity to work on high-impact, real-world business challenges.
- Collaborative, transparent, and learning-oriented work environment.
- Flexible work culture and focus on career development.
Experience- 6 to 8 years
Location- Bangalore
Job Description-
- Extensive experience with machine learning utilizing the latest analytical models in Python. (i.e., experience in generating data-driven insights that play a key role in rapid decision-making and driving business outcomes.)
- Extensive experience using Tableau, table design, PowerApps, Power BI, Power Automate, and cloud environments, or equivalent experience designing/implementing data analysis pipelines and visualization.
- Extensive experience using AI agent platforms. (AI = data analysis: a required skill for data analysts.)
- A statistics major or equivalent understanding of statistical analysis results interpretation.
Review Criteria
- Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
- 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
- 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
- Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
- Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
- Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
- Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
- Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Preferred
- Ability to build insightful dashboards and executive reports highlighting user engagement, retention, and behavioral metrics; familiarity with mixed-method research, AI-assisted insight tools (Dovetail, EnjoyHQ, Qualtrics, UserZoom), and mentoring junior researchers
Job Specific Criteria
- CV Attachment is mandatory
- We have an alternate Saturday’s working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
Product Conceptualization & UX Strategy Development:
- Conceptualize customer experience strategies
- Collaborate with product managers to conceptualize new products & align UX with product roadmaps.
- Develop and implement UX strategies that align with business objectives.
- Stay up-to-date with industry trends and best practices in UX & UI for AI.
- Assist in defining product requirements and features.
- Use data analytics to inform product strategy and prioritize features.
- Ensure product alignment with customer needs and business goals.Develop platform blueprints that include a features and functionalities map, ecosystem map, and information architecture.
- Create wireframes, prototypes, and mock-ups using tools like Figma
- Conduct usability testing and iterate designs based on feedback
- Employ tools like X-Mind for brainstorming and mind mapping
Customer Journey Analysis:
- Understand and map out customer journeys and scenarios.
- Identify pain points and opportunities for improvement.
- Develop customer personas and empathy maps.
Cross-Functional Collaboration:
- Work closely with internal units such as UX Research, Design, UX Content, and UX QA to ensure seamless delivery of CX initiatives.
- Coordinate with development teams to ensure UX designs are implemented accurately.
Data Analytics and Tools:
- Utilize clickstream and analytics tools like Google Analytics, CleverTap, and Medallia to gather and analyse user data.
- Leverage data to drive decisions and optimize customer experiences.
- Strong background in data analytics, including proficiency in interpreting complex datasets to inform UX decisions.
Ideal Candidate
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
Job Title: Digital Marketing Executive
Location: Kolkata – Chinar Park
Salary: ₹18,000 – ₹22,000 per month
Job Type: Full-time
Job Summary:
We are looking for a results-driven Digital Marketing Executive to plan, execute, and manage digital marketing campaigns across multiple platforms. The ideal candidate should have hands-on experience in social media management, SEO, paid ads, content creation, and analytics.
Key Responsibilities:
1. Social Media Management
- Develop, schedule, and publish posts across all social media channels.
- Monitor engagement, respond to comments/messages, and grow community presence.
- Create campaigns to increase brand awareness and lead generation.
2. Performance Marketing
- Run and optimize paid campaigns on Meta Ads, Google Ads, and other platforms.
- Track ROI, monitor performance, and adjust strategies for maximum conversions.
3. SEO & Website Management
- Implement on-page and off-page SEO strategies.
- Conduct keyword research and competitor analysis.
- Coordinate with developers/designers for website updates and performance improvements.
4. Content Creation
- Create engaging content for social media, blogs, emailers, and ads.
- Collaborate with the design team for creatives, videos, and promotional materials.
5. Analytics & Reporting
- Monitor campaign performance using tools like Google Analytics, Meta Business Suite, etc.
- Generate weekly and monthly performance reports.
- Provide insights and suggestions for improving campaign results.
Required Skills & Qualifications:
- Bachelor’s degree in Marketing, Mass Communication, or related field.
- Proven experience (1–3 years) in digital marketing.
- Strong knowledge of Google Ads, Meta Ads, SEO, and social media platforms.
- Proficiency in tools like Google Analytics, Canva, Meta Business Suite, and other digital tools.
- Excellent communication, creativity, and analytical skills.
- Ability to manage multiple tasks and meet deadlines.
Preferred Qualifications:
- Experience with video editing or basic graphic design.
- Knowledge of email marketing tools .
- Understanding of branding and consumer psychology.
Benefits:
- Competitive salary and incentives.
- Opportunity to work on multiple brands and campaigns.
- Growth and learning opportunities.
- Friendly and collaborative work environment.
DataHavn IT Solutions is a company that specializes in big data and cloud computing, artificial intelligence and machine learning, application development, and consulting services. We want to be in the frontrunner into anything to do with data and we have the required expertise to transform customer businesses by making right use of data.
About the Role:
As a Data Scientist specializing in Google Cloud, you will play a pivotal role in driving data-driven decision-making and innovation within our organization. You will leverage the power of Google Cloud's robust data analytics and machine learning tools to extract valuable insights from large datasets, develop predictive models, and optimize business processes.
Key Responsibilities:
- Data Ingestion and Preparation:
- Design and implement efficient data pipelines for ingesting, cleaning, and transforming data from various sources (e.g., databases, APIs, cloud storage) into Google Cloud Platform (GCP) data warehouses (BigQuery) or data lakes (Dataflow).
- Perform data quality assessments, handle missing values, and address inconsistencies to ensure data integrity.
- Exploratory Data Analysis (EDA):
- Conduct in-depth EDA to uncover patterns, trends, and anomalies within the data.
- Utilize visualization techniques (e.g., Tableau, Looker) to communicate findings effectively.
- Feature Engineering:
- Create relevant features from raw data to enhance model performance and interpretability.
- Explore techniques like feature selection, normalization, and dimensionality reduction.
- Model Development and Training:
- Develop and train predictive models using machine learning algorithms (e.g., linear regression, logistic regression, decision trees, random forests, neural networks) on GCP platforms like Vertex AI.
- Evaluate model performance using appropriate metrics and iterate on the modeling process.
- Model Deployment and Monitoring:
- Deploy trained models into production environments using GCP's ML tools and infrastructure.
- Monitor model performance over time, identify drift, and retrain models as needed.
- Collaboration and Communication:
- Work closely with data engineers, analysts, and business stakeholders to understand their requirements and translate them into data-driven solutions.
- Communicate findings and insights in a clear and concise manner, using visualizations and storytelling techniques.
Required Skills and Qualifications:
- Strong proficiency in Python or R programming languages.
- Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Cloud Dataproc, and Vertex AI.
- Familiarity with machine learning algorithms and techniques.
- Knowledge of data visualization tools (e.g., Tableau, Looker).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
Preferred Qualifications:
- Experience with cloud-native data technologies (e.g., Apache Spark, Kubernetes).
- Knowledge of distributed systems and scalable data architectures.
- Experience with natural language processing (NLP) or computer vision applications.
- Certifications in Google Cloud Platform or relevant machine learning frameworks.
About the Role:
We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
Project Details:
- Task: Annotate and label 25,000 images as per defined guidelines.
- Duration: 2 months (Immediate start)
- Compensation: ₹10,000 per month (₹20,000 total for 2 months)
- Location: Remote (candidates from South India also welcome)
🖥️ Requirements:
- Personal laptop with good performance (capable of handling annotation tools)
- Stable high-speed internet connection
- Prior experience in data annotation / labeling on platforms like Freelancer, Upwork, Toloka, or RWS
Strong attention to detail and ability to meet weekly targets
Role: Data Scientist (Python + R Expertise)
Exp: 8 -12 Years
CTC: up to 30 LPA
Required Skills & Qualifications:
- 8–12 years of hands-on experience as a Data Scientist or in a similar analytical role.
- Strong expertise in Python and R for data analysis, modeling, and visualization.
- Proficiency in machine learning frameworks (scikit-learn, TensorFlow, PyTorch, caret, etc.).
- Strong understanding of statistical modeling, hypothesis testing, regression, and classification techniques.
- Experience with SQL and working with large-scale structured and unstructured data.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and deployment practices (Docker, MLflow).
- Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
- Experience with NLP, time series forecasting, or deep learning projects.
- Exposure to data visualization tools (Tableau, Power BI, or R Shiny).
- Experience working in product or data-driven organizations.
- Knowledge of MLOps and model lifecycle management is a plus.
If interested kindly share your updated resume on 82008 31681
Location: Bangalore
Experience: 5–8 Years
CTC: Up to 22 LPA
Required Skills:
- Strong expertise in Power BI (dashboarding, DAX, data modeling, report building).
- Basic to working knowledge of Tableau.
- Solid understanding of SQL and relational databases.
- Experience in connecting to various data sources (Excel, SQL Server, APIs, etc.).
- Strong analytical, problem-solving, and communication skills.
Nice to Have:
- Experience with cloud data platforms (Azure, AWS, GCP).
- Knowledge of data warehousing concepts and ETL processes.
About Analyticsliv:
We are a Google Partner Agency & Google Tag Manager Certified Partner that believes that when the right set of people forms a team they can achieve formidable things. India's Digital industry is going to grow 10-fold in the next 5 years and we intend to play a strong role in that growth. If you think you have the potential to pace up with a fast growth process driven organization, we are waiting to hear from you.
Requirements:
- Bachelor's degree in Marketing or a related business or technology field
- Experience with Google Analytics 4 reporting in depth (default & custom) & Universal Analytics
- Extensive digital expertise including digital measurement and associated data and technology platforms
- In depth knowledge of the Google Analytics KPIs
- Ability to identify data quality issues, navigate multiple data sources, and work to resolve data quality issues.
- Ability to create impactful dashboards using Data Visualization tools
- Experience analysing web analytics data
- Strong analytical skills with high attention to detail
- Excellent verbal and written communication skills
Perks:
- Exposure to global brands and large-scale campaign management.
- Supportive and growth-oriented work culture.
- 5 Days working
- Flexible working hours
- Mediclaim working hours
- Maternity Leave benefits
- Paternity Leave benefits
- Birthday Leave benefits
- Yearly Performance Incentives
- Bereavement Leave
Reports Developer
Description - Data Insights Analyst specializing in dashboard development, data validation, and ETL testing using Tableau, Cognos, and SQL.
Work Experience: 5-9 years
Key Responsibilities
Insights Solution Development:
• Develop, maintain, and enhance dashboards and static reports using Tableau and IBM Cognos.
• Collaborate with Senior Data Insights specialists to design solutions that meet customer needs.
• Utilize data from modeled Business Data sources, Structured DB2 Datamarts, and DB2 Operational Data stores to fulfill business requirements.
• Conduct internal analytics testing on all data products to ensure accuracy and reliability.
• Use SQL to pull and prepare data for use in Dashboards and Reports
Data Management Tasks:
• Test new ETL developments, break-fixes, and enhancements using SQL-based tools to ensure the accuracy, volume, and quality of ETL changes.
• Participate in data projects, delivering larger-scale data solutions as part of a project team.
• Report defects using Jira and work closely with IT team professionals to ensure the timely retesting of data defects
• Utilize Spec. Documentation and Data lineage tools to understand flow of data into Analytics Data sources
• Develop repeatable testing processes using SQL based tools
Technical Experience
• SQL
• Tableau
• Data Visualization
• Report Design
• Cognos Analytics
• Cognos Transformer
• OLAP Modeling (Cognos)
Additional Skills
An Ideal Candidate would have the following additional Skills
• Python
• SAS Programming
• MS Access
• MS Excel
Work Hours: We would like to have the majority of the work hours align to U.S. Eastern time zone with people working until 2 p.m. or 3 p.m. est. so that work hours align to times the Senior Analysts are available and Data Bases are available.
🚀 Hiring: MIS Executive (CA | Power BI Expert) – Andheri East
💼 Company Industry: Financial Services (CFO Services for clients across Manufacturing, Retail, FMCG, Pharma company with approx 150 Cr. Turnover & more)
📍 Location: Client Site – Andheri East, Mumbai
🕒 Timings: 10 AM – 7 PM | 6 Days
Role Overview
We are looking for a detail-oriented MIS Executive with a flair for data visualization & financial reporting. The ideal candidate will be a CA with strong analytical skills, hands-on experience in Power BI, and the ability to transform complex financial/operational data into actionable insights.
Key Responsibilities
Develop & maintain MIS reports & dashboards for management review.
Collect, consolidate & analyze data to generate insights for business decisions.
Automate reports & streamline processes using Power BI & Excel.
Support teams with ad-hoc analysis & data visualization.
Ensure accuracy, consistency & timeliness of all reports.
Qualifications & Skills
✔ Chartered Accountant (CA) with 1–2 years’ experience
✔ Expertise in Power BI (mandatory) & Advanced Excel
✔ Strong knowledge of financial reporting & data analysis
✔ Excellent communication & presentation skills
✔ Detail-oriented & deadline-driven
Job Description for Senior Frontend Engineer
Job Title: Senior Frontend Engineer
Company: Mydbops
Job Overview:
We’re building an agentic AI database platform where intelligent automation meets intuitive design. Join our small, execution-focused development team powered by deep database expertise as we redefine how teams monitor and optimize database performance.
About Mydbops
Mydbops helps fast-growing teams and unicorns run mission-critical databases with speed, scale, and reliability. Trusted by 800+ clients, including 30+ unicorns, we optimize and manage systems like MySQL, PostgreSQL, MongoDB, MariaDB, TiDB, and Cassandra, powering 10B+ transactions daily across industries such as fintech, SaaS, healthcare, logistics, and e-commerce.
With 9+ years of deep expertise, we build automation tools, lead complex migrations, and solve high-impact performance and database challenges. We are ISO and PCI-DSS certified, combining engineering excellence with enterprise-grade security and compliance.
Role:
You'll architect and build our frontend from the ground up, crafting a modern, production-ready agentic AI application that makes complex database insights clear, accessible, and actionable. In this high-impact role, your technical decisions will shape how our platform scales and evolves.
What You'll Own
- Design and implement the entire frontend architecture using modern technologies
- Build responsive, performant UIs that handle real-time data and complex user workflows
- Collaborate closely with backend and AI teams to define system boundaries, advocate for optimal data flows, and craft experiences that make AI accessible
- Establish engineering standards for testing, CI/CD, and code quality
- Balance technical excellence with product intuition and a deep focus on user experience
What We're Looking For
Core Requirements:
- 3 to 5 Years building production frontends with proven performance, security, and architectural ownership
- Strong proficiency in React, TypeScript, and modern component-based architecture
- Deep knowledge of authentication, RBAC, session management, and frontend security best practices
- Experience with state management (e.g., Redux, Zustand) and integrating complex or real-time APIs
- Strong design intuition with exposure to design systems, responsive UX, and accessibility standards
- Experience building real-time UIs, data visualizations, or telemetry dashboards
- Familiarity with data visualization libraries like Recharts, D3, Chart.js, or similar open-source solutions
- A track record of clean, maintainable code and a proactive ownership mindset
Culture Fit
We value execution over meetings, ownership over hand-holding, and long-term thinking over quick fixes. We're looking for someone who:
- Takes initiative and drives projects to completion independently
- Writes clean, maintainable code with testing in mind
- Cares equally about performance, security, and user experience
- Balances technical debt with product velocity and user impact
This role likely isn't a fit if:
- You need daily task breakdowns to make progress
- You haven't built and shipped production-grade applications end-to-end
- You avoid dealing with performance, authentication, or architecture
- You prefer polishing UI components over solving deep product problems
Ready to help build intelligent database performance tools? We'd love to hear from you.
Job Details:
- Job Type: Full-time opportunity
- Work time: General Shift
- Mode of Employment - Hybrid (Chennai Location)
- Experience - 3 to 5 years
Job Summary:
Position : Senior Power BI Developer
Experience : 4+Years
Location : Ahmedabad - WFO
Key Responsibilities:
- Design, develop, and maintain interactive and user-friendly Power BI dashboards and
- reports.
- Translate business requirements into functional and technical specifications.
- Perform data modeling, DAX calculations, and Power Query transformations.
- Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs.
- Optimize Power BI datasets, reports, and dashboards for performance and usability.
- Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy
- and relevance.
- Ensure security and governance best practices in Power BI workspaces and datasets.
- Provide ongoing support and troubleshooting for existing Power BI solutions.
- Stay updated with Power BI updates, best practices, and industry trends.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a
- related field.
- 4+ years of professional experience in data analytics or business intelligence.
- 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service).
- Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema).
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Experience in working with large and complex datasets.
- Experience in BigQuery, MySql, Looker Studio is a plus.
- Ecommerce Industry Experience will be an added advantage.
- Solid understanding of data warehousing concepts and ETL processes.
- Experience with version control tools such as Power Apps & Power Automate would be a plus.
Preferred Qualifications:
- Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse).
- Knowledge of other BI tools (Tableau, Qlik) is a plus.
- Familiarity with scripting languages (Python, R) for data analysis is a bonus.
- Experience integrating Power BI into web portals using Power BI Embedded.
We’re seeking a detail-oriented Data Analyst with proven experience in corporate settings to transform raw data into actionable insights. You’ll collaborate across departments to support strategic decision-making, optimize operations, and enhance business performance through data-driven analysis.
We are seeking a highly motivated and knowledgeable DADS Trainer to conduct hands-on training in Data Analytics and Data Science. The ideal candidate will have strong domain expertise, coding proficiency, and a passion for teaching concepts in Python, statistics, machine learning, data visualization, and tools like Excel, Power BI, and SQL.
Senior Data Engineer Job Description
Overview
The Senior Data Engineer will design, develop, and maintain scalable data pipelines and
infrastructure to support data-driven decision-making and advanced analytics. This role requires deep
expertise in data engineering, strong problem-solving skills, and the ability to collaborate with
cross-functional teams to deliver robust data solutions.
Key Responsibilities
Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data
pipelines to ingest, process, and transform large volumes of structured and unstructured data.
Data Architecture: Architect and maintain data storage solutions, including data lakes, data
warehouses, and databases, ensuring performance, scalability, and cost-efficiency.
Data Integration: Integrate data from diverse sources, including APIs, third-party systems,
and streaming platforms, ensuring data quality and consistency.
Performance Optimization: Monitor and optimize data systems for performance, scalability,
and cost, implementing best practices for partitioning, indexing, and caching.
Collaboration: Work closely with data scientists, analysts, and software engineers to
understand data needs and deliver solutions that enable advanced analytics, machine
learning, and reporting.
Data Governance: Implement data governance policies, ensuring compliance with data
security, privacy regulations (e.g., GDPR, CCPA), and internal standards.
Automation: Develop automated processes for data ingestion, transformation, and validation
to improve efficiency and reduce manual intervention.
Mentorship: Guide and mentor junior data engineers, fostering a culture of technical
excellence and continuous learning.
Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high
availability and reliability of data systems.
Required Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,
or a related field.
Experience: 5+ years of experience in data engineering or a related role, with a proven track
record of building scalable data pipelines and infrastructure.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).
Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services
(e.g., Redshift, BigQuery, Snowflake).
Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and
data integration frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed
systems.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a
plus.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Ability to work in a fast-paced, dynamic environment and manage multiple priorities.
Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,
Google Professional Data Engineer) or relevant data engineering certifications.
Preferred Qualifica
Experience with real-time data processing and streaming architectures.
Familiarity with machine learning pipelines and MLOps practices.
Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data
pipelines.
Experience in industries with high data complexity, such as finance, healthcare, or
e-commerce.
Work Environment
Location: Hybrid/Remote/On-site (depending on company policy).
Team: Collaborative, cross-functional team environment with data scientists, analysts, and
business stakeholders.
Hours: Full-time, with occasional on-call responsibilities for critical data systems.
Responsibilities
· Design and architect data virtualization solutions using Denodo.
· Collaborate with business analysts and data engineers to understand data requirements and translate them into technical specifications.
· Implement best practices for data governance and security within Denodo environments.
· Lead the integration of Denodo with various data sources, ensuring performance optimization.
· Conduct training sessions and provide guidance to technical teams on Denodo capabilities.
· Participate in the evaluation and selection of data technologies and tools.
· Stay current with industry trends in data integration and virtualization.
Requirements
· Bachelor's degree in Computer Science, Information Technology, or a related field.
· 10+ years of experience in data architecture, with a focus on Denodo solutions.
· Strong knowledge of data virtualization principles and practices.
· Experience with SQL and data modeling techniques.
· Familiarity with ETL processes and data integration tools.
· Excellent communication and presentation skills.
· Ability to lead technical discussions and provide strategic insights.
· Certifications related to Denodo or data architecture are a plus
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
This is a fulltime onsite trainer cum developer role. To prepare placement students with the technical knowledge, skills, and confidence required to succeed in campus recruitment drives, technical interviews, and entry-level job roles in the industry.
About the company:
Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level
Role Overview:
Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.
Key Responsibilities
● Data Strategy & Automation:
○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.
○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.
● Data Analysis & Insight Generation:
○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.
○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.
● Reporting & Quality Assurance:
○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.
○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.
● Collaboration & Strategic Planning:
○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.
○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.
○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.
Required Skills and Qualifications
● Technical Expertise:
○ Strong background in SQL, Statistics and Maths
● Analytical & Strategic Mindset:
○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.
○ Experience with statistical analysis, advanced analytics
● Communication & Collaboration:
○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.
○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.
● Preferred Experience:
○ Proven experience in advanced analytics roles
○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.
Why Join Ketto?
At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!
We are seeking an experienced Data Scientist to join our data-driven team. As a Data Scientist, you will work with large datasets, apply advanced analytics techniques, and build machine learning models to provide actionable insights that drive business decisions. You will collaborate with various teams to translate complex data into clear recommendations and innovative solutions.
Key Responsibilities:
- Analyze large datasets to identify trends, patterns, and insights that can inform business strategy.
- Develop, implement, and maintain machine learning models and algorithms to solve complex problems.
- Work closely with stakeholders to understand business objectives and translate them into data science tasks.
- Preprocess, clean, and organize raw data from various sources for analysis.
- Conduct statistical analysis and build predictive models to support data-driven decision-making.
- Create data visualizations and reports to communicate findings clearly and effectively to both technical and non-technical teams.
- Design experiments and A/B testing to evaluate business initiatives.
- Ensure the scalability and performance of data pipelines and machine learning models.
- Collaborate with engineering teams to integrate data science solutions into production systems.
- Continuously stay updated with the latest developments in data science, machine learning, and analytics technologies.
Job Title : Sr. Data Scientist
Experience : 5+ Years
Location : Noida (Hybrid – 3 Days in Office)
Shift Timing : 2 PM to 11 PM
Availability : Immediate
Job Description :
We are seeking a Senior Data Scientist to develop and implement machine learning models, predictive analytics, and data-driven solutions.
The role involves data analysis, dashboard development (Looker Studio), NLP, Generative AI (LLMs, Prompt Engineering), and statistical modeling.
Strong expertise in Python (Pandas, NumPy), Cloud Data Science (AWS SageMaker, Azure OpenAI), Agile (Jira, Confluence), and stakeholder collaboration is essential.
Mandatory skills : Machine Learning, Cloud Data Science (AWS SageMaker, Azure OpenAI), Python (Pandas, NumPy), Data Visualization (Looker Studio), NLP & Generative AI (LLMs, Prompt Engineering), Statistical Modeling, Agile (Jira, Confluence), and strong stakeholder communication.
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.
Key functions & responsibilities:
· Communication & interaction with Project Manager to understand the requirement
· Dashboard designing, development and deployment using Tableau eco-system
· Ensure delivery within given time frame while maintaining quality
· Stay up to date with current tech and bring relevant ideas to the table
· Proactively work with the Management team to identify and resolve issues
· Performs other related duties as assigned or advised
· He/she should be a leader that sets the standard and expectations through example in his/her conduct, work ethic, integrity and character
· Contribute in dashboard designing, R&D and project delivery using Tableau
Experience:
· Overall 3-7 Years of experience in DWBI development projects, having worked on BI and Visualization technologies (Tableau, Qlikview) for at least 3 years.
· At least 3 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modeling, data blending, etc.
Technology & Skills:
· Hands-on expertise of Tableau administration and maintenance
· Strong working knowledge and development experience with Tableau Server and Desktop
· Strong knowledge in SQL, PL/SQL and Data modelling
· Knowledge of databases like Microsoft SQL Server, Oracle, etc.
· Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
· Good communication & Analytical skills with Excellent creative and conceptual thinking abilities
· Superior organizational skills, attention to detail/level of quality, Strong communication skills, both verbal and written
We are seeking a skilled Qlik Developer with 4-5 years of experience in Qlik development to join our team. The ideal candidate will have expertise in QlikView and Qlik Sense, along with strong communication skills for interacting with business stakeholders. Knowledge of other BI tools such as Power BI and Tableau is a plus.
Must-Have Skills:
QlikView and Qlik Sense Development: 4-5 years of hands-on experience in developing and maintaining QlikView/Qlik Sense applications and dashboards.
Data Visualization: Proficiency in creating interactive reports and dashboards, with a deep understanding of data storytelling.
ETL (Extract, Transform, Load): Experience in data extraction from multiple data sources (databases, flat files, APIs) and transforming it into actionable insights.
Qlik Scripting: Knowledge of Qlik scripting, set analysis, and expressions to create efficient solutions.
Data Modeling: Expertise in designing and implementing data models for reporting and analytics.
Stakeholder Communication: Strong communication skills to collaborate with non-technical business users and translate their requirements into effective BI solutions.
Troubleshooting and Support: Ability to identify, troubleshoot, and resolve issues related to Qlik applications.
Nice-to-Have Skills:
Other BI Tools: Experience in using other business intelligence tools such as Power BI and Tableau.
SQL & Data Querying: Familiarity with SQL for data querying and database management.
Cloud Platforms: Experience with cloud services like Azure, AWS, or Google Cloud in relation to BI and data solutions.
Programming Knowledge: Exposure to programming languages like Python or R.
Agile Methodologies: Understanding of Agile frameworks for project delivery.
We are seeking a motivated AI Engineer who is passionate about exploring the future of artificial intelligence and eager to transition into the IT field. This role is ideal for individuals with a career gap who have undergone professional training in AI and are looking to apply their skills in a dynamic environment.
Job Responsibilities
- Develop AI Models: Design and implement machine learning algorithms and deep learning models to extract insights from large datasets.
- Collaborate with Teams: Work closely with cross-functional teams to identify business needs and integrate AI solutions that enhance operational efficiency.
- Research and Experimentation: Conduct experiments to improve AI system performance and stay updated with the latest advancements in AI technologies.
- Documentation: Maintain comprehensive documentation for AI models, algorithms, and processes.
Qualifications
- Education: A bachelor’s degree in Computer Science, Engineering, or a related field is preferred.
- Training: Completion of professional training programs in AI, machine learning, or data science.
- Programming Skills: Proficiency in programming languages such as Python or R, with experience in data processing techniques.
- Analytical Skills: Strong problem-solving abilities and a keen analytical mindset.
Desired Attributes
- A genuine interest in the evolving landscape of AI technologies.
- Willingness to learn and adapt in a fast-paced environment.
- Excellent communication skills for effective collaboration within teams.
This position offers a unique opportunity for those looking to pivot their careers into IT while contributing to innovative AI projects. If you are ready to embrace this challenge, we encourage you to apply!
Role: Data Analyst (Apprentice - 1 Year contract)
Location: Bangalore, Hybrid Model
Job Summary: Join our team as an Apprentice for one year and take the next step in your career while supporting FOX’s unique corporate services and Business Intelligence (BI) platforms. This role offers a fantastic opportunity to leverage your communication skills, technical expertise, analytical and problem-solving competencies, and customer-focused experience.
Key Responsibilities:
· Assist in analyzing and solving data-related challenges.
· Support the development and maintenance of corporate service and BI platforms.
· Collaborate with cross-functional teams to enhance user experience and operational efficiency.
· Participate in training sessions to further develop technical and analytical skills.
· Conduct research and analysis to identify trends and insights in data.
· Prepare reports and presentations to communicate findings to stakeholders.
· Engage with employees to understand their needs and provide support.
· Contribute insights and suggestions during team meetings to drive continuous improvement.
Qualifications:
· Bachelor’s degree in Engineering (2024 pass-out).
· Strong analytical and technical skills with attention to detail.
· Excellent communication skills, both verbal and written.
· Ability to work collaboratively in a team-oriented environment.
· Proactive attitude and a strong willingness to learn.
· Familiarity with data analysis tools and software (e.g., Excel, SQL, Tableau) is a plus.
· Basic understanding of programming languages (e.g., Python, R) is an advantage.
Additional Information:
- This position offers a hybrid work model, allowing flexibility between remote and in-office work.
- Opportunities for professional development and skill enhancement through buddy and mentorship.
- Exposure to real-world projects and the chance to contribute to impactful solutions.
- A supportive and inclusive team environment that values diverse perspectives.
Thirumoolar IT Solutions is looking for a motivated and enthusiastic Fresher Trained Dataset Engineer to join our team. This entry-level position is ideal for recent graduates who are eager to apply their academic knowledge in a practical setting and contribute to the development of high-quality datasets for machine learning applications.
Responsibilities
Assist in the collection, cleaning, and preprocessing of data to ensure it is ready for training machine learning models.
Collaborate with senior dataset engineers and data scientists to understand the requirements for specific machine learning tasks.
Participate in the annotation and labeling of datasets, ensuring accuracy and consistency in data representation.
Conduct quality checks on datasets to identify and rectify errors or inconsistencies.
Support the development of documentation and guidelines for data annotation processes.
Stay updated with the latest tools and techniques in data processing and machine learning.
Skills and Qualifications
Bachelor’s degree in Computer Science, Data Science, Mathematics, or a related field.
Basic understanding of machine learning concepts and the importance of high-quality datasets.
Familiarity with programming languages such as Python or R is a plus.
Knowledge of data manipulation libraries (e.g., Pandas, NumPy) is advantageous.
Strong analytical skills and attention to detail.
Excellent communication and teamwork abilities.
A passion for learning and a desire to grow in the field of data engineering.
Preferred Location
Candidates based in Tamil Nadu or those willing to work from home are encouraged to apply.
Company Description
UpSolve Solutions is a company specializing in Video and Text Analytics to drive business decisions. We aim to solve business problems that are taking more time and resources than expected, and turn them into opportunities for growth and improvement. We are located in Pune and offer innovative solutions to help businesses succeed.
Role Description
This is a full-time on-site role for a QlikView Developer. As a QlikView Developer, you will be responsible for developing data models, creating dashboards, and utilizing your analytical skills. You will also work with data warehousing and ETL (Extract Transform Load) processes to ensure effective data management. You will be an integral part of our team in Pune.
Qualifications
- Data Modeling and Dashboard development skills
- Strong analytical skills
- Experience in data warehousing and ETL processes
- Proficiency in QlikView development
- Good understanding of business requirements and data analysis
- Excellent problem-solving and communication skills
- Experience in the software industry is a plus
- Bachelor's degree in Computer Science, Information Technology, or related field
Optiblack is a product growth firm that works with product leaders to drive revenue from data. Optiblack has worked with 50+ firms and created an impact of $300M+ USD
Optiblack is looking to add a product analyst that works with their clients on client engagement.
Job Descriptions
0. Monitoring
Look at data from various sources daily to understand insights
Identify areas of improvement
Create the dashboards as per requirements
Build tracking plan
1. Data Analysis
Gather and analyze data from various sources to extract actionable insights.
Use statistical techniques and data visualization to interpret trends and patterns.
Identify opportunities for product improvements based on data analysis.
2. Product Performance Evaluation
Monitor key product metrics and performance indicators.
Conduct regular evaluations to assess product performance against goals and benchmarks.
Identify areas of improvement and recommend strategies for optimization.
3. Market Research
Conduct market research to understand industry trends, customer needs, and competitive landscape.
Analyze market data and customer feedback to identify market opportunities.
Provide insights on target market segments and potential customer segments.
4. User Experience Enhancement
Collaborate with UX UI designers to improve the user experience of the product.
Conduct user research, interviews, and usability testing to gather feedback on product usability and satisfaction.
Recommend enhancements and features to improve user experience.
5. Requirement Gathering
Collaborate with stakeholders to gather and document product requirements.
Conduct interviews, workshops, and surveys to understand user and business needs.
Translate requirements into clear and actionable user stories or product specifications.
6. A/B Testing
Plan, execute, and analyze A/B Tests to evaluate the impact of changes or new features.
Use data from A/B tests to make data-driven decisions and optimize the product.
7. Reporting and Presentation
Prepare reports and presentations to communicate insights, findings, and recommendations.
Present data analysis and product performance evaluations to stakeholders and cross-functional teams.
Clearly articulate complex concepts in a concise and understandable manner.
Skills Required for a Product Analyst
1. Technical Skills
Data Analysis: Proficiency in using analytical tools and techniques to gather, analyze, and interpret data.
Tools: Mixpanel or similar
Market Research: Knowledge of market research methodologies, including data collection, analysis, and competitor analysis.
Product Management Tools: Familiarity with product management tools such as Jira, Trello, or Asana to manage product backlogs, roadmaps, and user stories.
User Experience (UX) Design: Understanding of UX principles and the ability to work closely with designers to enhance the user experience of the product.
A/B Testing: Experience in planning, executing, and analyzing A/B tests to measure the impact of product changes.
SQL and Database Knowledge: Proficiency in SQL to extract and analyze data from databases.
2. Workplace Skills
Communication: Strong verbal and written communication skills to effectively collaborate with stakeholders, present findings, and document requirements.
Problem-Solving: Ability to identify problems, gather relevant information, and propose practical solutions to improve product performance and user experience.
Critical Thinking: Capacity to think analytically, assess situations, and make data-driven decisions.
Stakeholder Management: Skill in managing relationships with stakeholders, understanding their needs, and balancing competing priorities.
Teamwork and Collaboration: Ability to work collaboratively with cross-functional teams, including product managers, designers, developers, and marketers.
Adaptability: Flexibility to navigate in a dynamic and fast-paced work environment, adjusting priorities as needed.
Attention to Detail: Strong attention to detail to ensure accuracy and precision in data analysis, documentation, and requirement gathering.
Role Overview
We are looking for a Tech Lead with a strong background in fintech, especially with experience or a strong interest in fraud prevention and Anti-Money Laundering (AML) technologies.
This role is critical in leading our fintech product development, ensuring the integration of robust security measures, and guiding our team in Hyderabad towards delivering high-quality, secure, and compliant software solutions.
Responsibilities
- Lead the development of fintech solutions, focusing on fraud prevention and AML, using Typescript, ReactJs, Python, and SQL databases.
- Architect and deploy secure, scalable applications on AWS or Azure, adhering to the best practices in financial security and data protection.
- Design and manage databases with an emphasis on security, integrity, and performance, ensuring compliance with fintech regulatory standards.
- Guide and mentor the development team, promoting a culture of excellence, innovation, and continuous learning in the fintech space.
- Collaborate with stakeholders across the company, including product management, design, and QA, to ensure project alignment with business goals and regulatory requirements.
- Keep abreast of the latest trends and technologies in fintech, fraud prevention, and AML, applying this knowledge to drive the company's objectives.
Requirements
- 5-7 years of experience in software development, with a focus on fintech solutions and a strong understanding of fraud prevention and AML strategies.
- Expertise in Typescript, ReactJs, and familiarity with Python.
- Proven experience with SQL databases and cloud services (AWS or Azure), with certifications in these areas being a plus.
- Demonstrated ability to design and implement secure, high-performance software architectures in the fintech domain.
- Exceptional leadership and communication skills, with the ability to inspire and lead a team towards achieving excellence.
- A bachelor's degree in Computer Science, Engineering, or a related field, with additional certifications in fintech, security, or compliance being highly regarded.
Why Join Us?
- Opportunity to be at the cutting edge of fintech innovation, particularly in fraud prevention and AML.
- Contribute to a company with ambitious goals to revolutionize software development and make a historical impact.
- Be part of a visionary team dedicated to creating a lasting legacy in the tech industry.
- Work in an environment that values innovation, leadership, and the long-term success of its employees.
Minimum of 8 years of experience of which, 4 years should be of applied data mining
experience in disciplines such as Call Centre Metrics.
Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.
Experience with leading and managing large teams.
Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.
Demonstrated experience with Business Intelligence/Data Mining tools to work with
data, investigate anomalies, construct data sets, and build models.
Critical to share details on projects undertaken (preferably on telecom industry)
specifically through analysis from CRM.


























