11+ OLAP Jobs in Mumbai | OLAP Job openings in Mumbai
Apply to 11+ OLAP Jobs in Mumbai on CutShort.io. Explore the latest OLAP Job opportunities across top companies like Google, Amazon & Adobe.
- Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
- Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
- Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
- Periodic Database health check and maintenance
- Designing collections in a no-SQL Database for efficient performance
- Document & maintain data dictionary from various sources to enable data governance
- Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
- Data Governance Process Implementation and ensuring data security
Requirements
- Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
- Programming experience using Python / Java.
- Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
- Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
- Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
- Extensive technical experience in SQL including code optimization techniques.
- Strung knowledge of database performance and tuning, troubleshooting, and tuning.
- Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
- Ability to understand business functionality, processes, and flows.
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
- Any OLAP DWH DBA Experience and User Management will be added advantage.
- Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
- Experience in Snowflake will be added advantage.
- Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.
Functional knowledge
- Data Governance & Quality Assurance
- Modern OLAP Database Architecture & Design
- Linux
- Data structures, algorithm & data modeling techniques
- No-SQL database architecture
- Data Security
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Role & Responsibilities
- Develop and deliver automation software to build and improve platform functionality
- Ensure reliability, availability, and manageability of applications and cloud platforms
- Champion adoption of Infrastructure as Code (IaC) practices
- Design and build self-service, self-healing, monitoring, and alerting platforms
- Automate development and testing workflows through CI/CD pipelines (Git, Jenkins, SonarQube, Artifactory, Docker containers)
- Build and manage container hosting platforms using Kubernetes
Requirements
- Strong experience deploying and maintaining GCP cloud infrastructure
- Well-versed in service-oriented and cloud-based architecture design patterns
- Knowledge of cloud services including compute, storage, networking, messaging, and automation tools (e.g., CloudFormation/Terraform equivalents)
- Experience with relational and NoSQL databases (Postgres, Cassandra)
- Hands-on experience with automation/configuration tools (Puppet, Chef, Ansible, Terraform)
Additional Skills
- Strong Linux system administration and troubleshooting skills
- Programming/scripting exposure (Bash, Python, Core Java, or Scala)
- CI/CD pipeline experience (Jenkins, Git, Maven, etc.)
- Experience integrating solutions in multi-region environments
- Familiarity with Agile/Scrum/DevOps methodologies
Job Title: Power Automate Developer
Experience Required: 2 to 4 Years
Location: Mumbai [Only for Applicants based in Mumbai]
Job Type: Full-time
Job Summary:
We are looking for a skilled and detail-oriented Power Automate Developer with 2 to 4 years of hands-on experience in designing, developing, and maintaining automation workflows using Microsoft Power Platform, especially Power Automate. The ideal candidate should have a solid understanding of process automation, business workflows, and integrations with Microsoft 365 and other third-party systems.
Key Responsibilities:
- Design, develop, test, and deploy automated workflows using Power Automate (Flow).
- Integrate Power Automate with SharePoint, Microsoft Teams, Outlook, Excel, PowerApps, and third-party APIs.
- Gather requirements and work closely with business teams to understand and optimize business processes.
- Create and manage custom connectors, triggers, and actions in Power Automate.
- Monitor and troubleshoot workflows to ensure smooth execution and error handling.
- Work with PowerApps, Power BI, and other tools in the Microsoft Power Platform suite as needed.
- Create technical documentation and provide end-user training/support as necessary.
- Ensure automation solutions meet performance, security, and compliance requirements.
Required Skills and Qualifications:
- 2 to 4 years of experience working with Power Automate / Microsoft Flow.
- Strong knowledge of Microsoft 365 (O365) ecosystem and tools like SharePoint Online, Outlook, Excel, Teams.
- Experience in integrating Power Automate with third-party systems using APIs, HTTP connectors, and JSON.
- Good understanding of workflow logic, triggers, conditions, loops, and expressions in Power Automate.
- Experience in PowerApps development is a plus.
- Basic knowledge of scripting languages (e.g., JavaScript, PowerShell, or VB) is an advantage.
- Strong analytical and problem-solving skills.
- Excellent communication and documentation skills.
Preferred Qualifications:
- Microsoft Power Platform certifications (e.g., PL-900, PL-100, PL-400).
- Experience with Dataverse, SQL Server, Azure Logic Apps, or Dynamics 365.
- Knowledge of Agile/Scrum methodologies.
Job Description – Ground Team (Field Engineers)
Qualification: 10+2+Hardware Course and Networking Course/ Graduate with Hardware and Networking Course, Core Hardware, and Networking Profile field profile.
Communication: Average English and good at Hindi or Local Language
Location: Delhi, Mumbai, Gurgaon, Noida, Chennai, Bangalore
Package: INR 19000 – 21000 CTC Per Month Plus Conveyance as per Policy + Insurance
Experience:
• 1 + yrs (Trouble shoot card level hardware issues related to Laptops, Desktops, Tablets, Mobiles, Printers and other accessories)
Roles and Responsibilities:
• Field Engineer is a role of on-site technical support, troubleshooting and support to corporate and retail clients.
• This role is critical for ensuring that the organization's end-users have great experience with brand representative.
• The field engineer will visit PUDO Centre twice in a day – to collect hardware part as per the calls for the day and in the evening to deposit back the collected part from the client to drop back to PUDO. Field engineer will be responsible for successfully closing the allocated calls for the day.
- MUST know laptop repairs for all BRANDS and Close Tickets successfully
Must Have:
• 2-wheeler with a valid license.
• Open for field profile.
(Leading Health & Nutrition based Startup in Mumbai)
- Creating Campaigns in Google ads, Facebook marketing & other online marketing Platforms
- Keyword Research & Monitoring
- Writing Ad copy/description
- Basic knowledge of Google Sheets
- Must be proficient in the English Language
- Monitors the latest trends in social media, including advertising formats, channels and technologies in order to improve campaign performance and provide recommendations
- Compile data across several platforms and create weekly/monthly reports, including analysis for insights, optimizations and future strategy development
- Displays organizational capabilities to track progress, execution and consistency of social advertising campaigns
- Running Email/SMS Campaigns
- Demonstrates understanding of and ability to facilitate and manage forecasting, budgeting and pacing, campaign creation and optimization
- Coordinate with Ad Agencies & take regular updates
- Coordinate with content team to get the required creatives
- Coordinate with production team to ensure inventory is in place before running campaigns
Pivotroots Digital is urgently looking for Magento Developer Role.
Company Website - https://www.pivotroots.com/
Skills and Qualifications
- Solid understanding of Magento and ecommerce technologies
- Experience in working in Magento 2.x
- Performance improvement and Security updates for Magento
- Proficient in the UI, HTML, JavaScript usage
- Strong understanding of PHP back-end development
- Strong knowledge of MySQL database
- Good to have knowledge of WordPress
- Comfortable working with debugging tools like Firebug, Chrome inspector, etc.
- Knowledge of how to interact with RESTful APIs and formats (JSON, XML)
- Proficient understanding of code versioning tools such as GIT
What you'll be doing :
-
You'll be creating the best-in-class mobile designs for users on the Android platform, including visual & motion graphics, flash animation to explain the concepts.
-
Asset creations, icon design, for the final deliverable for the release of the phone.
-
You'll be continuously keeping an eye on the latest user interactive designs and leveraging those in one's own and the team's work as necessary.
-
Create a look consistent with all of Indus OS while pleasing our users.
-
Present the user interface visually so information is easy to read, easy to understand and easy to find.
Your Profile :
-
Candidate with 4+ years experience.
-
B.des, M.des from IIT(IDC) /NID/ Symbiosis/ MIT/ other design colleges.
-
Creative and imaginative-, loves to explore.
-
Ability to translate a concept into a visual form that explains the concept and able to give justification for each design decision.
-
Strong penchant towards all things technology
-
Comfortable presenting ideas and designs
-
Capable of working within deadlines in a fast-paced environment, often on multiple projects
-
Exceptional understanding of the graphic and the functional aspects of design and the ability to execute both
-
A strong and creative portfolio with varied projects (this is important to evaluate the capabilities of interested candidates)
-
Proficient with Adobe design software (Photoshop, Illustrator, Dreamweaver, Flash)
- Key Technical Skills: Deep experience on Performance Engineering with understanding of Java/J2EE technologies.
- Experienced in defining and realizing end-to-end Technical Architecture for large scale real- time enterprise systems. Ability to identify and define non-functional requirements and design systems to meet the same.
- Ability to review existing Architectures and identify Risks, Trade-offs, and share recommendations for addressing the identified issues.
- Demonstrate strong understanding of cloud architecture considerations when scaling and tuning application deployments. Must have hands on experience working on any of the Cloud deployments on AWS or Azure.
- Good experience on leveraging APM tools to provide deep dive analysis on performance problems. Deep understanding of the dashboards which can be built for CIO level interactions. Must have relevant experience on APM tools like Dynatrace or AppDynamics.
- Experience in performance optimization of J2EE systems on any of different types of application servers - WebLogic, WebSphere, JBoss etc. Deep expertise in any one of the application servers is a must.
- Experience in creating and reviewing technical documents like Architecture blueprint, Design specifications, Deployment architecture.
- Experience on working on Performance Testing Projects. Fair understanding of Performance Testing tools - Apache JMeter /Gatling/ HP Load Runner for Load testing. Must be in a position to review Performance Testing programs and steer directions towards right Workload Model, appropriate Test and Monitoring Strategy, build performance models and derive at right Capacity Planning.
- Experience in Big Data Analytics like - Apache Kafka, Apache Storm, Apache Hadoop, Apache Spark.
- Good skills in RDBMS like: Oracle, MS-SQL, MySQL, Cassandra, and Mongo DB
- Exposure to Agile methodologies & Continuous Integration Tools
- Entrepreneur / Intrapreneur (someone who has built technology teams ground-up, built new solutions from scratch)
- Very sound understanding of technology and have a consultative approach.
- Sound understanding of complex enterprise IT environment and issues faced by CIOs in the digital era.
- Excellent Pre-sales experience and have played a key role in winning business along with the sales team.
- Excellent communication, interpersonal, liaison and problem-solving skills with the ability to work in a multi-cultural environment
- Good negotiation skills
- Go getter and results oriented
- High energy level with ability to work well under pressure
- Good relationship building skills. Someone who enjoys CIOs trust and has an ability to develop relationships at all levels (technology teams) of the customer organization.
- Data pre-processing, data transformation, data analysis, and feature engineering
- Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
- Required skills:
- Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
- Fluency in Python (Pandas), PySpark, SQL, or similar
- Azure data factory experience (min 12 months)
- Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
- Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
- Ability to work independently with demonstrated experience in project or program management
- Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment





