10+ OLAP Jobs in India
Apply to 10+ OLAP Jobs on CutShort.io. Find your next job, effortlessly. Browse OLAP Jobs and apply today!
Key Responsibilities:
- Design, develop, and maintain ETL processes and data pipelines.
- Work with OLAP databases and ensure efficient data storage and retrieval.
- Utilize Apache Pinot for real-time data analytics.
- Implement and manage data integration using Airbyte.
- Orchestrate data workflows with Apache Airflow.
- Develop and maintain RESTful and GraphQL APIs for data services.
- Deploy and manage applications on Hausra Cloud.
- Collaborate with cross-functional teams to understand data requirements and provide scalable solutions.
- Ensure data quality, integrity, and security across all pipelines.
Required Skills and Experience:
- Proven experience in ETL development and data pipeline management.
- Strong understanding of OLAP systems.
- Hands-on experience with Apache Pinot.
- Proficiency in using Airbyte for data integration.
- Expertise in Apache Airflow for workflow orchestration.
- Experience in developing RESTful and GraphQL APIs.
- Familiarity with Hausra Cloud or similar cloud platforms.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills and ability to work in a collaborative team environment.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in backend engineering with a focus on data pipelines and ETL processes.
- Demonstrated ability to work in a fast-paced, dynamic environment.
Note: Contractual Job for 06 Months
Job Description - Jasper
- Knowledge of Jasper report server administration, installation and configuration
- Knowledge of report deployment and configuration
- Knowledge of Jaspersoft Architecture and Deployment
- Knowledge of User Management in Jaspersoft Server
- Experience in developing Complex Reports using Jaspersoft Server and Jaspersoft Studio
- Understand the Overall architecture of Jaspersoft BI
- Experience in creating Ad Hoc Reports, OLAP, Views, Domains
- Experience in report server (Jaspersoft) integration with web application
- Experience in JasperReports Server web services API and Jaspersoft Visualise.JS Web service API
- Experience in creating dashboards with visualizations
- Experience in security and auditing, metadata layer
- Experience in Interacting with stakeholders for requirement gathering and Analysis
- Good knowledge ETL design and development, logical and physical data modeling (relational and dimensional)
- Strong self- initiative to strive for both personal & technical excellence.
- Coordinate efforts across Product development team and Business Analyst team.
- Strong business and data analysis skills.
- Domain knowledge of Healthcare an advantage.
- Should be strong on Co- ordinate with onshore resources on development.
- Data oriented professional with good communications skills and should have a great eye for detail.
- Interpret data, analyze results and provide insightful inferences
- Maintain relationship with Business Intelligence stakeholders
- Strong Analytical and Problem Solving skills
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Job Description :
- Candidate should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- Candidate should have experience for proper understanding of the project deliverables.
- Candidate should be responsible for the respective tasks assigned in the project.
- Candidate will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- Candidate should be involved in client interaction.
- Candidate should possess excellent communication skills.
Required Skills : BI Gateway, MS SQL Server, Tableau, Power BI,.Net , OLAP, UI/UX , Dashboard Building
Experience : 5+Years
Job Location : Remote/Saudi Arabia
Work Timings : 2.30 pm- 11.30 pm
Power BI Engineer
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
What we're looking for :
- You should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- You have experience in OLAP, UI/UX and Dashboard building.
- You should have experience for proper understanding of the project deliverables.
- You should possess excellent communication skills.
Responsibilities:
- You will be responsible for the respective tasks assigned in the project.
- You will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- You will be involved in client interaction.
- Manage different versions of complicate code and distribute them to different teams in the organization utilizing TFS.
- Develop an ASP.Net application to input and manage a production schedule, production statistical analysis and trend reporting.
- Create routines for importing data utilizing XML, CSV and comma delimitate files.
- Filter and cleanse OLTP data with complex store procedures and SSIS packages in the staging area.
- Develop a dimensional database and OLAP cube using SSAS for analysis, maintenance and good customer service.
- Involve in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
- Extract the data from XML and load it to dimensional model.
- Maintain SQL scripts, indexes, complex queries for data analysis and extraction.
- Create advanced reports like dashboard and scoreboard using SharePoint and power pivot for better presentation of data.
- Create cubes in SSAS reports which require complex calculations such as calculation of the premium for a particular policy.
- Create SharePoint sub sites, lists, libraries, folders and apply site permissions according to the given requirement.
- Work with business analysts, subject matter experts, and other team members to determine data extraction and transformation requirements.
- Develop T-SQL functions and store procedures to support complex business requirements.
- Create and configure an OLTP replication database for disaster recovery purpose.
- Designed/Implemented/Maintain OLAP servers and processes to replicate production data to the server.
SaaS Company strive to make selling fun with our SaaS incen
What is the role?
You will be responsible for developing and designing front-end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. You will be responsible for the functional/technical track of the project
Key Responsibilities
- Develop and automate large-scale, high-performance data processing systems (batch and/or streaming).
- Build high-quality software engineering practices towards building data infrastructure and pipelines at scale.
- Lead data engineering projects to ensure pipelines are reliable, efficient, testable, & maintainable
- Optimize performance to meet high throughput and scale
What are we looking for?
- 4+ years of relevant industry experience.
- Working with data at the terabyte scale.
- Experience designing, building and operating robust distributed systems.
- Experience designing and deploying high throughput and low latency systems with reliable monitoring and logging practices.
- Building and leading teams.
- Working knowledge of relational databases like Postgresql/MySQL.
- Experience with Python / Spark / Kafka / Celery
- Experience working with OLTP and OLAP systems
- Excellent communication skills, both written and verbal.
- Experience working in cloud e.g., AWS, Azure or GCP
Whom will you work with?
You will work with a top-notch tech team, working closely with the architect and engineering head.
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at this company
We are
We strive to make selling fun with our SaaS incentive gamification product. Company is the #1 gamification software that automates and digitizes Sales Contests and Commission Programs. With game-like elements, rewards, recognitions, and complete access to relevant information, Company turbocharges an entire salesforce. Company also empowers Sales Managers with easy-to-publish game templates, leaderboards, and analytics to help accelerate performances and sustain growth.
We are a fun and high-energy team, with people from diverse backgrounds - united under the passion of getting things done. Rest assured that you shall get complete autonomy in your tasks and ample opportunities to develop your strengths.
Way forward
If you find this role exciting and want to join us in Bangalore, India, then apply by clicking below. Provide your details and upload your resume. All received resumes will be screened, shortlisted candidates will be requested to join for a discussion and on mutual alignment and agreement, we will proceed with hiring.
Skillsets-Azure, Olap, Etl, sql, python, c#
exp range - 3+ to 4 years
salary-best in industry
notice period - Currently serving notice period (Immediate joiners are preferred)
location- remote work
job type -permanent role
it is full time and totally remote based
Note: For the interview 3 rounds are there -technical round, manager/client round, hr round
They provide both wholesale and retail funding. (PM1)
- Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
- Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
- Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
- Periodic Database health check and maintenance
- Designing collections in a no-SQL Database for efficient performance
- Document & maintain data dictionary from various sources to enable data governance
- Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
- Data Governance Process Implementation and ensuring data security
Requirements
- Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
- Programming experience using Python / Java.
- Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
- Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
- Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
- Extensive technical experience in SQL including code optimization techniques.
- Strung knowledge of database performance and tuning, troubleshooting, and tuning.
- Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
- Ability to understand business functionality, processes, and flows.
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
- Any OLAP DWH DBA Experience and User Management will be added advantage.
- Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
- Experience in Snowflake will be added advantage.
- Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.
Functional knowledge
- Data Governance & Quality Assurance
- Modern OLAP Database Architecture & Design
- Linux
- Data structures, algorithm & data modeling techniques
- No-SQL database architecture
- Data Security
Work closely with different Front Office and Support Function stakeholders including but not restricted to Business
Management, Accounts, Regulatory Reporting, Operations, Risk, Compliance, HR on all data collection and reporting use cases.
Collaborate with Business and Technology teams to understand enterprise data, create an innovative narrative to explain, engage and enlighten regular staff members as well as executive leadership with data-driven storytelling
Solve data consumption and visualization through data as a service distribution model
Articulate findings clearly and concisely for different target use cases, including through presentations, design solutions, visualizations
Perform Adhoc / automated report generation tasks using Power BI, Oracle BI, Informatica
Perform data access/transfer and ETL automation tasks using Python, SQL, OLAP / OLTP, RESTful APIs, and IT tools (CFT, MQ-Series, Control-M, etc.)
Provide support and maintain the availability of BI applications irrespective of the hosting location
Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability, provide incident-related communications promptly
Work with strict deadlines on high priority regulatory reports
Serve as a liaison between business and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood, and considered as part of operational
prioritization and planning
To work for APAC Chief Data Office and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).
General Skills:
Excellent knowledge of RDBMS and hands-on experience with complex SQL is a must, some experience in NoSQL and Big Data Technologies like Hive and Spark would be a plus
Experience with industrialized reporting on BI tools like PowerBI, Informatica
Knowledge of data related industry best practices in the highly regulated CIB industry, experience with regulatory report generation for financial institutions
Knowledge of industry-leading data access, data security, Master Data, and Reference Data Management, and establishing data lineage
5+ years experience on Data Visualization / Business Intelligence / ETL developer roles
Ability to multi-task and manage various projects simultaneously
Attention to detail
Ability to present to Senior Management, ExCo; excellent written and verbal communication skills
2) Expertise in developing OLAP cubes and developing complex calculations, Aggregations, implementing a dynamic security model using MDX/ DAX functions in - Azure Analysis service- or (SSAS)
3) Extensively used performance monitor/SQL profiler/DMVs to solve deadlocks, to monitor long-running queries and trouble-shoot cubes SQL and T-SQL. Roles & Responsibilities : 1) "SSAS" OR "Azure Analysis services" Lead Developer with 7+ years of experience in SSAS Azure Data Model Development, SSAS Data model Deployment in Azure, Querying data from SSAS Azure to build Reports.
2) Design and Create SSAS/OLAP/OLTP/Tabular cubes and automate processes for analytical needs.
3) Writing optimized SQL queries for integration with other applications, Maintain data quality and overseeing database security, Partitions and Index.