11+ OLAP Jobs in Hyderabad | OLAP Job openings in Hyderabad
Apply to 11+ OLAP Jobs in Hyderabad on CutShort.io. Explore the latest OLAP Job opportunities across top companies like Google, Amazon & Adobe.

Global digital transformation solutions provider.
JOB DETAILS:
* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka
* Industry: Global digital transformation solutions provider
* Salary: Best in Industry
* Experience: 5-8 years
* Location: Hyderabad
Job Summary
We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.
Key Responsibilities
ETL Pipeline Development & Optimization
- Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
- Optimize data pipelines for performance, scalability, fault tolerance, and reliability.
Big Data Processing
- Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
- Ensure fault-tolerant, scalable, and high-performance data processing systems.
Cloud Infrastructure Development
- Build and manage scalable, cloud-native data infrastructure on AWS.
- Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.
Real-Time & Batch Data Integration
- Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
- Ensure consistency, data quality, and a unified view across multiple data sources and formats.
Data Analysis & Insights
- Partner with business teams and data scientists to understand data requirements.
- Perform in-depth data analysis to identify trends, patterns, and anomalies.
- Deliver high-quality datasets and present actionable insights to stakeholders.
CI/CD & Automation
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Automate testing, deployment, and monitoring to ensure smooth production releases.
Data Security & Compliance
- Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
- Implement data governance practices ensuring data integrity, security, and traceability.
Troubleshooting & Performance Tuning
- Identify and resolve performance bottlenecks in data pipelines.
- Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.
Collaboration & Cross-Functional Work
- Work closely with engineers, data scientists, product managers, and business stakeholders.
- Participate in agile ceremonies, sprint planning, and architectural discussions.
Skills & Qualifications
Mandatory (Must-Have) Skills
- AWS Expertise
- Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
- Strong understanding of cloud-native data architectures.
- Big Data Technologies
- Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
- Experience with Apache Spark and Apache Kafka in production environments.
- Data Frameworks
- Strong knowledge of Spark DataFrames and Datasets.
- ETL Pipeline Development
- Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
- Database Modeling & Data Warehousing
- Expertise in designing scalable data models for OLAP and OLTP systems.
- Data Analysis & Insights
- Ability to perform complex data analysis and extract actionable business insights.
- Strong analytical and problem-solving skills with a data-driven mindset.
- CI/CD & Automation
- Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
- Familiarity with automated testing and deployment workflows.
Good-to-Have (Preferred) Skills
- Knowledge of Java for data processing applications.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
- Familiarity with data governance frameworks and compliance tooling.
- Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
- Exposure to cost optimization strategies for large-scale cloud data platforms.
Skills: big data, scala spark, apache spark, ETL pipeline development
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer
F2F Interview: 14th Feb 2026
3 days in office, Hybrid model.
Position Overview:
We are looking for a React Native Developer with 1-2 years of experience in developing mobile applications. The ideal candidate should have a solid understanding of JavaScript, TypeScript, and React Native, along with a keen interest in building high-quality, scalable mobile applications.
Key Responsibilities:
- Develop and maintain cross-platform mobile applications using React Native.
- Work closely with designers, product managers, and backend developers to deliver seamless user experiences.
- Integrate third-party services such as Google Maps, Firebase, and payment gateways.
- Optimize applications for performance, responsiveness, and scalability.
- Work with RESTful APIs for backend integration.
- Debug and troubleshoot application issues to ensure a smooth user experience.
- Stay updated with the latest React Native features and industry trends.
Required Skills:
- 1-2 years of hands-on experience in React Native mobile app development.
- Proficiency in JavaScript, TypeScript, and React Hooks.
- Experience in mobile UI development, animations, and responsiveness.
- Familiarity with iOS (Xcode) and Android (Android Studio) environments.
- Experience integrating APIs, Firebase, push notifications, and authentication mechanisms.
- Knowledge of Git and Agile methodologies.
- Strong problem-solving skills and a collaborative mindset.
Job Description
Experience: 4 – 8 Years
Location: Hyderabad
Employment Type: Fulltime
Key Responsibilities
- Design, develop, and implement backend services using Java (latest version), Spring Boot, and Microservices architecture.
- Participate in the end-to-end development lifecycle, from requirement analysis to deployment and support.
- Collaborate with cross-functional teams (UI/UX, DevOps, Product) to deliver high-quality, scalable software solutions.
- Integrate APIs and manage data flow between services and front-end systems.
- Work on cloud-based deployment using AWS or GCP environments.
- Ensure performance, security, and scalability of services in production.
- Contribute to technical documentation, code reviews, and best practice implementations.
Required Skills
- Strong hands-on experience with Core Java (latest versions), Spring Boot, and Microservices.
- Solid understanding of RESTful APIs, JSON, and distributed systems.
- Basic knowledge of Kubernetes (K8s) for containerization and orchestration.
- Working experience or strong conceptual understanding of cloud platforms (AWS / GCP).
- Exposure to CI/CD pipelines, version control (Git), and deployment automation.
- Familiarity with security best practices, logging, and monitoring tools.
Preferred Skills
- Experience with end-to-end deployment on AWS or GCP.
- Familiarity with payment gateway integrations or fintech applications.
- Understanding of DevOps concepts and infrastructure-as-code tools (Added advantage).
3+ years experience as Oracle Cloud Techno Fucnional Consultant.
(70%- Technical , 30% - Functional)
5-7 years’ experience in IT Field.
Strong Hands on Experience in Oracle SCM or Oracle Financial modules as a technical Consultant.
Should be strongly experienced working with Oracle cloud ERP application particularly on reporting (BI and OTBI)
Advanced understanding of Oracle PAAS offerings and architecture.
Must have knowledge on:
1) Experience in creation and customization of reports/forms and XML Publisher.
Should have worked in Functional modules (SCM/ Finance)
2) Experience in developing or customizing reports using BI templates.
3) Experience in building Oracle BIP reports, Analysis reports, OTBI and
reports using data models.
4) Should have hands-on experience in BI Publisher (RTF design/ eText/Scheduling/ Parameter Handling/ Bursting/ backup and migration of reportsto different pods).
5) Strong knowledge of writing SQL queries using Oracle Financial or Oracle SCM database tables.
6) Very Strong SQL/PLSQL skills to create custom queries in BI Publisher, Oracle Fusion environment
7)Proficiency in creating reports using templates like RTF, Excel, Pipe delimited, stylesheet, e-text, etc
8) Expertise in working with the Cubes and how to extract data from cubes and joins etc
9) Hands-on experience in converting the reports to ESS jobs for scheduling reports
10)Hands-on experience in migrating the reports between environments
11)Excellent Communication Skills
Good to have Knowledge on:
1) Expertise in Conversions and Integrations via FBDI, Webservices
2) Experience in AP Check printing, Positive pay templates, PO
Printing, AP Printing, automated AR Invoice. Lockbox Processing
/ Bank Statementintegration.
3) Solid understanding of performance tuning best practices and
experience improving end-to-end processing times.
4) Additional Reporting Tools: Analysis Dashboard / ESS Base
Financial reporting studio (FRS) and Smartview reports.
5) Knowledge on Integrations and the OIC module.
We are seeking a highly skilled and experienced Offshore Data Engineer . The role involves designing, implementing, and testing data pipelines and products.
Qualifications & Experience:
bachelor's or master's degree in computer science, Information Systems, or a related field.
5+ years of experience in data engineering, with expertise in data architecture and pipeline development.
☁️ Proven experience with GCP, Big Query, Databricks, Airflow, Spark, DBT, and GCP Services.
️ Hands-on experience with ETL processes, SQL, PostgreSQL, MySQL, MongoDB, Cassandra.
Strong proficiency in Python and data modelling.
Experience in testing and validation of data pipelines.
Preferred: Experience with eCommerce systems, data visualization tools (Tableau, Looker), and cloud certifications.
If you meet the above criteria and are interested, please share your updated CV along with the following details:
Total Experience:
Current CTC:
Expected CTC:
Current Location:
Preferred Location:
Notice Period / Last Working Day (if serving notice):
⚠️ Kindly share your details only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Overview:
We are currently seeking an energetic Sales Development Representatives. This position is an integral part of our sales engine. SDRs are a specialised group of individuals always “hunting” for high-value opportunities.
SDRs are focussed on the front end of the sales cycle and be the face of KEKA. This individual must be highly motivated, self-starter who is able to identify and develop opportunities from multiple sources including prospect lists, discovery and individual research.
A typical day at KEKA being an SDR would involve:
- Generating qualified opportunities for the company by rigorously prospecting and researching In the given region/market
- Striking and initiating conversations with high-profile personas of companies you are prospecting into
- Be the face of KEKA and introduce the company and the products to the prospects
- Getting creative with writing emails/ Inmails and grabbing the attention of prospects
- Being adaptable to change and have the thirst to learn new things/ways to reach the prospects and have them on board for a meeting
Requirements:
- 1-4 years of work experience in any customer-facing sales like Sales/inside sales/cold calling/business development
- Clear, concise, and effective written and oral communication skills.
- Empathy towards customers and understanding their needs.
- Interest, curiosity, and openness to learning new technologies.
- Prior exposure to tools like LinkedIn Sales Navigator, DiscoverOrg, ZoomInfo, etc.
- Good interpersonal skills and ability to collaborate with internal stakeholders as well as end customers.
- Learning mindset and the right attitude that will help you thrive and adapt in a fast-paced, performance-driven environment.
- Ability to handle rejections and stay focused and driven.
- Ability to multi-task and manage your tasks effectively.
- Flexibility working in different shifts/regions. This is absolutely mandatory
Good To Have:
- Prior work experience in SaaS product companies in domains relevant to Keka products.
- A proven track record of consistency in overachieving targets.
SKILL AND EXPERIENCE REQUIRED:
- 2+ years of experience in Java skills.
- 2+ years of hands-on HTML5/CSS3 experience.
- Experience with popular JavaScript frameworks such as ReactJs or Angular 2.0; experience preferred.
- Experience of working with HTTP 1.1, and HTTP/2.
- Experience with RESTful APIs and JSON RPC.
- Ability to write clean, bug-free code that is easy to understand and easily maintainable.
- Experience following Git workflows.
- Working knowledge of DevOps tools (e.g., Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc.).
- Microsoft development experience using C#, ASP.NET Core Web API, MVC, Authentication and
Authorization, and proficient in developing large scale web applications using a .Net framework.
- Hands-on working experience with setting up applications using Azure Functions, Azure SQL and
NoSQL
- Guide the development team with building applications on Azure Cloud
- Learn and research new solutions for application development that can be applicable to the
business problems we solve
Requirements:
- Problem solver using data and deep understanding of designing/setting up solutions
- Minimum of 2 years experience using and implementing Azure Active Directory, Azure Functions
with various triggers
- Strong interpersonal and communication skills and flexibility to work US Hours if needed
- Be flexible to learn new technologies and apply them to solve business problems
- Strong Communication skills (verbal and written).
- Extremely detail-oriented and well-organized.
- Ability to positively engage with the clients and build strong long-term relationships.
- Ability to work efficiently and effectively in a high-paced environment and under deadlines.
About Us:
DataBeat.io is a data and analytics services company that provides big data, analytics and
operations management services to various companies globally.
Working at DataBeat.io helps you to be at the forefront of the big data and analytics ecosystem. You
will work with clients who are leading companies that develop breakthrough solutions, concepts that
are shaping the technology world and cutting-edge tools. Fast-growing company where your
performance and contribution could move you into leadership positions fairly quickly.
Must Have
- 1+ to 6 year’s development experience in Java/J2EE Development.
- 1+ years’ experience in Spring, Hibernate.
- 1+ years’ experience in developing REST API’s
- 1+ years’ experience in developing Spring boot applications.
- Hands-On experience in Unit testing.
- Hands On experience in MVC frameworks –AngularJS/Angular7/8
- Understanding of Micro services.
- Understanding of Agile Methodologies.
- Working experience with DB technologies
- Strong analytical and problem-solving skills.
- Aptitude for innovation, working independently and thinking ‘outside of the box’.
• Bachelor's Degree in Computer Science or related field.
• 2+ years in full-stack engineering roles.
• 2+ years of strong coding skills in at least one programming language that we use.
Desired Skills:
• Excellent written and verbal communication skills.
• Willingness to learn new technologies.
• Ability to debug, solve and fix bugs independently.
Responsibilities:
• Build new and enhance products and services.
• Design, develop, validate, maintain, release, and operate our applications.
• Take complete ownership of the project or features from design to implementation and to deployment.
• Collaborate with the rest of the team to deliver the products and features.
Candidates below 2 years of experience shall not be considered for this position. Please do not apply if you are a fresher or a professional with less than 2 years of Work Experience



