11+ OLAP Jobs in Hyderabad | OLAP Job openings in Hyderabad
Apply to 11+ OLAP Jobs in Hyderabad on CutShort.io. Explore the latest OLAP Job opportunities across top companies like Google, Amazon & Adobe.

Global digital transformation solutions provider.
JOB DETAILS:
* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka
* Industry: Global digital transformation solutions provider
* Salary: Best in Industry
* Experience: 5-8 years
* Location: Hyderabad
Job Summary
We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.
Key Responsibilities
ETL Pipeline Development & Optimization
- Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
- Optimize data pipelines for performance, scalability, fault tolerance, and reliability.
Big Data Processing
- Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
- Ensure fault-tolerant, scalable, and high-performance data processing systems.
Cloud Infrastructure Development
- Build and manage scalable, cloud-native data infrastructure on AWS.
- Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.
Real-Time & Batch Data Integration
- Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
- Ensure consistency, data quality, and a unified view across multiple data sources and formats.
Data Analysis & Insights
- Partner with business teams and data scientists to understand data requirements.
- Perform in-depth data analysis to identify trends, patterns, and anomalies.
- Deliver high-quality datasets and present actionable insights to stakeholders.
CI/CD & Automation
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Automate testing, deployment, and monitoring to ensure smooth production releases.
Data Security & Compliance
- Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
- Implement data governance practices ensuring data integrity, security, and traceability.
Troubleshooting & Performance Tuning
- Identify and resolve performance bottlenecks in data pipelines.
- Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.
Collaboration & Cross-Functional Work
- Work closely with engineers, data scientists, product managers, and business stakeholders.
- Participate in agile ceremonies, sprint planning, and architectural discussions.
Skills & Qualifications
Mandatory (Must-Have) Skills
- AWS Expertise
- Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
- Strong understanding of cloud-native data architectures.
- Big Data Technologies
- Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
- Experience with Apache Spark and Apache Kafka in production environments.
- Data Frameworks
- Strong knowledge of Spark DataFrames and Datasets.
- ETL Pipeline Development
- Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
- Database Modeling & Data Warehousing
- Expertise in designing scalable data models for OLAP and OLTP systems.
- Data Analysis & Insights
- Ability to perform complex data analysis and extract actionable business insights.
- Strong analytical and problem-solving skills with a data-driven mindset.
- CI/CD & Automation
- Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
- Familiarity with automated testing and deployment workflows.
Good-to-Have (Preferred) Skills
- Knowledge of Java for data processing applications.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
- Familiarity with data governance frameworks and compliance tooling.
- Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
- Exposure to cost optimization strategies for large-scale cloud data platforms.
Skills: big data, scala spark, apache spark, ETL pipeline development
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer
F2F Interview: 14th Feb 2026
3 days in office, Hybrid model.
Role - RSA Archer Technial Specialist
Location preferred - Bangalore + key metro
Exp Band - 10 +
JD
Experience in application development using the Archer platform
- Proficiency in Archer configuration, including custom fields, rules, and workflows
- Strong understanding of GRC concepts and the business context of Archer solutions
- Experience with web technologies including HTML, JavaScript, and CSS
- Familiarity with integration techniques and APIs
- Excellent problem-solving and analytical skills
- Able to work independently and collaboratively in a fast-paced environment
- Strong communication skills to interact with various stakeholders effectively
Associate Software Developer
Location: Hyderabad
Job Type: Full-time, In-Office
Department: Software Development
Blurgs AI is a deep-tech startup specializing in data-intelligence solutions for the maritime and defense sectors. The company's main product, Trident, enhances domain awareness for these industries by integrating data from various sensors like AIS, Radar, SAR, and EO/IR.
Blurgs AI has a collaborative, innovative, and inclusive work culture. They encourage new hires to contribute their ideas and offer opportunities for personal and professional growth. The company's goal is to solve real-world challenges using cutting-edge technology that has a global impact.
Role and Responsibilities
In this role, you will work in an on-premises environment to develop and maintain software solutions. You will use FastAPI, MongoDB, Angular + TypeScript, and Docker to build these solutions.
Your responsibilities will include:
- Developing back-end services using FastAPI and MongoDB.
- Building and enhancing front-end applications with Angular and TypeScript.
- Using Docker for containerization and deployment.
- Collaborating with the team to deliver software solutions.
Requirements
- A Bachelor's degree in Computer Science or a related field (or equivalent experience).
- Freshers with experience in FastAPI, MongoDB, Angular + TypeScript, and Docker are encouraged to apply.
- Strong problem-solving skills and a willingness to learn.
The position is offered as a 12-month contract.
• C++ , Unix Environment ( Linux/AIX/HP UX), Oracle/MySQL
• Excellent command on OOPS
• Minimum of 3 years (for Mid and Junior) of hands-on work experience in C++, Unix
• Oracle/MySQL
• Hands-on experience of using data structures, STL, Boost libraries, Design patterns
• Exposure to XML or Edifact is desired
• Exposure to XSLT mappings is a plus
• Excellent troubleshooting skills
• Exposure to CppUnit (or similar tools)
Experience range:
• 4 to 8 years of experience
Joining Location:
• Pune, Gandhinagar & Hyderabad (Preferably Pune & Gandhinagar)
· Problem-Solving Skills, should be able to convert Idea on Paper to Code
· Bachelor’s degree in computer science or related field, or equivalent professional experience.
· 0 - 4 years of database experience (Oracle SQL, PL/SQL)
· Proficiency in Oracle, with Hands-on experience in database design
· Creation & implementation of data models.
· Strong experience with Oracle functions, procedures, triggers, packages.
· Willing to learn, grasp & quickly adapt needed cutting-edge tools & technologies in shorter timeframe.
· Should be able to write basic Procedures & Functions.
About Company:
The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.
Requirements:
1. Java 8+, J2EE,
2. Core Java- More hands on collections framework
3. JPA- Spring, Hibernate
4. Webservices (SOAP / Rest)
5. Knowldege on SQL & PLQL (Oracle/MySQL)
6. Expericne in Agile tools & processes
7. Good analytical & Communication skills
Responsibilities & ownership
- Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product.
- Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Lead the team to solve complex and unknown problems
- Solve technical problems and customer issues with technical expertise
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Mentor other team members for high quality and design
- Collaborate with Product Management to deliver on customer requirements and innovation
- Collaborate with Support and field teams to ensure that customers are successful with Dremio
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 3alm+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems
- 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Ability to solve ambiguous, unexplored, and cross-team problems effectively
- Hands on experience of working projects on AWS, Azure, and Google Cloud Platform
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)
- Understanding of distributed file systems such as S3, ADLS, or HDFS
- Excellent communication skills and affinity for collaboration and teamwork
- Ability to work individually and collaboratively with other team members
- Ability to scope and plan solution for big problems and mentors others on the same
- Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Total Experience: 3 to 15 Years
Notice period: Immediate to 30 Days.
Responsibilities:
Implementation of Manufacturing, Costing, Planning Central, Demand Management, Sales and Operation Planning in Oracle ERP.
Drive requirement gathering, Fit-Gap, Solution Design, Build, CRP, SIT, UAT, Cutover/Go-Live and postproduction support for above applications.
Perform configuration/Application setups.
capturing solution design, configuration, documentation, and post-production support.
Responsible for roll out of all Oracle modules for new State operations.
Training the new users and preparation of user manuals
We are looking for a copywriter with an experience of 2 years who could take care of different marketing content throughout its journey until it gets published. It means you would not just write content but use Canva to design images, coordinate with the SEOs to optimize the content for search engines, and work with different team members to get the content uploaded.
We are looking for a writer who loves reading and for whom writing is second nature. If you read books, especially novels and newspapers, we would love to talk to you.
Our target audience comprises businesses. Therefore it is essential that you are able to write in a business language yet make it engaging.
PLEASE NOTE:
1. This is an in-office role
2. You must complete a series of in-office tasks if you are shortlisted.
3. When we say 2 years of experience, we mean experience working for an organization as an employee. It won't include freelancing.
4. You would be working with an enthusiastic and dynamic team of digital marketing specialists who have built the digital presence of Focus Softnet from the scratch.
5. We expect you to be self-driven. You would get strong support from the team to help you in your work. However, we expect you to be committed and responsible regarding your own work.
Responsibilities:
· Read and absorb everything related to our products and software
· Write about our products and software in an engaging way
· Ensure the content is in line with our branding and messaging style
· Create new content to assist marketing campaigns
· Work closely with marketing team members
· Optimize content using SEO best practices
· Write blog articles based on SEO guidelines and keywords which you would be provided with
· Write content for Case Studies and White Papers after carrying out thorough research on relevant topics
Qualifications:
· Any graduate or post-graduate with proficiency in the English language
· Must be a regular reader
· Proficiency in major digital and print platforms
Responsibilities:
· Analyze, design and support implementation of business specific Pega solutions and/or framework
· Responsible for implementing technical solutions on Pega 8.x
· Ability to create reusable components that can be leveraged across the enterprise for providing top-notch user experience
· Ability to translate complex business requirement into functional technical requirements using Pega Systems Smart BPM methodology and DCO tools
· Good hands on implementing PEGA integration services using REST, SOAP, Email etc. Good understanding of PEGA case management features
· Design and implement product features in collaboration with business and IT stakeholders
· Design reusable components, frameworks and libraries
· Ability to interact with business analysts and business team members to refine requirements, Use Cases
· Perform all phases of software engineering including requirements analysis, application design, code development and testing
· Work very closely with architecture groups and drive solutions
· Participate in an Agile / Scrum methodology to deliver high-quality software releases
· Design and develop innovative solutions to meet the needs of the business
· Review code and provide feedback relative to best practices and improving performance
· Troubleshoot production support issues post-deployment and come up with solutions as required
· Mentor and guide other software/Pega engineers within the team
· Experience with troubleshooting Production log files and performance issues
· Responsible for unit testing more complex individual and team deliverables based on established test plans
· Responsible for ensuring an efficient integration of programming deliverables timely builds and overall code quality
· Contributes to the delivery of new applications and to the maintenance and enhancement of existing applications with shared responsibility for the technical issues
Experience and Qualifications:
· Bachelor’s Degree in computer science, or Information Systems or Engineering is required, else in lieu, a demonstrated equivalence in work experience is mandatory.
· Pega Systems Certified Senior System Architect (CSSA)
· 5+ Years of Experience on Pega PRPC (8.x/7.x versions) and Frame worked Case Management
· In-Depth knowledge of Pega Upgrades, Integration, Performance Management and all Pega relevant Tools
· Ability to negotiate and allocate resources appropriately for development and implementation
· Excellent written, communication and presentation skills


