11+ OLAP Jobs in Hyderabad | OLAP Job openings in Hyderabad
Apply to 11+ OLAP Jobs in Hyderabad on CutShort.io. Explore the latest OLAP Job opportunities across top companies like Google, Amazon & Adobe.

Global digital transformation solutions provider.
JOB DETAILS:
* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka
* Industry: Global digital transformation solutions provider
* Salary: Best in Industry
* Experience: 5-8 years
* Location: Hyderabad
Job Summary
We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.
Key Responsibilities
ETL Pipeline Development & Optimization
- Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
- Optimize data pipelines for performance, scalability, fault tolerance, and reliability.
Big Data Processing
- Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
- Ensure fault-tolerant, scalable, and high-performance data processing systems.
Cloud Infrastructure Development
- Build and manage scalable, cloud-native data infrastructure on AWS.
- Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.
Real-Time & Batch Data Integration
- Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
- Ensure consistency, data quality, and a unified view across multiple data sources and formats.
Data Analysis & Insights
- Partner with business teams and data scientists to understand data requirements.
- Perform in-depth data analysis to identify trends, patterns, and anomalies.
- Deliver high-quality datasets and present actionable insights to stakeholders.
CI/CD & Automation
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Automate testing, deployment, and monitoring to ensure smooth production releases.
Data Security & Compliance
- Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
- Implement data governance practices ensuring data integrity, security, and traceability.
Troubleshooting & Performance Tuning
- Identify and resolve performance bottlenecks in data pipelines.
- Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.
Collaboration & Cross-Functional Work
- Work closely with engineers, data scientists, product managers, and business stakeholders.
- Participate in agile ceremonies, sprint planning, and architectural discussions.
Skills & Qualifications
Mandatory (Must-Have) Skills
- AWS Expertise
- Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
- Strong understanding of cloud-native data architectures.
- Big Data Technologies
- Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
- Experience with Apache Spark and Apache Kafka in production environments.
- Data Frameworks
- Strong knowledge of Spark DataFrames and Datasets.
- ETL Pipeline Development
- Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
- Database Modeling & Data Warehousing
- Expertise in designing scalable data models for OLAP and OLTP systems.
- Data Analysis & Insights
- Ability to perform complex data analysis and extract actionable business insights.
- Strong analytical and problem-solving skills with a data-driven mindset.
- CI/CD & Automation
- Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
- Familiarity with automated testing and deployment workflows.
Good-to-Have (Preferred) Skills
- Knowledge of Java for data processing applications.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
- Familiarity with data governance frameworks and compliance tooling.
- Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
- Exposure to cost optimization strategies for large-scale cloud data platforms.
Skills: big data, scala spark, apache spark, ETL pipeline development
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer
F2F Interview: 14th Feb 2026
3 days in office, Hybrid model.

Global Digital Transformation Solutions Provider
MUST-HAVES:
- LLM, AI, Prompt Engineering LLM Integration & Prompt Engineering
- Context & Knowledge Base Design.
- Context & Knowledge Base Design.
- Experience running LLM evals
NOTICE PERIOD: Immediate – 30 Days
SKILLS: LLM, AI, PROMPT ENGINEERING
NICE TO HAVES:
Data Literacy & Modelling Awareness Familiarity with Databricks, AWS, and ChatGPT Environments
ROLE PROFICIENCY:
Role Scope / Deliverables:
- Scope of Role Serve as the link between business intelligence, data engineering, and AI application teams, ensuring the Large Language Model (LLM) interacts effectively with the modeled dataset.
- Define and curate the context and knowledge base that enables GPT to provide accurate, relevant, and compliant business insights.
- Collaborate with Data Analysts and System SMEs to identify, structure, and tag data elements that feed the LLM environment.
- Design, test, and refine prompt strategies and context frameworks that align GPT outputs with business objectives.
- Conduct evaluation and performance testing (evals) to validate LLM responses for accuracy, completeness, and relevance.
- Partner with IT and governance stakeholders to ensure secure, ethical, and controlled AI behavior within enterprise boundaries.
KEY DELIVERABLES:
- LLM Interaction Design Framework: Documentation of how GPT connects to the modeled dataset, including context injection, prompt templates, and retrieval logic.
- Knowledge Base Configuration: Curated and structured domain knowledge to enable precise and useful GPT responses (e.g., commercial definitions, data context, business rules).
- Evaluation Scripts & Test Results: Defined eval sets, scoring criteria, and output analysis to measure GPT accuracy and quality over time.
- Prompt Library & Usage Guidelines: Standardized prompts and design patterns to ensure consistent business interactions and outcomes.
- AI Performance Dashboard / Reporting: Visualizations or reports summarizing GPT response quality, usage trends, and continuous improvement metrics.
- Governance & Compliance Documentation: Inputs to data security, bias prevention, and responsible AI practices in collaboration with IT and compliance teams.
KEY SKILLS:
Technical & Analytical Skills:
- LLM Integration & Prompt Engineering – Understanding of how GPT models interact with structured and unstructured data to generate business-relevant insights.
- Context & Knowledge Base Design – Skilled in curating, structuring, and managing contextual data to optimize GPT accuracy and reliability.
- Evaluation & Testing Methods – Experience running LLM evals, defining scoring criteria, and assessing model quality across use cases.
- Data Literacy & Modeling Awareness – Familiar with relational and analytical data models to ensure alignment between data structures and AI responses.
- Familiarity with Databricks, AWS, and ChatGPT Environments – Capable of working in cloud-based analytics and AI environments for development, testing, and deployment.
- Scripting & Query Skills (e.g., SQL, Python) – Ability to extract, transform, and validate data for model training and evaluation workflows.
- Business & Collaboration Skills Cross-Functional Collaboration – Works effectively with business, data, and IT teams to align GPT capabilities with business objectives.
- Analytical Thinking & Problem Solving – Evaluates LLM outputs critically, identifies improvement opportunities, and translates findings into actionable refinements.
- Commercial Context Awareness – Understands how sales and marketing intelligence data should be represented and leveraged by GPT.
- Governance & Responsible AI Mindset – Applies enterprise AI standards for data security, privacy, and ethical use.
- Communication & Documentation – Clearly articulates AI logic, context structures, and testing results for both technical and non-technical audiences.
Hiring: Customer Relationship Executive – Hyderabad
We are looking for a Customer Relationship Executive to manage customer interactions for our School Management Software at Assentcode Technologies.
Key Focus
- Strong communication skills (English + Telugu or English + Hindi)
- Ability to handle customer queries with clarity and confidence
- Good listening and problem-solving approach
Role Overview
- Handle customer enquiries via calls, messages, and emails
- Provide clear and accurate information about the product
- Follow up with customers to ensure issue resolution
- Maintain records of customer interactions and updates
- Coordinate with internal teams for quick support
Additional Expectations
- Maintain a professional and polite tone with customers
- Ability to manage multiple conversations and priorities
- Adapt to a fast-paced startup environment
Eligibility
- Fresher to 1 year experience
- Based in Hyderabad
- Immediate joiners preferred
Ideal Candidate
- Customer-focused with a positive attitude
- Good communication and interpersonal skills
- Responsible and consistent in follow-ups
Role: Mobile Automation Engineer (SDET) — On-site, India
Role & Responsibilities
- Design, build and maintain scalable mobile test automation frameworks for Android and iOS using Appium, Espresso, XCUITest or equivalent tools to support continuous delivery.
- Create and own automated test suites (functional, regression, UI, and smoke) that run reliably in CI/CD pipelines (Jenkins/GitHub Actions) and on cloud device farms (BrowserStack/Sauce Labs).
- Collaborate with Developers and Product Owners to translate requirements into test strategies, write robust test cases, and automate end-to-end and integration scenarios (including API tests).
- Investigate, triage, and debug failures — use device logs, ADB, Xcode traces, and performance tools to isolate flakiness and reliability issues and drive fixes.
- Integrate automated tests into build pipelines, enforce quality gates, and provide actionable reporting and metrics for release readiness.
- Advocate and implement test automation best practices: code quality, modular frameworks, reusability, CI parallelization, and maintainable test data strategies.
Skills & Qualifications
- Must-Have
- 3+ years in mobile QA/automation with hands-on experience in Appium or native frameworks (Espresso/XCUITest) across Android and iOS.
- Strong programming skills in Java/Kotlin or Swift and working knowledge of Python or JavaScript for scripting and test tooling.
- Experience integrating automated suites into CI/CD (Jenkins/GitHub Actions) and executing on real & virtual device clouds (BrowserStack/Sauce Labs).
- Practical experience with API testing (REST), test frameworks (TestNG/JUnit/Mocha), and source control (Git).
- Solid debugging skills using ADB, Xcode, Android SDK, and familiarity with mobile performance profiling.
- Preferred
- Experience building custom automation frameworks, parallel test execution, and reliability/flakiness reduction strategies.
- Knowledge of CI orchestration, containerized test runners, and mobile security or accessibility testing.
- ISTQB or equivalent QA certification, prior experience in Agile/Scrum teams, and exposure to device lab management.
Experience: 8+ Years
Work Location: Hyderabad
Mode of work: Work from Office
Senior Data Engineer / Architect
Summary of the Role
The Senior Data Engineer / Architect will be a key role within the data and technology team, responsible for engineering and building data solutions that enable seamless use of data within the organization.
Core Activities
- Work closely with the business teams and business analysts to understand and document data usage requirements
- Develop designs relating to data engineering solutions including data pipelines, ETL, data warehouse, data mart and data lake solutions
- Develop data designs for reporting and other data use requirements
- Develop data governance solutions that provide data governance services including data security, data quality, data lineage etc.
- Lead implementation of data use and data quality solutions
- Provide operational support for users for the implemented data solutions
- Support development of solutions that automate reporting and business intelligence requirements
- Support development of machine learning and AI solution using large scale internal and external datasets
Other activities
- Work on and manage technology projects as and when required
- Provide user and technical training on data solutions
Skills and Experience
- At least 5-8 years of experience in a senior data engineer / architect role
- Strong experience with AWS based data solutions including AWS Redshift, analytics and data governance solutions
- Strong experience with industry standard data governance / data quality solutions
- Strong experience with managing a Postgres SQL data environment
- Background as a software developer working in AWS / Python will be beneficial
- Experience with BI tools like Power BI and Tableau
Strong written and oral communication skills
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
• Technical end-to-end design for stories by collaborating with business analyst/product owner, technical architect and clients
• Implement stories end-to-end, which includes Frontend, Backends for Frontend (BFF), Caching and also service orchestration in some cases) using best engineering practices like test-driven development, SOLID principles and consideration of non-functional requirements like performance, scalability, security or cloud readiness (as applicable) at the story level
• Automates testing at unit, module and integration level as needed using tools relevant for the platform (e.g. Jasmine, Jest, Karma, Webdriver.io etc.)
• Focus on quality by implementing best practices like logging, calling out technical debt, meeting KPIs using code quality tools like SonarQube, ESLint (customize and sync with Sonar), Stylelint/Sasslint and so on.
• Hands-on with automating redundant work, like common component structure, both in terms of scaffolding as well as using the right design patterns / creating abstractions.
• Mentor junior engineers in helping review code, guiding on technical decision making
Key Requirements:
• 4-9 years’ experience
• Hands-on knowledge of HTML5, CSS3, JavaScript
• Hands-on experience working with Object Oriented JavaScript (OOJS), JavaScript - and practical uses in building complex interactive experiences, primarily with ECMAScript 2015+. You should be up-to-date with new specifications (different stages), and also, using transpilers like Babel to use features from all stages
• Good understanding of creational and structural design patterns in JavaScript
• Experience working with Single Page Applications (SPA) with universal rendering capabilities.
• Good understanding of React.js, its core principles - lifecycle methods, virtual DOM etc. and at least 6 – 12 months hands-on experience with the same
• Understanding of one-way data flow and the Flux architecture.
• Understanding of stateful (container) vs stateless (presentational) components and how to break down the application page into components
• Core understanding of how React’s state management works out of the box.
• Understanding of different CSS architectures that go hand-in-hand with React.js application development. Pros and cons of component-level CSS vs global and where to apply each
• Thorough understanding of Webpack bundler (version 2+).
• Good understanding of the Web Components standard.
• Hands-on experience with building Progressive Web Apps (PWAs) on any framework.
• Hands-on experience with Frontend Performance Optimization – especially in a React application with respect to resource loading strategy, CPU/Memory profiling on the browser.
• Hands-on experience with test-driven development using Jest or equivalent. Familiar with snapshot testing, code coverage. Experience working with utilities like Enzyme
• Experience using fluid grids, and building responsive/adaptive websites using Bootstrap etc.
• Understanding of SEO and accessibility and making it part of the development workflow
• Moderate to Strong graphics manipulation/optimization skills using Photoshop is a plus.
About the role: Paytm Transit business deals in products related to commuting viz NCMC / Transit card, QR Ticketing, Smart card recharge etc.
- Achieving targets for key performance metrics viz user acquisition, retention and business revenue goals.
- Increase productivity of team by continuous training, review and supervision.
- Ensuring the team follows best in class sales tools and processes laid out by the business.
- Recruiting, retention and maintaining pipeline of field sales force.
- Build penetration in market by creating a strong network of reseller / influencers
- Provide ground feedback that would open up avenues for increasing business.
Expectations/ Requirements
- 3+ years of Direct sales / business development experience.
- Managed a team of Front line sales persons for at least 1 year.
- Working knowledge of MS Office.
- Worked on any Sales force app.
- Educational Qualification – Graduate & above.
- Good communication skills and proficiency in regional language of the geography where this position is based.
- Have exposure in cold calling.
Superpowers/ Skills that will help you succeed in this role
High level of drive, initiative and self-motivation
● Ability to take internal and external stakeholders along
● Understanding of Distribution set up and ability to build the business
● Strong people management exposure
● Love for simplifying
● Growth Mindset
● Willingness to experiment and improve continuously
Why join us
A collaborative output driven program that brings cohesiveness across businesses
through technology
● Improve the average revenue per use by increasing the cross-sell opportunities
● A solid 360 feedbacks from your peer teams on your support of their goals
● Respect, that is earned, not demanded from your peers and manager
Compensation: If you are the right fit, we believe in creating wealth for you
Edvak is committed to delivering customized enterprise solutions that allow businesses to benefit from the use of advanced technologies to meet their organization’s objectives. We firmly believe that Artificial Intelligence can provide great productivity gains to any business. We support our clients by solving their business problems and increasing their operational efficiency by integrating and leveraging AI technological solutions.
Website: https://edvak.com/" target="_blank">https://edvak.com/
Hiring for Angular Developer:
- Delivering a complete front-end application
- Hands-on experience in Angular 4+ version
- Ensuring high performance & adaptability on mobile and desktop
- Writing tested, idiomatic, and documented JavaScript, HTML, and CSS
- Coordinating the workflow between the graphic designer, the HTML coder, and yourself
- Cooperating with the back-end developer in the process of building the RESTful API
- Communicating with external web services
- Deep knowledge of Angular practices and commonly used modules based on extensive work experience
- Creating self-contained, reusable, custom, and testable modules, directives, pipes, and components
- Ensuring a clear dependency chain, regarding the app logic as well as the file system
- Exp on angular material themes and best scss practices
- Extensive Typescript knowledge to create angular applications
- Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services used in the system
- Validating user actions on the client side and providing responsive feedback
- Writing non-blocking code, and resorting to advanced techniques such as multi-threading, when needed
- Experience with all levels of operation available to the front-end, such as from creating XHRs in vanilla JS to implementing custom HTTP interceptor.
- Architecting and automating the build process for production, using task runners or scripts
- Application knowledge of unit testing. TDD knowledge is an added advantage
- Good understanding of Git and related workflows for daily development activities
- Hands-on debugging and troubleshooting skills
- Working knowledge of Agile methodology and scrum
Roles and Responsibilities
To build cross-platform mobile apps for Android, IOS, and Web. This should include making responsive UIs to efficiently query data and also manage states in an optimized manner.
Desired Candidate Profile
- Deep experience contributing to and managing high-scale production mobile apps. You must have previous experience in Flutter development.
Responsibilities and Duties :
- 2-5 years working as a full-time professional developer (Mobile) and 2 Years in a flutter is must
- Experience building a complex mobile applications, flutter would be advantage
- Good to have if it's on the Play Store/App Store or any other demo
- Willing to work with cross-platform frameworks
- Willing to learn and work on different mobile platforms frameworks when needed.
- Familiar with REST Api's, websockets in mobile integration.
- Working with a version control system (i.e., Git)
- Good communication skills.
- Strong problem-solving skills.
- Team worker.
- Very comfortable learning new technologies, tools, and platforms.
- Highly motivated.
- Initiative and passionate.
- Strong problem-solving skills.
- 6+ years of experience working with MongoDB or other NoSQL databases.
- Maintain and configure MongoDB (developer)
- Keep clear documentation of the database setup and architecture.
- Backup and Disaster Recovery management.
- Adept with all the best practices and design patterns in MongoDB for designing document schemas.
- Good grasp of MongoDB’s aggregation framework.
- Ensure that the databases achieve maximum performance and availability.
- Design indexing strategies.
- Configure, monitor, and deploy replica sets.
- Should have experience with MongoDB Atlas.
- Should have minimum experience with development and performance tuning.
- Create roles and users and set their permissions.
- Excellent written, verbal communication skills and critical thinking skills



