Cutshort logo
Hotel Management Jobs in Hyderabad

11+ Hotel Management Jobs in Hyderabad | Hotel Management Job openings in Hyderabad

Apply to 11+ Hotel Management Jobs in Hyderabad on CutShort.io. Explore the latest Hotel Management Job opportunities across top companies like Google, Amazon & Adobe.

icon
NCR (Delhi | Gurgaon | Noida), Goa, Kolkata, Navi Mumbai, Hyderabad, Chennai, Bengaluru (Bangalore), Jaipur
1 - 5 yrs
₹1L - ₹3L / yr
Sales
Direct sales
Aviation
Hospitality
Hotel Management
1. Candidates from aviation industry who are looking for ground jobs or want to make their career in the Hotel industry. 2. Candidates from luxury retail stores, who want to work with luxury Hotels. 3. Candidates from luxury hotels, banquets or travel sales background.
Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai, Kolkata
3 - 8 yrs
₹5L - ₹20L / yr
Oracle Analytics Cloud (OAC)
Fusion Data Intelligence (FDI) Specialist
RPD
OAC Reports
Data Visualization
+7 more

Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist

Experience : 3 to 8 years

Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata

Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)


Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.


Key Responsibilities :

  • Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
  • Build and optimize complex RPD models, OAC reports, and data visualizations.
  • Utilize SQL and PL/SQL for data querying and performance optimization.
  • Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
  • Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
  • Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
  • Implement cloud scripting using CURL for Oracle Cloud automation.
  • Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
  • Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.

Required Skills :

  • Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
  • Deep understanding of data modeling, reporting, and visualization techniques.
  • Proficiency in SQL, PL/SQL, and relational databases on Oracle.
  • Familiarity with DevOps tools, version control, and deployment automation.
  • Working knowledge of Oracle Cloud services, scripting, and monitoring.

Good to Have :

  • Prior experience in OBIEE to OAC migrations.
  • Exposure to data security models and cloud performance tuning.
  • Certification in Oracle Cloud-related technologies.
Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Inncircles
Gangadhar M
Posted by Gangadhar M
Hyderabad
3 - 5 yrs
Best in industry
PySpark
Spark
skill iconPython
ETL
Amazon EMR
+7 more


We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in

building large-scale data pipelines, real-time streaming solutions, and batch/stream

processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and

AWS Big Data services, with hands-on experience in implementing CDC (Change Data

Capture) pipelines and integrating multiple data sources and sinks.


Responsibilities

  • Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
  • Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
  • Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
  • Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
  • Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
  • Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
  • Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
  • Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
  • Implement monitoring, logging, and alerting for critical data pipelines.
  • Follow best practices for data security, compliance, and cost optimization in cloud environments.


Required Skills & Experience

  • Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
  • Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
  • Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
  • CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
  • AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
  • ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
  • Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
  • Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
  • Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
  • Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
  • Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
  • Functions for workflow orchestration.


Preferred Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • Experience in large-scale data lake / lake house architectures.
  • Knowledge of data warehousing concepts and query optimisation.
  • Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
  • Exposure to ML/AI data pipelines is a plus.


Tools & Technologies (must-have exposure)

  • Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
  • Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
  • Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
  • Programming & Scripting: Python, SQL, Bash
  • Orchestration: Airflow / Step Functions
  • Version Control & CI/CD: Git, Jenkins/CodePipeline
  • Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
Read more
HighQ-labs
Lakshmi dantuluri
Posted by Lakshmi dantuluri
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Pune, Hyderabad, Mumbai
5 - 10 yrs
₹15L - ₹30L / yr
LOANIQ
  • Act as a bridge between business users and technical teams, translating business requirements into LOANIQ solution designs.
  • Gather, analyze, and document detailed functional requirements for new features, system enhancements, and regulatory changes.
  • Work closely with development teams to validate LOANIQ configurations, test workflows, and review technical designs.
  • Lead or support UAT, coordinate with QA teams, and ensure sign-off from business stakeholders.
  • Conduct gap analysis and suggest best practices using LOANIQ's capabilities to improve loan operations.
  • Create detailed BRDs, FRDs, process flow diagrams, and user stories aligned with Agile/Scrum methodology.
  • Support LOANIQ implementation, upgrades, and version migrations.
  • Conduct training, prepare user guides, and provide post-go-live support.


Read more
Information Technology and Services

Information Technology and Services

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore), Gurugram, Hyderabad
3 - 8 yrs
₹8L - ₹10L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)

Responsibilities : 


• Design, code, and implement highly scalable and reliable web-based applications.
• Coordinate with other team’s architect, engineers, and vendors, as necessary.
• Deliver on all phases of development work from initial kick-off, technical setup, application
development, and support.
• Identify exciting opportunities for adopting new technologies to solve existing needs and
predicting future challenges
• Perform ongoing refactoring of code, utilizing visualization and other techniques to fast-track
concepts, and delivering continuous improvement
• Work with product managers to prioritize features for ongoing sprints and managing a list of
technical requirements based on industry trends, new technologies, known defects, and issues
• Manage your own time, and work well both independently and as part of a team
• Quickly generate and update proof of concepts for testing and team feedback
• Embrace emerging standards while promoting best practices


Qualifications : 


• Must have java experience
• Experience in computer science, computer engineering
• Web development work experience preferred
• Demonstrated experience in Agile development, application design, software development, and
testing
• Experience with building RESTful APIs
• Expertise in objected oriented analysis and design across a variety of platforms
• Thorough understanding of JSON, Web Service technologies, and data structure fundamentals
• Experience with adaptive and responsive development techniques
• Aptitude for learning and applying programming concepts
• Ability to effectively communicate with internal and external business partners
• Experience with a broad range of software languages and payments technologies is a plus.

Read more
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry

consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry

Agency job
via Jobdost by Sathish Kumar
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
skill iconPython
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Hyderabad
7 - 15 yrs
Best in industry
ETL
Snow flake schema

Job Description for QA Engineer:

  • 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience on Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management toolis good to have.
  • Exposure to the financial domain knowledge is considered a plus
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Key Attributes include:

  • Team player with professional and positive approach
  • Creative, innovative and able to think outside of the box
  • Strong attention to detail during root cause analysis and defect issue resolution
  • Self-motivated & self-sufficient
  • Effective communicator both written and verbal
  • Brings a high level of energy with enthusiasm to generate excitement and motivate the team
  • Able to work under pressure with tight deadlines and/or multiple projects
  • Experience in negotiation and conflict resolution


Read more
My client is a Top 5 Services Company

My client is a Top 5 Services Company

Agency job
Bengaluru (Bangalore), Hyderabad, Gurugram
3 - 10 yrs
₹3L - ₹30L / yr
skill iconJava
skill iconSpring Boot
J2EE
Hibernate (Java)
Object Oriented Programming (OOPs)

Job Role: Java SpringBoot Developer

 

Job Description:

  • Should have expertise in Core Java, Java & J2ee
  • Built MVC based Web Application Using JSP/Struts framework
  • Hands on experience of OOPS concepts, Hibernate and Spring version 3.x 1, Spring Boot
  • Hands on experience of Application Servers like Tomcat and WebLogic.
  • Hands on experience of Restful services and Web services
  • Experience in deploying Spring boot in Micro services
  • Good verbal and written communication
  • Excellent team player, ability to work in a global team and follow through on deadlines

 

Primary Skills:

  • Core Java, Java & J2ee
  • Spring Boot
Read more
Premium IT and Analytics Company

Premium IT and Analytics Company

Agency job
via Syllogistics AI by Srilatha K
Hyderabad
3 - 9 yrs
₹5L - ₹15L / yr
skill iconReact.js
skill iconJavascript
skill iconRedux/Flux

Work Location: Hyderabad

 

J.D:

  • Minimum 3+ years of professional experience in Angular 2+ and above, React JS and scripting Language.
  • Excellence in modern JavaScript, HTML5 and design patterns.
  • Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, React testing library, server side rendering, and Type script.
  • Validating user input on the client side and implementing meaningful feedback.
  • Skill in designing a modern build process that integrates testing and continuous delivery.
  • Hands-on experience with creating configuration, build, and test scripts for continuous integration environments.
Read more
I Base IT

at I Base IT

1 recruiter
Sravanthi Alamuri
Posted by Sravanthi Alamuri
Hyderabad
9 - 15 yrs
₹11L - ₹22L / yr
skill iconAngularJS (1.x)
skill iconJava
skill iconNodeJS (Node.js)
skill iconAngular (2+)
Skills to Have: Anjular , HTML , Java Script , Jquery , Node JS
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort