11+ ODI Jobs in Hyderabad | ODI Job openings in Hyderabad
Apply to 11+ ODI Jobs in Hyderabad on CutShort.io. Explore the latest ODI Job opportunities across top companies like Google, Amazon & Adobe.
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We’re Hiring: Pricing Analyst 💼
📍 Location: Hyderabad
🕘 Timings: 9:30 AM – 6:30 PM | 🗓️ 5 Days Working
💼 Experience: 2+ Years
🏢 Industry: Design | Hospitality | Facade
✨ Key Responsibilities:
📊 Analyze market trends, competitor pricing & cost structures
💡 Recommend optimal pricing strategies for profitability
📈 Monitor margins, costs & sales performance
🤝 Collaborate with Sales, Procurement & Finance teams
📑 Prepare pricing models & forecasts for business planning
📉 Support contract negotiations with data-driven insights
🎓 Requirements:
✅ Bachelor’s degree (any field)
💪 2+ yrs in Pricing / Financial / Business Analysis
📘 Strong Excel & analytical skills
🗣️ Excellent communication & presentation abilities
📩 Share your resume
Tasks:
· You will take part in the design of the architecture and the development of modern software-based applications in the back-end (Node.JS, TypeScript, Nest.JS)
· You support the technological conception, programming and implementation of new features
· You help manage IoT-to-Cloud services, Microservices, Kubernetes and Docker on GCP
Requirements:
· Bachelor or Master student in an IT-related field of study
· Ability to holistically conceive applications in multiple technologies in the backend
· Experience in agile software development with JavaScript, TypeScript, Node.JS and ideally Nest.JS
· First experiences in Cloud (AWS/Azure/GCP) and Docker
·
Nice to Have:
· Experience with GraphQL, Microservices, Kubernetes
· Understanding of IoT device management and message brokers like AMQP or MQTT
Benefits:
· A responsible position in a fast-growing and highly innovative start-up
· An agile and diverse team with colleagues from all over the world
· English speaking open work environment, with flat hierarchies and short decision-making paths
· Advanced technology stack leveraging cutting-edge IoT hardware and software
· Creative freedom for own ideas, projects and personal development
· Team building, learning and start-up events
You have experience in tackling organization wide challenges which are complex initiative that span multiple teams and products. Strong bias for operational and engineering excellence. critical that your voice is heard in product and business decisions. You like to find patterns that can reuse across multiple stacks, and willingness to help organisations adopt newer technologies. You are a great communicator and you take care setting high benchmarks in technical design choices. You spend a lot of time on research and love to discuss pros/cons of all choices and helping others towards making careful decisions.
Responsibilities:
- You will participate in all aspects of our development and provide expertise on architecture decisions we'll need to make to solve organisational problems.
- Driving technical roadmaps for the organisation.
- Set the north star for all key technical metrics to monitor and set high standards.
- Identify and articulate current state architecture and clarity on final state architecture and build organisation level influence to drive this to conclusion.
- Coaching / mentoring folks across the organisation on various technologies.
- Improve the recruiting and hiring process.
- Thinks about culture and how to impact it.
- Build, develop and scale our platform that powers real estate agents, buyers and sellers.
- Become a domain expert on real estate technology and products and an empathetic partner to our customers.
- Inspire, recruit and mentor your engineering colleagues.
- Operate in a scalable engineering culture that leverages modern principles of decoupled systems and automated CI/CD/testing/monitoring to drive. efficiencies.
Requirements:
- BS in CS or EE or equivalent.
- Proven track record working in top talent and high performance environments.
- Experience working on large scale systems in rapid growth environments.
- Experience with modern web frameworks (e. g. Go/React), distributed computing (e. g. Spark), public cloud platforms (e. g. AWS) and data pipelines
- Familiarity with containerisation, microservices architecture, continuous integration and delivery.
- Experience with multiple well-known products throughout their life cycles, from idea conception to product release and maintenance.
- 10+ years experience.
- Published Engineering blogs.
- Experience with multiple well-known products throughout their life cycles, from idea conception to product release and maintenance.
- External presentations in events.
- 6+ years of experience working with MongoDB or other NoSQL databases.
- Maintain and configure MongoDB (developer)
- Keep clear documentation of the database setup and architecture.
- Backup and Disaster Recovery management.
- Adept with all the best practices and design patterns in MongoDB for designing document schemas.
- Good grasp of MongoDB’s aggregation framework.
- Ensure that the databases achieve maximum performance and availability.
- Design indexing strategies.
- Configure, monitor, and deploy replica sets.
- Should have experience with MongoDB Atlas.
- Should have minimum experience with development and performance tuning.
- Create roles and users and set their permissions.
- Excellent written, verbal communication skills and critical thinking skills
Please find the JD details below.
Tech Stack Expected: Java 8/11, Spring Boot, Spring Cloud, JUnit/Mockito, MySQL, Kafka, Avro, git, Jenkins, PCF, DDD/BDD/TDD.
- We are looking for candidates with strong programming experience of more than 3 years and deep knowledge on data structure and algorithms.
- Need strong in Spring Cloud
- Strong experience in Microservices
- DS and Algorithms: looking for problem solving skills. Who can Optimize code
- PCF: Manage the pipeline by Developer. Dev need to own it.
- Fundamental knowledge on Docker and Kubernetes will be helpful. Know how on Scale up and down etc...
- Secure coding practices
- Nice to have Pair programming experience:




