11+ NTP Jobs in Hyderabad | NTP Job openings in Hyderabad
Apply to 11+ NTP Jobs in Hyderabad on CutShort.io. Explore the latest NTP Job opportunities across top companies like Google, Amazon & Adobe.
AD Skills:
- Implementing Active Directory Infrastructure
- Expertise on Windows Infrastructure components integration like DNS, DHCP etc.
- Assess Domain Controllers, Domain Architecture, Centralized Migrations, GPO Designs, Windows DNS/DHCP Management and Forest Level Trust Architecture
- Cloud Service integration with Active Directory including Amazon Web Services and Azure Active Directory Services
- Multi Forest/Domain Active Directory and DNS migration
- In depth knowledge of AD Site Topology, OU’s Structure, Group Policies, NTP, LDAP and implementation and AD components
- Good knowledge on AD Security and identity principals
- Strong integration experience of ADFS and integration with the on-prem Active Directory
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Key Responsibilities:
● Work closely with product managers, designers, frontend developers, and other
cross-functional teams to ensure the seamless integration and alignment of frontend and
backend technologies, driving cohesive and high-quality product delivery.
● Develop and implement coding standards and best practices for the backend team.
● Document technical specifications and procedures.
● Stay up-to-date with the latest backend technologies, trends, and best practices.
● Collaborate with other departments to identify and address backend-related issues.
● Conduct code reviews and ensure code quality and consistency across the backend team.
● Create technical documentation, ensuring clarity for future development and
maintenance.
Requirements;
● Experience: 4-6 years of hands-on experience in backend development, with a strong
background in product-based companies or startups.
● Education: Bachelor’s degree or above in Computer Science or a related field.
● Programming skills: Proficient in Python and software development principles, with a
focus on clean, maintainable code, and industry best practices. Experienced in unit
testing, AI-driven code reviews, version control with Git, CI/CD pipelines using GitHub
Actions, and integrating New Relic for logging and APM into backend systems.
● Database Development: Proficiency in developing and optimizing backend systems in
both relational and non-relational database environments, such as MySQL and NoSQL
databases.
● GraphQL: Proven experience in developing and managing robust GraphQL APIs,
preferably using Apollo Server. Ability to design type-safe GraphQL schemas and
resolvers, ensuring seamless integration and high performance.
● Cloud Platforms: Familiar with AWS and experienced in Docker containerization and
orchestrating containerized systems.
● System Architecture: Proficient in system design and architecture with experience in
developing multi-tenant platforms, including security implementation, user onboarding,
payment integration, and scalable architecture.
● Linux Systems: Familiarity with Linux systems is mandatory, including deployment and
management.
● Continuous Learning: Stay current with industry trends and emerging technologies to
influence architectural decisions and drive continuous improvement.
Benefits:
● Competitive salary.
● Health insurance.
● Casual dress code.
● Dynamic & Collaboration friendly office.
● Hybrid work schedule.
Industry
- IT Services and IT Consulting
Employment Type
Full-time
About You: ● Education ranging from a Bachelor’s of Science degree in computer science or related engineering degree. ● 12+ years of high level API, abstraction layers, and application software development experience. ● 5+ years experience building scalable, serverless solutions in GCP or AWS ● 4+ years of experience in Python, MongoDB ● Experience with large-scale distributed systems and streaming data services. ● Experience building, developing, and maintaining cloud native infrastructure, serverless architecture, micro-operations, and workflow automation. ● You are a hardworking problem-solver who thrives in finding solutions to difficult technical challenges. ● Experience with modern high-level languages and databases including Javascript, MongoDB, and Python. ● Experience in Github, Gitlab, CI/CD, Jira, unit testing, integration testing, regression testing, and collaborative documentation. ● Expertise with GCP, Kubernetes, Docker, or containerization, is a great plus. ● Ability to write and assess clean, functional, high quality and testable code for each of our projects. ● Positive and proactive, solution-focused contributor and team motivation.
We are seeking a Senior Data Scientist with hands-on experience in Generative AI (GenAI) and Large Language Models (LLM). The ideal candidate will have expertise in building, fine-tuning, and deploying LLMs, as well as managing the lifecycle of AI models through LLMOps practices. You will play a key role in driving AI innovation, developing advanced algorithms, and optimizing model performance for various business applications.
Key Responsibilities:
- Develop, fine-tune, and deploy Large Language Models (LLM) for various business use cases.
- Implement and manage the operationalization of LLMs using LLMOps best practices.
- Collaborate with cross-functional teams to integrate AI models into production environments.
- Optimize and troubleshoot model performance to ensure high accuracy and scalability.
- Stay updated with the latest advancements in Generative AI and LLM technologies.
Required Skills and Qualifications:
- Strong hands-on experience with Generative AI, LLMs, and NLP techniques.
- Proven expertise in LLMOps, including model deployment, monitoring, and maintenance.
- Proficiency in programming languages like Python and frameworks such as TensorFlow, PyTorch, or Hugging Face.
- Solid understanding of AI/ML algorithms and model optimization.
EverestEngineering -Software Engineering when you need it. High quality, scalable, distributed development teams are ready to help you now. Sustainable software development. Fit for purpose. Doing the right thing both for our customers and for yours.
As a team, we are passionate and motivated by the impact our organization can have on our customers, employees, the industry, and the world. We are here to accelerate software innovation by digitally connecting the global workforce. Through our experience, we know that working with remote teams and getting it right can become a competitive advantage for organizations. We want to provide this advantage to customers and partners that are trying to have a positive impact on the world through software innovation
Excellence is at the forefront of our mission. We see an opportunity to shift the narrative of working with offshore teams - from a frustrating cost-cutting exercise to a beneficial value addition.https://everest.engineering/
Our experience means we understand the problems software companies face trying to build offshore distributed teams because we've been there before ourselves.
We work iteratively together to manage these problems for you, or to provide extra capacity in peak times so that you can do what you do best – deliver amazing innovations and delighting your customers
To see the quality of our code, you can checkout some of our open source projects: https://github.com/everest-engineering" target="_blank">https://github.com/
Specialties-
Big Data, Web Development, Mobile Development, Agile Development, Data Analytics, Software Product Development, and Remote Working
HeadQuarters- Melbourne, Victoria
We love people who in general -
- have a passion to own and create amazing products.
- are able to clearly understand the customer’s problem.
- are a good collaborative problem solver.
- are a really really good team player.
- are open to learn from others and teach others.
- are able to take the meaningful feedback and improve continuously.
- can commit to inclusion, equality & diversity.
- can maintain integrity at work.
You are the one if -
- you love solving problems.
- you have a keen interest in understanding and analysing the customer problems and solutions.
- you possess good listening, communication (verbal & written) and presentation skills.
- you can empathise easily with customers.
- you can facilitate and lead workshops that generate customised business solutions.
To be successful in this role, you need to -
- analyse the as-is system and collaborate with clients to create artefacts (personas, journeys, epics, stories, to-be system etc.) to outline business vision, objective, product roadmap, and a project release plan.
- effectively manage the scope and constraints of the project.
- work in agile teams that help in the successful delivery of a project.
- effectively prioritise and obtain buy-in from all the stakeholders to help the team with the requirements.
- write detailed stories to help the team understand the requirements.
- demonstrate flexibility in picking up things/roles needed for the successful delivery of the project.
- collaborate with the UX team to understand and contribute to the user research & design process.
- be an effective liaison between the client and your team to manage the product backlog and keep an eye on the software delivery.
1.Hands-on Hyperion implementation experience in EPBCS.
2. Must have done atleast 2 implementation of Hyperion planning
3. Must understand application dimesionality, metadata setup, data forms creation, dashboard creation and setting up tasklists
4. Demonstrated ability to work independently as well as in a collaborative team environment.
5. Experience with EPBCS preferred
6. Must have handled scripting and data rules creation
7. Should be good in Rules calculations
- Perform complex and varied applications support. Build, maintain & modify within a large environment;
- Identify issues, evaluate possible solutions, and make recommendations on the most appropriate action
- Design, build, test, and document solution-specific information and perform related work as required.
- Demonstrate functional knowledge in relevant clinical applications (OE-PCS-PHA-EDM-ITS-RAD-LAB-MIC-BBK-PTH)
- Demonstrate functional knowledge in Clinical Documentation
- Must have hands-on build and troubleshooting experience
- Must be able to quickly identify and resolve complex system issues
- Must be able to build using technical specifications.
Basic Qualifications:
- Implementation and support experience within a MEDITECH Environment
- Bachelors Degree in business, Healthcare Administration, Communication, Marketing related field or equivalent and relevant work experience
Preferred Qualifications:
- Knowledge of ITIL Incident, Problem, and Change management,
- Working knowledge of Agile, Product and Project methodologies
Expi with Node.js, Express, Feather JS
3rd party API integration knowledge
Database- MySql or NoSql
Kafka Client Integration with Nodejs
Redis integration using Nodejs
Education:
Bachelor’s/ Master’s degree in related field with 2+ years of experience in ERP product sales. .
Skills
- Excellent communication (verbal and written), time management skills, fast learner, self‐ motivated, and comfortable taking initiative and handling multiple projects simultaneously
- Excellent customer approach, interpersonal and influencing skills
- Proficient and demonstrable experience in prospecting, qualifying, creating value-based demonstrations
- A sales person with a proven, successful background in sales of 2+ years of experience in ERP sales.
Job Responsibilities:
- Responsible for New Business Development via prospecting, qualifying, selling and closing software product sales in the Enterprise Division.
- Develop and implement strategy sales in the particular region to maximize growth opportunities, strengthen market share and maximum customer retention.
- Attracting new clients by innovating and overseeing the sales process for the business.
- Responsible for enhancing revenue, within existing and new clients, through continuous client engagement
- Set up meetings with potential clients and listen to their wishes and concerns, Improving sales strategy.
- Research and identify new market opportunities.
- Creates and conducts effective presentation and product demos
- Build & Strengthen market intelligence & sales analytics for identification of opportunities, effective client solutioning and deal conversion.
- Consistently meet and exceed sales targets
- Foster a collaborative environment within the organization.
- Identify trends and customer needs, building a short/medium/long-term sales pipeline in accordance with targets
- Prepare and deliver pitches to potential investors.
- Create and conduct proposal and presentation based on clients’ needs and closing of orders
- Be responsible for conduct webinar about our product.
- Provide timely weekly, monthly and quarterly sales reports
- Client Relationship management



