11+ Kerberos Jobs in Hyderabad | Kerberos Job openings in Hyderabad
Apply to 11+ Kerberos Jobs in Hyderabad on CutShort.io. Explore the latest Kerberos Job opportunities across top companies like Google, Amazon & Adobe.
Skills
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Criteria
Mandatory
Strong Dremio / Lakehouse Data Architect profile
Mandatory (Experience 1) – 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
Mandatory (Experience 2) – Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
Mandatory (Technical Skills 1) – Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
Mandatory (Technical Skills 2) – Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
Mandatory (Architecture) – Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
Mandatory (Governance) – Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
Mandatory (Stakeholder Management) – Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
Mandatory (Company) – Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
We are seeking an experienced Senior Software Engineer to join our Vet Healthcare Technology team. In this role, you will design, develop, and maintain cloud-native applications on Azure that power our Practice Management platform. You’ll collaborate closely with cross-functional teams—clinical SME’s, architects, QA, and DevOps—to deliver robust, scalable, and secure solutions utilizing .NET 8, React, and modern Azure services.
Key Responsibilities
- Architecture & Design
- Lead design discussions and apply proven design patterns (e.g., CQRS, Repository, Factory) to ensure clean, maintainable code.
- Define microservices boundaries and integration strategies (APIs, messaging) for HL7 and FHIR data flows.
- Development & Integration
- Build backend services in .NET 8, leveraging Azure Functions, Logic Apps, Service Bus, API Gateway, and Storage Services.
- Develop responsive front-end interfaces using React, TypeScript, and state-management libraries (e.g., Redux or Context API).
- Implement data persistence layers for SQL Server and PostgreSQL, including schema design, stored procedures, and performance tuning.
- Integrate with healthcare standards (HL7 v2/v3, FHIR R4) and third-party systems via secure, high-throughput interfaces.
- Quality & Compliance
- Write unit and integration tests to ensure code quality; participate in code reviews and pair-programming sessions.
- Follow best practices for security, privacy, and compliance in healthcare (HIPAA, GDPR, etc.).
- Mentorship & Collaboration
- Mentor mid-level engineers, drive knowledge-sharing sessions, and contribute to technical roadmaps.
- Work in an Agile/Scrum environment: estimate user stories, attend sprint ceremonies, and deliver incremental value.
React Developer at BeyondScale
BeyondScale is a technology company on a mission to democratise AI for small and medium-sized businesses (SMBs). We're building Sitara, an AI-powered ERP suite that simplifies core business processes for the service sector. Imagine a suite of intuitive micro-apps - a pocket CRM, a streamlined POS system, and essential tools - all powered by intelligent automation and a user interface (UI) that fades into the background.
The Opportunity:
We're seeking a talented React Developer to join our growing team and play a crucial role in building the future of AI-powered ERP. You'll be responsible for crafting beautiful, user-friendly interfaces that bring Sitara's intuitive design to life.
What You'll Do:
- Develop and maintain highly interactive and responsive user interfaces (UIs) using ReactJS and its ecosystem.
- Collaborate with designers and engineers to translate UI mockups into clean, maintainable React code.
- Implement state management solutions, component-based design patterns, and utilise hooks effectively.
- Translate UI designs into pixel-perfect CSS, ensuring a seamless user experience across devices.
- Write clean, well-documented code that prioritises maintainability and future growth.
- Actively participate in testing, debugging, and problem-solving to ensure a flawless user experience.
- Utilise Git version control and adhere to modern development workflows for seamless collaboration.
- Prior experience with React Native is a plus.
You're a Great Fit If You:
- Have 1+ years of experience building web applications with ReactJS.
- Possess a strong understanding of core computer science principles like data structures and algorithms.
- Are passionate about creating user-centric interfaces and delivering exceptional UX.
- Thrive in a collaborative environment and enjoy working closely with designers and engineers.
- Have excellent communication and problem-solving skills.
- Are a coding craftsman who prioritises clean, maintainable code.
Why BeyondScale?
- Impact: Be a part of a revolutionary product that empowers service-based businesses to thrive.
- Innovation: Work at the forefront of AI and help shape the future of ERP software.
- Autonomy & Growth: Take ownership of your work, gain access to production systems, and grow alongside the company.
- Ground Floor Opportunity: Join a high-potential startup on the cusp of explosive growth.
Position Title: Sr. Manager/Manager Designs – Power Evacuation
Experience: 14 To 20 Years
Industry Type: Renewable Energy
Key Skills: AutoCAD, ETAP, Power World
Job Objectives:
- Lead the design of efficient power evacuation systems to integrate solar power plants with the national or regional grid.
- Design the layout of transmission lines, substations, step-up transformers, and switchgear for large-scale solar projects.
- Perform load flow studies, short circuit studies, and other electrical analyses to ensure the robustness of evacuation systems.
- Lead the planning, design, and implementation of power evacuation systems, ensuring they are aligned with project timelines, budgets, and specifications.
- Coordinate with external consultants, EPC contractors, and vendors to ensure seamless execution of the power evacuation systems.
- Ensure that all evacuation designs are compliant with local, national, and international standards (such as IEC, IEEE).
- Lead and mentor a team of engineers and technical specialists.
- Work closely with project managers, regulatory bodies, and other stakeholders to ensure project alignment and timely execution.
- Proficiency in Unity3D and C# programming language.
- Strong understanding of unity 3d and 2d principles and best practices.
- Familiarity with version control systems such as Git.
- Excellent problem-solving and communication skills.
- Ability to work effectively in a collaborative team environment.
- Familiarity with VR/XR/AR hardware and platforms, including Oculus, HTC Vive, HoloLens, and ARKit/ARCore.
- Excellent problem-solving skills and ability to adapt to evolving technologies and project requirements.
Dear Candidates.
Greetings from Nowfloats.
We are looking for QA Engineering Manager who are having experience in Team leading, Stalkholder management and Into Team leading also.
- Total Experience of 7-10 years and should be interested in teaching and research
- 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
- 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
- 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
- Experience in designing, building, and maintaining ETL systems
- Experience in data pipeline and workflow management tools like Airflow
- Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
- Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
- Should be good at storytelling in Technology
Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors
• Strong experience as a Java/J2EE development is required
• Excellent working knowledge in SPRING MVC, SPRING BOOT,
• Strong background in developing and deploying software that runs in a real-time, multi-threaded environment
• Good knowledge and experience with concepts of MVC, JDBC and RESTful API Integration
• Experience with threaded and asynchronous environment
• Experience with any of the following Frameworks is Desired: Spring, Spring Boot, Hibernate
• Fundamental understanding of design patterns
• Working knowledge of SOAP/XML/WSDL
• Proven experience in MongoDB
• Experience supporting and troubleshooting problems in a highly complex environment
• Familiar with agile / scrum development methodologies
• Proficient understanding of Code version tools like Git/ Bitbucket and SVN
• Familiarity with Continuous Integration and tools such as Maven and Jenkins.
Role & Responsibilities:
• You will be responsible for Java development and building large scale applications that are high performance, scalable, and resilient in an SOA environment
• Working closely with end-users and other members of the team to identify and employ the best solutions
• Developing and implementing strong algorithms/techniques for solving problems in a high-volume, high-availability environment
• Engaging end-users to identify new requirements, strategic direction and highlight issues
• Defining and building maintainable processes that provide resilient and stable platforms, which support end user’s business/technical demand
• Integrating new services and providing clean APIs and services for applications
• Understanding volume growth to ensure the systems



