11+ DHCP administration Jobs in Hyderabad | DHCP administration Job openings in Hyderabad
Apply to 11+ DHCP administration Jobs in Hyderabad on CutShort.io. Explore the latest DHCP administration Job opportunities across top companies like Google, Amazon & Adobe.
AD Skills:
- Implementing Active Directory Infrastructure
- Expertise on Windows Infrastructure components integration like DNS, DHCP etc.
- Assess Domain Controllers, Domain Architecture, Centralized Migrations, GPO Designs, Windows DNS/DHCP Management and Forest Level Trust Architecture
- Cloud Service integration with Active Directory including Amazon Web Services and Azure Active Directory Services
- Multi Forest/Domain Active Directory and DNS migration
- In depth knowledge of AD Site Topology, OU’s Structure, Group Policies, NTP, LDAP and implementation and AD components
- Good knowledge on AD Security and identity principals
- Strong integration experience of ADFS and integration with the on-prem Active Directory
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
2+ years of hands-on experience in data engineering, with at least 2 years on GCP
Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization, data modeling, and pipeline orchestration
Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
Exposure to Dataform, dbt, or similar tools for ELT workflows
Solid understanding of data architecture, schema design, and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
Skills Required:
- 4+ years' experience in application development in the following
- Mandatory: C# ASP.NET, MVC, REST API, T SQL, oops, Client facing skills
- Preferred: ASP.Net Core, JavaScript or Angular, Azure, VB.NET, WWF, Microservices
Job Description:
- Mandatory: Extensive hands-on experience in developing web applications using C# ASP.NET and MVC
- Mandatory: TSQL ability to develop and analyze existing complex flows within Stored Procedures
- Mandatory: Utilizing REST APIs for development of background processes with an emphasis on security (OIDC/OAUTH) and performance
- Mandatory: Extensive understanding of Object-Oriented Design and Programming.
- Mandatory: Ability to talk to international clients and also capable of explaining about the product features as well as support issues (client checks this ability by asking them to explain about their current projects)
- Preferred: Angular (not AngularJS) or / and JavaScript
- Preferred: Azure based DevOps and source check into Git and maintaining version control
- Preferred: Windows Workflow Foundation (workflow activities and business process automation) => tool utilizes VB.Net syntax
- Preferred: Azure function apps for serverless workloads
- Preferred: Experience implementing integrations (utilizing SOAP, SFTP, XML, etc.)
- Preferred: Experience with microservice architectures

One of the top Multinational IT and Consulting Company
Work Location- PAN India

Requirement 1: Sr. Sales Executive ( Internal Requirement- Only Female candidates)
- Experience Required: Minimum 2 years in Direct/Perm Staffing and Contingent Staffing Sales for the USA market.
- Scadea Payroll
- Preferred Candidate: Female
- Budget: 6 LPA don’t go beyond.
- NO Cab Facility
- Shift: Night Shift (Onsite – Bangalore)
- Skills Required:
- Excellent communication skills
- Extensive experience in lead generation, cold calling, and cold emailing, US Staffing.
- Additional Criteria: Must have stayed in the same company for over a year.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
We are seeking a highly skilled Java Developer to join our team. The ideal candidate should have at least 5 years of experience in Java development and be able to work from the office.
Responsibilities:
- Develop high-quality software design and architecture using Java technology
- Produce clean, efficient, and maintainable code
- Conduct code reviews and testing to identify and fix software defects
- Collaborate with cross-functional teams to identify and solve complex software problems
- Design, develop and maintain software applications using Java technology
- Debug and troubleshoot software defects
Requirements:
- Proven experience as a Java Developer with at least 5+ years of experience
- Knowledge of web frameworks such as Spring, Hibernate, and Struts
- Familiarity with Agile methodologies and software development life cycle (SDLC)
- Experience with source code management tools such as Git or SVN
- Excellent problem-solving skills and ability to work independently
- Strong written and verbal communication skills
- Immediate joiners preferred
If you are a Java Developer with a strong work ethic and a passion for developing high-quality software solutions, we encourage you to apply.
Company : It is a technology products
the company, leading digital transformation for enterprises using blockchain. It offers services in Blockchain
application development, training & consulting and an innovative suite of specialized products for e-
Governance, Financial Services, Insurance, Sustainable Supply Chain & Healthcare.
Established in 2017 in Pune, Snapper Future Tech has raised Pre-Seed and Seed rounds
through Enemtech Capital and strategic investors globally. A Hyperledger Certified Service Provider (HCSP) &
Training partner (HTP), the company participates in open-source initiatives across the globe & has robust
technological alliances & partnerships with Hyperledger, Oracle, Amazon Web Services, IBM, Trust over IP
& Sovrin.
Responsibilities
· Writing scalable, robust, testable, efficient, and easily maintainable code
· Translating software requirements into stable, working, high performance software
· Playing a key role in architectural and design decisions, building toward an efficient micro services
distributed architecture
· Strong knowledge of Go programming language, paradigms, constructs, and idioms
· Knowledge of common Go routine and channel patterns
· Experience with the full site of Go frameworks and tools, including:
o Dependency management tools such as Godep, Sltr, etc.
o Go’s templating language
o Go’s code generation tools, such as Stringer
o Popular Go web frameworks, such as Revel
o Router packages, such as Gorilla Mux
· Ability to write clean and effective Godoc comments
Qualifications
· Bachelor’s degree in Computer Science and 3+ years of experience in web development &
· Strong programming skills
· Good verbal and written communication skills
· Working experience in Microservices
· Should have Knowledge of docker & Kubernetes
· Should have working experience in messaging queues(RabbitMQ).
· Should have some knowledge of Cryptography
· Outstanding understanding of data structures and algorithms.
· Good understanding of Relational, NoSQL & In-memory Database like Redis .
Location: Hyderabad
Joining Date: ASAP
We offer exciting opportunities to learn new technologies and fast career growth path. We ensure a successful
career for our people.
We are looking for a passionate full-stack developer to be responsible for all platform-related duties, from developing designs for complicated applications to analyzing code. The key responsibilities include writing and testing code, debugging programs, and integrating applications with third-party web services, optimizing applications, ensuring UI/UX feasibility, and implementing API designs and architecture.
You should know python and use server-side logic. Ultimately, you'll build highly responsive web applications that align with our business needs.
Selected day-to-day responsibilities include:
- Writing effective, scalable, and sustainable code
- Developing back-end components to improve responsiveness and overall performance
- Integrating user-facing elements into applications
- Testing and debugging programs
- Improving the functionality of existing systems
- Implementing security and data protection solutions
- Assessing and prioritizing feature requests
- Ensuring the feasibility of UI/UX designs
- Coordinating with internal teams to understand user requirements and provide technical solutions
An ideal candidate must possess excellent Logical & Analytical skills. You will be working in a team as well on diverse projects. The candidate must be able to deal smoothly and confidently with the Clients & Personnel.
Key roles and Responsibilities:
⦁ Able to design and build efficient, testable and reliable code.
⦁ Should be a team player sharing ideas with the team for continuous improvement and development process.
⦁ Good Knowledge on Spring Boot, Spring MVC, J2EE and SQL Queries.
⦁ Stay updated of new tools, libraries, and best practices.
⦁ Adaptable, Self-Motivated, must be willing to learn new things.
⦁ Sound Good knowledge on HTML, CSS, JavaScript.
Basic Requirements:
⦁ Bachelors' Degree in Computer Science Engineering / IT or related discipline with a good academic record.
⦁ Excellent communication skills and interpersonal skills.
⦁ Knowledge on SDLC flow from requirement analysis to deployment phase.
⦁ Should be able to design, develop and deploy applications.
⦁ Able to identify bugs and devise solutions to address and resolve the issues.


