Cutshort logo
Dynatrace Jobs in Hyderabad

2+ Dynatrace Jobs in Hyderabad | Dynatrace Job openings in Hyderabad

Apply to 2+ Dynatrace Jobs in Hyderabad on CutShort.io. Explore the latest Dynatrace Job opportunities across top companies like Google, Amazon & Adobe.

icon
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
5 - 8 yrs
₹11L - ₹20L / yr
PySpark
Apache Kafka
Data architecture
skill iconAmazon Web Services (AWS)
EMR
+32 more

JOB DETAILS:

* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka

* Industry: Global digital transformation solutions provider

* Salary: Best in Industry

* Experience: 5-8 years

* Location: Hyderabad

 

Job Summary

We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.


Key Responsibilities

ETL Pipeline Development & Optimization

  • Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
  • Optimize data pipelines for performance, scalability, fault tolerance, and reliability.

Big Data Processing

  • Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
  • Ensure fault-tolerant, scalable, and high-performance data processing systems.

Cloud Infrastructure Development

  • Build and manage scalable, cloud-native data infrastructure on AWS.
  • Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.

Real-Time & Batch Data Integration

  • Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
  • Ensure consistency, data quality, and a unified view across multiple data sources and formats.

Data Analysis & Insights

  • Partner with business teams and data scientists to understand data requirements.
  • Perform in-depth data analysis to identify trends, patterns, and anomalies.
  • Deliver high-quality datasets and present actionable insights to stakeholders.

CI/CD & Automation

  • Implement and maintain CI/CD pipelines using Jenkins or similar tools.
  • Automate testing, deployment, and monitoring to ensure smooth production releases.

Data Security & Compliance

  • Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
  • Implement data governance practices ensuring data integrity, security, and traceability.

Troubleshooting & Performance Tuning

  • Identify and resolve performance bottlenecks in data pipelines.
  • Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.

Collaboration & Cross-Functional Work

  • Work closely with engineers, data scientists, product managers, and business stakeholders.
  • Participate in agile ceremonies, sprint planning, and architectural discussions.


Skills & Qualifications

Mandatory (Must-Have) Skills

  1. AWS Expertise
  • Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
  • Strong understanding of cloud-native data architectures.
  1. Big Data Technologies
  • Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
  • Experience with Apache Spark and Apache Kafka in production environments.
  1. Data Frameworks
  • Strong knowledge of Spark DataFrames and Datasets.
  1. ETL Pipeline Development
  • Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
  1. Database Modeling & Data Warehousing
  • Expertise in designing scalable data models for OLAP and OLTP systems.
  1. Data Analysis & Insights
  • Ability to perform complex data analysis and extract actionable business insights.
  • Strong analytical and problem-solving skills with a data-driven mindset.
  1. CI/CD & Automation
  • Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
  • Familiarity with automated testing and deployment workflows.

 

Good-to-Have (Preferred) Skills

  • Knowledge of Java for data processing applications.
  • Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
  • Familiarity with data governance frameworks and compliance tooling.
  • Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
  • Exposure to cost optimization strategies for large-scale cloud data platforms.

 

Skills: big data, scala spark, apache spark, ETL pipeline development

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Hyderabad

Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer

F2F Interview: 14th Feb 2026

3 days in office, Hybrid model.

 


Read more
CGI Inc

at CGI Inc

3 recruiters
Shruthi BT
Posted by Shruthi BT
Hyderabad
6 - 8 yrs
₹4L - ₹12L / yr
Bridge calls
Dynatrace
Splunk

Application Monitoring


Position Description

JD2 – SRE: Digital Channels Application Support

Experience: 6–9 years in Production Application Support / Site Reliability Engineering (Banking/Financial Services)

Work Model: 24x7 rotational shifts, ODC environment (no personal digital devices)

Responsibilities

Act as SRE for Digital Banking, Middleware, and Infra components.

Provide production support for Internet Banking, Mobile Banking, Cards, Payments, Digital Wallets, APIs.

Troubleshoot customer authentication flows (SSO, OAuth, 2FA, session management).

Lead and drive P1/P2 bridge calls, coordinating with app, infra, DB, and network teams.

Support API-driven integrations with third-party fintech/payment partners.

Perform proactive monitoring, certificate renewals, and infra health checks.

Enhance observability (Dynatrace, Logscale, Splunk, Grafana, Bigpanda, ThousandEyes).

Propose and implement automation solutions (Jenkins, Ansible AWX, Python, Shell) to reduce manual work.

Support release and deployment readiness for digital channels in ODC environment.

Conduct RCA for major incidents and recommend resilience & scalability improvements.

Monitor SLIs/SLOs for digital user experience (latency, availability, throughput).

Ensure security & compliance for API and digital banking platforms (SSL/TLS, PCI, fraud monitoring).

Partner with client stakeholders to maintain operational stability and high availability.

Collaborate with business stakeholders on banking-critical issues and ensure compliance with SLAs and regulatory standards.

Participate in DR drills, HA testing, and compliance audits.

Core Technologies (L2 / Deep Knowledge)

OpenShift (Kubernetes), Linux/Unix/Windows, Database, Networking, Middleware.

Common Tools (L1.5 – Awareness for All)

Bigpanda, Grafana, ThousandEyes, Splunk, ServiceNow, Jira, Confluence, Netcool, Siren, Glassbox.

Must Have Skills

Strong experience in Digital Banking & Middleware application support.

Hands-on experience with Internet/Mobile banking platforms, card services, and payment gateways.

Infra knowledge (servers, certificates, OS, networking).

Strong observability and monitoring (Dynatrace, Logscale, Splunk).

Incident management & RCA leadership, especially in client-facing bridge calls.

Strong Knowledge of ITIL processes (Incident, Problem, Change, Major Incident Management).

Awareness of banking compliance standards (PCI-DSS, SOX, InfoSec policies).

Ability to work under strict SLAs and regulatory environments.

Nice to Have

Kafka, VFaas, F5/Datapower, Akamai Control Center.

Salesforce (Apex), Pega, SharePoint.

Java, Node.js, JavaScript, Python, C#.

ETL, Hadoop, Teradata, Bigdata.

Evolven, Ansible AWX, CI/CD and automation (Jenkins, Ansible).


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort