Cutshort logo
EMR Jobs in Hyderabad

4+ EMR Jobs in Hyderabad | EMR Job openings in Hyderabad

Apply to 4+ EMR Jobs in Hyderabad on CutShort.io. Explore the latest EMR Job opportunities across top companies like Google, Amazon & Adobe.

icon
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
5 - 8 yrs
₹11L - ₹20L / yr
PySpark
Apache Kafka
Data architecture
skill iconAmazon Web Services (AWS)
EMR
+32 more

JOB DETAILS:

* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka

* Industry: Global digital transformation solutions provider

* Salary: Best in Industry

* Experience: 5-8 years

* Location: Hyderabad

 

Job Summary

We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.


Key Responsibilities

ETL Pipeline Development & Optimization

  • Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
  • Optimize data pipelines for performance, scalability, fault tolerance, and reliability.

Big Data Processing

  • Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
  • Ensure fault-tolerant, scalable, and high-performance data processing systems.

Cloud Infrastructure Development

  • Build and manage scalable, cloud-native data infrastructure on AWS.
  • Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.

Real-Time & Batch Data Integration

  • Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
  • Ensure consistency, data quality, and a unified view across multiple data sources and formats.

Data Analysis & Insights

  • Partner with business teams and data scientists to understand data requirements.
  • Perform in-depth data analysis to identify trends, patterns, and anomalies.
  • Deliver high-quality datasets and present actionable insights to stakeholders.

CI/CD & Automation

  • Implement and maintain CI/CD pipelines using Jenkins or similar tools.
  • Automate testing, deployment, and monitoring to ensure smooth production releases.

Data Security & Compliance

  • Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
  • Implement data governance practices ensuring data integrity, security, and traceability.

Troubleshooting & Performance Tuning

  • Identify and resolve performance bottlenecks in data pipelines.
  • Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.

Collaboration & Cross-Functional Work

  • Work closely with engineers, data scientists, product managers, and business stakeholders.
  • Participate in agile ceremonies, sprint planning, and architectural discussions.


Skills & Qualifications

Mandatory (Must-Have) Skills

  1. AWS Expertise
  • Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
  • Strong understanding of cloud-native data architectures.
  1. Big Data Technologies
  • Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
  • Experience with Apache Spark and Apache Kafka in production environments.
  1. Data Frameworks
  • Strong knowledge of Spark DataFrames and Datasets.
  1. ETL Pipeline Development
  • Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
  1. Database Modeling & Data Warehousing
  • Expertise in designing scalable data models for OLAP and OLTP systems.
  1. Data Analysis & Insights
  • Ability to perform complex data analysis and extract actionable business insights.
  • Strong analytical and problem-solving skills with a data-driven mindset.
  1. CI/CD & Automation
  • Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
  • Familiarity with automated testing and deployment workflows.

 

Good-to-Have (Preferred) Skills

  • Knowledge of Java for data processing applications.
  • Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
  • Familiarity with data governance frameworks and compliance tooling.
  • Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
  • Exposure to cost optimization strategies for large-scale cloud data platforms.

 

Skills: big data, scala spark, apache spark, ETL pipeline development

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Hyderabad

Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer

F2F Interview: 14th Feb 2026

3 days in office, Hybrid model.

 


Read more
Global product development and platform engineering company

Global product development and platform engineering company

Agency job
via Triunity Software Inc by Prashant Rathore
Hyderabad
10 - 10 yrs
₹12L - ₹15L / yr
skill iconPython
skill iconJavascript
AWS CLOUD
skill iconDjango
TypeScript
+6 more

Hi,

This is Prashant, a Senior Recruiter from Triunity Software Inc. a leading staffing organization.


Title: Staff Software Engineer with Cloud & Healthcare Experience (Need Local Candidates)


Job Location: Hyderabad, India, Hybrid with 3-days a week at our Offshore location

 

Job Summary:

Responsible for full stack software definition, development and maintenance for cloud-based software applications used in healthcare. Specifically, responsible for all external interfaces with the cloud applications. Work closely with the Cloud Engineering team, Project leaders to define requirements, develop code and conduct unit and system-level tests for the software you develop.

 

Job Responsibilities (but not limited to):

 

·    Work closely with Cloud Engineering team members in architecting and designing cloud- based solutions.

·    Ultrasound DICOM and Electronic Medical Record (EMR) interfaces.

·    OAuth and Single Sign-on (SSO) interfaces with cloud applications.

·    Workflow engine that automates and manages workflows in the applications.

·    Assist in the architecture, design, and deployment of the full stack product.

·    Build software that meets HIPAA and Cybersecurity requirements for medical products.

·    Be part of a small, cross-functional product team working alongside product managers, design engineers, clinical engineers and software engineers.

·    Meet all Quality Management System (QMS) requirements for design, development, testing and product release.

 

Minimum Education/Experience:

 

·    Bachelor’s degree in computer science or Equivalent

·    10+ years of software development experience with Python, Django and TypeScript/JavaScript.

·    Experience with Terraform, CI/CD and AWS.

·    Familiarity with medical imaging and interface protocols used in healthcare including DICOM, HL7, FIHR, LDAP and Single Sign-on (SSO) if preferable.


Thanks & Regards

Prashant Rathore |Sr.IT Recruiter| Triunity Software Inc.


Read more
German-based automotive company

German-based automotive company

Agency job
via Signio GlobalTech by Sneha Kurri
Hyderabad
3 - 10 yrs
₹8L - ₹15L / yr
Xtend
RCP
skill iconJava
Rhapsody
skill iconJavascript
+11 more
Strong in Java programming with Xtend, Xtext, RCP
Knowledge on Model to Code Generation
Ability to work independently, with minimal training and direct guidance
Ability to respond to customer inquiries quickly
Ability to quickly modify/setup routes
Familiarity with Rhapsody Secure transmission protocols (e.g. Secure File Transfer (SFT) and Secure Object Access Protocol (SOAP) routes process, etc.
Prior experience with protocols like OSLC, SOAP and REST APIs
Ability to identify and resolve exceptions with electronic data exchange between EMR data submitters, and data recipients.
Knowledge of HL7/XML/FHIR/EDI standards
Strong in building JUnit tests during development
Read more
Clarus RCM

Clarus RCM

Agency job
via Equavantage by Richa Shiromani
Chennai, Hyderabad, Coimbatore
0 - 3 yrs
₹3L - ₹4L / yr
HCC coding AAPC/AHIMA Certification
EMR
HCC Coding
HCC Documentation and coding. AAPC / AHIMA Certification. Knowledge of EMR for reviewing records. Analytical skills
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort