
Role Overview:
We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.
The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.
Key Responsibilities:
- Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
- Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
- Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
- Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
- Mentor junior engineers, perform code reviews, and promote engineering best practices.
- Stay current with evolving technologies in cloud, big data, and healthcare data standards.
- Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).
Required Skills & Qualifications:
- 4+ years of hands-on experience in data engineering roles.
- Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
- Proficient in Python for data processing and automation.
- Experience with Azure Databricks (or readiness to ramp up quickly).
- Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
- Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
- Familiarity with containerization tools like Docker and orchestration using Kubernetes.
- Exposure to CI/CD pipelines for data applications.
- Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
- Excellent problem-solving abilities and a proactive mindset.
- Strong communication and interpersonal skills to work in cross-functional teams.

Similar jobs
Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.
Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, ETL Design, development,
System testing, Implementation, and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts
and Dimensions
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created applets to use them in different mappings.
Created sessions, configured workflows to extract data from various sources, transformed data,
and loading into the data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Modified existing mappings for enhancements of new business requirements.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to
production repositories
Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex
SQL queries using PL/SQL.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
/Talend sessions as well as performance tuning of mappings and sessions.
Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing, and Documenting.
Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
Experience in using Automation Scheduling tools like Control-M.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration, and user
acceptance testing.
Build, operate, monitor, and troubleshoot Hadoop infrastructure.
Develop tools and libraries, and maintain processes for other engineers to access data and write
MapReduce programs.
Skills Set:
ü Good at communication skills.
ü Excellent Analytical, Problem solving skills.
ü Strong Programming skills in HTML, CSS, JavaScript, Ionic, Cordova, NodeJS.
ü Good exposure to Object oriented programming and MVC coding standards.
ü Excellent knowledge of MySQL/MS SQL Database is required in terms of designing database, SQL queries.
ü Knowledge of optimizing techniques.
• Minimum 5+ year of experience in Magento Development with In-depth knowledge of Magento framework, Frontend Architecture, Themes, Modules, Functionality, and Configuration.
• In-depth knowledge of Magento’s code structure, extension architecture, theming hierarchy, and fallback components
• Strong knowledge in Implementing API services like REST, SOAP
• Must have the ability to develop Magento Modules, Themes, UI Component/Widget, and customizing existing themes/modules.
• Good understanding of the Magento themes, layout, and templating systems.
• Experience in Less and Grunt Workflow.
• Experience in Knockout, Require JS, and Underscore.
• Experience in Customizing Magento jQuery Widgets
• Knowledge of HTML/ CSS, JS frameworks like Bootstrap
• Experience working in Magento 2.0
Roles and Responsibilities:
- Design and implementation of low-latency, high-availability, and performant APIs
- Writing reusable, testable, and efficient code to build features and improvements to the Plum product in a secure, well-tested, and performant way
- Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment
- Advocate for improvements to product quality, security, and performance
Qualifications:
- 5+ years of experience in shipping backend applications in NodeJs, with knowledge of Express.
- Experience with NextJS and Typescript is a plus. Able to integrate multiple data sources and databases into one system.
- Understanding of fundamental design principles behind a scalable API infrastructure.
- Shows the drive for owning a project from concept to production, including proposal, discussion, and execution. Self-motivated and self-managing, with strong organizational Skills.
- Having experience maintaining a production grade open source project is a plus.
• Bachelor or Master Degree in Computer Science, Software Engineering from a reputed University
• 5 to 8 years of experience working in eCommerce domain, specifically at-least 3-5 years of experience working in Django and Python.
• React and/or Angular for front end
• Proficiency in MongoDB and MySql
• Technical Skills: JIRA, Gitlab, Rest API, GCP or AWS









