
ABOUT EPISOURCE:
Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.
The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.
What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.
ABOUT THE ROLE:
We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.
This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.
You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.
During the course of a typical day with our team, expect to work on one or more projects around the following;
1. Create and maintain optimal data pipeline architectures for ML
2. Develop a strong API ecosystem for ML pipelines
3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible
4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems
5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations
6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms
7. Designing scalable implementations of the models developed by our Data Science teams
8. Big data and distributed ML with PySpark on AWS EMR, and more!
BASIC REQUIREMENTS
-
Bachelor’s degree or greater in Computer Science, IT or related fields
-
Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects
-
Strong experience with bash scripting, unix environments and building scalable/distributed systems
-
Experience with automation/configuration management using Ansible, Terraform, or equivalent
-
Very strong experience with AWS and Python
-
Experience building CI/CD systems
-
Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent
-
Ability to build and manage application and performance monitoring processes

About Episource
About
Connect with the team
Similar jobs
EMPLOYMENT TYPE: Full-Time, Permanent
LOCATION: Remote
SHIFT TIMINGS: 11.00 AM - 8:00 PM IST
Role : Lead Data Analyst
Qualifications:
● Bachelor’s or Master’s degree in Computer Science, Data Analytics, Information Systems, or a related field.
● 7–10 years of experience in data operations, data management, or analytics.
● Strong understanding of data governance, ETL processes, and quality control methodologies.
● Hands-on experience with SQL, Excel/Google Sheets, and data visualization tools
● Experience with automation tools like Python script is a plus.
● Must be capable of working independently and delivering stable, efficient and reliable software.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic environment
Preferred Skills:
● Experience in SaaS, B2B data, or lead intelligence industry.
● Exposure to data privacy regulations (GDPR, CCPA) and compliance practices.
● Ability to work effectively in cross-functional, global, and remote environments.
Strong UX / Product Designer profile
Mandatory (Experience 1) - Must have 3+ YOE in UX / Product Design, with a focus towards B2C Products
Mandatory (Experience 2) - Must have recent 2+ YOE in a Good Product Company
Mandatory (Experience 3) - Must have worked on both UX and UI / Visual Designs (some UI / Visual Design experience is required)
Mandatory (Experience 4) - Strong collaboration experience with Product Managers, Researchers, and other Product stakeholders
Mandatory (Portfolio) - Strong portfolio of UX Design and UI/Visual Designs (both are Mandatory) for B2C Products
Mandatory (Company) - Product Company only
Job Description: DevOps Engineer
About Hyno:
Hyno Technologies is a unique blend of top-notch designers and world-class developers for new-age product development. Within the last 2 years we have collaborated with 32 young startups from India, US and EU to to find the optimum solution to their complex business problems. We have helped them to address the issues of scalability and optimisation through the use of technology and minimal cost. To us any new challenge is an opportunity.
As part of Hyno’s expansion plans,Hyno, in partnership with Sparity, is seeking an experienced DevOps Engineer to join our dynamic team. As a DevOps Engineer, you will play a crucial role in enhancing our software development processes, optimising system infrastructure, and ensuring the seamless deployment of applications. If you are passionate about leveraging cutting-edge technologies to drive efficiency, reliability, and scalability in software development, this is the perfect opportunity for you.
Position: DevOps Engineer
Experience: 5-7 years
Responsibilities:
- Collaborate with cross-functional teams to design, develop, and implement CI/CD pipelines for automated application deployment, testing, and monitoring.
- Manage and maintain cloud infrastructure using tools like AWS, Azure, or GCP, ensuring scalability, security, and high availability.
- Develop and implement infrastructure as code (IaC) using tools like Terraform or CloudFormation to automate the provisioning and management of resources.
- Constantly evaluate continuous integration and continuous deployment solutions as the industry evolves, and develop standardised best practices.
- Work closely with development teams to provide support and guidance in building applications with a focus on scalability, reliability, and security.
- Perform regular security assessments and implement best practices for securing the entire development and deployment pipeline.
- Troubleshoot and resolve issues related to infrastructure, deployment, and application performance in a timely manner.
- Follow regulatory and ISO 13485 requirements.
- Stay updated with industry trends and emerging technologies in the DevOps and cloud space, and proactively suggest improvements to current processes.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field (or equivalent work experience).
- Minimum of 5 years of hands-on experience in DevOps, system administration, or related roles.
- Solid understanding of containerization technologies (Docker, Kubernetes) and orchestration tools
- Strong experience with cloud platforms such as AWS, Azure, or GCP, including services like ECS, S3, RDS, and more.
- Proficiency in at least one programming/scripting language such as Python, Bash, or PowerShell..
- Demonstrated experience in building and maintaining CI/CD pipelines using tools like Jenkins, GitLab CI/CD, or CircleCI.
- Familiarity with configuration management tools like Ansible, Puppet, or Chef.
- Experience with container (Docker, ECS, EKS), serverless (Lambda), and Virtual Machine (VMware, KVM) architectures.
- Experience with infrastructure as code (IaC) tools like Terraform, CloudFormation, or Pulumi.
- Strong knowledge of monitoring and logging tools such as Prometheus, ELK stack, or Splunk.
- Excellent problem-solving skills and the ability to work effectively in a fast-paced, collaborative environment.
- Strong communication skills and the ability to work independently as well as in a team.
Nice to Have:
- Relevant certifications such as AWS Certified DevOps Engineer, Azure DevOps Engineer, Certified Kubernetes Administrator (CKA), etc.
- Experience with microservices architecture and serverless computing.
Soft Skills:
- Excellent written and verbal communication skills.
- Ability to manage conflict effectively.
- Ability to adapt and be productive in a dynamic environment.
- Strong communication and collaboration skills supporting multiple stakeholders and business operations.
- Self-starter, self-managed, and a team player.
Join us in shaping the future of DevOps at Hyno in collaboration with Sparity. If you are a highly motivated and skilled DevOps Engineer, eager to make an impact in a remote setting, we'd love to hear from you.

Bachelor's degree in information security, computer science, or related.
A Strong Devops experience of at least 4+ years
Strong Experience in Unix/Linux/Python scripting
Strong networking knowledge,vSphere networking stack knowledge desired.
Experience on Docker and Kubernetes
Experience with cloud technologies (AWS/Azure)
Exposure to Continuous Development Tools such as Jenkins or Spinnaker
Exposure to configuration management systems such as Ansible
Knowledge of resource monitoring systems
Ability to scope and estimate
Strong verbal and communication skills
Advanced knowledge of Docker and Kubernetes.
Exposure to Blockchain as a Service (BaaS) like - Chainstack/IBM blockchain platform/Oracle Blockchain Cloud/Rubix/VMWare etc.
Capable of provisioning and maintaining local enterprise blockchain platforms for Development and QA (Hyperledger fabric/Baas/Corda/ETH).
About Navis
JOB Description
A London based foreign bank has a requirement of shifting its credit MIS to back office and thus has an immediate opportunity for a contract role for the MIS function.
Job Responsibilities:
You will be required to prepare MIS based upon existing standard operating procedures (SOP).
|
Process |
Description |
|
ODA file |
Update limits, limit expiry dates, calculate availment of limits for multi-currency OD limits, limits against FD etc |
|
Limit Sanction statement |
Preparation of list covering all the credit facilities sanctioned during the reporting month |
|
Consolidated data |
Preparation of industry-wise, country-wise data of entire credit portfolio of the bank |
|
Credit dashboard |
Pictorial representation of the credit portfolio based on various parameters |
|
India based exposure |
Reporting of India based exposure to parent bank |
|
Real estate exposure |
Add new accounts opened |
|
Real estate exposure |
Update exposure by updating the latest position of undisbursed loan amounts as on reporting date |
|
Real estate exposure |
Update value of security, cost of project, Gross Development Value etc |
|
Real estate exposure |
Add loans under process and loans sanctioned but not disbursed. Remove loans which have been disbursed. |
|
Key risk indicators |
Update the KRIs as requirement of Risk Department |
|
Largest and second largest exposure |
Select largest and second largest inflow |
|
Restructured accounts |
Reporting of accounts which have been restructured |
|
Loan file |
Instalment-wise breakup of the term loans in the credit portfolio and identification of addition of irregular accounts |
|
Control of capital adequacy |
Analysis of inflow and outflow based on proposals under process and expected repayments |
|
Valuation, Visit, Insurance |
Reporting of pending valuations, visits and insurance |
|
IFRS 9 |
1. Collating information from various sources to populate respective information sheet (3c File data). |
|
BTL regulatory reporting |
Reporting for Residential BTL Loans |
The candidate should have good knowledge of Microsoft Office Suite (Excel, Word) comprising of H/V Lookup, Data Flow Charts & Graphs, Pivot Table.
Candidate having knowledge of the following will be preferred
-
Fair understanding of credit functions
-
IFRS 9 application – ECL, stage 1/2/3
-
Banking products related to loans and advances TL, OD etc
-
NPA – days past due, standard, sub standard
-
LTV- LTC , GDV
-
Property type – Residential, Commercial, W/H, Industrial
-
ROI – Libor, fixed, floating
QUALIFICATIONS: B.E./ B.Tech. (Electronics/ Electrical/ Mechatronics/ Mechanical)
EMPLOYMENT: Permanent / Full Time
LOCATION: Ahmedabad
EXPERIENCE: 1 to 3 years within automotive industry / software development.
ROLES AND RESPONSIBILITIES
- Develop, Debug simulink/stateflow models according to functional, technical requirements, MAAB guidelines.
- Derive software requirements from system level requirements and establish bi-directional traceability within the models and generated code.
- Generate Code from developed models, generate test vectors, automate tests and document code reviews, test results.
- Automate the data exchange flow between MATLAB models, excel sheets, code testing using m-scripts.
SKILLS AND EXPERIENCE
- Experience in model based development and testing using Mathwork Tools such as MATLAB/Simulink, Stateflow, autocode generation tools Embedded coder or TargetLink.
- Good understanding of the Automotive software development cycle and tool chains
- Should possess a good understanding of C coding, m-scripting, with good debugging and software testing skills.
- Experience in developing MATLAB/ Simulink models from scratch for automotive applications.
- Should have experience in requirements management, and be able to write test cases from them.
- Experience in MBD testing such as MIL/ SIL/PIL.
- Good communication skills in English and Hindi Language.
- Experience in MATLAB verification and validation toolbox like Simulink Design Verifier, Simulink Test, Simulink Coverage is advantageous.
- Basic understanding of Control System Design and Electric Vehicles is a plus.
Digital Marketing Executive duties and responsibilities
- Build, plan and implement the overall digital marketing strategy
- Manage the strategy
- Manage and train the rest of the team
- Stay up to date with the latest technology and best practices
- Manage all digital marketing channels
- Measure ROI and KPIs
- Prepare and manage a digital marketing budget
- Oversee all the company's social media accounts
- Manage and improve online content, considering SEO and Google Analytics
- Build an inbound marketing plan
- Forecast sales performance trends
- Motivate the digital marketing team to achieve goals
- Monitor competition and provide suggestions for improvement
Digital Marketing Executive requirements- 2 years of experience as a Digital Marketing Executive
- 2 years of experience in developing and implementing digital marketing strategies
- Good knowledge of all different digital marketing channels
- Good knowledge and experience with online marketing tools and best practices
- 2years of hands-on experience with SEO/SEM, Google Analytics and CRM software
- Familiarity with web design
- Sense of ownership and pride in your performance and its impact on a company’s success
- Critical thinker and problem-solving skills
- Team player
- Good time-management skills
- Great interpersonal and communication skills
- Must Have Solar Industries Exerience in Digital Marketing
Requirement
∙ Proficient in Node.JS and ReactJS development stack
∙ Basic understanding of web technologies including HTML, CSS, JavaScript, AJAX etc.
∙ Passion for best design and coding practices and a desire to develop new bold ideas
∙ Good to have knowledge of AWS, Redis, ElasticSearch
Education: Min. Graduate in related discipline
Requirements:
Bachelor’s or Master’s Degree in Computer Science, Engineering or related field of study.
3+ years of experience in developing web applications using Javascript, HTML, CSS.
2+ years of experience using ReactJS
Skill Set:
- Expert in Reactjs
- Proficient understanding of web markup, including HTML5, CSS3, client-side scripting and JavaScript frameworks, including jQuery.
- Good understanding of server-side CSS, pre-processing platforms, such as LESS and SASS
- Good understanding of asynchronous request handling, partial page updates, and AJAX
- Basic knowledge of image authoring tools, to be able to crop, resize, or perform small adjustments on an image.
- Familiarity with tools such as Gimp or Photoshop is a plus.
- Good understanding of cross-browser compatibility issues and ways to work around them








