Molecular Connections
http://www.molecularconnections.comJobs at Molecular Connections
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
· Analyze complex data sets to answer specific questions using MMIT’s market access data (MMIT) and Norstella claims data, third-party claims data (IQVIA LAAD, Symphony SHA). Applicant must have experience working with the aforementioned data sets exclusively.
· Deliver consultative services to clients related to MMIT RWD sets
· Produce complex analytical reports using data visualization tools such as Power BI or Tableau
· Define customized technical specifications to surface MMIT RWD in MMIT tools.
· Execute work in a timely fashion with high accuracy, while managing various competing priorities; Perform thorough troubleshooting and execute QA; Communicate with internal teams to obtain required data
· Ensure adherence to documentation requirements, process workflows, timelines, and escalation protocols
· And other duties as assigned.
Requirements:
· Bachelor’s Degree or relevant experience required
· 2-5 yrs. of professional experience in RWD analytics using SQL
· Fundamental understanding of Pharma and Market access space
· Strong analysis skills and proficiency with tools such as Tableau or PowerBI
· Excellent written and verbal communication skills.
· Analytical, critical thinking and creative problem-solving skills.
· Relationship building skills.
· Solid organizational skills including attention to detail and multitasking skills.
· Excellent time management and prioritization skills.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description: Data Engineer
Experience: Over 4 years
Responsibilities:
- Design, develop, and maintain scalable data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Architect and implement data storage solutions, including data warehouses, data lakes, and data marts, aligned with business needs.
- Implement robust data quality checks and data cleansing techniques to ensure data accuracy and consistency.
- Optimize data pipelines for performance, scalability, and cost-effectiveness.
- Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions.
- Develop and maintain data security measures to ensure data privacy and regulatory compliance.
- Automate data processing tasks using scripting languages (Python, Bash) and big data frameworks (Spark, Hadoop).
- Monitor data pipelines and infrastructure for performance and troubleshoot any issues.
- Stay up to date with the latest trends and technologies in data engineering, including cloud platforms (AWS, Azure, GCP).
- Document data pipelines, processes, and data models for maintainability and knowledge sharing.
- Contribute to the overall data governance strategy and best practices.
Qualifications:
- Strong understanding of data architectures, data modelling principles, and ETL processes.
- Proficiency in SQL (e.g., MySQL, PostgreSQL) and experience with big data querying languages (e.g., Hive, Spark SQL).
- Experience with scripting languages (Python, Bash) for data manipulation and automation.
- Experience with distributed data processing frameworks (Spark, Hadoop) (preferred).
- Familiarity with cloud platforms (AWS, Azure, GCP) for data storage and processing (a plus).
- Experience with data quality tools and techniques.
- Excellent problem-solving, analytical, and critical thinking skills.
- Strong communication, collaboration, and teamwork abilities.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description: React Native Developer
Experience: Over 4 years
Responsibilities:
- Architect, design, develop, and maintain complex, scalable React Native applications using clean code principles.
- Collaborate with designers to translate UI/UX mock-ups into pixel-perfect, native-feeling mobile interfaces.
- Leverage React Native's capabilities to build reusable UI components and implement performant animations.
- Effectively utilize native modules and APIs to achieve platform-specific functionalities when necessary.
- Write unit and integration tests to ensure code quality and maintainability.
- Identify and troubleshoot bugs, diagnose performance bottlenecks, and implement optimizations.
- Stay up to date with the latest trends and advancements in the React Native ecosystem.
- Participate in code reviews, provide mentorship to junior developers, and foster a collaborative development environment.
Qualifications:
- Experience in professional software development with a strong focus on mobile development.
- Proven experience building production ready React Native applications.
- In-depth knowledge of React, JavaScript (ES6+), and related web technologies (HTML, CSS).
- Strong understanding of mobile development concepts and best practices.
- Experience with Redux or similar state management libraries for complex applications.
- Experience with unit testing frameworks (Jest, Mocha) and UI testing tools.
- Excellent communication, collaboration, and problem-solving skills.
- Ability to work independently and manage multiple tasks effectively.
- A passion for building high-quality, user-centric mobile applications.
Nice To Have:
- Experience with native development (iOS/Android) for deep integrations.
- Experience with containerization technologies (Docker, Kubernetes).
- Experience with continuous integration/continuous delivery (CI/CD) pipelines.
- Experience with GraphQL or RESTful APIs.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
- Conduct thorough manual testing of software applications across various functionalities.
- Develop and maintain automated test scripts using MSTest framework.
- Identify and report bugs and defects through a bug tracking system.
- Analyze test results, diagnose issues, and collaborate with developers to resolve them.
- Participate in code reviews to identify potential defects early in the development process.
- Stay up-to-date with the latest QA methodologies and best practices.
- Contribute to the improvement of existing testing processes and documentation.
- Work effectively within an Agile development environment.
- Clearly communicate test findings and recommendations to technical and non-technical audiences.
Qualifications:
- Proven experience in manual testing methodologies (e.g., black-box testing, exploratory testing).
- Expertise in developing and maintaining automated test scripts using MSTest.
- Strong understanding of software development lifecycle (SDLC) and Agile methodologies.
- Excellent analytical and problem-solving skills.
- Ability to prioritize tasks, manage time effectively, and meet deadlines.
- Strong written and verbal communication skills.
- Experience with API testing is a plus.
- Familiarity with other automation frameworks (e.g., Selenium) is a plus.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
- Design, develop, and implement robust and efficient backend services using microservices architecture principles.
- Write clean, maintainable, and well-documented code using C# and the .NET framework.
- Develop and implement data access layers using Entity Framework.
- Utilize Azure DevOps for version control, continuous integration, and continuous delivery (CI/CD) pipelines.
- Design and manage databases on Azure SQL.
- Perform code reviews and participate in pair programming to ensure code quality.
- Troubleshoot and debug complex backend issues.
- Optimize backend performance and scalability to ensure a smooth user experience.
- Stay up-to-date with the latest advancements in backend technologies and cloud platforms.
- Collaborate effectively with frontend developers, product managers, and other stakeholders.
- Clearly communicate technical concepts to both technical and non-technical audiences.
Qualifications:
- Strong understanding of microservices architecture principles and best practices.
- In-depth knowledge of C# programming language and the .NET framework (ASP.NET MVC/Core, Web API).
- Experience working with Entity Framework for data access.
- Proficiency with Azure DevOps for CI/CD pipelines and version control (Git).
- Experience with Azure SQL for database design and management.
- Experience with unit testing and integration testing methodologies.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong written and verbal communication skills.
- A passion for building high-quality, scalable, and secure software applications.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
- Design, develop, and maintain highly interactive and responsive web applications using AngularJS 14.
- Write clean, maintainable, and well-documented code that adheres to best practices.
- Leverage Tailwind CSS for rapid UI development and ensure consistent design across the application.
- Implement user interface components and features according to design specifications.
- Integrate with backend APIs and services to retrieve and manipulate data.
- Optimize application performance for a smooth user experience across all devices and browsers.
- Conduct unit testing and participate in integration testing to ensure code quality.
- Collaborate with designers, backend engineers, product managers, and other stakeholders throughout the development lifecycle.
- Stay up-to-date with the latest advancements in frontend technologies and frameworks.
-Contribute to the improvement of existing frontend development processes and documentation.
Qualifications:
- Strong understanding of AngularJS 14 architecture, components, directives, and services.
- Proficiency with HTML5, CSS3, and JavaScript (ES6+).
- In-depth knowledge of Tailwind CSS and its utility classes for rapid UI development.
- Experience with responsive web design (RWD) principles and best practices.
- Understanding of web accessibility guidelines (WCAG).
- Familiarity with unit testing frameworks (e.g., Jasmine, Karma) is a plus.
- Experience with build tools (e.g., Webpack) is a plus.
- Excellent problem-solving and analytical skills.
- Strong attention to detail and a passion for creating pixel-perfect user interfaces.
- Excellent communication and collaboration skills.
A proactive and results-oriented individual with a strong work ethic.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
- Design, implement, and maintain robust CI/CD pipelines using Azure DevOps for continuous integration and continuous delivery (CI/CD) of software applications.
- Provision and manage infrastructure resources on Microsoft Azure, including virtual machines, containers, storage, and networking components.
- Implement and manage Kubernetes clusters for containerized application deployments and orchestration.
- Configure and utilize Azure Container Registry (ACR) for secure container image storage and management.
- Automate infrastructure provisioning and configuration management using tools like Azure Resource Manager (ARM) templates.
- Monitor application performance and identify potential bottlenecks using Azure monitoring tools.
- Collaborate with developers and operations teams to identify and implement continuous improvement opportunities for the DevOps process.
- Troubleshoot and resolve DevOps-related issues, ensuring smooth and efficient software delivery.
- Stay up-to-date with the latest advancements in cloud technologies, DevOps tools, and best practices.
- Maintain a strong focus on security throughout the software delivery lifecycle.
- Participate in code reviews to identify potential infrastructure and deployment issues.
- Effectively communicate with technical and non-technical audiences on DevOps processes and initiatives.
Qualifications:
- Proven experience in designing and implementing CI/CD pipelines using Azure DevOps.
- In-depth knowledge of Microsoft Azure cloud platform services (IaaS, PaaS, SaaS).
- Expertise in deploying and managing containerized applications using Kubernetes.
- Experience with Infrastructure as Code (IaC) tools like ARM templates.
- Familiarity with Azure monitoring tools and troubleshooting techniques.
- A strong understanding of DevOps principles and methodologies (Agile, Lean).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong written and verbal communication skills.
- A minimum of one relevant Microsoft certification (e.g., Azure Administrator Associate, DevOps Engineer Expert) is highly preferred.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibility :
- Install, configure, and maintain Kubernetes clusters.
- Develop Kubernetes-based solutions.
- Improve Kubernetes infrastructure.
- Work with other engineers to troubleshoot Kubernetes issues.
Kubernetes Engineer Requirements & Skills
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Linux/Unix experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking to fill the role of Kubernetes engineer. To join our growing team, please review the list of responsibilities and qualifications.
Kubernetes Engineer Responsibilities
- Install, configure, and maintain Kubernetes clusters.
- Develop Kubernetes-based solutions.
- Improve Kubernetes infrastructure.
- Work with other engineers to troubleshoot Kubernetes issues.
Kubernetes Engineer Requirements & Skills
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Linux/Unix experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team
Similar companies
ConnectedH
About the company
We want to create a connected healthcare ecosystem where all the generated medical data is automatically collated and stored at one single location. This will help remove inefficiencies in healthcare delivery, healthcare insurance & financing, population healthcare management, and build new solutions in predictive healthcare and diagnosis.
We are also building a customer facing product where users will be able to look up for information about medical tests including interpretation of results, and other health-related queries.
Jobs
3
Live Connections
About the company
Jobs
2
Elucidata Corporation
About the company
Jobs
1
Datazymes
About the company
Jobs
0
Excelra Knowledge Solutions
About the company
Jobs
0
CDM Connect
About the company
Jobs
0
SciLynk
About the company
Jobs
0
Climate Connect
About the company
Jobs
0
Covalense Technologies Pvt ltd
About the company
Jobs
1
Covalense Digital Solutions Pvt
About the company
Jobs
0