Cutshort logo
Molecular Connections
Molecular Connections cover picture
Founded :
2001
Type :
Products & Services
Size :
1000-5000
Stage :
Profitable
About
With over two decades of experience in Big Data and Data Science Solutions, Molecular Connections (MC) has been using AI-powered proprietary models to help customers achieve digital transformation. We have been able to successfully build a data-driven decision-making strategy for our customer's digital transformational journey. MC leverages AI, ML and the Linked Data Store to build efficiencies in various verticals and generate new revenue streams for its customers. MC's decades of industry presence and a strong focus on innovation have led us to work with the world's leading pharma and STEM industries to offer end-to-end software development and data insights powered by proprietary workflows and platforms, enabling content engineering across multiple domains. With over 70% of its workforce being women, MC is ranked among the top 15 best companies for women to work for in India. Subsidiary Companies: 1. Molecular Connections Analytics Pvt. Ltd. | URL: https://mcanalytics.co.in/ 2. Molecular Connections Research Pvt. Ltd. | URL: https://mcresearch.co.in/
Read more
Connect with the team
Profile picture
Molecular Connections
Profile picture
Chendil Kumar
Profile picture
Gurminder kaur
Profile picture
Lokanath Khamari
Company social profiles
linkedin

Jobs at Molecular Connections

Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
2 - 5 yrs
₹13L - ₹16L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+4 more

Responsibilities:

·       Analyze complex data sets to answer specific questions using MMIT’s market access data (MMIT) and Norstella claims data, third-party claims data (IQVIA LAAD, Symphony SHA). Applicant must have experience working with the aforementioned data sets exclusively.

·       Deliver consultative services to clients related to MMIT RWD sets

·       Produce complex analytical reports using data visualization tools such as Power BI or Tableau

·       Define customized technical specifications to surface MMIT RWD in MMIT tools. 

·       Execute work in a timely fashion with high accuracy, while managing various competing priorities; Perform thorough troubleshooting and execute QA; Communicate with internal teams to obtain required data

·       Ensure adherence to documentation requirements, process workflows, timelines, and escalation protocols

·       And other duties as assigned.

 

Requirements:

·       Bachelor’s Degree or relevant experience required

·       2-5 yrs. of professional experience in RWD analytics using SQL

·       Fundamental understanding of Pharma and Market access space

·       Strong analysis skills and proficiency with tools such as Tableau or PowerBI

·       Excellent written and verbal communication skills.

·       Analytical, critical thinking and creative problem-solving skills.

·       Relationship building skills.

·       Solid organizational skills including attention to detail and multitasking skills.

·       Excellent time management and prioritization skills.

 

Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
4 - 9 yrs
₹8L - ₹12L / yr
ETL
Informatica
Data Warehouse (DWH)
Spark
Hadoop
+5 more

Job Description: Data Engineer


Experience: Over 4 years


Responsibilities:

-       Design, develop, and maintain scalable data pipelines for efficient data extraction, transformation, and loading (ETL) processes.

-       Architect and implement data storage solutions, including data warehouses, data lakes, and data marts, aligned with business needs.

-       Implement robust data quality checks and data cleansing techniques to ensure data accuracy and consistency.

-       Optimize data pipelines for performance, scalability, and cost-effectiveness.

-       Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions.

-       Develop and maintain data security measures to ensure data privacy and regulatory compliance.

-       Automate data processing tasks using scripting languages (Python, Bash) and big data frameworks (Spark, Hadoop).

-       Monitor data pipelines and infrastructure for performance and troubleshoot any issues.

-       Stay up to date with the latest trends and technologies in data engineering, including cloud platforms (AWS, Azure, GCP).

-        Document data pipelines, processes, and data models for maintainability and knowledge sharing.

-       Contribute to the overall data governance strategy and best practices.

 

Qualifications:

-       Strong understanding of data architectures, data modelling principles, and ETL processes.

-       Proficiency in SQL (e.g., MySQL, PostgreSQL) and experience with big data querying languages (e.g., Hive, Spark SQL).

-       Experience with scripting languages (Python, Bash) for data manipulation and automation.

-       Experience with distributed data processing frameworks (Spark, Hadoop) (preferred).

-       Familiarity with cloud platforms (AWS, Azure, GCP) for data storage and processing (a plus).

-       Experience with data quality tools and techniques.

-       Excellent problem-solving, analytical, and critical thinking skills.

-       Strong communication, collaboration, and teamwork abilities.

Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
4 - 9 yrs
₹11L - ₹13L / yr
skill iconReact.js
skill iconRedux/Flux
jest
skill iconDocker
skill iconKubernetes
+6 more

Job Description: React Native Developer


Experience: Over 4 years


Responsibilities:

-       Architect, design, develop, and maintain complex, scalable React Native applications using clean code principles.

-       Collaborate with designers to translate UI/UX mock-ups into pixel-perfect, native-feeling mobile interfaces.

-       Leverage React Native's capabilities to build reusable UI components and implement performant animations.

-       Effectively utilize native modules and APIs to achieve platform-specific functionalities when necessary.

-       Write unit and integration tests to ensure code quality and maintainability.

-       Identify and troubleshoot bugs, diagnose performance bottlenecks, and implement optimizations.

-       Stay up to date with the latest trends and advancements in the React Native ecosystem.

-       Participate in code reviews, provide mentorship to junior developers, and foster a collaborative development environment.

Qualifications:

-       Experience in professional software development with a strong focus on mobile development.

-       Proven experience building production ready React Native applications.

-       In-depth knowledge of React, JavaScript (ES6+), and related web technologies (HTML, CSS).

-       Strong understanding of mobile development concepts and best practices.

-       Experience with Redux or similar state management libraries for complex applications.

-       Experience with unit testing frameworks (Jest, Mocha) and UI testing tools.

-       Excellent communication, collaboration, and problem-solving skills.

-       Ability to work independently and manage multiple tasks effectively.

-       A passion for building high-quality, user-centric mobile applications.

Nice To Have:

-       Experience with native development (iOS/Android) for deep integrations.

-       Experience with containerization technologies (Docker, Kubernetes).

-       Experience with continuous integration/continuous delivery (CI/CD) pipelines.

-       Experience with GraphQL or RESTful APIs.

Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
3 - 4 yrs
₹4L - ₹8L / yr
Test Automation (QA)
Manual testing


Responsibilities:

-       Conduct thorough manual testing of software applications across various functionalities.

-       Develop and maintain automated test scripts using MSTest framework.

-       Identify and report bugs and defects through a bug tracking system.

-       Analyze test results, diagnose issues, and collaborate with developers to resolve them.

-       Participate in code reviews to identify potential defects early in the development process.

-       Stay up-to-date with the latest QA methodologies and best practices.

-       Contribute to the improvement of existing testing processes and documentation.

-       Work effectively within an Agile development environment.

-       Clearly communicate test findings and recommendations to technical and non-technical audiences.


Qualifications:

-       Proven experience in manual testing methodologies (e.g., black-box testing, exploratory testing).

-       Expertise in developing and maintaining automated test scripts using MSTest.

-       Strong understanding of software development lifecycle (SDLC) and Agile methodologies.

-       Excellent analytical and problem-solving skills.

-       Ability to prioritize tasks, manage time effectively, and meet deadlines.

-       Strong written and verbal communication skills.

-       Experience with API testing is a plus.

-       Familiarity with other automation frameworks (e.g., Selenium) is a plus.


Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Remote, Bengaluru (Bangalore), Mumbai
5 - 10 yrs
₹7L - ₹15L / yr
skill iconJava
skill iconPython
skill iconRuby
skill iconRuby on Rails (ROR)
skill iconGo Programming (Golang)
+8 more

Responsibilities:

- Design, develop, and implement robust and efficient backend services using microservices architecture principles.

-  Write clean, maintainable, and well-documented code using C# and the .NET framework.

-  Develop and implement data access layers using Entity Framework.

-  Utilize Azure DevOps for version control, continuous integration, and continuous delivery (CI/CD) pipelines.

-  Design and manage databases on Azure SQL.

-  Perform code reviews and participate in pair programming to ensure code quality.

-  Troubleshoot and debug complex backend issues.

-  Optimize backend performance and scalability to ensure a smooth user experience.

-  Stay up-to-date with the latest advancements in backend technologies and cloud platforms.

-  Collaborate effectively with frontend developers, product managers, and other stakeholders.

-  Clearly communicate technical concepts to both technical and non-technical audiences.

Qualifications:

-  Strong understanding of microservices architecture principles and best practices.

-  In-depth knowledge of C# programming language and the .NET framework (ASP.NET MVC/Core, Web API).

-  Experience working with Entity Framework for data access.

-  Proficiency with Azure DevOps for CI/CD pipelines and version control (Git).

-  Experience with Azure SQL for database design and management.

-  Experience with unit testing and integration testing methodologies.

-  Excellent problem-solving and analytical skills.

-   Ability to work independently and as part of a team.

-   Strong written and verbal communication skills.

-   A passion for building high-quality, scalable, and secure software applications.

Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
4 - 7 yrs
₹7L - ₹10L / yr
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconHTML/CSS
JavaScriptMVC
Jasmine (Javascript Testing Framework)
+1 more

Responsibilities:

- Design, develop, and maintain highly interactive and responsive web applications using AngularJS 14.

-  Write clean, maintainable, and well-documented code that adheres to best practices.

- Leverage Tailwind CSS for rapid UI development and ensure consistent design across the application.

- Implement user interface components and features according to design specifications.

- Integrate with backend APIs and services to retrieve and manipulate data.

- Optimize application performance for a smooth user experience across all devices and browsers.

- Conduct unit testing and participate in integration testing to ensure code quality.

- Collaborate with designers, backend engineers, product managers, and other stakeholders throughout the development lifecycle.

- Stay up-to-date with the latest advancements in frontend technologies and frameworks.

-Contribute to the improvement of existing frontend development processes and documentation.

 

Qualifications:

-  Strong understanding of AngularJS 14 architecture, components, directives, and services.

-  Proficiency with HTML5, CSS3, and JavaScript (ES6+).

-  In-depth knowledge of Tailwind CSS and its utility classes for rapid UI development.

- Experience with responsive web design (RWD) principles and best practices.

- Understanding of web accessibility guidelines (WCAG).

- Familiarity with unit testing frameworks (e.g., Jasmine, Karma) is a plus.

-  Experience with build tools (e.g., Webpack) is a plus.

-  Excellent problem-solving and analytical skills.

-   Strong attention to detail and a passion for creating pixel-perfect user interfaces.

-   Excellent communication and collaboration skills.

A proactive and results-oriented individual with a strong work ethic.

Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
DevOps
skill iconKubernetes
skill iconDocker
Windows Azure
Google Cloud Platform (GCP)
+1 more

Responsibilities:

- Design, implement, and maintain robust CI/CD pipelines using Azure DevOps for continuous integration and continuous delivery (CI/CD) of software applications.

- Provision and manage infrastructure resources on Microsoft Azure, including virtual machines, containers, storage, and networking components.

-  Implement and manage Kubernetes clusters for containerized application deployments and orchestration.

-  Configure and utilize Azure Container Registry (ACR) for secure container image storage and management.

-  Automate infrastructure provisioning and configuration management using tools like Azure Resource Manager (ARM) templates.

- Monitor application performance and identify potential bottlenecks using Azure monitoring tools.

- Collaborate with developers and operations teams to identify and implement continuous improvement opportunities for the DevOps process.

- Troubleshoot and resolve DevOps-related issues, ensuring smooth and efficient software delivery.

- Stay up-to-date with the latest advancements in cloud technologies, DevOps tools, and best practices.

- Maintain a strong focus on security throughout the software delivery lifecycle.

- Participate in code reviews to identify potential infrastructure and deployment issues.

-  Effectively communicate with technical and non-technical audiences on DevOps processes and initiatives.

Qualifications:

- Proven experience in designing and implementing CI/CD pipelines using Azure DevOps.

- In-depth knowledge of Microsoft Azure cloud platform services (IaaS, PaaS, SaaS).

- Expertise in deploying and managing containerized applications using Kubernetes.

-  Experience with Infrastructure as Code (IaC) tools like ARM templates.

- Familiarity with Azure monitoring tools and troubleshooting techniques.

-  A strong understanding of DevOps principles and methodologies (Agile, Lean).

-  Excellent problem-solving and analytical skills.

-   Ability to work independently and as part of a team.

-   Strong written and verbal communication skills.

-   A minimum of one relevant Microsoft certification (e.g., Azure Administrator Associate, DevOps Engineer Expert) is highly preferred.


Read more
Molecular Connections
at Molecular Connections
4 recruiters
Molecular Connections
Posted by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
company logo
Agency job
via Molecular Connections by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
3 - 5 yrs
₹5L - ₹10L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more
Looking for a Kubernetes Engineer Freelancer to support our existing team. please review the list of responsibilities and qualifications.

Responsibility :

  • Install, configure, and maintain Kubernetes clusters.
  • Develop Kubernetes-based solutions.
  • Improve Kubernetes infrastructure.
  • Work with other engineers to troubleshoot Kubernetes issues.

Kubernetes Engineer Requirements & Skills

  • Kubernetes administration experience, including installation, configuration, and troubleshooting
  • Kubernetes development experience
  • Linux/Unix experience
  • Strong analytical and problem-solving skills
  • Excellent communication and interpersonal skills
  • Ability to work independently and as part of a team
Read more
company logo
Agency job
via Molecular Connections by Molecular Connections
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
2 - 4 yrs
₹5L - ₹10L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

We are looking to fill the role of Kubernetes engineer.  To join our growing team, please review the list of responsibilities and qualifications.

Kubernetes Engineer Responsibilities

  • Install, configure, and maintain Kubernetes clusters.
  • Develop Kubernetes-based solutions.
  • Improve Kubernetes infrastructure.
  • Work with other engineers to troubleshoot Kubernetes issues.

Kubernetes Engineer Requirements & Skills

  • Kubernetes administration experience, including installation, configuration, and troubleshooting
  • Kubernetes development experience
  • Linux/Unix experience
  • Strong analytical and problem-solving skills
  • Excellent communication and interpersonal skills
  • Ability to work independently and as part of a team
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

ConnectedH cover picture
ConnectedH's logo

ConnectedH

https://www.connectedh.com
Founded
2017
Type
Products & Services
Size
0-20
Stage
Raised funding

About the company

We want to create a connected healthcare ecosystem where all the generated medical data is automatically collated and stored at one single location. This will help remove inefficiencies in healthcare delivery, healthcare insurance & financing, population healthcare management, and build new solutions in predictive healthcare and diagnosis.

 

We are also building a customer facing product where users will be able to look up for information about medical tests including interpretation of results, and other health-related queries.

Jobs

3

Live Connections cover picture
Live Connections's logo

Live Connections

https://www.liveconnections.in/
Founded
1996
Type
Services
Size
200-500
Stage
Profitable

About the company

Live Connections Placements Pvt. Ltd. (LiveC , as we are popularly known as) is a 23+ year old Search & Recruitment organisation that specializes in finding and placing professionals across several sectors around the globe. We bring to the table cumulative recruitment experience built over 2 decades. We understand the business life cycles in recruitment and have placed over 30000 people across 400+ clients. We extend our Services to multiple sectors and functions across various geographies. We are now operational in 7 locations in 4 countries (India, UAE, Singapore and Qatar ). We have a goal-oriented, highly focused approach with all our clients. We take the time to get to know and identify the needs of everyone we work with to build strong, long-lasting relationships.

Jobs

2

Elucidata Corporation cover picture
Elucidata Corporation's logo

Elucidata Corporation

http://www.elucidata.io
Founded
2015
Type
Products & Services
Size
20-100
Stage
Raised funding

About the company

Elucidata is using data science to transform decision-making processes in R&D labs in biotechnology and pharmaceutical companies.

Jobs

1

Datazymes cover picture
Datazymes's logo

Datazymes

http://www.datazymes.com
Founded
2016
Type
Products & Services
Size
20-100
Stage
Profitable

About the company

DataZymes Inc provides data management and analytics solutions for healthcare organizations. Our passion for data leverages machine learning algorithms to enable timely, relevant and quality insights and helps organizations get more value from data.

Jobs

0

Excelra Knowledge Solutions cover picture
Excelra Knowledge Solutions's logo

Excelra Knowledge Solutions

https://www.excelra.com/
Founded
2001
Type
Services
Size
100-1000
Stage
Profitable

About the company

Excelra is a leading global Biopharma Data and Analytics company. With over 15 years of experience, we have built a high-quality scientific data engine powered by human and machine intelligence. This has helped us create valuable data assets from drug chemistry (for medicinal and computational drug discovery) to biomarkers (for translational biology) and clinical and real-world evidence (for pharmacometric modelling and HEOR/MA). Scientists from Excelra’s Pharma Analytics team consult with Biopharma companies offering bespoke solutions leveraging Cheminformatics, Bioinformatics, Computational Biology, and Data Science. Our proprietary databases for Structure-Activity Relationships (GOSTAR), Biomarkers (GOBIOM) and Clinical Trials Outcome (CTOD) have been well received by top 20 pharma and biotech. Excelra has a strong focus on data analytics across the Pharma value chain, right from Discovery to Commercialization.

Jobs

0

CDM Connect cover picture
CDM Connect's logo

CDM Connect

http://www.cdmconnect.com
Founded
2017
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

We are accessible and available to fulfill your every digital need.

Jobs

0

SciLynk cover picture
SciLynk's logo

SciLynk

http://www.scilynk.in
Founded
2018
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

Home - SciLynk

Jobs

0

Climate Connect cover picture
Climate Connect's logo

Climate Connect

http://www.climate-connect.com
Founded
2010
Type
Products & Services
Size
20-100
Stage
Profitable

About the company

Climate Connect develops an AI-based analytics platform that provides portfolio management and storage optimization solutions for the energy sector.

Jobs

0

Covalense Technologies Pvt ltd cover picture
Covalense Technologies Pvt ltd's logo

Covalense Technologies Pvt ltd

http://covalenseglobal.com
Founded
2006
Type
Services
Size
100-1000
Stage
Profitable

About the company

Covalense Global is a recognised software services and niche solutions company and a reliable IT transformation partner to our clients.

Jobs

1

Covalense Digital Solutions Pvt cover picture
Covalense Digital Solutions Pvt's logo

Covalense Digital Solutions Pvt

https://www.google.com
Founded
2006
Type
Products & Services
Size
100-1000
Stage
Profitable

About the company

Jobs

0

Want to work at Molecular Connections?
Molecular Connections's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs