
Molecular Connections
http://www.molecularconnections.comAbout
Connect with the team
Jobs at Molecular Connections

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
About the Role:
We are seeking a talented and passionate DevOps Engineer to join our dynamic team. You will be responsible for designing, implementing, and managing scalable and secure infrastructure across multiple cloud platforms. The ideal candidate will have a deep understanding of DevOps best practices and a proven track record in automating and optimizing complex workflows.
Key Responsibilities:
Cloud Management:
- Design, implement, and manage cloud infrastructure on AWS, Azure, and GCP.
- Ensure high availability, scalability, and security of cloud resources.
Containerization & Orchestration:
- Develop and manage containerized applications using Docker.
- Deploy, scale, and manage Kubernetes clusters.
CI/CD Pipelines:
- Build and maintain robust CI/CD pipelines to automate the software delivery process.
- Implement monitoring and alerting to ensure pipeline efficiency.
Version Control & Collaboration:
- Manage code repositories and workflows using Git.
- Collaborate with development teams to optimize branching strategies and code reviews.
Automation & Scripting:
- Automate infrastructure provisioning and configuration using tools like Terraform, Ansible, or similar.
- Write scripts to optimize and maintain workflows.
Monitoring & Logging:
- Implement and maintain monitoring solutions to ensure system health and performance.
- Analyze logs and metrics to troubleshoot and resolve issues.
Required Skills & Qualifications:
- 3-5 years of experience with AWS, Azure, and Google Cloud Platform (GCP).
- Proficiency in containerization tools like Docker and orchestration tools like Kubernetes.
- Hands-on experience building and managing CI/CD pipelines.
- Proficient in using Git for version control.
- Experience with scripting languages such as Bash, Python, or PowerShell.
- Familiarity with infrastructure-as-code tools like Terraform or CloudFormation.
- Solid understanding of networking, security, and system administration.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and teamwork skills.
Preferred Qualifications:
- Certifications such as AWS Certified DevOps Engineer, Azure DevOps Engineer, or Google Professional DevOps Engineer.
- Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
- Familiarity with serverless architectures and microservices.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

A niche, specialist position in an interdisciplinary team focused on end-to-end solutions. Nature of projects range from proof-of-concept innovative applications, parallel implementations per end user requests, scaling up and continuous monitoring for improvements. Majority of the projects will be focused on providing automation solutions via both custom solutions and adapting machine learning generic standards to specific use cases/domains.
Clientele includes major publishers from the US and Europe, pharmaceutical bigwigs and government funded projects.
As a Senior Fullstack Developer, you will be responsible for designing, building, and maintaining scalable and performant web applications using modern technologies. You will work with cutting-edge tools and cloud infrastructure (primarily Google Cloud) and implement robust back-end services with React JS with Typescript, Koa.js, MongoDB, and Redis, while ensuring reliable and efficient monitoring with OpenTelemetry and logging with Bunyan. Your expertise in CI/CD pipelines and modern testing frameworks will be key to maintaining a smooth and efficient software development lifecycle.
Key Responsibilities:
- Fullstack Development: Design, develop, and maintain web applications using JavaScript (Node.js for back-end and React.js with Typescript for front-end).
- Cloud Infrastructure: Leverage Google Cloud services (like Compute Engine, Cloud Storage, Pub/Sub, etc.) to build scalable and resilient cloud solutions.
- API Development: Implement RESTful APIs and microservices with Koa.js, ensuring high performance, security, and scalability.
- Database Management: Manage MongoDB databases for storing and retrieving application data, and use Redis for caching and session management.
- Logging and Monitoring: Utilize Bunyan for structured logging and OpenTelemetry for distributed tracing and monitoring to ensure system health and performance.
- CI/CD: Design, implement, and maintain efficient CI/CD pipelines for continuous integration and deployment, ensuring fast and reliable code delivery.
- Testing & Quality Assurance: Write unit and integration tests using Jest, Mocha, and React Testing Library to ensure code reliability and maintainability.
- Collaboration: Work closely with front-end and back-end engineers to deliver high-quality software solutions, following agile development practices.
- Optimization & Scaling: Identify performance bottlenecks, troubleshoot production issues, and scale the system as needed.
- Code Reviews & Mentorship: Conduct peer code reviews, share best practices, and mentor junior developers to improve team efficiency and code quality.
Must-Have Skills:
- Google Cloud (GCP): Hands-on experience with various Google Cloud services (Compute Engine, Cloud Storage, Pub/Sub, Firestore, etc.) for building scalable applications.
- React.js: Strong experience in building modern, responsive user interfaces with React.js and Typescript
- Koa.js: Strong experience in building web servers and APIs with Koa.js.
- MongoDB & Redis: Proficiency in working with MongoDB (NoSQL databases) and Redis for caching and session management.
- Bunyan: Experience using Bunyan for structured logging and tracking application events.
- OpenTelemetry Ecosystem: Hands-on experience with the OpenTelemetry ecosystem for monitoring and distributed tracing.
- CI/CD: Proficient in setting up CI/CD pipelines using tools like CircleCI, Jenkins, or GitLab CI.
- Testing Frameworks: Solid understanding and experience with Jest, Mocha, and React Testing Library for testing both back-end and front-end applications.
- JavaScript & Node.js: Strong proficiency in JavaScript (ES6+), and experience working with Node.js for back-end services.
Desired Skills & Experience:
- Experience with other cloud platforms (AWS, Azure).
- Familiarity with containerization and orchestration tools like Docker and Kubernetes.
- Experience working with TypeScript.
- Knowledge of other logging and monitoring tools.
- Familiarity with agile methodologies and project management tools (JIRA, Trello, etc.).
Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5-10 years of hands-on experience as a Fullstack Developer.
- Strong problem-solving skills and ability to debug complex systems.
- Excellent communication skills and ability to work in a team-oriented, collaborative environment.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
· Analyze complex data sets to answer specific questions using MMIT’s market access data (MMIT) and Norstella claims data, third-party claims data (IQVIA LAAD, Symphony SHA). Applicant must have experience working with the aforementioned data sets exclusively.
· Deliver consultative services to clients related to MMIT RWD sets
· Produce complex analytical reports using data visualization tools such as Power BI or Tableau
· Define customized technical specifications to surface MMIT RWD in MMIT tools.
· Execute work in a timely fashion with high accuracy, while managing various competing priorities; Perform thorough troubleshooting and execute QA; Communicate with internal teams to obtain required data
· Ensure adherence to documentation requirements, process workflows, timelines, and escalation protocols
· And other duties as assigned.
Requirements:
· Bachelor’s Degree or relevant experience required
· 2-5 yrs. of professional experience in RWD analytics using SQL
· Fundamental understanding of Pharma and Market access space
· Strong analysis skills and proficiency with tools such as Tableau or PowerBI
· Excellent written and verbal communication skills.
· Analytical, critical thinking and creative problem-solving skills.
· Relationship building skills.
· Solid organizational skills including attention to detail and multitasking skills.
· Excellent time management and prioritization skills.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Job Description: React Native Developer
Experience: Over 4 years
Responsibilities:
- Architect, design, develop, and maintain complex, scalable React Native applications using clean code principles.
- Collaborate with designers to translate UI/UX mock-ups into pixel-perfect, native-feeling mobile interfaces.
- Leverage React Native's capabilities to build reusable UI components and implement performant animations.
- Effectively utilize native modules and APIs to achieve platform-specific functionalities when necessary.
- Write unit and integration tests to ensure code quality and maintainability.
- Identify and troubleshoot bugs, diagnose performance bottlenecks, and implement optimizations.
- Stay up to date with the latest trends and advancements in the React Native ecosystem.
- Participate in code reviews, provide mentorship to junior developers, and foster a collaborative development environment.
Qualifications:
- Experience in professional software development with a strong focus on mobile development.
- Proven experience building production ready React Native applications.
- In-depth knowledge of React, JavaScript (ES6+), and related web technologies (HTML, CSS).
- Strong understanding of mobile development concepts and best practices.
- Experience with Redux or similar state management libraries for complex applications.
- Experience with unit testing frameworks (Jest, Mocha) and UI testing tools.
- Excellent communication, collaboration, and problem-solving skills.
- Ability to work independently and manage multiple tasks effectively.
- A passion for building high-quality, user-centric mobile applications.
Nice To Have:
- Experience with native development (iOS/Android) for deep integrations.
- Experience with containerization technologies (Docker, Kubernetes).
- Experience with continuous integration/continuous delivery (CI/CD) pipelines.
- Experience with GraphQL or RESTful APIs.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibilities:
- Conduct thorough manual testing of software applications across various functionalities.
- Develop and maintain automated test scripts using MSTest framework.
- Identify and report bugs and defects through a bug tracking system.
- Analyze test results, diagnose issues, and collaborate with developers to resolve them.
- Participate in code reviews to identify potential defects early in the development process.
- Stay up-to-date with the latest QA methodologies and best practices.
- Contribute to the improvement of existing testing processes and documentation.
- Work effectively within an Agile development environment.
- Clearly communicate test findings and recommendations to technical and non-technical audiences.
Qualifications:
- Proven experience in manual testing methodologies (e.g., black-box testing, exploratory testing).
- Expertise in developing and maintaining automated test scripts using MSTest.
- Strong understanding of software development lifecycle (SDLC) and Agile methodologies.
- Excellent analytical and problem-solving skills.
- Ability to prioritize tasks, manage time effectively, and meet deadlines.
- Strong written and verbal communication skills.
- Experience with API testing is a plus.
- Familiarity with other automation frameworks (e.g., Selenium) is a plus.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.



Responsibilities:
- Design, develop, and implement robust and efficient backend services using microservices architecture principles.
- Write clean, maintainable, and well-documented code using C# and the .NET framework.
- Develop and implement data access layers using Entity Framework.
- Utilize Azure DevOps for version control, continuous integration, and continuous delivery (CI/CD) pipelines.
- Design and manage databases on Azure SQL.
- Perform code reviews and participate in pair programming to ensure code quality.
- Troubleshoot and debug complex backend issues.
- Optimize backend performance and scalability to ensure a smooth user experience.
- Stay up-to-date with the latest advancements in backend technologies and cloud platforms.
- Collaborate effectively with frontend developers, product managers, and other stakeholders.
- Clearly communicate technical concepts to both technical and non-technical audiences.
Qualifications:
- Strong understanding of microservices architecture principles and best practices.
- In-depth knowledge of C# programming language and the .NET framework (ASP.NET MVC/Core, Web API).
- Experience working with Entity Framework for data access.
- Proficiency with Azure DevOps for CI/CD pipelines and version control (Git).
- Experience with Azure SQL for database design and management.
- Experience with unit testing and integration testing methodologies.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong written and verbal communication skills.
- A passion for building high-quality, scalable, and secure software applications.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Responsibilities:
- Design, develop, and maintain highly interactive and responsive web applications using AngularJS 14.
- Write clean, maintainable, and well-documented code that adheres to best practices.
- Leverage Tailwind CSS for rapid UI development and ensure consistent design across the application.
- Implement user interface components and features according to design specifications.
- Integrate with backend APIs and services to retrieve and manipulate data.
- Optimize application performance for a smooth user experience across all devices and browsers.
- Conduct unit testing and participate in integration testing to ensure code quality.
- Collaborate with designers, backend engineers, product managers, and other stakeholders throughout the development lifecycle.
- Stay up-to-date with the latest advancements in frontend technologies and frameworks.
-Contribute to the improvement of existing frontend development processes and documentation.
Qualifications:
- Strong understanding of AngularJS 14 architecture, components, directives, and services.
- Proficiency with HTML5, CSS3, and JavaScript (ES6+).
- In-depth knowledge of Tailwind CSS and its utility classes for rapid UI development.
- Experience with responsive web design (RWD) principles and best practices.
- Understanding of web accessibility guidelines (WCAG).
- Familiarity with unit testing frameworks (e.g., Jasmine, Karma) is a plus.
- Experience with build tools (e.g., Webpack) is a plus.
- Excellent problem-solving and analytical skills.
- Strong attention to detail and a passion for creating pixel-perfect user interfaces.
- Excellent communication and collaboration skills.
A proactive and results-oriented individual with a strong work ethic.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibility :
- Install, configure, and maintain Kubernetes clusters.
- Develop Kubernetes-based solutions.
- Improve Kubernetes infrastructure.
- Work with other engineers to troubleshoot Kubernetes issues.
Kubernetes Engineer Requirements & Skills
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Linux/Unix experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking to fill the role of Kubernetes engineer. To join our growing team, please review the list of responsibilities and qualifications.
Kubernetes Engineer Responsibilities
- Install, configure, and maintain Kubernetes clusters.
- Develop Kubernetes-based solutions.
- Improve Kubernetes infrastructure.
- Work with other engineers to troubleshoot Kubernetes issues.
Kubernetes Engineer Requirements & Skills
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Linux/Unix experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team

Similar companies
About the company
We are the fastest growing all-in-one platform for SMB's and digital marketing agencies. We offer services related to CRM, Email, 2-way SMS, phone system, Facebook, Instagram, WhatsApp, Email marketing, Social media posting, Websites, Funnel Builder, Wordpress hosting & more!
We have a very strong and independent team. We value tinkerers and people with an entrepreneurial spirit. We want people to come to work and explore their curiosity every day. Our growth offers a unique opportunity for the right individual to scale and build world class products.
About HighLevel:
HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have 1000+ employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.
Our Customers:
HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.
Scale at HighLevel:
We work at scale; our infrastructure handles around 3 Billion+ API hits & 2 Billion+ message events monthly and over 25M views of customer pages daily. We also handle over 80 Terabytes of data across 5 Databases.
About the Team:
Currently we have millions of sales funnels, websites, attributions, forms and survey tools for lead generation. Our B2B customers use these tools to bring in the leads to the HighLevel CRM system. We are working to continuously improve the functionality of these tools to solve our customers’ business needs. In this role, you will be expected to be autonomous, guide other developers who might need technical help, collaborate with other technical teams, product, support and customer success
Some of the perks we offer:
- 100 % remote
- Uncapped leave policy
- WFH setup
- Champion big problems
Jobs
15
About the company
Jobs
3
About the company
Jobs
2
About the company
About AiSensy
AiSensy is a WhatsApp based Marketing & Engagement platform helping businesses like Skullcandy, Vivo, Rentomojo, Physicswallah, Cosco grow their revenues via WhatsApp.
- Enabling 100,000+ Businesses with WhatsApp Engagement & Marketing
- 400cr+ WhatsApp Messages done between Businesses and Users via AiSensy
- Working with top brands like Skullcandy, Vivo, Rentomojo, Physicswallah & more
- High Impact as Businesses drive 25-80% Revenues using AiSensy Platform
- Mission-Driven and Growth Stage Startup backed by Marsshot.vc, Bluelotus.vc & 50+ Angel Investors
Jobs
7
About the company
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.
With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.
We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).
Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Jobs
238
About the company
Jobs
3
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
144
About the company
Gruve was founded on the premise that new technologies in Machine Learning, Data Sciences, Artificial Intelligence, and Software Development are transforming Enterprise Services. Our goal is to harness these advancements to deliver services with superior efficiency and tangible outcomes.
Our Team
Our team is built with a strong background in Software and Services, united by a shared sense of Purpose: to achieve the best outcomes for our clients. We value all our stakeholders, recognizing that People are our most important assets. We adopt a Process framework that ensures the delivery of high-quality results every time.
What Sets Us Apart
Our differentiation is straightforward: we genuinely care, we innovate, we disrupt, and we work hard.
Our Core Values:
Customer Success: Putting customers first.
Positive Feedback Loop: Embracing continuous improvement.
Pursuit & Persevere: Staying resilient and ambitious.
Integrity and Ethics: Acting with honesty and ethics.
Team & Trust: Collaborating with trust and respect.
Giving Back: Committing to community and responsibility.
Gruve is Norwegian for "To Mine or Mining Activity"
Jobs
16
About the company
Jobs
123