11+ QoS Jobs in Mumbai | QoS Job openings in Mumbai
Apply to 11+ QoS Jobs in Mumbai on CutShort.io. Explore the latest QoS Job opportunities across top companies like Google, Amazon & Adobe.



Job Description
We are looking for an experienced engineer to join our data science team, who will help us design, develop, and deploy machine learning models in production. You will develop robust models, prepare their deployment into production in a controlled manner, while providing appropriate means to monitor their performance and stability after deployment.
What You’ll Do will include (But not limited to):
- Preparing datasets needed to train and validate our machine learning models
- Anticipate and build solutions for problems that interrupt availability, performance, and stability in our systems, services, and products at scale.
- Defining and implementing metrics to evaluate the performance of the models, both for computing performance (such as CPU & memory usage) and for ML performance (such as precision, recall, and F1)
- Supporting the deployment of machine learning models on our infrastructure, including containerization, instrumentation, and versioning
- Supporting the whole lifecycle of our machine learning models, including gathering data for retraining, A/B testing, and redeployments
- Developing, testing, and evaluating tools for machine learning models deployment, monitoring, retraining.
- Working closely within a distributed team to analyze and apply innovative solutions over billions of documents
- Supporting solutions ranging from rule-bases, classical ML techniques to the latest deep learning systems.
- Partnering with cross-functional team members to bring large scale data engineering solutions to production
- Communicating your approach and results to a wider audience through presentations
Your Qualifications:
- Demonstrated success with machine learning in a SaaS or Cloud environment, with hands–on knowledge of model creation and deployments in production at scale
- Good knowledge of traditional machine learning methods and neural networks
- Experience with practical machine learning modeling, especially on time-series forecasting, analysis, and causal inference.
- Experience with data mining algorithms and statistical modeling techniques for anomaly detection in time series such as clustering, classification, ARIMA, and decision trees is preferred.
- Ability to implement data import, cleansing and transformation functions at scale
- Fluency in Docker, Kubernetes
- Working knowledge of relational and dimensional data models with appropriate visualization techniques such as PCA.
- Solid English skills to effectively communicate with other team members
Due to the nature of the role, it would be nice if you have also:
- Experience with large datasets and distributed computing, especially with the Google Cloud Platform
- Fluency in at least one deep learning framework: PyTorch, TensorFlow / Keras
- Experience with No–SQL and Graph databases
- Experience working in a Colab, Jupyter, or Python notebook environment
- Some experience with monitoring, analysis, and alerting tools like New Relic, Prometheus, and the ELK stack
- Knowledge of Java, Scala or Go-Lang programming languages
- Familiarity with KubeFlow
- Experience with transformers, for example the Hugging Face libraries
- Experience with OpenCV
About Egnyte
In a content critical age, Egnyte fuels business growth by enabling content-rich business processes, while also providing organizations with visibility and control over their content assets. Egnyte’s cloud-native content services platform leverages the industry’s leading content intelligence engine to deliver a simple, secure, and vendor-neutral foundation for managing enterprise content across business applications and storage repositories. More than 16,000 customers trust Egnyte to enhance employee productivity, automate data management, and reduce file-sharing cost and complexity. Investors include Google Ventures, Kleiner Perkins, Caufield & Byers, and Goldman Sachs. For more information, visit www.egnyte.com
#LI-Remote
Key Responsibilities:
• Install, configure, and maintain Hadoop clusters.
• Monitor cluster performance and ensure high availability.
• Manage Hadoop ecosystem components (HDFS, YARN, Ozone, Spark, Kudu, Hive).
• Perform routine cluster maintenance and troubleshooting.
• Implement and manage security and data governance.
• Monitor systems health and optimize performance.
• Collaborate with cross-functional teams to support big data applications.
• Perform Linux administration tasks and manage system configurations.
• Ensure data integrity and backup procedures.

PHP Developer Responsibilities:
- Conducting analysis of website and application requirements.
- Writing back-end code and building efficient PHP modules.
- Developing back-end portals with an optimized database.
- Troubleshooting application and code issues.
- Integrating data storage solutions.
- Responding to integration requests from front-end developers.
- Finalizing back-end features and testing web applications.
- Updating and altering application features to enhance performance.
PHP Developer Requirements:
- Bachelor’s degree in computer science or a similar field.
- Knowledge of PHP web frameworks including Laravel, MySQL, API Development, OAuth, JWT, Git.
- Understanding of object-oriented PHP programming.
- Previous experience creating scalable applications.
- Proficient with code versioning tools including Git.
- Familiarity with MySQL databases.
- Ability to project manage.
- Good problem-solving skills.
Company link: https://www.catalystmi.com/" target="_blank">https://www.catalystmi.com/
Job Location: Mumbai (Hybrid)
Job Title: Fullstack AEM Developer
Location: Mumbai, India
Experience: 4+ years
Overview:
We are seeking a talented Fullstack AEM (Adobe Experience Manager) Developer with proficiency in React or Angular frameworks to join our team in Mumbai. The ideal candidate should have a solid understanding of front-end and back-end development, with a focus on implementing and maintaining AEM-based solutions. This role requires strong communication skills and the ability to collaborate effectively with cross-functional teams.
Responsibilities:
- Develop, implement, and maintain AEM-based web applications and components.
- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
- Design and develop custom AEM components, workflows, and templates.
- Integrate AEM with other systems and third-party applications as needed.
- Optimize application performance and ensure scalability and reliability.
- Stay updated with the latest AEM features, trends, and best practices.
- Implement responsive design principles to ensure a seamless user experience across devices.
- Troubleshoot and debug issues related to AEM implementation and integration.
- Write clean, well-documented code and ensure adherence to coding standards.
- Participate in code reviews and provide constructive feedback to peers.
- Mentor junior developers and share knowledge with the team.
- Collaborate with UX/UI designers to create visually appealing and user-friendly interfaces.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 4+ years of experience in full-stack development, with a focus on AEM.
- Strong proficiency in Adobe Experience Manager (AEM) and associated technologies (AEM Sites, AEM Assets, etc.).
- Experience with front-end technologies such as React or Angular.
- Proficiency in HTML5, CSS3, JavaScript, and related frameworks/libraries.
- Experience with server-side languages such as Java.
- Knowledge of RESTful web services and APIs.
- Familiarity with version control systems such as Git.
- Experience with agile development methodologies.
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills.
- Ability to work effectively in a collaborative team environment.
- Experience with cloud platforms (e.g., AWS, Azure) is a plus.
- Relevant Adobe certifications (e.g., Adobe Certified Expert) are desirable but not required.
We have opening for Area Sales Manager- Mumbai for medical devices preferably(Anaesthesia Products)

Key responsible area-
- Work closely with Project Managers and other members of the Development Team to both develop detailed specification documents with clear project deliverables and timelines, and to ensure timely completion of deliverables.
- Attend client meetings during the sales process and during development.
- Work with clients and Project Managers to build and refine graphic designs for websites.
- Convert raw images and layouts from a graphic designer into CSS/XHTML themes.
- Determine appropriate architecture, and other technical solutions, and make relevant recommendations to clients.
- Communicate to the Project Manager with efficiency and accuracy any progress and/or delays.
- Engage in outside-the-box thinking to provide high value-of-service to clients. Alert colleagues to emerging technologies or applications and the opportunities to integrate them into operations and activities.
- Be actively involved in and contribute regularly to the development community of the CMS of your choice. Develop innovative, reusable Web-based tools for activism and community building.
Database Administrator Lead - PostgreSQL
ABOUT Ashnik
Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership.
THE POSITION
Ashnik is looking for talented and passionate people to be part of the team for an upcoming project at client location.
RESPONSIBILITIES
· Monitoring database performance
· Optimizing Queries and handle escalations
· Analyse and assess the impact and risk of low to medium risk changes on high profile production databases
· Implement security features
· DR implementation and switch over
QUALIFICATION AND EXPERIENCE
· Preferably have a working experience of 4 Years and more , on production PostgreSQL DBs.
· Experience of working in a production support environment
· Engineering or Equivalent degree
· Passion for open-source technologies is desired
ADDITIONAL SKILLS
· Install & Configure PostgreSQL, Enterprise DB
· Technical capabilities PostgreSQL 9.x, 10.x, 11.x
· Server tuning
· Troubleshooting of Database issues
· Linux Shell Scripting
· Install, Configure and maintain Fail Over mechanism
· Backup - Restoration, Point in time database recovery
· A demonstrable ability to articulate and sell the benefits of modern platforms, software and technologies.
· A real passion for being curious and a continuous learner. You are someone that invests in yourself as much as you invest in your professional relationships.
LOCATION: Bangalore & Mumbai
Experience: 7 yrs plus
Package: upto 20 LPA

The Data Engineering team is one of the core technology teams of Lumiq.ai and is responsible for creating all the Data related products and platforms which scale for any amount of data, users, and processing. The team also interacts with our customers to work out solutions, create technical architectures and deliver the products and solutions.
If you are someone who is always pondering how to make things better, how technologies can interact, how various tools, technologies, and concepts can help a customer or how a customer can use our products, then Lumiq is the place of opportunities.
Who are you?
- Enthusiast is your middle name. You know what’s new in Big Data technologies and how things are moving
- Apache is your toolbox and you have been a contributor to open source projects or have discussed the problems with the community on several occasions
- You use cloud for more than just provisioning a Virtual Machine
- Vim is friendly to you and you know how to exit Nano
- You check logs before screaming about an error
- You are a solid engineer who writes modular code and commits in GIT
- You are a doer who doesn’t say “no” without first understanding
- You understand the value of documentation of your work
- You are familiar with Machine Learning Ecosystem and how you can help your fellow Data Scientists to explore data and create production-ready ML pipelines
Eligibility
Experience
- At least 2 years of Data Engineering Experience
- Have interacted with Customers
Must Have Skills
- Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES
- Apache Spark
- Python
- Scala
- PostgreSQL
- Git
- Linux
Good to have Skills
- Apache NiFi
- Apache Kafka
- Apache Hive
- Docker
- Amazon Certification

- 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors.
- 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka
- Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features
