11+ Quality control Jobs in Hyderabad | Quality control Job openings in Hyderabad
Apply to 11+ Quality control Jobs in Hyderabad on CutShort.io. Explore the latest Quality control Job opportunities across top companies like Google, Amazon & Adobe.
Roles and Responsibilities:
- Develop and implement quality management systems.
- Conduct quality audits, analyses, and management to ensure compliance with ISO standards.
- Establish and maintain quality control standards and procedures
- Develop and implement checklists for quality documentation and inspection.
- Collaborate with cross-functional teams to identify areas for improvement in renewable energy projects.
- Evaluate the efficiency of existing quality systems and recommend improvements.
- Ensure timely completion of all quality-related tasks within budget constraints.
- Provide technical support to resolve quality-related issues during project execution.
- Oversee and manage high-capacity solar projects as the project head.
- Lead and manage teams efficiently to achieve project goals.
Job Title : Power Automate Developer
Experience Level : 4 to 10 Years
Location : PAN India
Work Mode : Hybrid
Notice Period : Immediate Joiners Only
Job Summary:
Capgemini is looking for experienced Power Automate Developers with a strong background in automating business processes using Microsoft Power Platform tools.
Candidates must have hands-on experience and relevant certifications (PL-500 or PL-900).
Mandatory Skills : 4+ Years of hands-on experience with Power Automate, PL-500/PL-900 certification, workflow automation, system integration, and low-code/no-code development.
Key Responsibilities :
- Develop, implement, and support Power Automate workflows to streamline business processes.
- Integrate Power Automate with Microsoft 365, Dynamics 365, SharePoint, and other third-party services.
- Collaborate with business analysts and stakeholders to understand requirements and translate them into scalable automation solutions.
- Troubleshoot and optimize existing workflows to enhance performance and reliability.
- Ensure solutions comply with best practices, security policies, and performance standards.
Mandatory Skills :
- 4+ Years of hands-on experience with Power Automate.
- Certification : PL-500 or PL-900 (must have)

A American Bank holding company . a community-focused financial institution that provides accessible banking services to its members, operating on a not-for-profit basis.
Position: AIML_Python Enginner
Kothapet_Hyderabad _Hybrid.( 4 days a week onsite)
Contract to hire fulltime to client.
5+ years of python experience for scripting ML workflows to deploy ML Pipelines as real time, batch, event triggered, edge deployment
4+ years of experience in using AWS sagemaker for deployment of ML pipelines and ML Models using Sagemaker piplines, Sagemaker mlflow, Sagemaker Feature Store..etc.
3+ years of development of apis using FastAPI, Flask, Django
3+ year of experience in ML frameworks & tools like scikit-learn, PyTorch, xgboost, lightgbm, mlflow.
Solid understanding of ML lifecycle: model development, training, validation, deployment and monitoring
Solid understanding of CI/CD pipelines specifically for ML workflows using bitbucket, Jenkins, Nexus, AUTOSYS for scheduling
Experience with ETL process for ML pipelines with PySpark, Kafka, AWS EMR Serverless
Good to have experience in H2O.ai
Good to have experience in containerization using Docker and Orchestration using Kubernetes.
Job Summary:
The Senior Forensic Analyst has strong technical skills and an eagerness to lead projects and work with our clients. Apply Incident Response, forensics, log analysis, and malware triage skills to solve complex intrusion cases at organizations around the world. Our consultants must be comfortable working in teams to tackle challenging projects, communicating with clients, and creating and presenting high-quality deliverables.
ROLES AND RESPONSIBILITIES
· Investigate breaches leveraging forensics tools including Encase, FTK, X-Ways, SIFT, Splunk, and custom investigation tools to determine the source of compromises and malicious activity that occurred in client environments. The candidate should be able to perform forensic analysis on:
· Host-based such as Windows, Linux, and Mac OS X
· Firewall, web, database, and other log sources to identify evidence and artifacts of malicious and compromised activity.
· Cloud-based platforms such as Office 365, Google, Azure, AWS…etc
· Perform analysis on identified malicious artifacts
· Contribute to the curation of threat intelligence related to breach investigations
· Excellent verbal and written communication and experience presenting technical findings to a wide audience of varying technical expertise
· Be responsible for integrity in analysis, quality in client deliverables, as well as gathering caseload intelligence.
· Responsible for developing the forensic report for breach investigations related to ransomware, data theft, and other misconduct investigations.
· Must also be able to manage multiple projects daily.
· Manage junior analysts and/or external consultants providing investigative support
· Act as the most senior forensic analyst, assisting staff, provide a review of all forensic work product to ensure consistency and accuracy, and support based on workload or complexity of matters
· Ability to analyze workflow, processes, tools, and procedures to create further efficiency in forensic investigations
· Ability to work greater than 40 hours per week as needed DISCLAIMER The above statements are intended to describe the general nature and level of work being performed. They are not intended to be an exhaustive list of all responsibilities, duties, and skills required personnel so classified.
SKILLS AND KNOWLEDGE
· Proficient with host-based forensics, network forensics, malware analysis, and data breach response
· Experienced with EnCase, Axiom, X-Ways, FTK, SIFT, ELK, Redline, Volatility, and open-source forensic tools
· Experience with common scripting or programming language, including Perl, Python, Bash, or PowerShell Role Description Senior Forensic Analyst
JOB REQUIREMENTS
· Must have at least 5+ years of incident response or digital forensics experience with a passion for cybersecurity
· Consulting experience preferred.
WORK ENVIRONMENT
While performing the responsibilities of this position, the work environment characteristics listed below are representative of the environment the employee will encounter: Usual office working conditions. Reasonable accommodations may be made to enable people with disabilities to perform the essential functions of this job.
PHYSICAL DEMANDS
· No physical exertion is required.
· Travel within or outside of the state.
· Light work: Exerting up to 20 pounds of force occasionally, and/or up-to 10 pounds of force as frequently as needed to move objects.
DevOps & Automation:
- Experience in CI/CD tools like Azure DevOps, YAML, Git, and GitHub. Capable of automating build, test, and deployment processes to streamline application delivery.
- Hands-on experience with Infrastructure as Code (IaC) tools such as Bicep (preferred), Terraform, Ansible, and ARM Templates.
Cloud Services & Architecture:
- Experience in Azure Cloud services, including Web Apps, AKS, Application Gateway, APIM, and Logic Apps.
- Good understanding of cloud design patterns, security best practices, and cost optimization strategies.
Scripting & Automation:
- Experience in developing and maintaining automation scripts using PowerShell to manage, monitor, and support applications.
- Familiar with Azure CLI, REST APIs, and automating workflows using Azure DevOps Pipelines.
Data Integration & ADF:
- Working knowledge or basic hands-on experience with Azure Data Factory (ADF), focusing on developing and managing data pipelines and workflows.
- Knowledge of data integration practices, including ETL/ELT processes and data transformations.
Application Management & Monitoring:
- Ability to provide comprehensive support for both new and legacy applications.
- Proficient in managing and monitoring application performance using tools like Azure Monitor, Log Analytics, and Application Insights.
- Understanding of application security principles and best practices.
Database Skills:
- Basic experience of SQL and Azure SQL, including database backups, restores, and application data management.
- Key Responsibilities
- Manage on-time and on-budget delivery with planned profitability, consistently looking to improve quality and profitability
- Lead the onsite project teams and ensure they understand the client environment
- Responsible for Backlog growth - Existing projects + Renewals/Extensions of current projects + Rate Revisions
- Responsible for driving RFPs / Proactive bids
- Build senior and strategic relationships through delivery excellence
- Understand the client environment, issues, and priorities
- Serve as the day-to-day point of contact for the clients
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
Who you are:
We are looking for experienced / energetic Demo Jockey (s) who can represent the 500apps brand well and engage our prospective customers in partnership with the sales team in meaningful ways for customers as well as 500apps.. Ideal candidates will share our passion to learn and master the apps to make customers appreciate 500apps in their situation / context. The successful candidate will be business savvy, detail oriented and be able to build effective demo’s to suit a prospective customer's use case. She / he would have had relevant experience in a SaaS / Product based company.
What you will be doing (responsibilities):
1. Work as part of the Sales POD in a matrixed environment to close deals.
2. Pull together effective demo’s using necessary apps for desired outcomes.
3. Keep an eye on the Sales pipeline and build / customize demo’s as needed prior to client meetings in partnership with the sales team.
4. Gather feedback from Customers post demonstration and provide the same to the Product team for potential enhancements / tweaks to fit into the customer's environment.
5. Be available / flex for demo’s as per shift / region assigned.
6. Build and maintain a repository of ready to use Demo’s for most frequent use cases.
7. Create and submit necessary reports.
8. Be the product champion / evangelist for 500apps.com
9. Stay on top of updates in current apps and new apps being launched - explore new app combinations for enhanced results.
Skills, Qualifications, Requirements:
- Clear and articulate communicator in English language to be able to effectively engage International customers.
- Ability to learn and apply apps in a stand alone fashion as well as in combination with other apps to deliver enhanced results for Customer use cases.
- Ideal candidate has a broad understanding of industries, domains, functions and disciplines to be able to effectively and quickly apply the apps in appropriate scenarios.
- You will ideally have a combination of both Product (Computer Sciences) and Business education and experience.
- At least 3 years of relevant experience as Demo Specialist or Onboarding Specialist in a B2B setting.
- Should be able to work effectively in a Matrixed environment - working closely with Sales and Customer Success Teams.
- Ability to learn and adapt fast.
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.





