11+ System testing Jobs in Pune | System testing Job openings in Pune
Apply to 11+ System testing Jobs in Pune on CutShort.io. Explore the latest System testing Job opportunities across top companies like Google, Amazon & Adobe.

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position Summary
We are seeking a System Test Engineer with expertise in SaaS applications providing cybersecurity solution to join our dynamic team. The ideal candidate will play a critical role in testing, validating, and ensuring the reliability and security of our SaaS-based cyber security platform. This position requires strong analytical skills, hands-on experience with automation, and a deep understanding of cloud environments, networking protocols, firewalls and security frameworks.
Key Roles & Responsibilities
- Design, develop, and execute system-level test plans, test cases, and automated test scripts for a SaaS-based cyber security platform.
- Validate end-to-end functionality, scalability, and performance of security applications integrated with external ITSM systems.
- Develop and maintain automation frameworks to streamline test execution and enhance test coverage.
- Conduct security, performance, and regression testing to identify vulnerabilities, bottlenecks, and reliability issues.
- Test and validate the functionality of agents that connect with the SaaS platform.
- Work closely with development, product management, and DevOps teams to troubleshoot issues and ensure high-quality product releases.
- Implement and execute API testing, system integration testing, and user acceptance testing.
- Participate in test strategy planning and provide feedback for continuous improvement of the test process.
Basic Qualifications
- A bachelor’s or master’s degree in computer science, electronics engineering or a related field
- 3-12 years of experience in system testing for SaaS applications and Cyber Security platforms.
- Strong knowledge of networking protocols (TCP/IP, HTTP/HTTPS, DNS, VPN, IPSec, TLS, etc.).
- Strong understanding of security concepts such as firewalls, IDS/IPS, zero-trust architecture, and cloud security controls.
- Hands-on experience with test automation tools (Selenium, Robot Framework, PyTest, etc.).
- Proficiency in scripting and automation using Python, Bash, or similar languages.
- Experience working with cloud platforms such as AWS, Azure, or Google Cloud.
- Familiarity with containerization and orchestration tools like Docker and Kubernetes.
- Experience with CI/CD pipelines and DevOps processes.
- Strong troubleshooting and debugging skills in distributed systems and cloud environments.
Preferred Qualifications
- Knowledge of security frameworks such as SOC2, ISO 27001, and compliance standards.
- Experience with security testing tools such as Burp Suite, Nessus, Wireshark, or Metasploit.
- Familiarity with Infrastructure as Code (IaC) tools such as Terraform or Ansible.
- Certifications such as AWS Certified Security - Specialty, CCNA Security, CISSP, or CEH are a plus.


Requirements
• Extensive and expert programming experience in at least one general programming language (e. g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
• Experience with multi-threading and concurrency programming.
• Extensive experience in object oriented design skills, knowledge of design patterns, and a huge passion and ability to design intuitive modules and class-level interfaces.
• Excellent coding skills - should be able to convert design into code fluently.
• Knowledge of Test Driven Development.
• Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
• 4+ years of experience in the art of writing code and solving problems on a large scale.
• Open communicator who shares thoughts and opinions frequently, listens intently, and takes constructive feedback.


- Design and implement cloud solutions, build MLOps on Azure
- Build CI/CD pipelines orchestration by GitLab CI, GitHub Actions, Circle CI, Airflow or similar tools
- Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality
- Data science models testing, validation and tests automation
- Deployment of code and pipelines across environments
- Model performance metrics
- Service performance metrics
- Communicate with a team of data scientists, data engineers and architect, document the processes
Role Description
This is a full-time hybrid role as a GCP Data Engineer,. As a GCP Data Engineer, you will be responsible for managing large sets of structured and unstructured data and developing processes to convert data into insights, information, and knowledge.
Skill Name: GCP Data Engineer
Experience: 7-10 years
Notice Period: 0-15 days
Location :-Pune
If you have a passion for data engineering and possess the following , we would love to hear from you:
🔹 7 to 10 years of experience working on Software Development Life Cycle (SDLC)
🔹 At least 4+ years of experience in Google Cloud platform, with a focus on Big Query
🔹 Proficiency in Java and Python, along with experience in Google Cloud SDK & API Scripting
🔹 Experience in the Finance/Revenue domain would be considered an added advantage
🔹 Familiarity with GCP Migration activities and the DBT Tool would also be beneficial
You will play a crucial role in developing and maintaining our data infrastructure on the Google Cloud platform.
Your expertise in SDLC, Big Query, Java, Python, and Google Cloud SDK & API Scripting will be instrumental in ensuring the smooth operation of our data systems..
Join our dynamic team and contribute to our mission of harnessing the power of data to make informed business decisions.
Hello ,
Greetings from Talentika !!
we are having a job vacancy for Java Developer Along with the Skills
Java, Microservices, Kubernetes, AWS/Azure, devops background .
Best Regards,
Priya.
Headquartered in Redwood city, CA, our client is a communication and assessment SaaS startup that enables the movement health professional (athletic trainer, physical therapist, recovery specialist) to seamlessly capture and assess patient care data through cutting edge technologies such as machine learning and AI. It improves outcomes for healthcare organisations by identifying early signs of clinical deterioration in chronically-ill patients thereby decreasing hospital admissions and reducing unnecessary spending.
This revolutionary startup has raised $3 Mn in a seed funding round led by top investors. It is all set to democratise the accessibility and affordability of movement health through its full stack digital health platform.
What you will do:
- Facilitating all Board Meeting and CEO reviews (weekly/ monthly/ quarterly)
- Flagging risk from smallest to largest which can derail projects or cause tart misses
- Owning and organizing quarterly planning sessions with internal partners with content, agenda, preparation, and oversight of deliverables
- Helping drive data driven presentations for the leadership team in both internal and external engagements
- Driving cross-functional objectives at the leadership level with exceptional project management and ownership
Desired Candidate Profile
What you need to have:- Graduation/ Post graduation (preferably in Strategy and Finance)
- 3+ years of experience in assisting executive leadership
- Excellent communication skills (English & 1 regional language)
- Competency in using MS Word, Excel & PowerPoint
- Working knowledge of data tools
- Understanding of business operations and related finance
- Comfort running strategic workshops with executive teams
- Extensive program management skills and ability to lead and facilitate strategic initiatives in a cross-functional environment
- Ability to partner, execute, and lead through influence
• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions
COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem


Data warehousing architect/ desinger
Data migration architect/ designer
technical expertise
Responsible for sales activities, meeting & achieving sales targets
Primarily engaged in field sales activity i.e. Visit to customer for Technical Plant
Studies, Problem Evaluation, Technical Feasibility Study & ROI Determination of our
Products
Preparation, submission and follow up of techno-commercial proposals
Visit to customers for follow up of enquiries and techno-commercial discussions
Responsible for negotiations / finalization of Proposals
Responsible for preparation of order documents with necessary technical inputs
Co-ordination for post order activities and other compliance till order
commissioning
Preparation and submission of Weekly & Monthly report for sales visits