11+ Test scenarios Jobs in Chennai | Test scenarios Job openings in Chennai
Apply to 11+ Test scenarios Jobs in Chennai on CutShort.io. Explore the latest Test scenarios Job opportunities across top companies like Google, Amazon & Adobe.
● The individual will be responsible for developing manual test scripts.
● Responsible for executing automated and manual test cases.
● Responsible for tracking and logging defects during execution.
● Generate reports such as defect analysis report, project status report, etc.
● Should be able to understand customer/end user's requirements and should own the issue till
closure.
● Actively participate in requirement analysis and test design.
● Work with product team/designer on front end side UI design.
● Identify opportunities for continuous improvement.
● Provide proactive advice and feedback throughout the software testing lifecycle for prevention
and early correction of quality problem.
● Demonstrate relationship building and teamwork skills within all levels of the organization in a
collaborative effort
● Must be adaptable in a changing environment and be effective in many different business
settings.
Required skills
● 2 - 5 years' experience testing Web and Mobile Applications (either manual or automated)
● Good knowledge in Web/REST API testing
● Experience working on Agile based projects
● Good experience in writing clear and thorough test plans and test cases
● Good understanding of Software testing lifecycle
● Should have expertise in various test development and design methodologies
● A high standard of written and verbal English Desirable skills
● Working knowledge on REST API.
● Understanding of web and Mobile app design standards.
Nice-to-have
● 1+ year(s) experience writing automated test scripts (not record/replay)
● Experience in the design of great user experiences
● Knowledge of Postman or similar for WebAPI testing
● Robotic process automation (RPA) tool experience using UIPATH/Automation Anywhere/Blue
Prism
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes
Key responsibilities
• Design, build, and maintain robust CI/CD pipelines using Azure DevOps Services (Azure Pipelines) and Git-based workflows.
• Implement and manage infrastructure as code (IaC) using ARM templates, Bicep, and/or Terraform for repeatable environment provisioning.
• Containerize applications (Docker) and manage container orchestration platforms such as AKS (Azure Kubernetes Service).
• Automate build, test, release, and rollback processes; integrate automated testing and quality gates into pipelines.
• Monitor and improve platform reliability and observability using logging and monitoring tools (e.g., Azure Monitor, Application Insights, Prometheus, Grafana).
• Drive platform security and compliance through pipeline controls, secrets management (Key Vault / Vault), and secure configuration practices.
• Implement cost-optimization and governance for Azure resources (tags, policies, budgets).
• Troubleshoot build/release failures, production incidents, and performance bottlenecks; perform root-cause analysis and implement permanent fixes.
• Mentor developers in Git workflows, pipeline authoring, best practices for IaC, and cloud-native design.
• Maintain clear documentation: runbooks, deployment playbooks, architecture diagrams, and pipeline templates.
Required skills & experience
• 4+ years hands-on experience working with Azure and cloud-native application delivery.
• Deep experience with Azure DevOps (Repos, Pipelines, Artifacts, Boards).
• Strong IaC skills with Terraform, ARM templates, or Bicep.
• Solid experience with CI/CD design and YAML pipeline authoring.
• Practical knowledge of containerization (Docker) and Kubernetes — preferably AKS.
• Scripting skills: PowerShell, Bash, and/or Python for automation.
• Experience with Git workflows (branching strategies, PRs, code reviews).
• Familiarity with configuration management and secrets management (Azure Key Vault, HashiCorp Vault).
• Understanding of networking, identity (Azure AD), and security fundamentals in Azure.
• Strong troubleshooting, debugging, and incident response skills.
• Good collaboration and communication skills; ability to work across teams.
Certification
AZ-400: Microsoft Certified: DevOps Engineer Expert or AZ-104 or AZ 305 or Terraform Associate.
Job Description:
- Extensive experience in Appian BPM application development
- Knowledge of Appian architecture and its objects best practices
- Participate in analysis, design, and new development of Appian based applications
- Team leadership and provide technical leadership to Scrum teams
- Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
- Build applications: interfaces, process flows, expressions, data types, sites, integrations,
- Proficient with SQL queries and with accessing data present in DB tables and views
- Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
- Experience working with different Appian Object types, query rules, constant rules and expression rules
Qualifications
- At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
- Over 8 years in Implementing IT solutions using BPM or integration technologies
- Certification Mandatory- L1 and L2 a
- Experience in Scrum/Agile methodologies with Enterprise level application development projects
- Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL
Additional information Skills Required
- Appian BPM application development on version 19.x or higher
- Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
- Good leadership skills and the ability to lead a team of software engineers technically
- Experience working in Agile Scrum teams
- Good Communication skills
Job Description:
1.Be a hands on problem solver with consultative approach, who can apply Machine Learning & Deep Learning algorithms to solve business challenges
a. Use the knowledge of wide variety of AI/ML techniques and algorithms to find what combinations of these techniques can best solve the problem
b. Improve Model accuracy to deliver greater business impact
c.Estimate business impact due to deployment of model
2.Work with the domain/customer teams to understand business context , data dictionaries and apply relevant Deep Learning solution for the given business challenge
3.Working with tools and scripts for sufficiently pre-processing the data & feature engineering for model development – Python / R / SQL / Cloud data pipelines
4.Design , develop & deploy Deep learning models using Tensorflow / Pytorch
5.Experience in using Deep learning models with text, speech, image and video data
a.Design & Develop NLP models for Text Classification, Custom Entity Recognition, Relationship extraction, Text Summarization, Topic Modeling, Reasoning over Knowledge Graphs, Semantic Search using NLP tools like Spacy and opensource Tensorflow, Pytorch, etc
b.Design and develop Image recognition & video analysis models using Deep learning algorithms and open source tools like OpenCV
c.Knowledge of State of the art Deep learning algorithms
6.Optimize and tune Deep Learnings model for best possible accuracy
7.Use visualization tools/modules to be able to explore and analyze outcomes & for Model validation eg: using Power BI / Tableau
8.Work with application teams, in deploying models on cloud as a service or on-prem
a.Deployment of models in Test / Control framework for tracking
b.Build CI/CD pipelines for ML model deployment
9.Integrating AI&ML models with other applications using REST APIs and other connector technologies
10.Constantly upskill and update with the latest techniques and best practices. Write white papers and create demonstrable assets to summarize the AIML work and its impact.
· Technology/Subject Matter Expertise
- Sufficient expertise in machine learning, mathematical and statistical sciences
- Use of versioning & Collaborative tools like Git / Github
- Good understanding of landscape of AI solutions – cloud, GPU based compute, data security and privacy, API gateways, microservices based architecture, big data ingestion, storage and processing, CUDA Programming
- Develop prototype level ideas into a solution that can scale to industrial grade strength
- Ability to quantify & estimate the impact of ML models.
· Softskills Profile
- Curiosity to think in fresh and unique ways with the intent of breaking new ground.
- Must have the ability to share, explain and “sell” their thoughts, processes, ideas and opinions, even outside their own span of control
- Ability to think ahead, and anticipate the needs for solving the problem will be important
· Ability to communicate key messages effectively, and articulate strong opinions in large forums
· Desirable Experience:
- Keen contributor to open source communities, and communities like Kaggle
- Ability to process Huge amount of Data using Pyspark/Hadoop
- Development & Application of Reinforcement Learning
- Knowledge of Optimization/Genetic Algorithms
- Operationalizing Deep learning model for a customer and understanding nuances of scaling such models in real scenarios
- Optimize and tune deep learning model for best possible accuracy
- Understanding of stream data processing, RPA, edge computing, AR/VR etc
- Appreciation of digital ethics, data privacy will be important
- Experience of working with AI & Cognitive services platforms like Azure ML, IBM Watson, AWS Sagemaker, Google Cloud will all be a big plus
- Experience in platforms like Data robot, Cognitive scale, H2O.AI etc will all be a big plus
CaratLane is a Technology-Driven organization and India’s first omnichannel jewelry brand, it
was founded in 2008, by Mithun Sacheti, with a simple but courageous objective – to make
beautiful jewelry accessible, affordable and forever wearable. With a strategic investment from
Titan Company Limited, CaratLane is now partnered with India’s largest retail jeweler Tanishq.
Under the leadership of our co-founders Gurukeerthi Gurunathan and Avnish Anand, CaratLane
aims to work towards a common mission – to offer customers beautiful jewelry and a
distinctive shopping experience that fits today’s values and lifestyles.
Desired candidate profile :
● 4 to 6 years of iOS Native Application Development with Swift
● Web services/API interactions, Audio/Video streaming, SQLite, JSON/XML parsing.
● Expertise in Autolayouts, Custom UI Elements, and IBDesignable.
● Strong grasp of Data structure and algorithms
● Good knowledge of Object Oriented Programming and Protocol Oriented
Programming.
● Experience with design patterns like MVC/MVVM/VIPER
● Experience with Unit testing, TDD and UI Testing.
Nice to Have :
● Domain knowledge in eCommerce
● Previous experience in a product company is a plus.
What we value as a team:
● Proactive in communication
● Collaborate with other members of the agile ecosystem
● Out-of-the-box thinking to resolve issues and bringing New Ideas to bringing Quality to the Applications
Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.
JOB Role : Senior Platform Consultant
Essential Skills :
- Expertise in SFMC Email Marketing Tool
- Good Experience in SQL
- Experience in Data model design
- Expertise in Campaign Orchestration
- Life cycle programs and personalization
- Expertise in Marketing data optimization and List cleansing
- Experience in Data optimization and Hygiene
- Experience in Personalization in campaigns
- Good Business understanding of Digital Marketing
- Excellent communication skills
- Preferably with SFMC Certification (Marketing Cloud Specialist or Email Specialist)
Role Description :
- The person will be required to work on SFMC platform for creating and executing marketing campaigns.
- Ability to work independently and work directly with Marketing teams
- Drive the strategy for the email program to deliver strong engagement and results, through audience segmentation, deliverability, best practices, and data analysis.
- Analyse the effectiveness of Digital Marketing deliverables, measures against objectives and reports results.
- Flawless delivery of daily campaigns, troubleshoot issues that arise and implement solutions to mitigate future problems
Notice Period- Immediate Joiners are preferred or max 1 month only
Location - Gurgram /Bangalore/Chennai
Responsibilities:
The role ensures the successful delivery of contracted services to the customer together with local consultants. You will:
· Act as an interface between local consulting teams & RIB R&D to ensure implementation customer needs are met
· Ensure successful system deployments and deliver standard configurations in collaboration with RIB
· Scoping of System integration and proposal design in collaboration with RIB
· Facilitate customer data migration
· Configure MTWO deployment to customer needs with following means :
o Workflow configuration with JavaScript and SQL
o Dashboard/Control Tower 4.0 configuration
o Forms (Regional standard forms, client data entry forms with Java)
o Reports (Regional standard reports, 4.0 business/management reports, client reports with Java)
o Prepare and execute end-user training & manuals
**Apply technical knowledge and customer insights to create a roadmap for the deployment of MTWO to meet business and IT needs, ensuring technical viability of the project and successful deployments
**Maintain and advance deep technical skills and knowledge, keeping up to date with market trends and competitive insights, and share within the Practice
**Assess the customers' knowledge of MTWO and overall cloud readiness to support them through a structured learning plan and ensure its delivery through the consulting team.
**Create, maintain and take-up of template documents, IT & system architecture, and standards
**Handover of repeatable processes to local consulting teams
**Advocate technology and business enablement to advise the best-practice methods in how to utilize MTWO to drive competitive advantage.
Qualifications:
*Highly motivated and results-oriented
*Solid problem-solving skills required.
*Excellent verbal and written communication skills as well as strong business acumen.
*Experience migrating or transforming legacy customer solution to the cloud
*Demonstrated ability to adapt to new technologies and learn quickly
*Presentation skills with a high degree of comfort with both large and small audiences (Senior Executives, IT management and conferences)
*Experience conducting proofs of concept of new technology, and providing thought leadership at the highest levels
*Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
*Ability to evaluate requirements and technical specifications for solution design fit
*Address technical specification (e.g. via user stories)
*Knowledge of ticket systems and agile working practices
*Strong analytical and critical thinking skills
*Contribute to knowledge sharing within a community (e.g. team, practice, or project).
*Experience and desire to work in a cloud technology consulting environment.
*Preferably previous experience in solution rollout as well as user training & onboarding.
Required Technical Skills:
**5+ yrs at consultant Level and 8 yrs plus at lead level
**Java
**Javascript
**SQL
**Dashboard configuration

