11+ Black-box testing Jobs in Pune | Black-box testing Job openings in Pune
Apply to 11+ Black-box testing Jobs in Pune on CutShort.io. Explore the latest Black-box testing Job opportunities across top companies like Google, Amazon & Adobe.
Our job is one of empowerment. Specifically, empowering our clients to improve their business through advances in technology and data. As the ad tech ecosystem continues to become more complex, we are counted on to provide expertise on digital marketing technologies, incorporate strategy and manage tracking of all digital marketing efforts. But as our client roster continues to grow, we’re going to need more help.
YOUR RESPONSIBILITIES
- Automated testing of complex SPA Web application (brand-new software products based on unified platform, new features, bug verification), requirement testing, analysis and clarification (direct communication with product owner),
- Communication with different cross-functional teams
- Following automation testing standards set in the team
- Help to build fully automated testing pipeline
REQUIRED SKILLS AND EXPERIENCE
- 3-5 years of experience in automation testing
- Testing web-applications (UI and API), including SPA
- Black box testing (boundary values, equivalence partitioning)
- UI/UX Testing
- C# experience of 2+ years
- NUnit or similar
- Selenium WebDriver
- CSS-selectors
DESIRABLE SKILLS AND EXPERIENCE
- Basic knowledge of core JavaScript
- Desire to learn tools and techniques
- Confluence or similar
- Bug tracking system (Jira)
- TeamCity
- Allure or similar
- Jira X-Ray or similar
- Performance testing experience
- Cross-browser testing experience
- Security testing experience
PERSONAL SKILLS
- Clever personality
- Strong spirit skills set: conflict resistant, self -motivated, result-oriented, responsible, honest, “open”, courageous
- Configure, optimize, document, and support of the infrastructure components of software products (which are hosted in collocated facilities and cloud services such as AWS)
- Design and build tools and frameworks that support deployment and management and platforms
- Design, build, and deliver cloud computing solutions, hosted services, and underlying software infrastructures
- Build core functionality of our cloud-based platform product, deliver secure, reliable services and construct third party integrations
- Assist in coaching application developers on proper DevOps techniques for building scalable applications in the microservices paradigm
- Foster collaboration with software product development and architecture teams to ensure releases are delivered with repeatable and auditable processes
- Support and troubleshoot scalability, high availability, performance, monitoring, backup, and restores of different environments
- Work independently across multiple platforms and applications to understand dependencies
- Evaluate new tools, technologies, and processes to improve speed, efficiency, and scalability of continuous integration environments
- Design and architect solutions for existing client-facing applications as they are moved into cloud environments such as AWS
- Competencies
- Full understanding of scripting and automated process management in languages such as Shell, Ruby and/ or Python
- Working Knowledge SCM tools such as Git, GitHub, Bitbucket, etc.
- Working knowledge of Amazon Web Services and related APIs
- Ability to deliver and manage web or cloud-based services
- General familiarity with monitoring tools
- General familiarity with configuration/provisioning tools such as Terraform
- Experience
- Experience working within an Agile type environment
- 4+ years of experience with cloud-based provisioning (Azure, AWS, Google), monitoring, troubleshooting, and related DevOps technologies
- 4+ years of experience with containerization/orchestration technologies like Rancher, Docker and Kubernetes
Exp - 2 to 4 Yrs
Location - Viman Nagar, Pune, (WFO)
Skills - Java+Spring Boot
Budget - 50 to 70K PM
Mandatory Skills:
Minimum 3+ years of experience
Solid and proficient skills in Java, Spring Framework, JDBC
Solid and proficient skills in Angular 6+
Strong foundation in Restful design practices
Experience in Unit Testing, Data Mockup and Automation Test
Strong communication
Knowledge of Control M
Good to have:
Knowledge of Scrum and Agile
Knowledge of DevOps tooling (e.g., Jenkins, Git, Maven)
Knowledge of basics of Cloud Computing
Knowledge of Python
Knowledge of Jenkins
Location: Hybrid / Remote
Responsibilities:
• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
• Implementing Spark processing based ETL frameworks
• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
• Modifying the Informatica-Teradata & Unix based data pipeline
• Enhancing the Talend-Hive/Spark & Unix based data pipelines
• Develop and Deploy Scala/Python based Spark Jobs for ETL processing
• Strong SQL & DWH concepts.
Preferred Background:
• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs
• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
• Understanding of EDW system of business and creating High level design document and low level implementation document
• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document
• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Company Profile:
Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.
We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.
Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.
Salary: As per company standards.
Designation: Data Engineering
Location: Pune
Experience with ETL, Data Modeling, and Data Architecture
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.
Experience with AWS cloud data lake for development of real-time or near real-time use cases
Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing
Build data pipeline frameworks to automate high-volume and real-time data delivery
Create prototypes and proof-of-concepts for iterative development.
Experience with NoSQL databases, such as DynamoDB, MongoDB etc
Create and maintain optimal data pipeline architecture,
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow
Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.
Employment Type
Full-time
Shift:- Night
Responsibilities
- Design conversational chatbots using state of the art technology
- Collaborate with the senior team (System Architect and Senior Programmer) on application architecture design and decision-making
- Clean / analyze data coming from bot conversation
- Define recurring questions that can be handled automatically / defined by the client
- Improve dialog flow to handle those recurring questions / develop new actions
- Help with the handling of multilingual support
- Develop internal testing tools
- 3-10 years experience
- Strong React, TypeScript, JavaScript
- Experience with NextJS and Material UI
- Experience with popular React.js workflows (such as Flux or Redux)
- Demonstrated mastery of HTML, CSS & JavaScript (ES6+)
- Good understanding of HTML, CSS, Javascript, jQuery, Bootstrap3/4, JSON & AJAX.
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model.
Backend server development & support with Node, JavaScript, JSON, REST, NoSQL, Cloud Native technologies like Docker & Registry, Kubernetes & Helm
- Skilled in data structures, algorithms, modularization, OOP, microservices, and design patterns
- Skilled in coding best practices using containers, packaging (npm, yarn), agility (with Git, Jira), unit testing (JEST), CI/CD (Jenkins), debugging, and ensuring high productivity & quality
- Exposure to security (OIDC/JWT, RBAC, monitoring, auditing)
- Good with learning, problem solving & innovation
About the Company
Leap Info Systems Pvt. Ltd. - A software product company with its products and solutions in the convergent lighting controls and automation. Recently Leap acquired elitedali – worlds first Niagara based lighting controls and automation solution, from one of the leading US organization.
We are a passionate team on a mission to develop innovative controls and automation products. We are expanding our product development team and are in search of like-minded highly passionate team members who would like to contribute to the leading lighting controls and automation product. We have customers in India, Europe, USA, and Australia.
Eligibility
Any suitable graduates (0 to 2 years experienced) who meet prescribed qualities and job responsibilities. Preferred but not limited to engineering graduate in Computer/IT/Instrumentation/EnTC etc.
Job Responsibilities
Be a part of the product development team for maintaining as well as developing new products based on JAVA and Niagara Software framework.
Qualities
- Self-Learner
- Analytical skills
- Able to work with Cross-functional team
- Problem-solving approach
- Good Team Player with Positive vibes
What you must have,
- Hands-on programming at academic or at the hobby level.
- Good knowledge of Java/J2EE related technologies
- Good knowledge of solution-based approaches
- Good communication skills - phone, email and in-person.
What you can expect from LEAP,
- Positive conducive working environment to grow with cross-functional team members.
- Flexi working hours with defined responsibilities.
- Opportunity to work with leading open standard automation framework like Niagara Software, Lighting controls technologies like DALI, Wireless etc
- Be the active part of an emerging global product company



