Performance Engineering (JMeter/Gatling with Cloud) - QA Lead/Manager
at A CMML5 IT MNC from USA
Must to have skills :
1. Load testing tools- Jmeter or Gatling
2. Containerization: Docker or kubernetes
3. Created CI/CD pipelines using Jenkins and, integarted them with Jmeter
4. APM tools- Appdynamics/Dynatrace
5. Head dump/thread dump analysis
Primary Responsibilities:
Define performance test strategies for medium to complexity integrated / n-tier application
architecture defined for medium to complexity business functionality applications
You play a key role in the planning, design, scripting, execution, and analysis of various
performance tests of software applications. Design workload based on NFRs
Apply functional knowledge and technical expertise to deliver performance testing projects,
implementation of configuration changes for client applications, and other maintenance of
software applications to meet business process requirements.
The ability to think beyond just what is on the monitor to the servers and architecture of the entire system.
Load Tools: Performance Center, LoadRunner, StormRunner, JMeter, BlazeMeter, Soasta,
OpenSTA, SoapUI.
Frontend Testing : Web Page Test, Google Page Speed, Google Lighthouse
Monitoring Tools : Appdynamics, DynaTrace, New Relic, Nagios/Ops View, Cacti, Wily
Introscope and SiteScope
Similar jobs
The Content and Communications Manager is responsible for creating, managing, and executing content strategies to support the organization's communication and marketing goals. This role involves producing high-quality content across various platforms, managing communication channels, and ensuring consistent messaging that enhances the company's brand and engages its target audience.
Key Responsibilities:
Content Strategy:
Develop and implement a comprehensive content strategy aligned with the company's goals and target audience.
Conduct content audits to identify gaps and opportunities for improvement.
Stay up-to-date with industry trends and competitor activities to inform content strategy.
Content Creation:
Produce high-quality, engaging content for various platforms, including website, blog, social media, email, and print materials.
Collaborate with internal teams, such as marketing, product, and sales, to gather information and create relevant content.
Edit and proofread content to ensure accuracy, clarity, and consistency.
Communications Management:
Develop and manage the company's communication channels, including social media, newsletters, press releases, and internal communications.
Craft compelling messaging that aligns with the company's brand voice and resonates with the target audience.
Manage relationships with media contacts, influencers, and other external partners to amplify the company's reach.
SEO and Analytics:
Optimize content for search engines (SEO) to improve organic visibility and drive traffic to the company's website.
Monitor and analyze content performance using tools like Google Analytics, social media insights, and other relevant metrics.
Use data-driven insights to refine content strategies and improve engagement and conversion rates.
Project Management:
Manage content projects from conception to completion, ensuring timely deliver
- Job Title- Java + AWS Developer
- Experience - 5+ Years
- Location - Pune
- Work Mode - Hybrid
- Qualification - Any Computer/Engineering Degree
- Job Description
- Experience with AWS services such as EKS, ECR, Aurora, S3, KVS, SQS
- Experience with Java technology stack, including Java SE, Java EE, JDBC Spring, Spring Boot, Micro services, Hibernate
- Experience with Eclipse, GIT
- Experience with SQL, No-SQL databases, messaging systems
- Understanding of MQTT & AMQP, experience with RabbiMQ
- Understanding of CI/CD (continuous integration/continuous delivery) tools, frameworks and deployment processes
- Thorough understanding of OOP, SOLID, and RESTful services
- Thorough understanding of multi-threading best practices, especially with regard to Java
- Thorough understanding of database query optimization and Java code optimization
- Thorough understanding of dependency injection, cloud development and maintaining a large-scale cloud platform
We are looking for a QA Automation Engineer who understands how to ensure the quality of a product that is changing very fast. If you're passionate about writing code to test financial products at scale , then we want to chat. Are you ready to revolutionize the small business banking industry with us?
About the Role
- Collaborate with developers and product managers to deliver the highest quality with each release
- Own the acceptance criteria process with developers and create detailed, comprehensive, and well-structured test plans and test cases for features
- Create the necessary automation library and framework as required and build automation tests and integrated them to CI/CD pipeline
- Communicate using synchronous and asynchronous channels to collaborate effectively
Requirements for the Role
- 5+ years experience with manual and automation testing
- Experience with QA methodologies for large scale web and mobile applications, understanding of software development process
- Experience with mobile,(android and iOS) web and API automation testing using tools such as Selenium, Appium and Postman
- Demonstrated ability to work well with others in a team environment and with geographically distributed teams
- Experience working with SCM tools like Git
- Experience in implementing CI/CD regression pipelines
Role Summary
As a Data Engineer, you will be an integral part of our Data Engineering team supporting an event-driven server less data engineering pipeline on AWS cloud, responsible for assisting in the end-to-end analysis, development & maintenance of data pipelines and systems (DataOps). You will work closely with fellow data engineers & production support to ensure the availability and reliability of data for analytics and business intelligence purposes.
Requirements:
· Around 4 years of working experience in data warehousing / BI system.
· Strong hands-on experience with Snowflake AND strong programming skills in Python
· Strong hands-on SQL skills
· Knowledge with any of the cloud databases such as Snowflake,Redshift,Google BigQuery,RDS,etc.
· Knowledge on debt for cloud databases
· AWS Services such as SNS, SQS, ECS, Docker, Kinesis & Lambda functions
· Solid understanding of ETL processes, and data warehousing concepts
· Familiarity with version control systems (e.g., Git/bit bucket, etc.) and collaborative development practices in an agile framework
· Experience with scrum methodologies
· Infrastructure build tools such as CFT / Terraform is a plus.
· Knowledge on Denodo, data cataloguing tools & data quality mechanisms is a plus.
· Strong team player with good communication skills.
Overview Optisol Business Solutions
OptiSol was named on this year's Best Companies to Work for list by Great place to work. We are a team of about 500+ Agile employees with a development center in India and global offices in the US, UK (United Kingdom), Australia, Ireland, Sweden, and Dubai. 16+ years of joyful journey and we have built about 500+ digital solutions. We have 200+ happy and satisfied clients across 24 countries.
Benefits, working with Optisol
· Great Learning & Development program
· Flextime, Work-at-Home & Hybrid Options
· A knowledgeable, high-achieving, experienced & fun team.
· Spot Awards & Recognition.
· The chance to be a part of next success story.
· A competitive base salary.
More Than Just a Job, We Offer an Opportunity To Grow. Are you the one, who looks out to Build your Future & Build your Dream? We have the Job for you, to make your dream comes true.
Experience: 8+ Years
Work Location: Hyderabad
Mode of work: Work from Office
Senior Data Engineer / Architect
Summary of the Role
The Senior Data Engineer / Architect will be a key role within the data and technology team, responsible for engineering and building data solutions that enable seamless use of data within the organization.
Core Activities
- Work closely with the business teams and business analysts to understand and document data usage requirements
- Develop designs relating to data engineering solutions including data pipelines, ETL, data warehouse, data mart and data lake solutions
- Develop data designs for reporting and other data use requirements
- Develop data governance solutions that provide data governance services including data security, data quality, data lineage etc.
- Lead implementation of data use and data quality solutions
- Provide operational support for users for the implemented data solutions
- Support development of solutions that automate reporting and business intelligence requirements
- Support development of machine learning and AI solution using large scale internal and external datasets
Other activities
- Work on and manage technology projects as and when required
- Provide user and technical training on data solutions
Skills and Experience
- At least 5-8 years of experience in a senior data engineer / architect role
- Strong experience with AWS based data solutions including AWS Redshift, analytics and data governance solutions
- Strong experience with industry standard data governance / data quality solutions
- Strong experience with managing a Postgres SQL data environment
- Background as a software developer working in AWS / Python will be beneficial
- Experience with BI tools like Power BI and Tableau
Strong written and oral communication skills
Skills required: -
- Excellent front end/UI skills (JS, HTML, Angular, Flutter) and OR Android skills. -
- Understanding of nodeJS, server-side technologies, exposure to databases. -
- Deep knowhow of data structures, algorithms.
- Hands on development across technologies. -
- B.E/ B.Tech (Computer Science/ Equivalent) from a reputed institute
• Should have hands-on experience in Java + Spring boot and REST/JSON API
• Should have hands-on experience in Java + Angular and REST/JSON API Spring ORM(Spring Data JPA or Hibernate)
• Basic HTML,CSS,Bootstrap
• Basic JavaScript, JQuery or UI frameworks.
• Knowledge of SQL will be an advantage.
• Ability to quickly diagnose the problem areas and come up with solutions and/or workarounds.
• Understanding of source code management and necessary technical documentation.
• Having hands-on experience and knowledge of Rest APIs and JSON.
• Should have been part of the development of production-grade applications on java Spring + REST/JSON + HTML + Javascript.
• Should have been part of development of production-grade application on Java + Angular + REST/JSON + HTML + Javascript.
• Good Communication skills, as require direct client interaction.
Rules & Responsibilities:
Technical Skills:
- .Net– Net, C#, .Net core, MVC, Framework, Web API, Web Services, Micro Service and SQL
- Azure – Azure Cloud, SaaS, PaaS, IaaS, Azure Relational, and No-SQL Database, Big Data Services
Responsibilities
- Good understanding of and experience in working on Microsoft Azure (IAAS/PAAS/SAAS)
- Ability to architect, design, and implement cloud-based solutions
- Proven track record of designing and implement the IoT-based solutions/Big Data solutions/applications to the Azure cloud platform.
- Experience in building .Net-based enterprise distributed solutions in Windows and Linux.
- Experience in using CI and CD tools. Jenkins/ Azure pipeline and Terraform. Experience in using another tooling such as Ansible, CloudFormation, etc.
- Good understanding of HA/DR Setups in Cloud
- Experience and working knowledge of Virtualization, Networking, Data Center, and Security
- Deep hands-on experience in the design, development, and deployment of business software at scale.
- Strong hands-on experience in Azure Cloud Platform
- Experience in Kubernetes, Docker, and other cloud deployment, container technologies
- Experience / knowledge of other cloud offerings (e.g. AWS, GCP) will be added advantage
- Experience with monitoring tools like Prometheus, Grafana, Datadog, etc.
- Team Name - SDET
- Skills and Stacks - Java, Spring boot, Mysql, AWS stack, HTTP/GRPC
- Project 1 line description -Will be required the folks to close the P0 E2E automation