Loading...

{{notif_text}}

SocialHelpouts is now CutShort! Read about it here\
The next CutShort event, {{next_cs_event.name}}, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, is LIVE.Join now!

Selenium Web driver Jobs in Bangalore (Bengaluru)

Explore top Selenium Web driver Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

MTS SDET

Founded 2012
Product
250+ employees
Raised funding
Test Automation (QA)
Python
Selenium Web driver
Algorithms
Data Structures
Virtualization
Storage & Networking
Software Testing (QA)
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 15 years
Experience icon
20 - 60 lacs/annum

Rubrik is creating the cloud data management space. We make it easy for businesses to protect, search, secure, and analyze all of their data simply and scalably. As cloud continues to grow at an astounding rate, we’ll be solving some of it’s most interesting challenges while building a product unlike anything seen before. This is a massive challenge and we’re just getting started so there is a lot of opportunity for personal growth and contribution. We take quality very seriously. As an SDET at Rubrik, you will work closely with our engineering team to maintain a high level of quality within the product. Quality is a team responsibility and our SDET’s serve as advocates to help the entire team build a comprehensive and scalable test and automation framework to make sure that we deliver the best possible product. Responsibilities Lead assessment and planning of test efforts required for Rubrik Cloud Data Management platform. Implement and maintain a test environment for Rubrik’s products. Design and document detailed test cases to cover all levels of test, including performance, and scalability under load/stress. Ensure adequate coverage of functional requirements and define the acceptance criteria. Analyze test results to ensure functionality and recommend appropriate action. Work with product development engineers and track all problem reports to closure. Requirements BS in Computer Science or related technical field Excellent coding skills in C, C++, Java, or Python Scripting skills in Python, Perl, Shell or another common language Familiar with one of the following domains (Databases (Oracle/MSSQL/NoSQL/RDBMS), Storage(SAN/NAS/HCI), Virtualisation(AHV, HyperV, ESXi, XenServer)) Understanding of web technologies (HTTP/S, HTML, Javascript, XML, JSON). Hands-on development of automated tests using tools like Selenium, Appium, TestNG, Jenkins and Maven. Experience with automated testing of RESTful web services

Job posted by
apply for job
apply for job
Job poster profile picture - Sunil Goyal
Sunil Goyal
Job posted by
Job poster profile picture - Sunil Goyal
Sunil Goyal

Data Crawler

Founded 2012
Product
51-250 employees
Profitable
Python
Selenium Web driver
Scrapy
Web crawling
Apache Nutch
output.io
Crawlera
Cassandra
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 7 years
Experience icon
5 - 20 lacs/annum

Brief About the Company EdGE Networks Pvt. Ltd. is an innovative HR technology solutions provider focused on helping organizations meet their talent-related challenges. With our expertise in Artificial Intelligence, Semantic Analysis, Data Science, Machine Learning and Predictive Modelling, we enable HR organizations to lead with data and intelligence. Our solutions significantly improve workforce availability, billing, allocation and drive straight bottom line impacts. For more details, please logon to www.edgenetworks.in and www.hirealchemy.com Summary of the Role: We are looking for a skilled and enthusiastic Data Procurement Specialist for web crawling and public data scraping.  Design, build and improve our distributed system of web crawlers.  Integrate with the third-party API's to improve results.  Integrate the data crawled and scraped into our databases.  Create more/better ways to crawl relevant information.  Strong knowledge of web technologies (HTML, CSS, Javascript, XPath, RegEx)  Good knowledge of Linux command tools  Experienced in Python, with knowledge of Scrapy framework  Strong knowledge of Selenium (Selenium WebDriver is a must)  Familiarity with web frontiers like Frontera  Familiarity with distributed messaging middleware (Kafka) Desired:  Practical, hands-on experience with modern Agile development methodologies  Ability thrive in a fast paced, test driven, collaborative and iterative programming environment.  Experience with web crawling projects  Experience with NoSQL databases (HBase, Cassandra, MongoDB, etc)  Experience with CI Tools (Git, Jenkins, etc);  Experience with distributed systems  Familiarity with data loading tools like Flume

Job posted by
apply for job
apply for job
Job poster profile picture - Naveen Taalanki
Naveen Taalanki
Job posted by
Job poster profile picture - Naveen Taalanki
Naveen Taalanki
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.