• Strong knowledge of SQL and ETL Testing
• Extensive experience in ETL/ Data warehouse backend testing and BI Intelligence reports testing
• Hands-on back-end testing skills and strong RDBMS and testing methodologies.
• Expertise in test management tools and defect tracking tools i.e HP Quality Center, Jira
• Proficient experience of working on SDLC & Agile Methodology
•Excellent Knowledge of Database Systems Vertica /Oracle/ Teradata
• Knowledge in security testing will be an added advantage.
• Experience in Business Intelligence testing in various reports Using Tableau
• Strong comprehension, analytical, and problem-solving skills
•Good interpersonal and communication skills, quick learner, and good troubleshooting capabilities.
• Good knowledge of Python Programming language.
• Working knowledge of AWS
Similar jobs
Cloud DevOps Architect
· Practices self-leadership and promotes learning in others by building relationships with cross- functional stakeholders; communicating information and providing advice to drive projects forward; adapting to competing demands and new responsibilities; providing feedback to others; mentoring junior team members; creating and executing plans to capitalize on strengths and improve opportunity areas; and adapting to and learning from change, difficulties, and feedback.
· Ensure appropriate translation of business requirements and functional specifications into physical program designs, code modules, stable application systems, and software solutions by partnering with Business Analysts and other team members to understand business needs and functional specifications.
· Build use cases/scenarios and reference architectures to enable rapid adoption of cloud services
in Product’s cloud journey.
· Provide insight into recommendations for technical solutions that meet design and functional needs.
· Experience or familiarity with Firewall/NGFW deployed in a variety of form factors (Checkpoint, Imperva, Palo Alto, Azure Firewall).
· Establish credibility & build deep relationships with senior technical individuals to enable them to be cloud advocates.
· Participate in deep architectural discussions to build confidence and ensure engineering success when building new and migrating existing applications, software. and services to AWS and GCP.
· Conduct deep-dive hands-on education/training sessions to transfer knowledge to DevOps and engineering teams considering or already using public cloud services.
· Be a cloud (Amazon Web Services, Google Cloud Platform) and DevOps evangelist and advise the stakeholders on cloud readiness, workload identification, migration and identifying the right multi cloud mix to effectively accomplish business objectives.
· Understands engineering requirements and architect scalable solutions adopting DevOps and leveraging advanced technologies such as AWS CodePipelines, AWS Code-Commit, ECS containers, API Gateway, CloudFormation Templates, AWS Kinesis, Splunk, Dome9, AWS-SQS, AWS-SNS, SonarCube, Microservices, and Kubernetes to realize stronger benefits and future proof outcomes for customer-facing applications.
· Build use cases/scenarios and reference architectures to enable rapid adoption of cloud services in product’s cloud journey.
· Be an integral part of the technology and architecture community in the public cloud partners (AWS, GCP, Azure) and bring in new services launched by cloud providers into 8K Miles Product Platform scope.
· Capture and share best-practice knowledge amongst the DevOps and Cloud community.
· Act as a technical liaison between product management, service engineering, and support teams.
· Qualification:
o Master’s Degree in Computer Science/Engineering with 12+ years’ experience in information technology (networking, infrastructure, database).
o Strong and recent exposure to AWS/GCP/Azure Cloud platforms and designing hybrid multi cloud solutions. Preferred to be Certified AWS Architect professional or similar
· Working knowledge of UNIX shell scripting.
· Strong hands-on programming experience in Python
· Working knowledge of data visualization tools – Tableau.
· Experience working in cloud environment — AWS.
· Experience working with modern tools in the Agile Software Development Life Cycle.
· Version Control Systems (Ex. Git, Github, Stash/Bitbucket), Knowledge Management (Ex. Confluence, Google Docs), Development Workflow (Ex. Jira), Continuous Integration (Ex. Bamboo), Real Time Collaboration (Ex. Hipchat, Slack).
Sr IT Administrator
Experience - 4 - 5 yrs
Location- Hyderabad - Hybrid
Mandatory skills : AWS , VMware,trouble shooting, SQL , Problem solving
Responsibilities:
Diagnose and troubleshoot technical issues - Windows
Upgrade the network infrastructure - Windows
Install servers (VMWare and Dell Servers),devices and firewalls(Sonicwall)
Monitor the performance of servers, software and hardware
Ensure the smooth deployment of new applications and SQL DB changes(Manual,Script and CICD) on Windows Environment
Update technical documentation.
Requirements and skills:
Solid understanding of LAN/WAN and Hybrid networks
Good problem-solving and communication skills
Good experience on AD,DNS,MS SQL Server,IIS, VMware and AWS
Hand on experience on Migation of servers(File, AD and Application servers)
Good experience on AWS and VMWare ESXi networks (Miagtion, upgade, and troubleshooting)
Knowledge of security techniques.
Should have worked in complex projects
All the experience is expected in Windows Domain only. Not Linux or Unix.
Working knowledge in AWS is also mandatory.
We need people with good exposure and good attitude and willingness to work in a team.
Title: Test Automation Engineer (Senior)
Type: Permanent
Region: India (Mumbai)
Location: Hybrid
About the Role:
Join our dynamic QA Automation Team and take your career to the next level! We are looking for Senior Level Automation Engineers to join our team and grow with us long term. You will have the opportunity to work with cutting-edge technologies and bring your expertise in using Bash, JavaScript, testing APIs, web services, and webpages, Cloud (Azure, AWS, GCP. This is a hybrid role, meaning you will need to spend 1-2 days in the office each week.
Responsibilities:
· Develop and execute automation scripts using open-source tools
· Identify, record, document thoroughly and track bugs
· Perform thorough regression testing when bugs are resolved
· Monitor debugging process results
· Track quality assurance metrics, like defect densities and open defect counts
· Create detailed, comprehensive, and well-structured test plans and test cases
· Estimate, prioritize, plan, and execute testing activities
· Stay up to date with new testing tools and test strategies
Required Experience:
· At least 5 years’ experience in test automation for Senior Level.
· Proven work experience in web-based and/or mobile based quality assurance best practices
· Strong knowledge of software QA methodologies and processes
· Strong experience with automated testing tools (e.g., JMeter, Selenium)
· Scripting experience using Bash, JavaScript or NodeJS
· Experience in testing APIs, webservices, and webpages
· Experience in writing clear, concise, and comprehensive test plans and test cases.
· Experience in working in an Agile/Scrum development process
· Knowledge of Continuous Integration and Continuous build
· Experience working with Cloud (Azure, AWS, GCP)
· Advanced experience in defect management and prioritization
· Excellent analytical skills, with demonstrable experience driving issues to resolution
· A good eye for identifying opportunities to add greater value and accuracy to our current testing processes
· Experience with High Traffic Public Web Sites, security testing, scalability, and performance challenges of server-side code, knowledge of Kubernetes, Docker, or any container orchestration
About Ovyo
Ovyo works globally with companies in the TV, Media & Networks industries including household content brands and operators. Our people provide consulting services to build the platforms, test the apps, and drive the programmes that shape the way the world watches video and connects. Our management team is in the UK, and we have technical, and operations teams based in India, South Africa, Europe, and the Americas.
Familiar with the MicroStrategy architecture, Admin Certification Preferred
· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes
· Monitor and manage existing Business Intelligence development/production systems
· MicroStrategy installation, upgrade and administration on Windows and Linux platform
· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.
· Analyze application and system logs while troubleshooting and root cause analysis
· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.
· Monitor, report and investigate solutions to improve report performance.
· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.
· Provide support for the platform, report execution and implementation, user community and data investigations.
· Identify improvement areas in Environment hosting and upgrade processes.
· Identify automation opportunities and participate in automation implementations
· Provide on-call support for Business Intelligence issues
· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.
· Familiar with AWS, Linux Scripting
· Knowledge of MSTR Mobile
· Knowledge of capacity planning and system’s scaling needs
Key Responsibilities & Job Duties
● To provide functional support for our ERP
Finance applications (Oracle eBusiness Suite)
○ Assist users in case of question
○ Manage functional requests
● Train users to keep the right level of skill
○ To organize audits of the use of systems
○ Detect and collect issues
○ Implement action plan and coordinate actions
● To manage enhancements in accordance with the procedures which apply in the department
○ Collect enhancements from users
○ Challenge the user requirements
● Define priorities
○ Write functional specifications and coordinate developments with the technical team
● Test and train the users
○ Regularly communicate to the users the status of the enhancement
○ Report activity to the management
○ Collaborate with functional support in other zones to share information
○ Optionally, may manage some projects
Key Technical & Functional Competencies
● Functional competencies
○ Strong Business processes: Finance
● Technical competencies
○ Good Knowledge of Oracle SQL
○ Strong Oracle eBusiness Suite
Modules: AP, AR, GL, FA (At least at a professional for two Modules)
○ Basic knowledge of Oracle eBusiness Suite Modules: INV, PO, OE, WSH is preferable
Company Profile:
Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.
We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.
Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.
Salary: As per company standards.
Designation: Data Engineering
Location: Pune
Experience with ETL, Data Modeling, and Data Architecture
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.
Experience with AWS cloud data lake for development of real-time or near real-time use cases
Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing
Build data pipeline frameworks to automate high-volume and real-time data delivery
Create prototypes and proof-of-concepts for iterative development.
Experience with NoSQL databases, such as DynamoDB, MongoDB etc
Create and maintain optimal data pipeline architecture,
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow
Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.
Employment Type
Full-time
- 5+ years of Quality Assurance/Testing experience.
- 3+ years of Data Quality experience, or SDET experience with a focus on data, data warehousing, reporting, etc.
- 3+ years of Data Quality experience, or QA experience with a focus on Android, iOS, Roku, and connected devices,
- 3+ years of testing experience working within an Agile environment, and with Agile Management tools such as JIRA.
- Experience with Automation Framework development using Java.
- Experience with Performance Test Design, Development, and load testing execution.
- Design, create and maintain assets used to execute performance tests and contribute to the execution and monitoring of performance test executions using ApacheJMeter, LoadRunner, or similar tools.
- Working knowledge of JAVA, JVM, Spring Boot, data warehouse, data integration, SQL Server, apache Kafka, data streaming, big data, MongoDB, SQL, Web Services, microservices,ETL,change data capture (CDC), DevOps.
- Strong SQL experience, with knowledge of AWS Redshift, Snowflake, or columnar databases.
- Experience with reporting or analytics tools like Tableau or Mode.
- Experience working with Amazon Web Services, querying, and working with data in various AWS services.
- Programming experience in a language such as Python, Java, etc. for the purposes of parsing files and running queries.
- Experience with analytics implementations (network events, ad beacons, user action events, etc.) in a web or mobile application.
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
As an Engineering Manager, your role would involve architecting systems capable of serving as the brains of complex distributed products. In addition, you’d also closely Managing engineers on the team and contribute to team building.
A strong technologist at Meesho cares about code modularity, scalability, re-usability and thrives in a complex and ambiguous environment.
Required skill & Experience:
- Bachelors / Masters in Computer Science or equivalent from a premier institute with at least 8+ years over all professional experience. At-least 2+ years experience in managing/leading software development teams.
- Create clear career paths for team members and help them grow with regular & deep mentoring. Perform regular performance evaluation and share and seek feedback.
- Able to drive sprints and OKRs.
- Exceptional team managing skills; experience in building large scale distributed systems
- Experience in Scalable Systems - transactional systems (B2C)
- Expertise in Java/J2EE and multithreading
- Deep understanding of transactional and NoSQL DBs
- Deep understanding of Messaging systems - kafka
- Good experience on cloud infrastructure - AWS preferably
- Good to have: Data pipelines, ES
- Ability to think and analyze both breadth-wise and depth-wise while designing and implementing services
- Excellent teamwork skills, flexibility, and ability to handle multiple tasks.