25+ Shell Scripting Jobs in Hyderabad | Shell Scripting Job openings in Hyderabad
Apply to 25+ Shell Scripting Jobs in Hyderabad on CutShort.io. Explore the latest Shell Scripting Job opportunities across top companies like Google, Amazon & Adobe.
at KITAAB TECHNOLOGIES PRIVATE LIMITED
JOB DESCRIPTION
We are seeking a skilled Linux System Administrator and Database Administrator to join our team. The ideal candidate will possess a strong understanding of Linux commands and shell scripting, along with experience in handling database administration tasks, particularly with PostgreSQL. Additionally, proficiency in AWS EC2 and cloud fundamentals, AWS networking, server and database monitoring, and securing production environments are essential for this role.
RESPONSIBILITIES
1. Manage Linux-based systems, including installation, configuration, troubleshooting, and maintenance.
2. Develop and maintain shell scripts for automating system tasks and processes.
3. Administer PostgreSQL databases, including installation, configuration, performance tuning, backup, and recovery.
4. Deploy and manage AWS EC2 instances, ensuring optimal performance and security.
5. Implement and manage AWS networking configurations, such as VPCs, subnets, and security groups.
6. Monitor servers and databases for performance, availability, and security compliance.
7. Handle production workloads effectively, ensuring high availability and reliability.
8. Implement security best practices to safeguard production environments from unauthorized access and threats.
9. Manage Elasticsearch and Redis instances, if required, to support application needs (experience in these areas is a plus). Linux System Administrator and Database Administrator 2
10. Collaborate with cross-functional teams to troubleshoot and resolve technical issues promptly.
11. Stay updated with emerging technologies and best practices in Linux systems administration, database management, and cloud computing.
SKILLS
1. Proven experience as a Linux System Administrator and Database Administrator.
2. Strong knowledge of Linux commands, shell scripting, and system administration.
3. Hands-on experience with PostgreSQL database administration, including performance tuning and backup/recovery.
4. Proficiency in AWS services, particularly EC2, VPC, IAM, and CloudWatch.
5. Experience with networking concepts in AWS, including VPCs, subnets, and security groups.
6. Familiarity with server and database monitoring tools (e.g., Nagios, Zabbix, Prometheus).
7. Ability to manage production workloads and prioritize tasks effectively.
8. Knowledge of security best practices and techniques for securing production environments.
9. Experience with Elasticsearch and Redis is a plus.
10. Excellent problem-solving skills and attention to detail.
11. Strong communication and interpersonal skills, with the ability to collaborate effectively in a team environment.
Aprajita Consultancy
Role: Oracle DBA Developer
Location: Hyderabad
Required Experience: 8 + Years
Skills : DBA, Terraform, Ansible, Python, Shell Script, DevOps activities, Oracle DBA, SQL server, Cassandra, Oracle sql/plsql, MySQL/Oracle/MSSql/Mongo/Cassandra, Security measure configuration
Roles and Responsibilities:
1. 8+ years of hands-on DBA experience in one or many of the following: SQL Server, Oracle, Cassandra
2. DBA experience in a SRE environment will be an advantage.
3. Experience in Automation/building databases by providing self-service tools. analyze and implement solutions for database administration (e.g., backups, performance tuning, Troubleshooting, Capacity planning)
4. Analyze solutions and implement best practices for cloud database and their components.
5. Build and enhance tooling, automation, and CI/CD workflows (Jenkins etc.) that provide safe self-service capabilities to th6. Implement proactive monitoring and alerting to detect issues before they impact users. Use a metrics-driven approach to identify and root cause performance and scalability bottlenecks in the system.
7. Work on automation of database infrastructure and help engineering succeed by providing self-service tools.
8. Write database documentation, including data standards, procedures, and definitions for the data dictionary (metadata)
9. Monitor database performance, control access permissions and privileges, capacity planning, implement changes and apply new patches and versions when required.
10. Recommend query and schema changes to optimize the performance of database queries.
11. Have experience with cloud-based environments (OCI, AWS, Azure) as well as On-Premises.
12. Have experience with cloud database such as SQL server, Oracle, Cassandra
13. Have experience with infrastructure automation and configuration management (Jira, Confluence, Ansible, Gitlab, Terraform)
14. Have excellent written and verbal English communication skills.
15. Planning, managing, and scaling of data stores to ensure a business’ complex data requirements are met and it can easily access its data in a fast, reliable, and safe manner.
16. Ensures the quality of orchestration and integration of tools needed to support daily operations by patching together existing infrastructure with cloud solutions and additional data infrastructures.
17. Data Security and protecting the data through rigorous testing of backup and recovery processes and frequently auditing well-regulated security procedures.
18. use software and tooling to automate manual tasks and enable engineers to move fast without the concern of losing data during their experiments.
19. service level objectives (SLOs), risk analysis to determine which problems to address and which problems to automate.
20. Bachelor's Degree in a technical discipline required.
21. DBA Certifications required: Oracle, SQLServer, Cassandra (2 or more)
21. Cloud, DevOps certifications will be an advantage.
Must have Skills:
Ø Oracle DBA with development
Ø SQL
Ø Devops tools
Ø Cassandra
Machint Solutions, a US registered IT & Digital automation Products and Services organization is seeking to hire couple of LINUX ADMINISTRATORS for its office in WHITEFIELDS, KONDAPUR, HYDERABAD, TELANAGANA.
Job description
- Minimum 5 years of strong Linux (RHEL & SuSE) Admin knowledge & troubleshooting skills.
- Must know Storage integration with Linux.
- Must have strong scripting (Bash or Shell) knowledge.
- Cluster Knowledge (RHEL & SuSE)
- Monitoring & Patching tools knowledge
- Should have good experience with AWS Cloud & VMWare
- Networking Knowledge with respect to Linux
- Work Location: Machint Solutions Private Limited., Whitefields, Kondapur, Hyderabad
- Notice period: Candidates who can join in 2 weeks are preferred.
- Interview: F2F at our office - Between 11 AM and 6 PM Monday through Friday
- Budget: Market standards
Please share your updated resume on ram dot n at machint dot com with salary and notice period info to initiate the hiring process.
Mandatory Skill set : C++ and Python - UNIX- Database - SQL or Postgres
Developer Role EXP : 3 to 5yrs
Location : Bangalore /Chennai/Hyderabad
1. Strong proficiency in C++ , with fair knowledge of the language specification (Telecom experience is preferred).
2. Proficient understanding of standard template library (STL): algorithms, containers, functions, and iterators
3. Must have experience on Unix platforms, should possess shell scripting skills.
4. Knowledge on compilers(gcc, g) and debugger (dbx). Knowledge of libraries and linking.
5. Good understanding of code versioning tools (e.g. Git, CVS etc.)
6. Able to write and understand python scripts (both python2 and python3)
7. Handson with logic implementation in python and should be familiar with list comprehension and is comfortable in integrating it with C++ and Unix scripts
8. Able to implement multithreading in both C++ and Python environment.
9. Familiar with Postgres SQL.
C++ developer with Python as secondary - 3 to 4 yrs exp / should be CW.
Position: ETL Developer
Location: Mumbai
Exp.Level: 4+ Yrs
Required Skills:
* Strong scripting knowledge such as: Python and Shell
* Strong relational database skills especially with DB2/Sybase
* Create high quality and optimized stored procedures and queries
* Strong with scripting language such as Python and Unix / K-Shell
* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.
* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.
* Experienced in Agile development process
* Java Knowledge is a big plus but not essential
* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus
* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.
* Good team player; Integrity & ownership
07-13 years as an Oracle DBA with production support experience, including Oracle 10g and 11g.
07-13 years of experience as an Oracle Apps DBA with Oracle EBS 11.5.10 and R12
Strong understanding of Oracle RDBMS architecture
Hands-on experience with implementation, upgrade, and tuning of:
Large Oracle EBS environments, Oracle EBS in a production environment; throughout all stages of the lifecycle
Enterprise Manager Grid Control
Backup and recovery concepts, including RMAN
Oracle R12, Oracle RAC
Patch assessment and application in an EBS environment with activation of different modules
Performance Tuning (ie. Sql Queries)
Proven experience with multiple operating systems: Linux/Unix/Solaris/Windows
Strong verbal, written, organizational and documentation skills.
Ability to provide accurate and realistic effort estimates, commit and deliver accordingly
Ability to work in a fast paced environment with occasional non-business hours (evenings and weekends) flexibility and able to prioritize and juggle multiple projects.
Ability to work productively in a team environment and participate as a positive and cooperative team member.
MS in CS or BS/ BA in Computer Science with five years of relevant work experience.
Candidates should be flexible for Night shifts/Weekend Shifts
Preferred Skills:
Experience with Engineered Systems (ie. ODA's)
Oracle Data Guard
Oracle RAC/Multitenant architecture . WebLogic . OEM 13c/12c
R12.2 Upgrade Experience on multiple projects
Experienced with Oracle Identity Management product suite
Immediate Joiners preferred
Job Title:
Telephony Engineer
Job Description:
- Support the operations team with debugging issues related to the telephony platform
- Identify issues with calls using monitoring and analysis tools such as VoIPMOnitor
- Write scripts to automate tasks, monitor the services and functionalities
- Contribute to the improvement of the system by providing ideas
- Help with capacity planning
- Perform the scheduled maintenance activities
- Modify the existing code to accommodate new features
Experience Range:
3- 6 years
Educational Qualifications:
Any graduation,
Job Responsibilities:
• Knowledge of open source technologies, VoIP, SIP, WebRTC etc.
• Shell scripting.
• Experience in Asterisk and Kamailio or OpenSIPS.
• C/C++ or JS programming with Linux.
• 5-8 years experience.
• Good communication skills.
• Willingness to work in the night, whenever required, for handling support related issues.
Skills Required:
ViOP, OpenSIPS, SIP, Shell Scripting, Asterisk, kamailio, C++, Linux, Development,
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Software development Co.in HR Applications
(Candidates from Service based Companies apply-Looking for automation(shell or python scripting))
SHIFT- Shift time either US East coast or west coast (2:30 PM to 10:30 PM India time or 5 to 2 am india time)
Exp- 5 to 8 years
Salary- Upto 25 LPA
Hyderabad based candidates preferred!
Immediate joiners would be preferred!!
Role Objective:
- Ability to identify processes where efficiency could be improved via automation
- Ability to research, prototype, iterate and test automation solutions
- Good Technical understanding of Cloud service offering, with a sound appreciation of the associated business processes.
- Ability to build & maintain a strong working relationship with other Technical teams using the agile methodology (internal and external), Infrastructure Partners and Service Engagement Managers.
- Ability to shape and co-ordinate delivery of key initiatives to deliver improvements in stability
- Good understanding of the cost of the e2e service provision, and delivery of associated savings.
- Knowledge of web security principals
- Strong Linux experience – comfortable working from command line
- Some networking knowledge (routing, DNS)
- Knowledge of HA and DR concepts and experience implementing them
- Working with team to analyse and design infrastructure with 99.99% up-time.
Qualifications:
- Infrastructure automation through DevOps scripting (Eg Python, Ruby, PowerShell, Java, shell) or previous software development experience
- Experience in building and managing production cloud environments from the ground up.
- Hands-on, working experience with primary AWS services (EC2, VPC, RDS, Route53, S3)
- Knowledge on repository management (GitHub, SVN)
- Solid understanding of web application architecture and RDBMS (SQL Server preferred).
- Experience with IT compliance and risk management requirements is a bonus. (Eg Security, Privacy, HIPAA, SOX, etc)
- Strong logical, analytical and problem-solving skills with excellent communication skills.
- Should have degree in computer science, MIS, engineering or equivalent with 5+ years of experience.
- Should be willing to work in rotational shifts (including the nights)
Perks and benefits:
- Health & Wellness
- Paid time off
- Learning at work
- Fun at work
- Night shift allowance
- Comp off
- Pick and drop facility available to certain distance
Job Description
Must have Skill Sets
- Go lang + Microservices
- Familiarity with MAC/Linux environment, Shell script
- GRPC ● JavaScript & JSON
- Knowledge of microservices and architecture
- Knowledge of Uber Tech Stack would be a bonus ○ Will ensure candidate who are selected will undergo training on Uber Stack as a part of their induction to Uber
- Basic SQL knowledge
Expectations from the candidate
- Strong hands-on experience in understanding requirements and creating microservices using Go lang..
- Exceptional debugging and problem-solving skills on large codebase
- Be a proactive thinker and demonstrate keen sense to find solutions to challenging problems
- Product sense: Create more than beautiful code. Play a crucial role in choosing what we build and how we build it.
- A penchant for collaboration & a team player: Work cross collaboratively to drive impact across orgs. Be open to candid feedback for improvement. When required step up to the role of a tech-lead to ensure effective coordination and communication
- Independent to explore all the end points in UBER environment and coordinate with different Line of Business to understand the correct microservice for a business use case.
- Ability to convert a high-level PRD into a detailed ERD for execution. Coordinate with business teams to understand the functional requirement and convert it into engineering logic
Primary Skills :
4+ Years with Java
2+ with Microservices
6 months to 1 Year with Golang
Exp : 4+ Years
Location : Bangalore/Hyderabad
Max Budget : 28 Lakhs
End Client : UBER
• Good coding skills (SOLID, DI, Design Patterns, etc.)
• Able to unit test the framework built
• Working with repositories (preferably Git)
• Automation experience (Cucumber/Specflow, XUnit, TestNg, etc.)
• Web UI testing experience (different browsers)
• API testing experience
• Mobile Web/App testing experience (if the automation skills are good then this is can be a "nice to have")
• Ability to work as part of team or on their own (as we are consultants)
• Integrate automation tests into CI/CD pipelines
Nice to have
• Mocks and stubs
• Advanced Devops knowledge
• Deep Agile Development knowledge
• Ability to lead projects and design test strategy
Leading Healthcare Compnay
ONLY FOR WOMEN/RETURNING WOMEN
Client: A Leading Product Healthcare Company
Position: Software Engineer - Automation Testing
Work Location: Hyderabad remote working
Roles & Description
- Experience in Automation Testing using Selenium resource with Jscript and familiarity with Protractor
- Should have proven experience of Framework development
- Strong knowledge and experience in working with Jasmine and with POM framework
- Expert knowledge of XPath construction and in Node.js, debugging the scripts and running the scripts through GIT
Required Skills
- Good verbal and communication skills with3 years and above experience
- Strong problem solver who can work independently and having good analytical skills
- B.Tech /MCA/ ENGG qualification
- Flexible work hours
Beyond Pinks Will Offer
- We will offer free training and help you prepare for over 1 week to prepare for resume writing, interview and technical interview as well.
- We will also train on managing work life balance, speaking confidently, salary negotiation, bouncing back after career break.
We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description :
Experience: 6+ Years
Work Location: Pune / Hyderabad
Technical Skills :
- Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
- Knowledge of database performance tuning techniques
- Rich experience in a database development
- Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
- Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
Required Candidate Profile :
- Excellent communication, interpersonal, analytical skills and strong ability to drive teams
- Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
- Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
- Stakeholder management and client engagement skills
- Strong communication skills (written and verbal)
About Us!
A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle Data warehouse Assessment & Migration Planning Product
Raven Automated Workload Conversion Product
Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
Experience Range 3 to 6 years.
Please share your details like notice period years of experience in automation and then apply.
Skills Require Automation Testing, Java or Python,web and mobile testing.
**Immediate to 15 days. joiners are preferred.
Who are currenlty serving notice can also apply.**
- 2-4 years of experience, with at least 1-2 years of experience in testing Web Applications/Services, UI testing.
- Good exposure to Non-functional tests strategies like Load, Perf and Chaos.
- Good understanding of testing principles: UI and Usability testing, Stress testing, Functional/Regression testing, Code coverage, TDD/BDD, UAT.
- Experience in deploying solutions into AWS is a major plus.
- Good team player, having effective and professional oral and written communication skills.
About the Role
The Dremio India team owns the development of the cloud infrastructure and services that power Dremio's Data Lake Engine. With focus on query performance optimization, supporting modern table formats like Iceberg, Deltalake and Nessie, this team provides endless opportunities to to define the products for next generation of data analytics.
In this role, you will get opportunities to impact high performance system software and scalable SaaS services through application of continuous performance management. You will plan, design, automate, execute the runs followed by deep analysis and identification of key performance fixes in collaboration with developers. Open and flexible work culture combined with providing employees ownership of the work they do will help you develop as a leader. The inclusive culture of the company will provide you a platform to bring fresh ideas and innovate.
Responsibilities
- Deliver end to end performance testing independently using agile methodologies
- Prepare performance test plans, load simulators and test harnesses to thoroughly test the products against the approved specifications
- Translate deep insight of architecture, product & usage into an enhanced automated performance measurement & evaluation framework to support continuous performance management.
- Evaluate & apply the latest tools, techniques and research insights to drive improvements into a world-class data analytics engine
- Collaborate with other engineering and customer success functions to simulate customer data and usage patterns, workloads to execute performance runs, identify and fix customer issues and make sure that customers get highly performant, optimized and scalable Dremio experience
- Analyze performance bottlenecks, root cause issues, file defects, follow up with developers, documentation and other teams on the resolution.
- Publish performance benchmark report based on test runs in accordance with industry standards
- Regularly communicate leadership team with an assessment of the performance, scalability, reliability, and robustness of products before they are exposed to customers
- Analyze and debug performance issues in customer environments.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Actively participate in code and design reviews to maintain exceptional quality and deepen your understanding of the system architecture and implementation
Basic Requirements
- B.Tech/M.Tech/Equivalent in Computer Science or a related technical field
- 8+ years of performance automation engineering experience on large scale distributed systems
- Proficiency in any of Java/C++/Python/Go and automation frameworks
- Hands on experience in integration performance automation using CI/CD tools like Jenkins
- Experience in planning and executing performance engineering tasks to completion and taking ownership of performance epics during a set of sprints.
- Experience in designing, implementing, executing and analyzing automated performance tests for complex, production system software.
- Experience in analyzing performance bottlenecks in system, performing root cause analysis, and following issue resolution workflow to tune the system to provide optimized performance
- Ability to derive meaningful insights from the collected performance data, articulate performance findings effectively with senior team members to evaluate design choices.
- Experience with database systems internals, query optimization, understanding and tuning query access plans, and query execution instrumentation.
- Hands on experience of working projects on AWS, Azure and Google Cloud Platform
- Understanding of distributed file systems like S3 or ADLS or HDFS and HIVE
- Ability to create reusable components to automate repeatable, manual activities
- Ability to write technical reports and summary and present to leadership team
- Passion for learning and delivering using latest technologies
- Excellent communication skills and affinity for collaboration and teamwork
- Passion and ability to work in a fast paced and agile development environment.
Preferred Qualification
- Hands on experience of multi-threaded and asynchronous programming models
- Hands on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, storage systems
Aqua Security enables enterprises to secure their container-based and cloud-native applications from development to production, accelerating container adoption and bridging the gap between DevOps and IT security.
About The Position
Aqua is looking for a passionate and experienced QA automation engineer who feels at home in any Linux and cloud environment.
- Be a part of our product development cycle from the design to production
- Design and write framework, tools and tests for automated test scenarios
- Execute automation tests and analyze results
Requirements:
Requirements
- At least 5 years of experience as a QA engineer
- Passionate about the quality and perfection level of the product
- Knowledge with Linux environment is must
- 1+ year of experience with Python - must, experience with Bash - advantage
- Proven capabilities in creating test environments from scratch
- Experience working with at least one cloud environment (AWS, Google Cloud, Azure etc.) or any orchestrator (Kubernetes, Openshift etc) - an advantage
- Background from Security/Cyber companies - advantage
- Strong technical skills; ability to deep dive into complex problems and find their root cause
- Experience with CI/CD systems, Jenkins or similar - an advantage
- Familiar with Docker containers - an advantage
-
Message sent.
15 years US based Product Company
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment