8+ Bash Jobs in Hyderabad | Bash Job openings in Hyderabad
Apply to 8+ Bash Jobs in Hyderabad on CutShort.io. Explore the latest Bash Job opportunities across top companies like Google, Amazon & Adobe.
Fintrac Global services
Required Qualifications:
∙Bachelor’s degree in computer science, Information Technology, or related field, or equivalent experience.
∙5+ years of experience in a DevOps role, preferably for a SaaS or software company.
∙Expertise in cloud computing platforms (e.g., AWS, Azure, GCP).
∙Proficiency in scripting languages (e.g., Python, Bash, Ruby).
∙Extensive experience with CI/CD tools (e.g., Jenkins, GitLab CI, Travis CI).
∙Extensive experience with NGINX and similar web servers.
∙Strong knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
∙Familiarity with infrastructure-as-code tools (e.g. Terraform, CloudFormation).
∙Ability to work on-call as needed and respond to emergencies in a timely manner.
∙Experience with high transactional e-commerce platforms.
Preferred Qualifications:
∙Certifications in cloud computing or DevOps are a plus (e.g., AWS Certified DevOps Engineer,
Azure DevOps Engineer Expert).
∙Experience in a high availability, 24x7x365 environment.
∙Strong collaboration, communication, and interpersonal skills.
∙Ability to work independently and as part of a team.
Machint Solutions, a US registered IT & Digital automation Products and Services organization is seeking to hire couple of LINUX ADMINISTRATORS for its office in WHITEFIELDS, KONDAPUR, HYDERABAD, TELANAGANA.
Job description
- Minimum 5 years of strong Linux (RHEL & SuSE) Admin knowledge & troubleshooting skills.
- Must know Storage integration with Linux.
- Must have strong scripting (Bash or Shell) knowledge.
- Cluster Knowledge (RHEL & SuSE)
- Monitoring & Patching tools knowledge
- Should have good experience with AWS Cloud & VMWare
- Networking Knowledge with respect to Linux
- Work Location: Machint Solutions Private Limited., Whitefields, Kondapur, Hyderabad
- Notice period: Candidates who can join in 2 weeks are preferred.
- Interview: F2F at our office - Between 11 AM and 6 PM Monday through Friday
- Budget: Market standards
Please share your updated resume on ram dot n at machint dot com with salary and notice period info to initiate the hiring process.
Main tasks
- Supervision of the CI/CD process for the automated builds and deployments of web services and web applications as well as desktop tool in the cloud and container environment
- Responsibility of the operations part of a DevOps organization especially for development in the environment of container technology and orchestration, e.g. with Kubernetes
- Installation, operation and monitoring of web applications in cloud data centers for the purpose of development of the test as well as for the operation of an own productive cloud
- Implementation of installations of the solution especially in the container context
- Introduction, maintenance and improvement of installation solutions for development in the desktop and server environment as well as in the cloud and with on-premise Kubernetes
- Maintenance of the system installation documentation and implementation of trainings
Execution of internal software tests and support of involved teams and stakeholders
- Hands on Experience with Azure DevOps.
Qualification profile
- Bachelor’s or master’s degree in communications engineering, electrical engineering, physics or comparable qualification
- Experience in software
- Installation and administration of Linux and Windows systems including network and firewalling aspects
- Experience with build and deployment automation with tools like Jenkins, Gradle, Argo, AnangoDB or similar as well as system scripting (Bash, Power-Shell, etc.)
- Interest in operation and monitoring of applications in virtualized and containerized environments in cloud and on-premise
- Server environments, especially application, web-and database servers
- Knowledge in VMware/K3D/Rancer is an advantage
- Good spoken and written knowledge of English
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
Responsible for managing the configuration of all IaaS assets across the platforms
Hands-on python experience
Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
AWS Certification is Plus
Knowledge of best practices for IT operations in an always-on, always-available service model
Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
3 to 6 years of experience in the IT field
AWS Python developer
AWS, Serverless/Lambda, Middleware.
Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
Must have created REST API with AWS Lambda.
Python relevant exp 3 years
Good to have Experience working on projects and problem solving with large scale multivendor
teams.
Good to have knowledge on Agile Development
Good knowledge on SDLC.
Hands on AWS Databases, (RDS, etc)
Good to have Unit testing exp.
Good to have CICD working knowledge.
Decent communication, as there will be client interaction and documentation.
Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
Linux/Unix system administration
Continuous Integration/Continuous Delivery tools like Jenkins
Cloud provisioning and management – Azure, AWS, GCP
Ansible, Chef, or Puppet
Python, PowerShell & BASH
Job Details
JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
RC: TBD
PREFERRED LOCATION: HYDERABAD, IND
POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
CAREER LEVEL: 3
Work Location:
Hyderabad
- Experience with Infrastructure-as-Code tools(IaS) like Terraform and Cloud Formation.
- Proficiency in cloud-native technologies and architectures (Docker/ Kubernetes), Ci/CD pipelines.
- Good experience in Javascript.
- Expertise in Linux / Windows environment.
- Good Experience in Scripting languages like PowerShell / Bash/ Python.
- Proficiency in revision control and DevOps best practices like Git