
Automation With Unix Shell Scripting
at Leading Payment Solutions Company
Requirment:
- Proven industrial experience in software
- Excellent communication skill
- Excellent on Unix and Shell Scripting, Vi Editor (also Python)
- Development for any Test Automation with C++
- Development experience in Test Automation with shell scripting
- Should have good analysis skills, good comprehension and abstraction capability
- Experience / expertise in Docker
- Good experience working on Databases Oracle/MySQL: SQL Queries, Joins, Views etc.
- Good understanding of C++ skills: STL, String Operations, Design Patterns, OOPS concepts
- Know-how of Eclipse
- CI (Continuous Integration), Jenkins, GIT
- Working experience with Agile methodology, JIRA

Similar jobs


Job Title : Software Engineer (.NET, Azure)
Location : Remote
Employment Type : [Full-time/Contract]
Experience Level : 3+ Years
Job Summary :
We are looking for a skilled Software Engineer (.NET, Azure) to develop high-quality, secure, and scalable software solutions. You will collaborate with product owners, security specialists, and engineers to deliver robust applications.
Responsibilities :
- Develop and maintain server-side software using .NET (C#), SQL, and Azure.
- Build and secure RESTful APIs.
- Deploy and manage applications on Azure.
- Ensure version control using Azure DevOps/Git.
- Write clean, maintainable, and scalable code.
- Debug and optimize application performance.
Qualifications :
- 3+ Years of server-side development experience.
- Strong proficiency in .NET, SQL, and Azure.
- Experience with RESTful APIs and DevOps/Git.
- Ability to work collaboratively and independently.
- Familiarity with Scrum methodologies.
About the job
As a Senior SIP Engineer, you must take complete ownership of supporting all VoIP infrastructure, debugging issues related to specific servers or software, or remote clients such as SIP devices (both virtual such as soft-phone or WebRTC client, and physical such as a desk phone or an on-premise PBX), and providing fixes.
- Support customers during EST timezone during critical releases or emergency incidents 5+ yrs of supporting global VoIP services and/or applications on cloud-based servers.
- Expertise in SIP call flow analysis and debugging Expertise in setup and maintaining SIP-based monitoring, debugging, and alerting services
- Experience scripting call flow, dialplan, and custom routing with FreeSwitch using LUA and XML
- Experience in debugging Kamailio and Freeswitch-based applications is a must Good problem-solving and analytical skills Excellent written and verbal communication
- Experience working with open-source projects
- Exposure to SIP Carrier Integration
- Advanced Experience with cloud media infrastructure (load balancers, gateways, SBCs, STUN, TURN)
- Advanced Knowledge of all modern VoIP protocols/platforms including (SIP, RTP stack & SDP, RTCP, TCP, UDP, SIP, HTTPS, SSL/TLS)
Job Description:
• Experience in Core Java, Spring Boot.
• Experience in microservices.
• Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
• Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
• Good knowledge of multi-threading
• Basic working knowledge of Unix/Linux
• Excellent problem solving and coding skills in Java
• Strong interpersonal, communication and analytical skills.
• Should be able to express their design ideas and thoughts.
Please Apply - https://zrec.in/GzLLD?source=CareerSite
About Us
Infra360 Solutions is a services company specializing in Cloud, DevSecOps, Security, and Observability solutions. We help technology companies adapt DevOps culture in their organization by focusing on long-term DevOps roadmap. We focus on identifying technical and cultural issues in the journey of successfully implementing the DevOps practices in the organization and work with respective teams to fix issues to increase overall productivity. We also do training sessions for the developers and make them realize the importance of DevOps. We provide these services - DevOps, DevSecOps, FinOps, Cost Optimizations, CI/CD, Observability, Cloud Security, Containerization, Cloud Migration, Site Reliability, Performance Optimizations, SIEM and SecOps, Serverless automation, Well-Architected Review, MLOps, Governance, Risk & Compliance. We do assessments of technology architecture, security, governance, compliance, and DevOps maturity model for any technology company and help them optimize their cloud cost, streamline their technology architecture, and set up processes to improve the availability and reliability of their website and applications. We set up tools for monitoring, logging, and observability. We focus on bringing the DevOps culture to the organization to improve its efficiency and delivery.
Job Description
Job Title: DevOps Engineer AWS
Department: Technology
Location: Gurgaon
Work Mode: On-site
Working Hours: 10 AM - 7 PM
Terms: Permanent
Experience: 2-4 years
Education: B.Tech/MCA/BCA
Notice Period: Immediately
Infra360.io is searching for a DevOps Engineer to lead our group of IT specialists in maintaining and improving our software infrastructure. You'll collaborate with software engineers, QA engineers, and other IT pros in deploying, automating, and managing the software infrastructure. As a DevOps engineer you will also be responsible for setting up CI/CD pipelines, monitoring programs, and cloud infrastructure.
Below is a detailed description of the roles and responsibilities, expectations for the role.
Tech Stack :
- Kubernetes: Deep understanding of Kubernetes clusters, container orchestration, and its architecture.
- Terraform: Extensive hands-on experience with Infrastructure as Code (IaC) using Terraform for managing cloud resources.
- ArgoCD: Experience in continuous deployment and using ArgoCD to maintain GitOps workflows.
- Helm: Expertise in Helm for managing Kubernetes applications.
- Cloud Platforms: Expertise in AWS, GCP or Azure will be an added advantage.
- Debugging and Troubleshooting: The DevOps Engineer must be proficient in identifying and resolving complex issues in a distributed environment, ranging from networking issues to misconfigurations in infrastructure or application components.
Key Responsibilities:
- CI/CD and configuration management
- Doing RCA of production issues and providing resolution
- Setting up failover, DR, backups, logging, monitoring, and alerting
- Containerizing different applications on the Kubernetes platform
- Capacity planning of different environment's infrastructure
- Ensuring zero outages of critical services
- Database administration of SQL and NoSQL databases
- Infrastructure as a code (IaC)
- Keeping the cost of the infrastructure to the minimum
- Setting up the right set of security measures
- CI/CD and configuration management
- Doing RCA of production issues and providing resolution
- Setting up failover, DR, backups, logging, monitoring, and alerting
- Containerizing different applications on the Kubernetes platform
- Capacity planning of different environment's infrastructure
- Ensuring zero outages of critical services
- Database administration of SQL and NoSQL databases
- Infrastructure as a code (IaC)
- Keeping the cost of the infrastructure to the minimum
- Setting up the right set of security measures
Ideal Candidate Profile:
- A graduation/post-graduation degree in Computer Science and related fields
- 2-4 years of strong DevOps experience with the Linux environment.
- Strong interest in working in our tech stack
- Excellent communication skills
- Worked with minimal supervision and love to work as a self-starter
- Hands-on experience with at least one of the scripting languages - Bash, Python, Go etc
- Experience with version control systems like Git
- Strong experience of Amazon Web Services (EC2, RDS, VPC, S3, Route53, IAM etc.)
- Strong experience with managing the Production Systems day in and day out
- Experience in finding issues in different layers of architecture in production environment and fixing them
- Knowledge of SQL and NoSQL databases, ElasticSearch, Solr etc.
- Knowledge of Networking, Firewalls, load balancers, Nginx, Apache etc.
- Experience in automation tools like Ansible/SaltStack and Jenkins
- Experience in Docker/Kubernetes platform and managing OpenStack (desirable)
- Experience with Hashicorp tools i.e. Vault, Vagrant, Terraform, Consul, VirtualBox etc. (desirable)
- Experience with managing/mentoring small team of 2-3 people (desirable)
- Experience in Monitoring tools like Prometheus/Grafana/Elastic APM.
- Experience in logging tools Like ELK/Loki.
Senior Mechanical Design Engineer is required in one of the leading global recycling platform with operations across the United States, UK, and Asia. Industry Preference : ex. paper, sugar, chemical, paint, fertilizer, oil & gas, candidates who are really interested to work and able to join on immediate basis may apply.
Responsibilities
1. Design, build, and commission SPMs and Automated Equipment in collaboration with the fabrication team
2. Calculate & problem solve on design challenges in various equipment from conceptual proposals to detailed design.
3. Improve existing equipment with targeted performance and automation upgrades
4. Create detailed engineering drawings and bills of materials
5. Develop equipment operational and maintenance documentation in collaboration with process & fabrication teams and conduct relevant training
6. Develop models to capture engineering calculations and FEA to validate equipment design Develop and implement control narratives in collaboration with the automation and fabrication teams
7. Create conceptual plant layouts, Process & Instrumentation diagrams, and manuals in collaboration with projects and process teams
8. Manage team of mechanical engineers for design
Requirements
1. B. Tech. / M. Tech. in Mechanical Engineering with extensive hands-on experience in SPM design for diverse applications
2. Preference of candidate with prior work experience in process industries (for ex. paper, sugar, chemical, paint, fertilizer, oil & gas)
3. Knowledge of machine shop fabrication processes
4. Proficiency with all types of automation components, sensors, electromechanical actuators, motion control, hydraulics, pneumatics etc.
5. Good experience in FEA
6. Ability to work both independently and in teams
7. Proficiency with Solid Modelling 3D CAD, SolidWorks, AutoCAD
8. Extensive knowledge of material specifications/ grades/properties/uses with engineering codes, and extensive knowledge of engineering accessories like pumps, valves, motors, gear boxes etc.
Sales executives are responsible for selling a company’s products or services to potential customers. They are often regarded as the face of the company and the primary point of contact for clients.
A sales executive usually performs the following tasks:
- Find new business opportunities and customers
- Contact potential customers to showcase company products or services
- Develop relationships with existing customers and keep in touch with them
- Negotiate prices and terms with customers
- Prepare sales contracts and keep track of sales activities
- Work with other team members to ensure customer satisfaction



Ultra Commerce is a software and hosting company. It is based in Australia with additional offices in US, UK, New Zealand & India. Ultra Commerce provides managed- hosting services for commerce applications with its cloud platform & offers its own digital commerce platform. It serves B2B & B2C both.
Role & Responsibility-
5 years’ experience in web development with a server-side language such as PHP(framework such as Laravel, Zend, CodeIgniter)
Experience in Python.
- Experience in JavaScript frameworks (especially React)
- Experience in web services, Rest API, JSON, SOAP, Linux/Unix.
- Advanced knowledge in MVC architecture and object-oriented programming
- Proven track record of delivering high-quality customer facing software solutions on time and within deadlines.
- Solid understanding of web application security
- Professional experience of version control (Git)
- Ability to estimate, design and implement development tasks in the form of writing and editing code, as well as writing and executing database queries and applying configuration changes.
- Great communication, critical thinking, problem solving, and time-management skills.
- Ability to create development tasks based on analysis of client business rules, desired functionality, and functional specs
- Mentor developers by answering questions on ongoing project related tasks.
Job Location- Gurgaon sector 39 Unitech Cyber Park
5 Days working- Work from office
http://www.ultracommerce.co">www.ultracommerce.co
Reporting to the Development Product Manager, he will work in a distributed development environment suggesting innovative design patterns.
Qualifications
• 7+ years of software development experience using technologies such as ReactJS, NodeJS.
• Strong understanding on UI design patterns
• Should have exposure with Single Page Applications.
• HTML5, JavaScript, CSS3, less
• Should be able to write web API’s in NodeJS
Preference shall be given to candidate with strong hold in ReactJS and good UI development concepts.

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow


