Cutshort logo
Shell Scripting Jobs in Chennai

11+ Shell Scripting Jobs in Chennai | Shell Scripting Job openings in Chennai

Apply to 11+ Shell Scripting Jobs in Chennai on CutShort.io. Explore the latest Shell Scripting Job opportunities across top companies like Google, Amazon & Adobe.

icon
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Hyderabad, Bengaluru (Bangalore), Chennai
3 - 5 yrs
Best in industry
skill iconPython
skill iconDjango
skill iconFlask
skill iconC++
Telecom
+5 more

Mandatory Skill set : C++ and Python - UNIX- Database - SQL or Postgres

 

Developer Role EXP : 3 to 5yrs

 

 

Location : Bangalore /Chennai/Hyderabad

 

1. Strong proficiency in C++ , with fair knowledge of the language specification (Telecom experience is preferred). 

2. Proficient understanding of standard template library (STL): algorithms, containers, functions, and iterators 

3. Must have experience on Unix platforms, should possess shell scripting skills.

4. Knowledge on compilers(gcc, g) and debugger (dbx). Knowledge of libraries and linking. 

5. Good understanding of code versioning tools (e.g. Git, CVS etc.)

6. Able to write and understand python scripts (both python2 and python3)

7. Handson with logic implementation in python and should be familiar with list comprehension and is comfortable in integrating it with C++ and Unix scripts

8. Able to implement multithreading in both C++ and Python environment.

9. Familiar with Postgres SQL. 

 

C++ developer with Python as secondary - 3 to 4 yrs exp / should be CW.

Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Chennai
4 - 7 yrs
₹13L - ₹15L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+10 more

Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA

Responsibilities:

  • Parse data using Python, create dashboards in Tableau.
  • Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
  • Migrate Datastage jobs to Snowflake, optimize performance.
  • Work with HDFS, Hive, Kafka, and basic Spark.
  • Develop Python scripts for data parsing, quality checks, and visualization.
  • Conduct unit testing and web application testing.
  • Implement Apache Airflow and handle production migration.
  • Apply data warehousing techniques for data cleansing and dimension modeling.

Requirements:

  • 4+ years of experience as a Platform Engineer.
  • Strong Python skills, knowledge of Tableau.
  • Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
  • Proficient in Unix Shell Scripting and SQL.
  • Familiarity with ETL tools like DataStage and DMExpress.
  • Understanding of Apache Airflow.
  • Strong problem-solving and communication skills.

Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.

Read more
VuNet Systems

at VuNet Systems

2 recruiters
Gaurav Giri
Posted by Gaurav Giri
Hyderabad, Mumbai, Chennai, Bengaluru (Bangalore)
2 - 7 yrs
₹5L - ₹15L / yr
Red Hat Linux
System Administration
Linux administration
Monitoring
System monitoring
+9 more
At VuNet, we are building next-generation products that use a full-stack product with big data and machine learning in innovative ways to monitor customer journeys and improve user experience. Our next-generation systems are helping the largest financial institutions to improve their digital payment experience, driving more financial inclusion across the country.

This role is for Work from the office.

Job Description​

Roles & Responsibilities​​
  1. Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
  2. Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
  3. Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
  4. Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
  5. Analytics and analysis of data and provide insights and recommendations
  6. Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
  7. Work with the product team in developing new features, identifying solution gaps, etc.​
  8. Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.​

Skills & Experience

  1. At least 2+ years of experience in IT Infrastructure Management
  2. Experience in working with large-scale IT infra, including applications, databases, and networks.​
  3. Experience in working with monitoring tools, automation tools
  4. Hands-on experience in Linux and scripting.
  5. Knowledge/Experience in the following technologies will be an added plus:​ ​ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Read more
Lister Technologies

at Lister Technologies

8 recruiters
Vignesh Paramasivam
Posted by Vignesh Paramasivam
Chennai, Coimbatore
4 - 9 yrs
₹4L - ₹14L / yr
cypress
Test Automation (QA)
Selenium
Software Testing (QA)
Shell Scripting
+1 more

ESSENTIAL SKILLS:

·       4+ Years Hands on experience with web automation using Selenium, JavaScript technologies

·       Good knowledge and experience in unit testing tools such as Jasmine, Jest, Mocha/Chai or equivalent framework

·        Hands on experience in Cypress

·       Decent experience in testing web applications built on technologies like Node.js / React.js

·       Excellent programming skills in Java, JavaScript, or related languages, written custom test automation scripts, automation tools and frameworks

·       Ability to learn quickly and to adapt to existing / new tools and technologies

·       Work exposure with custom CI/CD Pipelines e.g.: Jenkins

·       Experience with Version control systems like Git

·       Experience with Bug tracking software like Jira

·       Excellent written/verbal communication

·       Ability to collaborate with different stakeholders including engineers, other QA, Product owners

 

ROLE DESCRIPTION:

·       Build the test suite for the UI automation where web pages are generated using UI as a service

·       The test suite is expected to be built on Cypress.io

·       Develop and implement automated test scripts by enhancing the existing / new framework and architecture

·       Write functional and end to end tests using JavaScript technologies, Selenium etc

·       Develop reusable utilities and test scripts which can be shared across the platform

·       Document defects effectively and collaborate with engineers and other stakeholders to have them resolved

·       Participate in the Agile execution and take part in the required agile ceremonies

·       Be an integral part of the development process, evangelising TDD and collaborating with software engineers across the development cycle.

 

Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Pune, Bengaluru (Bangalore), Hyderabad, Chennai
3 - 8 yrs
₹2L - ₹9L / yr
Data engineering
Data engineer
Spark
Apache Spark
Apache Kafka
+13 more

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
Plivo

at Plivo

11 recruiters
Anusha Reddy
Posted by Anusha Reddy
Remote, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Noida, NCR (Delhi | Gurgaon | Noida)
2 - 5 yrs
₹9L - ₹17L / yr
Software Development
SDET
Software Testing (QA)
Test Automation (QA)
skill iconPython
+5 more
  • 2-4  years of experience, with at least 1-2 years of experience in testing Web Applications/Services, UI testing.
  • Good exposure to Non-functional tests strategies like Load, Perf and Chaos.
  • Good understanding of testing principles: UI and Usability testing, Stress testing, Functional/Regression testing, Code coverage, TDD/BDD, UAT.
  • Experience in deploying solutions into AWS is a major plus.
  • Good team player, having effective and professional oral and written communication skills.
Read more
Chennai
2 - 8 yrs
₹4L - ₹9L / yr
Tosca
Shell Scripting
Automation
Test Automation (QA)
Selenium
Automation Testing (TOSCA) Job Opening in Leading MNC, Chennai

Greetings from People First Consultants!!

We are hiring for one of leading client in Chennai

Work Location: Chennai

Exp:3-6 yrs (Associate).

Notice:  Immediate to 30 days.

 

Required Skills: 

  • Experience with Test Design/Execution with Tricentis (Tosca), Selenium Apptus
  • Experreince with SAP Agile & Salesforce.
  • Automation of Web Portal/Service.
  • Framwork Creation, Script Creation and Maintainence.
  • Managing requirment in terms of Automation.
  • Ability to work individually and also in a team as and when required.
  • writing and executing codes with respect to the software and its specifications.
  • Maintaining test scripts inventories
  • Able to create test plan/strategies for automated tests
  • Creating required data for test cases
Read more
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Chennai, Bengaluru (Bangalore)
5 - 12 yrs
₹8L - ₹16L / yr
Informatica MDM
Informatica PowerCenter
Informatica
MDM
skill iconJava
+3 more
  • Looking only for immediate to 15 days candidate.
  • Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
  • Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
  • Having experience in a broader Informatica toolset is strongly preferred
  • Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
  • Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
  • Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
  • Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
  • Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
  • Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
  • Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
  • Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
  • Very strong scripting background (C/bourne shell/Perl/Java)
  • Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
  • Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.
Read more
HeyMath

at HeyMath

1 recruiter
Sivakumar Periadurai
Posted by Sivakumar Periadurai
Chennai
2 - 6 yrs
₹5L - ₹9L / yr
Test Automation (QA)
Shell Scripting
Software Testing (QA)
Selenium
manual testing
We are looking to hire someone who is passionate to work on Automation along with Manual testing. About Us Incorporated in June 2000, HeyMath! is an Ed-tech product company, based in Chennai and focused on improving the quality of Maths education globally through our flagship products: HeyMath! We develop our products in formal collaboration with the University of Cambridge who we have partnered with since our inception. You can check us out here: https://plp.heymath.com/ Over 95% of our work is with overseas customers, and we are investing heavily in building world-class capabilities in Software Engineering to support our next phase of growth – which will include consumer facing Apps in multiple languages, offline delivery capabilities as well as server side integration with our distribution partners to deliver the best user experience. The office environment is very collegiate and you will get to work in small, high-impact teams.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort