11+ Shell Scripting Jobs in Chennai | Shell Scripting Job openings in Chennai
Apply to 11+ Shell Scripting Jobs in Chennai on CutShort.io. Explore the latest Shell Scripting Job opportunities across top companies like Google, Amazon & Adobe.
Mandatory Skill set : C++ and Python - UNIX- Database - SQL or Postgres
Developer Role EXP : 3 to 5yrs
Location : Bangalore /Chennai/Hyderabad
1. Strong proficiency in C++ , with fair knowledge of the language specification (Telecom experience is preferred).
2. Proficient understanding of standard template library (STL): algorithms, containers, functions, and iterators
3. Must have experience on Unix platforms, should possess shell scripting skills.
4. Knowledge on compilers(gcc, g) and debugger (dbx). Knowledge of libraries and linking.
5. Good understanding of code versioning tools (e.g. Git, CVS etc.)
6. Able to write and understand python scripts (both python2 and python3)
7. Handson with logic implementation in python and should be familiar with list comprehension and is comfortable in integrating it with C++ and Unix scripts
8. Able to implement multithreading in both C++ and Python environment.
9. Familiar with Postgres SQL.
C++ developer with Python as secondary - 3 to 4 yrs exp / should be CW.
Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA
Responsibilities:
- Parse data using Python, create dashboards in Tableau.
- Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
- Migrate Datastage jobs to Snowflake, optimize performance.
- Work with HDFS, Hive, Kafka, and basic Spark.
- Develop Python scripts for data parsing, quality checks, and visualization.
- Conduct unit testing and web application testing.
- Implement Apache Airflow and handle production migration.
- Apply data warehousing techniques for data cleansing and dimension modeling.
Requirements:
- 4+ years of experience as a Platform Engineer.
- Strong Python skills, knowledge of Tableau.
- Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
- Proficient in Unix Shell Scripting and SQL.
- Familiarity with ETL tools like DataStage and DMExpress.
- Understanding of Apache Airflow.
- Strong problem-solving and communication skills.
Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
ESSENTIAL SKILLS:
· 4+ Years Hands on experience with web automation using Selenium, JavaScript technologies · Good knowledge and experience in unit testing tools such as Jasmine, Jest, Mocha/Chai or equivalent framework · Hands on experience in Cypress · Decent experience in testing web applications built on technologies like Node.js / React.js · Excellent programming skills in Java, JavaScript, or related languages, written custom test automation scripts, automation tools and frameworks · Ability to learn quickly and to adapt to existing / new tools and technologies · Work exposure with custom CI/CD Pipelines e.g.: Jenkins · Experience with Version control systems like Git · Experience with Bug tracking software like Jira · Excellent written/verbal communication · Ability to collaborate with different stakeholders including engineers, other QA, Product owners
ROLE DESCRIPTION: |
· Build the test suite for the UI automation where web pages are generated using UI as a service · The test suite is expected to be built on Cypress.io · Develop and implement automated test scripts by enhancing the existing / new framework and architecture · Write functional and end to end tests using JavaScript technologies, Selenium etc · Develop reusable utilities and test scripts which can be shared across the platform · Document defects effectively and collaborate with engineers and other stakeholders to have them resolved · Participate in the Agile execution and take part in the required agile ceremonies · Be an integral part of the development process, evangelising TDD and collaborating with software engineers across the development cycle. |
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
- 2-4 years of experience, with at least 1-2 years of experience in testing Web Applications/Services, UI testing.
- Good exposure to Non-functional tests strategies like Load, Perf and Chaos.
- Good understanding of testing principles: UI and Usability testing, Stress testing, Functional/Regression testing, Code coverage, TDD/BDD, UAT.
- Experience in deploying solutions into AWS is a major plus.
- Good team player, having effective and professional oral and written communication skills.
IT Service Company- Leading MNC, Chennai
Greetings from People First Consultants!!
We are hiring for one of leading client in Chennai
Work Location: Chennai
Exp:3-6 yrs (Associate).
Notice: Immediate to 30 days.
Required Skills:
- Experience with Test Design/Execution with Tricentis (Tosca), Selenium Apptus
- Experreince with SAP Agile & Salesforce.
- Automation of Web Portal/Service.
- Framwork Creation, Script Creation and Maintainence.
- Managing requirment in terms of Automation.
- Ability to work individually and also in a team as and when required.
- writing and executing codes with respect to the software and its specifications.
- Maintaining test scripts inventories
- Able to create test plan/strategies for automated tests
- Creating required data for test cases
15 years US based Product Company
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
- Looking only for immediate to 15 days candidate.
- Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
- Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
- Having experience in a broader Informatica toolset is strongly preferred
- Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
- Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
- Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
- Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
- Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
- Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
- Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
- Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
- Very strong scripting background (C/bourne shell/Perl/Java)
- Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
- Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.