43+ Linux/Unix Jobs in Chennai | Linux/Unix Job openings in Chennai
Apply to 43+ Linux/Unix Jobs in Chennai on CutShort.io. Explore the latest Linux/Unix Job opportunities across top companies like Google, Amazon & Adobe.
Are you interested in joining the team behind Amazon’s newest innovation? Come help us work on world class software for our customers!
The Amazon Kindle Reader and Shopping Support Engineering team provides production engineering support and is also responsible for providing multifaceted services to the Kindle digital product family of development teams and working with production operations teams for software product release coordination and deployment. This job requires you to hit the ground running and your ability to learn quickly and work on disparate and overlapping tasks that will define your success
Job responsibilities
- Provide support of incoming tickets, including extensive troubleshooting tasks, with responsibilities covering multiple products, features and services
- Work on operations and maintenance driven coding projects, primarily in Java and C++
- Software deployment support in staging and production environments
- Develop tools to aid operations and maintenance
- System and Support status reporting
- Ownership of one or more Digital products or components
- Customer notification and workflow coordination and follow-up to maintain service level agreements
- Work with support team for handing-off or taking over active support issues and creating a team specific knowledge base and skill set
BASIC QUALIFICATIONS
- 4+ years of software development, or 4+ years of technical support experience
- Experience troubleshooting and debugging technical systems
- Experience in Unix
- Experience scripting in modern program languages
- Experience in agile/scrum or related collaborative workflow
- Experience troubleshooting and documenting findings
PREFERRED QUALIFICATIONS
- Knowledge of distributed applications/enterprise applications
- Knowledge of UNIX/Linux operating system
- Experience analyzing and troubleshooting RESTful web API calls
- Exposure to iOS (SWIFT) and Android (Native) App support & development
Mandatory Skill set : C++ and Python - UNIX- Database - SQL or Postgres
Developer Role EXP : 3 to 5yrs
Location : Bangalore /Chennai/Hyderabad
1. Strong proficiency in C++ , with fair knowledge of the language specification (Telecom experience is preferred).
2. Proficient understanding of standard template library (STL): algorithms, containers, functions, and iterators
3. Must have experience on Unix platforms, should possess shell scripting skills.
4. Knowledge on compilers(gcc, g) and debugger (dbx). Knowledge of libraries and linking.
5. Good understanding of code versioning tools (e.g. Git, CVS etc.)
6. Able to write and understand python scripts (both python2 and python3)
7. Handson with logic implementation in python and should be familiar with list comprehension and is comfortable in integrating it with C++ and Unix scripts
8. Able to implement multithreading in both C++ and Python environment.
9. Familiar with Postgres SQL.
C++ developer with Python as secondary - 3 to 4 yrs exp / should be CW.
Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA
Responsibilities:
- Parse data using Python, create dashboards in Tableau.
- Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
- Migrate Datastage jobs to Snowflake, optimize performance.
- Work with HDFS, Hive, Kafka, and basic Spark.
- Develop Python scripts for data parsing, quality checks, and visualization.
- Conduct unit testing and web application testing.
- Implement Apache Airflow and handle production migration.
- Apply data warehousing techniques for data cleansing and dimension modeling.
Requirements:
- 4+ years of experience as a Platform Engineer.
- Strong Python skills, knowledge of Tableau.
- Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
- Proficient in Unix Shell Scripting and SQL.
- Familiarity with ETL tools like DataStage and DMExpress.
- Understanding of Apache Airflow.
- Strong problem-solving and communication skills.
Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.
About us:
HappyFox is a software-as-a-service (SaaS) support platform. We offer an enterprise-grade help desk ticketing system and intuitively designed live chat software.
We serve over 12,000 companies in 70+ countries. HappyFox is used by companies that span across education, media, e-commerce, retail, information technology, manufacturing, non-profit, government and many other verticals that have an internal or external support function.
To know more, Visit! - https://www.happyfox.com/
We’re looking for a Lead Backend Engineer with 5+ years of experience in building web services to join our engineering team to help architect, build and run our growing list of products. You should have prior experience being responsible for building sufficiently complex products/services and mentoring software engineers.
Responsibilities:
- Lead a team of engineers working on our product roadmap. You are expected to contribute to feature development with hands-on development tasks
- Oversee software architecture, source control workflows, and CI/CD processes and perform code reviews to ensure exceptional code quality
- Improve the development experience and the quality of the codebase. You will define and uphold best practices and coding standards for the team
- Responsible for architecture and design decisions
- Own stability and performance of the service that you work on
- Work with the Engineering Manager to ship stable software on time
- Contribute to the vision and long-term strategy in your area of expertise
Requirements:
- 2+ years of experience in a technical lead role designing and building complex backend systems
- 5+ years of professional software development experience
- Excellent knowledge of best practices and coding patterns
- Deep knowledge of backend programming languages like Python and web application frameworks like Django
- Solid experience in building web services using relational databases like PostgreSQL or NoSQL databases like MongoDB
- Proficiency with Amazon Web Services (AWS) or Google Cloud Platform or Azure
- Extensive experience with Linux/UNIX production environments
- Strong sense of quality in terms of both program architecture and code style
- Passion to dig into technically complex problems to troubleshoot and figure out a solution
- Desire to continuously improve and ship the best quality product
- Pragmatic approach to make sure technical decisions align with actual business needs.
- Good understanding of the trade-offs when building for product value, reliability, or performance
- Excellent communication skills with the ability to discuss and explain your point of view clearly and effectively
- An engineering degree is a must(B.E. CS preferred)
Responsibilities:
• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
• Implementing Spark processing based ETL frameworks
• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
• Modifying the Informatica-Teradata & Unix based data pipeline
• Enhancing the Talend-Hive/Spark & Unix based data pipelines
• Develop and Deploy Scala/Python based Spark Jobs for ETL processing
• Strong SQL & DWH concepts.
Preferred Background:
• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs
• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
• Understanding of EDW system of business and creating High level design document and low level implementation document
• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document
• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Strong in Basic C++, STL, Linux
OOPs, Exception Handling
Design Pattern and Solid principles, concepts related to UML representation
• Solution, design, and architecture concepts
• Knowledge on Pointers and smart Pointers.
• IO streams, Files and Streams and Lambda Expressions in C++ added advantage.
• Features of C++17 and usage of STL in C++ is added advantage.
• Templates in C++.
Communication skill, Attitude, learnability
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
AI First, New-age Technology Company for the Digital
Location:Bangalore / Chennai
Type: Permanent
Company Type: AI / Digital Product Design Service based
Job Description
* Experience in C/C++ and have strong experience embedded application development and integration in Linux
* Good in creating unit tests, performing code reviews, optimizing performance and ensuring standards for maintainability
* Debugging, profiling, and performance optimization skills
* Have experience working in a fast paced Agile/Scrum environmentat Altimetrik
Loc: Chennai, Bangalore,Pune,JaipurEXP: 5 yrs to 8 yrs
- Implement best practices for the engineering team across code hygiene, overall architecture design, testing, and deployment activities
- Drive technical decisions for building data pipelines, data lakes, and analyst access.
- Act as a leader within the engineering team, providing support and mentorship for teammates across functions
- Bachelor’s Degree in Computer Science or equivalent job experience
- Experienced developer in large data environments
- Experience using Git productively in a team environment
- Experience with Docker
- Experience with Amazon Web Services
- Ability to sit with business or technical SMEs to listen, learn and propose technical solutions to business problems
· Experience using and adapting to new technologies
· Take and understand business requirements and goals
· Work collaboratively with project managers and stakeholders to make sure that all aspects of the project are delivered as planned
· Strong SQL skills with MySQL or PostgreSQL
- Experience with non-relational databases and their role in web architectures desired
Knowledge and Experience:
- Good experience with Elixir and functional programming a plus
- Several years of python experience
- Excellent analytical and problem-solving skills
- Excellent organizational skills
Proven verbal and written cross-department and customer communication skills
Notice Period should be imemdiate or Max 20 Days
● Experience in Core Java and Spring.
● Extensive experience in developing enterprise-scale n-tier applications for financial
domain. Should possess good architectural knowledge and be aware of enterprise
application design patterns.
● Should have the ability to analyze, design, develop and test complex, low-latency
client-facing applications.
● Good development experience with RDBMS, preferably Sybase database.
● Good knowledge of multi-threading and high-volume server-side development.
● Experience in sales and trading platforms in investment banking/capital markets.
● Basic working knowledge of Unix/Linux.
● Excellent problem solving and coding skills in Java.
● Strong interpersonal, communication and analytical skills.
● Should have the ability to express their design ideas and thoughts.
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
Responsibilities for Data Engineer
• Create and maintain optimal data pipeline architecture,
• Assemble large, complex data sets that meet functional / non-functional business requirements.
• Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
• Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
• Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
• Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
• Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
• Experience building and optimizing big data ETL pipelines, architectures and data sets.
• Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
• Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
• A successful history of manipulating, processing and extracting value from large disconnected
datasets.
Technical Skills Required – Mandatory:
- Cloud Technologies – AWS OR Bigdata (Basic Level)
- Python (Highly Experience Level)
- Databases – SQL (Highly Experience Level)
- Operating Systems – Unix (Highly Experience Level)
Strong knowledge in writing complex SQL queries and Knowledge of Performance Tuning
Excellent working knowledge of production support role and Strong debug/troubleshooting skills
Provide 24×7 operational support to all production practices on holidays and weekends.
Hands-on experience in production support for data engineering processes, root cause analysis, and identifying opportunities to improve existing processes.
Roles and Responsibilities:
Set up and optimize the existing integration, monitoring and alerting processes.
Working knowledge of ETL/ELT in Hadoop, Spark and MPP databases is required.
Effectively collaborate with the partners (SMEs, DBA, and Business users) to ensure the reliability of the Data systems.
Responsibilities :
- Develop server side applications in C++
- Contribute in analysis based on requirements. Analysis can be in the form of research and producing Proof of Concepts.
- Produce documents and specifications wherever required.
- Work collaboratively with other developers and QA engineers to deliver the solutions.
CoreStack, an AI-powered multi-cloud governance solution, empowers enterprises to rapidly achieve Continuous and Autonomous Cloud Governance at Scale. CoreStack enables enterprises to realize outcomes such as 40% decrease in cloud costs and 50% increase in operational efficiencies by governing operations, security, cost, access, and resources. CoreStack also assures 100% compliance with standards such as ISO, FedRAMP, NIST, HIPAA, PCI-DSS, AWS CIS & Well Architected Framework (WAF). We work with many large global customers across multiple industries including Financial Services, Healthcare, Retail, Education, Telecommunications, Technology and Government.
Responsibilities:
- Part of a Cloud Governance product team responsible for installing, configuring, automating and monitoring various Cloud Services (IaaS, PaaS, and SaaS)
- Be at the forefront of Cloud technology, assisting a global list of customers that consume multiple cloud environments.
- Ensure availability of internal & customers' hosts and services thru monitoring, analysing metric trends, investigating alerts.
- Explore and implement a broad spectrum of open source technologies. Help the team/customer to resolve technical issues.
- Extremely customer focused, flexible to be available on-call for solving critical problems.
- Contribute towards the process improvement involving the Product deployments, Cloud Governance & Customer Success.
Skills Required
- Minimum 3+ Years of experience with a B.E/B.Tech
- Experience in managing Azure IaaS, PaaS services for customer production environments
- Well versed in DevOps technologies, automation, infrastructure orchestration, configuration management and CI/CD
- Experience in Linux and Windows Administration, server hardening and security compliance
- Web and Application Server technologies (e.g. Apache, Nginx, IIS)
- Good command in at least one scripting language (e.g. Bash, PowerShell, Ruby, Python)
- Networking protocols such as HTTP, DNS and TCP/IP
- Experience in managing version control platforms (e.g. Git, SVN)
- 4-15 year experience in Application Support
- Must have good knowledge in Java/J2EE, Microservices, PL/SQL, Unix
- Good to have knowledge in Agile, JIRA, Splunk, Service Now
- Good understanding and hands-on experience in Incident, Problem and Change Management
- Provide technical leadership to the team & contribute in the skill development within team
Interact with internal teams and client stakeholders to trouble shoot the tickets/incident and manage other support activities - Good communications skills are necessary, must be team player and inquisitive.
- Strong customer service and support focus with a desire to deliver a high-quality service
- Ability to multi-task, work under pressure and to tight deadlines
- Flexible in working outside of office business hours at short notice (as required)
- Should be able to examine the system and identify the areas for Service Improvements & Value adds.
- 3to 4years of professional experience as a DevOps / System Engineer
- Command line experience with Linux including writing bash scripts
- Programming in Python, Java or similar
- Fluent in Python and Python testing best practices
- Extensive experience working within AWS and with it’s managed products (EC2, ECS, ECR,R53,SES, Elasticache, RDS,VPCs etc)
- Strong experience with containers (Docker, Compose, ECS)
- Version control system experience (e.g. Git)
- Networking fundamentals
- Ability to learn and apply new technologies through self-learning
Responsibilities
- As part of a team implement DevOps infrastructure projects
- Design and implement secure automation solutions for development, testing, and productionenvironments
- Build and deploy automation, monitoring, and analysis solutions
- Manage our continuous integration and delivery pipeline to maximize efficiency
- Implement industry best practices for system hardening and configuration management
- Secure, scale, and manage Linux virtual environments
- Develop and maintain solutions for operational administration, system/data backup, disasterrecovery, and security/performance monitoring
Minimum of 4+ years of experience in Java development
- Experience delivering Services (REST, SOAP) and Web applications in Micro services architecture
- Experience developing and deploying Java solutions to cloud
- Experience in Spring Boot and components of Spring framework
- Experience in a JavaScript framework such as Angular or React
- Experience in TDD using Junit or similar frameworks
· Experience in Design Patterns and service oriented architectural principles, Data structures and Algorithms.
· Individual should be an active participant in the product design and code reviews for self and team and can competently review any aspect of their product or major sub-system.
· Experience in SQL, Unix skills.
· Good communication Skill
Intuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East.
Intuitive is looking for highly talented hands-on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.
Job Description :
- Extensive exp. with K8s (EKS/GKE) and k8s eco-system tooling e,g., Prometheus, ArgoCD, Grafana, Istio etc.
- Extensive AWS/GCP Core Infrastructure skills
- Infrastructure/ IAC Automation, Integration - Terraform
- Kubernetes resources engineering and management
- Experience with DevOps tools, CICD pipelines and release management
- Good at creating documentation(runbooks, design documents, implementation plans )
Linux Experience :
- Namespace
- Virtualization
- Containers
Networking Experience
- Virtual networking
- Overlay networks
- Vxlans, GRE
Kubernetes Experience :
Should have experience in bringing up the Kubernetes cluster manually without using kubeadm tool.
Observability
Experience in observability is a plus
Cloud automation :
Familiarity with cloud platforms exclusively AWS, DevOps tools like Jenkins, terraform etc.
client of peoplefirst consultants
Software Engineer - C++ Developer.
Experience: 1-3 years
Requirements:-
- A minimum of 2 years' experience as a C++ software developer. • Linux Operating Systems(Basic idea about structure, file types and memory) • Socket Programming • Version control tools like GIT, SVN(basic operations) • Current knowledge of C++ standards and specifications. • Proficiency in C++ compliant languages such as C, Java, and Python. • Good Understanding for backend concepts like OOPS, Algorithm, Data structure, Design patterns |
Job Description
Embedded Software/Firmware Design and Development on OS/Non-OS based platform
Necessary system architecture development and implementation
Device, sensor and gateway selection and integration based on industry use case.
Development on Image Analytics and Cloud/Platform integration
Development or integration of Cloud/Platform communication protocol
Technical Documentation and Testing of the developed system
Desired Skills, Knowledge& Experience
B. Tech/M. Tech/MCA with 8-10 years of industry experience
Strong programming skills in C/C++ and Python development in Linux
Experience in device or sensor communication interfaces (wired/wireless, short range/long range).
Experience of microcontroller and microprocessor.
Knowledge of IoT gateway firmware development, container, and OS hardening.
Experience in device communication protocol, Edge computing and processing
Competent in application development for Image processing, transmission, and storage
Good understanding and of programming primitives, data structures, multi-threading and memory management techniques
Strong command over complex sensor and hardware control logic to work seamlessly with edge devices or gateway
Proactive and Self motivated
Strong verbal and written communication skill
· Strong knowledge on Windows and Linux
· Experience working in Version Control Systems like git
· Hands-on experience in tools Docker, SonarQube, Ansible, Kubernetes, ELK.
· Basic understanding of SQL commands
· Experience working on Azure Cloud DevOps
• Provides remote planning (design), implementation and/or administrative support on Dell server and storage products involving software. • Performs initial installation, implementation, customization, integration and outline orientation for the customer. • Works closely with other Dell teams, account team and Customer.
Essential Skill Requirements: • Understanding of compute environment eco system • Dell PowerEdge Servers & modular server – planning, implementation and/or administrative • Dell Power Vault – MD / ME4 series storage planning, implementation and/or administrative • Dell Storage – NX, SC Series storage planning, implementation • Experience with basic network switch technologies (Ethernet, Fibre Channel) IP networking and L2 switches. • Operating system installation and configuration: o Windows Server (inclusive of Hyper-V clustering) o VMWare ESXi and virtualization o Red hat Linux • Possesses file, P2V and/or V2V migration experience will have an added advantage.
Desirable Requirements: • Customer Service skill. • Stakeholder management. • Possess excellent problem solving, communication and organizational skills • Flexibility, dependability and have excellent time management skills • Good presentation skills • Analytical, articulate, results-oriented and able to provide excellent follow-ups. • Strong technical aptitude. • Ability to multi-task and influence others to achieve results. • Possesses Professional certification from Cisco/VMWare/Microsoft/Red Hat/Cloud will have an added advantage.
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Requirements
You will make an ideal candidate if you have:
-
Overall 6+ years of software development experience in building web-based and highly scalable applications
-
Excellent understanding of core computer science concepts like algorithms, data structure, system design, OOP, etc.
-
Deep knowledge and development expertise in Core Java and SpringBoot
-
Experience in using dockers and Linux.
-
Hands-on coding experience in developing web applications with REST APIs
-
Good understanding/exposure to Kafka and Elastic.(must have)
-
Log stash understanding is add-on
-
Worked in a CI/CD environment with Jenkins and handled deployments by yourself
-
Experience working in an agile team with software engineering practices like automated testing, test-driven-development, continuous integration, etc is a big plus.
Requirements:-
- Bachelor’s Degree or Master’s in Computer Science, Engineering,Software Engineering or a relevant field.
- Strong experience with Windows/Linux-based infrastructures, Linux/Unix administration.
- knowledge of Jira, Bitbucket, Jenkins, Xray, Ansible, Windows and .Net. as their Core Skill.
- Strong experience with databases such as SQL, MS SQL, MySQL, NoSQL.
- Knowledge of scripting languages such as Shell Scripting /Python/ PHP/Groovy, Bash.
- Experience with project management and workflow tools such as Agile, Jira / WorkFront etc.
- Experience with open-source technologies and cloud services.
- Experience in working with Puppet or Chef for automation and configuration.
- Strong communication skills and ability to explain protocol and processes with team and management.
- Experience in a DevOps Engineer role (or similar role)
- AExperience in software development and infrastructure development is a plus
Job Specifications:-
- Building and maintaining tools, solutions and micro services associated with deployment and our operations platform, ensuring that all meet our customer service standards and reduce errors.
- Actively troubleshoot any issues that arise during testing and production, catching and solving issues before launch.
- Test our system integrity, implemented designs, application developments and other processes related to infrastructure, making improvements as needed
- Deploy product updates as required while implementing integrations when they arise.
- Automate our operational processes as needed, with accuracy and in compliance with our security requirements.
- Specifying, documenting and developing new product features, and writing automating scripts. Manage code deployments, fixes, updates and related processes.
- Work with open-source technologies as needed.
- Work with CI and CD tools, and source control such as GIT and SVN.
- Lead the team through development and operations.
- Offer technical support where needed, developing software for our back-end systems.
An IT Services Major, hiring for a leading insurance player.
Job Description:
- To resolve the tickets raised as per the defined standards of time, cost, and quality for increased customer satisfaction.
Skills and Experience
- Good SQL & PLSQL knowledge.
- Good Unix basic commands knowledge.
- Good Unix scripting knowledge will be an advantage.
- ANSI SQL
- Unix - Others/
- Experience - Fresher
We are hiring candidates who are looking to work in a cloud environment and ready to learn and adapt to the evolving technologies.
Linux Administrator Roles & Responsibilities:
- 5+ or more years of professional experience with strong working expertise in Agile environments
- Deep knowledge in managing Linux servers.
- Managing Windows servers(Not Mandatory).
- Manage Web servers (Apache, Nginx).
- Manage Application servers.
- Strong background & experience in any one scripting language (Bash, Python)
- Manage firewall rules.
- Perform root cause analysis for production errors.
- Basic administration of MySQL, MSSQL.
- Ready to learn and adapt to business requirements.
- Manage information security controls with best practises and processes.
- Support business requirements beyond working hours.
- Ensuring highest uptimes of the services.
- Monitoring resource usages.
Skills/Requirements
- Bachelor’s Degree or Diploma in Computer Science, Engineering, Software Engineering or a relevant field.
- Experience with Linux-based infrastructures, Linux/Unix administration.
- Knowledge in managing databases such as My SQL, MS SQL.
- Knowledge of scripting languages such as Python, Bash.
- Knowledge in open-source technologies and cloud services like AWS, Azure is a plus. Candidates willing to learn will be preferred.
- Experience in managing web applications.
- Problem-solving attitude.
- 5+ years experience in the IT industry.
Our client company is into Financial services. ( O1)
- Experience working on Linux based infrastructure
- Strong hands-on knowledge of setting up production, staging, and dev environments on AWS/GCP/Azure
- Strong hands-on knowledge of technologies like Terraform, Docker, Kubernetes
- Strong understanding of continuous testing environments such as Travis-CI, CircleCI, Jenkins, etc.
- Configuration and managing databases such as MySQL, Mongo
- Excellent troubleshooting
- Working knowledge of various tools, open-source technologies, and cloud services
- Awareness of critical concepts in DevOps and Agile principles
A Product Based IT Startup
We are looking networking professionals with the following skill set,
Experience :6+ years of experience in the networking domain
Key skills:
- Must have 6+ years of experience in C/C++ programming language.
- Knowledge of Go programming language and Python programming language is a big plus.
- Strong background in L4-L7 Internet Protocols TCP, HTTP, HTTP2, GRPC and HTTPS/SSL/TLS.
- Background in Internet security related products such as Web Application Firewalls, API Security Gateways, Reverse Proxies and Forward Proxies
- Proven knowledge of Linux kernel internals (process scheduler, memory management, etc.)
- Experience with eBPF is a plus.
- Hands-on experience in cloud architectures (SaaS, PaaS, IaaS, distributed systems) with continuous delivery
- Familiar with containerization solutions like Docker/Kubernetes etc.
- Familiar with server less technologies such as AWS Lambda.
- Exposure to machine learning technologies and distributed systems is a plus
Your role in helping us shape the future:
Gogo Commercial Aviation, An Intelsat Company has an exciting opportunity for a Software Engineer. You will help us design and develop solutions aimed at providing a means to manage and deploy the configuration specifics of aircraft, based on airlines, aircraft deployed technology, fleet and installation requirements. This solution uses a combination of configuration management and cloud-based deployment using AWS which also requires an understanding of network / telecom protocols.
Are you up to the challenge?
The ideal candidate for this role will be a working member of an engineering team delivering quality technical solutions carrying through from design to deployment. This role requires solid understanding of the design and development process, using CI / CD technologies. Your role will be crucial in designing and delivering complex products and solutions across technologies products and tools, improving quality of the code by following coding standards/guidelines, using static code analysers, maximum unit test coverage and automating the test cases.
Can you drive these processes?
- Develop groundbreaking and distributed solutions with high availability and impact.
- Work with the Architects/ Technical lead / Product Owner to understand user requirements, convert BDD scenarios or user stories into product functionality.
- Work with the team members collectively to own and deliver features and flexible to take up any type of tasks like develop, automate and test to complete feature.
- Perform code reviews and associated engineering quality checks clearly and consistently.
You should definitely have:
- Bachelor's degree in Computer Science, Engineering, or related discipline
- 2 - 5 Years of experience in systems implementations with a focus on both custom application development & commercial systems software implementations
- Knowledge and experience in multiple technical disciplines required (development, QA, devops, etc)
- 2+ years of in-depth development experience with Java / C++ / C / Go.
- Experience in CI/CD and AWS deployments (EC2, SQS, SNS, Lambda, S3, Aurora, DynamoDB)
- Advanced knowledge of software development lifecycles; expert knowledge in Agile and Lean methods.
- Proponent of DevOps, TDD, Agile/XP practices, and CI/CD pipelines
- Ability to work under pressure, prioritize work, coordinate with onsite stakeholders and well organized.
- Flexibility in working with technologies and platforms.
- A commitment to excellence, best practices, and the continuous improvement of our products, code base, processes, and tools.
It would be nice if you had:
- Experience with Scaled Agile Framework (SAFe) work environments
What it’s like to work with us:
Intelsat is connecting the world and transforming the satellite landscape by reaching beyond the traditional satellite industry. We are defining new products that will open new, profitable markets. To help us reach this goal, you should be a bold thinker who will perform a key role in shaping Intelsat innovation for years to come.
- We emphasize personal and professional growth
- Awesome benefits including Leave Benefits, medical and training
- Fun, diverse, and inclusive culture
We are an upcoming profitable social enterprise and as a a part of the team we are looking for a candidate who can work with our team to build better analytics and intellegence into our platform Prabhaav.
We are looking for a Software Developer to build and implement functional programs. You will work with other Developers and https://resources.workable.com/product-manager-job-description">Product Managers throughout the software development life cycle.
In this role, you should be a team player with a keen eye for detail and problem-solving skills. If you also have experience in Agile frameworks and popular coding languages (e.g. JavaScript).
Your goal will be to build efficient programs and systems that serve user needs.
Technical Skills we are looking for are:
- Producing clean, efficient code based on specifications
- Coding Abilities in HTML , PHP , JS , JSP – Server let , JAVA , DevOps(basic Knowledge).
- Additional Skills (preferred) : NodeJS , Python , Angular JS .
- System Administrator Experience : Linux (Ubuntu/RedHat) , Windows CE-Embedded.
- Data Base Experience : MySQL , Posgres , Mongo DB.
- Data Format Experience : JSON , XML , AJAX , JQuery.
- Should have Depth in software Architecture Design especially for Stand-Alone Software As Product , or SaaS Platform Experience.
- Should have Basic Experience/knowledge in Micro-Services , Rest API’s and SOAP methodologies.
- Should have built some backend architecture for Long Standing Applications.
- Good HTML Design Sense.
- Experience with AWS Services like EC2 and LightSail is Preferred.
- Testing and deploying programs and systems
- Fixing and improving existing software
- Good Understanding of OOP’s and Similar Concepts.
- Research on New JS Methodologies like React Js and Angular Js
Experience areas we are looking for:
- Proven experience as a Software Developer, https://resources.workable.com/software-engineer-job-description">Software Engineeror similar role
- Familiarity with Agile development methodologies
- Experience with software design and development in a test-driven environment
- Knowledge of coding languages (e.g. Java, JavaScript) and frameworks/systems (e.g. AngularJS, Git)
- Experience with databases and Object-Relational Mapping (ORM) frameworks (e.g. Hibernate)
- Ability to learn new languages and technologies
- Excellent communication skills
- Resourcefulness and troubleshooting aptitude
- Attention to detail
- Expertise in Infrastructure & Application design & architecture
- Expertise in AWS, OS & networking
- Having good exposure on Infra & Application security
- Expertise in Python, Shell scripting
- Proficient with Devops tools Terraform, Jenkins, Ansible, Docker, GIT
- Solid background in systems engineering and operations
- Strong in Devops methodologies and processes
- Strong in CI/CD pipeline & SDLC.
Location: Chennai- Guindy Industrial Estate
Duration: Full time role
Company: Mobile Programming (https://www.mobileprogramming.com/" target="_blank">https://www.
Client Name: Samsung
We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be
responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing
data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline
builder and data wrangler who enjoy optimizing data systems and building them from the ground up.
The Data Engineer will support our software developers, database architects, data analysts and data
scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout
ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple
teams, systems and products.
Responsibilities for Data Engineer
Create and maintain optimal data pipeline architecture,
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
Experience building and optimizing big data ETL pipelines, architectures and data sets.
Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
A successful history of manipulating, processing and extracting value from large disconnected
datasets.
Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data
stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
We are looking for a candidate with 3-6 years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Experience with big data tools: Spark, Kafka, HBase, Hive etc.
Experience with relational SQL and NoSQL databases
Experience with AWS cloud services: EC2, EMR, RDS, Redshift
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Skills: Big Data, AWS, Hive, Spark, Python, SQL