48+ Shell Scripting Jobs in Pune | Shell Scripting Job openings in Pune
Apply to 48+ Shell Scripting Jobs in Pune on CutShort.io. Explore the latest Shell Scripting Job opportunities across top companies like Google, Amazon & Adobe.
Salary: INR 15 to INR 30 lakhs per annum
Performance Bonus: Up to 10% of the base salary can be added
Location: Bangalore or Pune
Experience: 2-5 years
About AbleCredit:
AbleCredit is on a mission to solve the Credit Gap of emerging economies. In India alone, the Credit Gap is over USD 5T (Trillion!). This is the single largest contributor to poverty, poor genie index and lack of opportunities. Our Vision is to deploy AI reliably, and safely to solve some of the greatest problems of humanity.
Job Description:
This role is ideal for someone with a strong foundation in deep learning and hands-on experience with AI technologies.
- You will be tasked with solving complex, real-world problems using advanced machine learning models in a privacy-sensitive domain, where your contributions will have a direct impact on business-critical processes.
- As a Machine Learning Engineer at AbleCredit, you will collaborate closely with the founding team, who bring decades of industry expertise to the table.
- You’ll work on deploying cutting-edge Generative AI solutions at scale, ensuring they align with strict privacy requirements and optimize business outcomes.
This is an opportunity for experienced engineers to bring creative AI solutions to one of the most challenging and evolving sectors, while making a significant difference to the company’s growth and success.
Requirements:
- Experience: 2-4 years of hands-on experience in applying machine learning and deep learning techniques to solve complex business problems.
- Technical Skills: Proficiency in standard ML tools and languages, including:
- Python: Strong coding ability for building, training, and deploying machine learning models.
- PyTorch (or MLX or Jax): Solid experience in one or more deep learning frameworks for developing and fine-tuning models.
- Shell scripting: Familiarity with Unix/Linux shell scripting for automation and system-level tasks.
- Mathematical Foundation: Good understanding of the mathematical principles behind machine learning and deep learning (linear algebra, calculus, probability, optimization).
- Problem Solving: A passion for solving tough, ambiguous problems using AI, especially in data-sensitive, large-scale environments.
- Privacy & Security: Awareness and understanding of working in privacy-sensitive domains, adhering to best practices in data security and compliance.
- Collaboration: Ability to work closely with cross-functional teams, including engineers, product managers, and business stakeholders, and communicate technical ideas effectively.
- Work Experience: This position is for experienced candidates only.
Additional Information:
- Location: Pune or Bangalore
- Work Environment: Collaborative and entrepreneurial, with close interactions with the founders.
- Growth Opportunities: Exposure to large-scale AI systems, GenAI, and working in a data-driven privacy-sensitive domain.
- Compensation: Competitive salary and ESOPs, based on experience and performance
- Industry Impact: You’ll be at the forefront of applying Generative AI to solve high-impact problems in the finance/credit space, helping shape the future of AI in the business world.
at Wissen Technology
Job Requirements:
Intermediate Linux Knowledge
- Experience with shell scripting
- Familiarity with Linux commands such as grep, awk, sed
- Required
Advanced Python Scripting Knowledge
- Strong expertise in Python
- Required
Ruby
- Nice to have
Basic Knowledge of Network Protocols
- Understanding of TCP/UDP, Multicast/Unicast
- Required
Packet Captures
- Experience with tools like Wireshark, tcpdump, tshark
- Nice to have
High-Performance Messaging Libraries
- Familiarity with tools like Tibco, 29West, LBM, Aeron
- Nice to have
at DeepIntent
DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.
What You’ll Do:
We are looking for a talented candidate with several years of experience in software Quality Assurance to join our QA team. This position will be at an individual contributor level as part of a collaborative, fast-paced team. As a member of the QA team, you will work closely with Product Managers and Developers to understand application features and create robust comprehensive test plans, write test cases, and work closely with the developers to make the applications more testable. We are looking for a well-rounded candidate with solid analytical skills, an enthusiasm for taking ownership of features, a strong commitment to quality, and the ability to work closely and communicate effectively with development and other teams. Experience with the following is preferred:
- Python
- Perl
- Shell Scripting
- Selenium
- Test Automation (QA)
- Software Testing (QA)
- Software Development (MUST HAVE)
- SDET (MUST HAVE)
- MySQL
- CI/CD
Who You Are:
- Hands on Experience with QA Automation Framework development & Design (Preferred language Python)
- Strong understanding of testing methodologies
- Scripting
- Strong problem analysis and troubleshooting skills
- Experience in databases, preferably MySQL
- Debugging skills
- REST/API testing experience is a plus
- Integrate end-to-end tests with CI/CD pipelines and monitor and improve metrics around test coverage
- Ability to work in a dynamic and agile development environment and be adaptable to changing requirements
- Performance testing experience with relevant automation and monitoring tools
- Exposure to Dockerization or Virtualization is a plus
- Experience working in the Linux/Unix environment
- Basic understanding of OS
DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.
DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.
DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.
L2 Support
Location : Mumbai, Pune, Bangalore
Requirement details : (Mandatory Skills)
- Excell communication skills
- Production Support, Incident Management
- SQL ( Must have experience in writing complex queries )
- Unix ( Must have working experience on Linux operating system.
- Pearl/Shell Scripting
- Candidates working in the Investment Banking domain will be preferred
- Seeking an Individual carrying around 5+ yrs of experience.
- Must have skills - Jenkins, Groovy, Ansible, Shell Scripting, Python, Linux Admin
- Terraform, AWS deep knowledge to automate and provision EC2, EBS, SQL Server, cost optimization, CI/CD pipeline using Jenkins, Server less automation is plus.
- Excellent writing and communication skills in English. Enjoy writing crisp and understandable documentation
- Comfortable programming in one or more scripting languages
- Enjoys tinkering with tooling. Find easier ways to handle systems by doing some research. Strong awareness around build vs buy.
one of the world's leading multinational investment bank
- Provide hands on technical support and post-mortem root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management.
- Actively address and work on user and system tickets in the Service Now ticketing application. Create and implement change tickets for enhancements, new monitoring, and assisting development groups.
- Create, test, and implement Non-Functional Requirements (NFR) for current and new applications.
- Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Maintain documentation, knowledge articles, and runbooks.
- Conduct real time monitoring to ensure application OLA/SLAs are achieved and maximum application availability (up time) using an array of monitoring tools.
- Assist in the process to approve application code releases change tickets as well as tasks assigned to the support team to perform and validate the associated implementation plan.
- Approach support with a proactive attitude, desire to seek root cause, in-depth analysis and triage, and strive to reduce inefficiencies and manual efforts.
As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.
If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!
What You'll Do:
- Creating detailed design, working on development and performing code reviews.
- Implementing validation and support activities in line with architecture requirements
- Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
- Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
- Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
- Ownership of product/feature end-to-end for all phases from the development to the production.
- Ensuring the developed features are scalable and highly available with no quality concerns.
- Work closely with senior engineers for refining the and implementation.
- Management and execution against project plans and delivery commitments.
- Assist directly and indirectly in the continual hiring and development of technical talent.
- Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.
The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.
What You'll Need:
- A Bachelor's degree in Computer Science or related technical discipline.
- 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
- Fluency with Java, and Spring is good.
- Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
- Strong knowledge of Data Structures, Algorithms and CS fundamentals.
- Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
- Excellent analytical and reasoning skills
- Ability to learn new domains and deliver output
- Hands on Experience with the core AWS services
- Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)
- Expertise in at least one of the following:
- Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology
- Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached
- Distributed column store databases like Snowflake, Cassandra, or HBase
- Spark, Flink, Beam, or equivalent streaming data processing frameworks
- Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
- Experience building automations and CICD pipelines (integration, testing, deployment)
- Experience with Kubernetes would be a plus.
- Good understanding of working with distributed teams using Agile: Scrum, Kanban
- Strong interpersonal skills as well as excellent written and verbal communication skills
• Attention to detail and quality, and the ability to work well in and across teams
• Experienced Developer in Shell scripting,
• PERL Scripting
• PL/SQL knowledge is required.
• Advance Communication skill is a must.
• Ability to learn new applications and technologies
- Building and setting up new development tools and infrastructure
- Understanding the needs of stakeholders and conveying this to developers
- Working on ways to automate and improve development and release processes
- Ensuring that systems are safe and secure against cybersecurity threats
- Identifying technical problems and developing software updates and 'fixes'
- Working with software developers and software engineers to ensure that development follows established processes and works as intended
Daily and Monthly Responsibilities :
- Deploy updates and fixes
- Provide Level 2 technical support
- Build tools to reduce occurrences of errors and improve customer experience
- Develop software to integrate with internal back end systems
- Perform root cause analysis for production errors
- Investigate and resolve technical issues
- Develop scripts to automate visualization
- Design procedures for system troubleshooting and maintenance
Skills and Qualifications :
- Bachelors in Computer Science, Engineering or relevant field
- Experience as a DevOps Engineer or similar software engineering role
- Proficient with git and git workflows
- Good knowledge of Python
- Working knowledge of databases such as Mysql,Postgres and SQL
- Problem solving attitude
- Collaborative team spirit
- Detail knowledge of Linux systems (Ubuntu)
- Proficient in AWS console and should have handled the infrastructure of any product (Including dev and prod environments)
Mandatory hands on experience in the following :
- Python based application deployment and maintenance
- NGINX web server
- AWS modules EC2, VPC, EBS, S3
- IAM setup
- Database configurations MySQL, PostgreSQL
- Linux flavoured OS
- Instance/Disaster management
Designation:Linux/System/Support Engineer (L2) Experience : 2-5 Yrs
Notice period : immediate to 30 Days
- Server Monitoring
- Deployments
- Collecting information about the reported issues
- Ensuring whether all the information has been logged in the ticketing system or not
- Must be able to follow and execute instructions specified in user guides, emails to run, monitor and trouble shoot
- Must be able and willing to document activities, procedures
- Must have trouble shooting skills and have knowledge on Antivirus, Firewall, Gateway
- Should be ready to work for extended shifts, if
- Good customer management skills bundled with good communication skills
- Databases: concepts and ability to use DB tools such as psql,
- Good Understanding of Oracle, weblogic, Linux/ Unix Terminology and able to execute commands
- Internet Technologies : Tomcat/ apache concepts, basic html, etc
- Able to use MS- Excel, Power Point
at FPL Technologies Pvt Ltd
SDET (Software Development Engineer in Test)
Grounds up build REST API automation framework using best in class open source, cloud technologies. Help build automaton first Quality culture in a fast growing product team. The ideal candidate is a self-starter and operates with a high sense of ownership.
What you will do:
- Author extensible, manageable REST API automation suite.
- Works closely with the Development team to release high quality, predictable releases.
- Develop detailed understanding of overall product architecture, engage at design time to access Automation and Test environment impact.
- Build Test plans, create prioritized regression suites.
Experience Range:
You have 2 - 4 years of proven automation and quality assurance experience with any product.
Technical Expertise:
- Strong hands on programming experience in either Java or Python.
- Deep understanding of REST APIs, have built Test Suites validating product functionalities using API automation frameworks like tavern, postman etc.
- Good to have an understanding of the AWS/Mobile App Automation. environment. ALBs, VPCs, IAM roles.
- Understanding of Docker is nice to have.
- Good understanding of Object oriented concepts.
Opportunity for Unix Developer!!
We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.
Experience - 2 to 7 years
Job location - Pune
Mandatory Skills:
Strong experience in Unix with Shell Scripting development.
What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.
ABOUT US:
Pingahla was founded by a group of people passionate about making the world a better place by harnessing
the power of Data. We are a data management firm with offices in New York and India.Our mission is to help transform the way companies operate and think about their business. We make it easier
to adopt and stay ahead of the curve in the ever-changing digital landscape. One of our core beliefs is
excellence in everything we do!
JOB DESCRIPTION:
Pingahla is recruiting ETL & BI Test Manager who can build and lead a team, establish infrastructure, processes
and best practices for our Quality Assurance vertical. The candidates are expected to have at least 5+ years of experience with ETL Testing and working in Data Management project testing. Being a growing company, we will be able to provide very good career opportunities and a very attractive remuneration.
JOB ROLE & RESPONSIBILITIES:
• Plans and manages the testing activities;
• Defect Management and Weekly & Monthly Test report Generation;
• Work as a Test Manager to design Test Strategy and approach for DW&BI - (ETL & BI) solution;
• Provide leadership and directions to the team on quality standards and testing best practices;
• Ensured that project deliverables are produced, including, but not limited to: quality assurance plans,
test plans, testing priorities, status reports, user documentation, online help, etc. Managed and
motivated teams to accomplish significant deliverables within tight deadlines.
• Test Data Management; Reviews and approves all test cases prior to execution;
• Coordinates and reviews offshore work efforts for projects and maintenance activities.
REQUIRED SKILLSET:
• Experience in Quality Assurance Management, Program Management, DW - (ETL & BI)
Management
• Minimum 5 years in ETL Testing, at least 2 years in the Team Lead role
• Technical abilities complemented by sound communication skills, user interaction abilities,
requirement gathering and analysis, and skills in data migration and conversion strategies.
• Proficient in test definition, capable of developing test plans and test cases from technical
specifications
• Single handedly looking after complete delivery from testing side.
• Experience working with remote teams, across multiple time zones.
• Must have a strong knowledge of QA processes and methodologies
• Strong UNIX and PERL scripting skills
• Expertise with ETL testing & hands-on with working on ETL tool like Informatica, DataStage
PL/SQL is a plus.
• Excellent problem solving, analytical and technical troubleshooting skills.
• Familiar with Data Management projects.
• Eager to learn, adopt and apply rapidly changing new technologies and methodologies.
• Efficient and effective at approaching and escalating quality issues when appropriate.
We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description :
Experience: 6+ Years
Work Location: Pune / Hyderabad
Technical Skills :
- Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
- Knowledge of database performance tuning techniques
- Rich experience in a database development
- Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
- Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
Required Candidate Profile :
- Excellent communication, interpersonal, analytical skills and strong ability to drive teams
- Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
- Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
- Stakeholder management and client engagement skills
- Strong communication skills (written and verbal)
About Us!
A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle Data warehouse Assessment & Migration Planning Product
Raven Automated Workload Conversion Product
Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
-
Expertise in test planning and test strategy design for various types of testing - integration testing, functional testing, system testing and regression.
-
Ability to quickly learn the functional aspects of the project & be the quality gatekeeper for the code release.
-
Experience in testing the developed UI pages against the UX design to ensure the experience is realised.
-
Experience in creating knowledge artifacts and authoring guides for end-users
-
Experience in creating test scenarios, test cases, traceability
-
Hands-on experience with JIRA, WIKI, or Zephyr applications
-
Understanding and working knowledge in Scrum Methodology
-
Experience in Web,mobile, cross browser and device testing.
-
Experience in Automation testing with Java and Selenium
-
experience in frameworks like TestNG and Cucumber
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
Experience - 2 to 6 Years
Work Location - Pune
Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided learning opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
Position: QE Automation Engineer / SDET
Job Location: Pune (Work From Home) Till Pandemic.
Salary: As per Company Standard
Experience: 8 plus Years of Software Engineer Testing , 2 plus years of hands-on experience using Selenium and Cucumber.
Responsibility:
Skills needed for Automation SDETs are :
Excellent communication skills
Must have knowledge of –
Core Java
Selenium with standard Maven, TestNG/JUnit
Cucumber / BDD (Karate will also do)
Rest Assured (including Postman)
GitHub / Sourcetree / GitLab
Good to have skills are –
Jenkins
Shell / Groovy script
Any Cloud experience
Skills we can train on as per need –
GCP Basics
DB (brushup)
Skills needed for Site Reliability Engineers (SREs) are -
Must have –
Java / Python scripting
Shell / Groovy
Jenkins / Bamboo and related devops areas
Cloud experience
Good to have skills are –
GCP Intermediate skills
DB skills
EARLY JOINING CANDIDATES ARE MORE PREFFERED. SAY WITHIN 2 OR 3 WEEKS.
We have an urgent requirement of an Automation Tester who has at least 2 years of experience in Automation Testing.
Immediate Joiner will be preffered or the maximum notice period -15 days.
Job Requirements:
- Maintaining a suite of automated tests for Web and Mobile applications
- Maintaining automation framework and implementing improvements
- Conversion of the manual test suite to automation
- Create and maintain test cases.
- Maintain and update regression testing scripts to close test escapes.
- Perform exploratory testing, integration testing, regression testing, automated testing.
- Execution and triage of automated test executions.
- Defect management in Jira.
- Maintenance on Jenkins integration.
- Exposure to Test and Defect Management Tools like JIRA, Test rail, Zephyr etc.
Skills required
- Writing reliable and maintainable automated tests.
- Demonstrable experience in Automation testing using Selenium or similar
- Experience with Serenity BDD test framework, (Cucumber, Selenium, Rest Assured)
- Programming skills with either Java / JavaScript / Python or similar
- Unit testing frameworks such as TestNG, JUnit and/or N Unit
- Testing API’s with SoapUI / Postman or similar
- Performance Testing with J meter, Neo Load or similar
- Familiarity with Swagger or Open API Specifications.
- Familiarity with JSON objects, SQL queries, data structures, log file analysis
- Strong knowledge of Scripting Languages like Java, JavaScript, Python, Ruby, Groovy etc.
at Syscort Technologies
Job Description
We are looking for a Senior Test Engineer who will work closely with the project team members and ensure the quality of the delivery.
Roles and Responsibilities
- Develop automated tests for product validation (regression and integration).
- Develop and execute test cases, scripts, plans, and procedures (manual and automated).
- Integrate and execute automated tests in CI/CD environments
- Troubleshoot, and improve existing automation scripts
- Act as a lead team member as the team grows, providing support and training to other team members
- Execute test cases against various software applications and document the results
- Perform ad hoc testing of new and existing features to ensure they meet design requirements, quality,
and usability standards
- Follow-up with and document resolution of bugs
- Maintain test cases for manual and automated tests
- Communicate defects to stakeholders in a clear and concise manner
Technical Skills
- Minimum 3 - 5 years experience in a diverse technology environment in a Quality Assurance role
- Experience developing testing strategies and test plans
- Experience creating and executing automated end-to-end tests, using tools such as Selenium,
Cucumber, or other automation frameworks
- Exposure to ‘Leapwork’ is good to have. Expertise on Java, OOPS Concepts
- Good understanding of software development lifecycles
- Ability to work in multiple application environments
Strong data analysis, data verification, and problem-solving abilities
You should have
- Fluency in written and communication skills in English
- Strong ability to work collaboratively in a team environment
- Excellent organizational skills including, management of multiple assignments, setting personal goals and
targets, and document management
- Excellent focus on detail through adherence to quality assurance best practices
- 2-4 years of experience, with at least 1-2 years of experience in testing Web Applications/Services, UI testing.
- Good exposure to Non-functional tests strategies like Load, Perf and Chaos.
- Good understanding of testing principles: UI and Usability testing, Stress testing, Functional/Regression testing, Code coverage, TDD/BDD, UAT.
- Experience in deploying solutions into AWS is a major plus.
- Good team player, having effective and professional oral and written communication skills.
Monitor, analyze and fix issues for consumer and business VoIP services and endpoints.
Roles and Responsibilities:
- As a senior engineer, take complete ownership of supporting customer VoIP deployments by debugging issues and providing fixes.
- Document existing call flows
- Build customer confidence to grow/lead local team
- Support customer during PST timezone during critical releases
Required Skills:
- 5+ yrs of supporting VoIP services/applications on Linux platform
- Strong scripting skills with bash/python/perl
- Expertise in SIP call flow analysis and debugging using Wireshark
- Experience in debugging Kamailio and Freeswitch/asterisk based applications is a must
- Good problem solving/analytical skills
- Excellent written and verbal communication
Preferred Skills:
- Experience working with open source projects
- Exposure to Level 3-Carrier Integration
- Knowledge of networking protocols
Yugabyte
You Will:
- Design, develop and maintain automation framework, system and functional test suites and contribute to Database platform development.
- Test the product for performance, resiliency, security, scalability, and reliability.
- Fix defects identified via testing in Database platform.
- Understand the end-to-end configuration, technical dependencies, code paths, and overall behavioural characteristics of the products you test.
- Analyze and understand existing test coverage and test cases, identifying opportunities for redesign, replacement, reusability, and improvement in efficiency and performance.
- Define and inspire changes to our product with our development engineering team based on feedback from tests and customer issues.
- Develop and contribute to internal and external knowledge bases.Be a champion for our customers.
- Go above and beyond to ensure customers are getting the most out of their investment in the Yugabyte platform.
You'll Need:
- Strong programming skills (C++, Java, Python and UI automation tools) and experience in developing automation frameworks and testing tools.
- Working knowledge of SQL and/or CQL.
- Professional experience in Databases and/or Distributed Systems.
- Customer Obsession – you are passionate about delivering a high-quality product.
- You judge our own success by the success of the team and the happiness of our customers.
- Excellent written and verbal communication skills – you’re able to work with a wide variety of people and collaborate with geographically distributed teams and effectively communicating everything from data points to critical feedback.
- Entrepreneurial spirit and are not afraid to take on new challenges.
- 5+ years of relevant work experience with BS in CS or equivalent technical degree.
Nice to have
- Experience working in a continuous integration / continuous delivery development environment.
- Have expertise with automation and build tools such as Selenium, jMeter, and Jenkins.
- Thrive on working on open source technologies.
WFH Opportunity
We are looking for
Python Tester/ Python QA automation Engineer /
Experience : 6 to 12 years
CTC : As per Industry Norms
Required Key Skills
Python+ selenium, automation testing, manual testing, selenium , shell scripting , manual, API Testing or Rest assured, Rest API,
Robot Framework
client of People first consultant pvt ltd
- candidate having 3+ years of experience
- REST API end point testing experience using REST-Assured framework.
- Experience in BDD framework (Behavior Driven Development) using Cucumber
- Preferable to have experience on Java, Selenium and Postman.
- Solid experience in manual testing
- Familiarity with test tracking and reporting systems
- Experience with Agile development practices
- Excellent written and verbal communication skills
- Ability to effectively articulate technical challenges and solutions
- Skilled in interfacing with internal and external technical resources
- Involve in Platform Sprint activities.
- Write scripts to test various services, REST APIs developed
- Manual and automation testing of APIs and services, application
1.Understand client business requirements and interpret into technical solutions
2. Build and maintain database stored procedures
3. Build and maintain ETL workflows
4. Perform quality assurance and testing at the unit level
5. Write and maintain user and technical documentation
6. Integrate Merkle database solutions with web services and cloud-based platforms. Must Have: SQL server stored procedures
Good/Nice to have: UNIX shell scripting, Talend/Tidal/Databricks/Informatica , JAVA/Python Experience : 2 to 10 years of experienced candidates
at Olacabs.com
Role: SDET III
Location: Pune
Department: Engineering
About Us:
Ola is India’s largest mobility platform and one of the world’s largest ride-hailing companies, serving 250+ cities across India, Australia, New Zealand, and the UK. The Ola app offers mobility solutions by connecting customers to drivers and a wide range of vehicles across bikes, auto-rickshaws, metered taxis, and cabs, enabling convenience and transparency for hundreds of millions of consumers and over 1.5 million driver-partners.
Ola’s core mobility offering in India is supplemented by its electric-vehicle arm, Ola Electric; India’s largest fleet management business, Ola Fleet Technologies and Ola Skilling, that aims to enable millions of livelihood opportunities for India's youth. With its acquisition of Ridlr, India’s leading public transportation app and investment in Vogo, a dockless scooter sharing solution, Ola is looking to build mobility for the next billion Indians. Ola also extends its consumer offerings like micro-insurance and credit led payments through Ola Financial Services and a range of owned food brands through India’s largest network of kitchens under its Food business.
Ola was founded in 2011 by Bhavish Aggarwal and Ankit Bhati with a mission to build mobility for a billion people. For more details, visit https://mailtrack.io/trace/link/9cf02745c1ccbb168313fd6de6e15ec819c3fd1d?url=http%3A%2F%2Fwww.olacabs.com%2Fmedia&userId=2857940&signature=1ee2b69749ed1642">www.olacabs.com/media.
Roles and Responsibilities
- Review requirements, specifications and technical design documents to provide timely and meaningful feedback.
- Create detailed, comprehensive and well-structured functional, system, and regression test plans and test cases.
- You will understand the requirements and would write automation tests for integration, load, and performance.
- Estimate, prioritize, plan and coordinate testing activities.
- Write and Implement Tests using Selenium and Java.
- Write and Implement Load and performance tests.
- Design, develop and troubleshoot automated test scripts to validate the technical and functional integrity of web and mobile-based application components, backend API, and reports.
- Collect, analyze and interpret test metrics.
- Summarize test data and report findings.
- Liaise with internal teams (e.g. developers and product managers) to identify system requirements.
- Monitor debugging process results.
Critical Functional Skills
- Exposure to best practices in SQA and software development, including code reviews, debugging, troubleshooting and CI processes
- Strong knowledge of Rest-Assured/API testing, Selenium and Appium/Robotium.
- Hands-on experience in performance tools.
- Strong in Core Java fundamentals and Object-Oriented Programming concepts
- Proven work experience in software quality assurance.
- Strong knowledge of software QA methodologies, tools and processes.
- Experience in writing clear, concise and comprehensive test plans and test cases.
- Hands-on experience with both white box and black-box testing.
- Hands-on experience with functional and non-functional testing.
- Experience working in an Agile/Scrum development process.
- Excellent communication skills with the ability to present complex technical information in a clear and concise manner to a variety of audiences, both technical and non-technical
Experience Required
- 6+ years of experience in QA
- QA experience across multiple projects.
Minimum Qualifications Required
BS/MS degree in Computer Science, Engineering or a related subject.
USA based product engineering company.Medical industry
Total Experience: 6 – 12 Years
Required Skills and Experience
- 3+ years of relevant experience with DevOps tools Jenkins, Ansible, Chef etc
- 3+ years of experience in continuous integration/deployment and software tools development experience with Python and shell scripts etc
- Building and running Docker images and deployment on Amazon ECS
- Working with AWS services (EC2, S3, ELB, VPC, RDS, Cloudwatch, ECS, ECR, EKS)
- Knowledge and experience working with container technologies such as Docker and Amazon ECS, EKS, Kubernetes
- Experience with source code and configuration management tools such as Git, Bitbucket, and Maven
- Ability to work with and support Linux environments (Ubuntu, Amazon Linux, CentOS)
- Knowledge and experience in cloud orchestration tools such as AWS Cloudformation/Terraform etc
- Experience with implementing "infrastructure as code", “pipeline as code” and "security as code" to enable continuous integration and delivery
- Understanding of IAM, RBAC, NACLs, and KMS
- Good communication skills
Good to have:
- Strong understanding of security concepts, methodologies and apply them such as SSH, public key encryption, access credentials, certificates etc.
- Knowledge of database administration such as MongoDB.
- Knowledge of maintaining and using tools such as Jira, Bitbucket, Confluence.
Responsibilities
- Work with Leads and Architects in designing and implementation of technical infrastructure, platform, and tools to support modern best practices and facilitate the efficiency of our development teams through automation, CI/CD pipelines, and ease of access and performance.
- Establish and promote DevOps thinking, guidelines, best practices, and standards.
- Contribute to architectural discussions, Agile software development process improvement, and DevOps best practices.
We are a self organized engineering team with a passion for programming and solving business problems for our customers. We are looking to expand our team capabilities on the DevOps front and are on a lookout for 4 DevOps professionals having relevant hands on technical experience of 4-8 years.
We encourage our team to continuously learn new technologies and apply the learnings in the day to day work even if the new technologies are not adopted. We strive to continuously improve our DevOps practices and expertise to form a solid backbone for the product, customer relationships and sales teams which enables them to add new customers every week to our financing network.
As a DevOps Engineer, you :
- Will work collaboratively with the engineering and customer support teams to deploy and operate our systems.
- Build and maintain tools for deployment, monitoring and operations.
- Help automate and streamline our operations and processes.
- Troubleshoot and resolve issues in our test and production environments.
- Take control of various mandates and change management processes to ensure compliance for various certifications (PCI and ISO 27001 in particular)
- Monitor and optimize the usage of various cloud services.
- Setup and enforce CI/CD processes and practices
Skills required :
- Strong experience with AWS services (EC2, ECS, ELB, S3, SES, to name a few)
- Strong background in Linux/Unix administration and hardening
- Experience with automation using Ansible, Terraform or equivalent
- Experience with continuous integration and continuous deployment tools (Jenkins)
- Experience with container related technologies (docker, lxc, rkt, docker swarm, kubernetes)
- Working understanding of code and script (Python, Perl, Ruby, Java)
- Working understanding of SQL and databases
- Working understanding of version control system (GIT is preferred)
- Managing IT operations, setting up best practices and tuning them from time-totime.
- Ensuring that process overheads do not reduce the productivity and effectiveness of small team. - Willingness to explore and learn new technologies and continuously refactor thetools and processes.
- Hands on experience in following is a must: Unix, Python and Shell Scripting.
- Hands on experience in creating infrastructure on cloud platform AWS is a must.
- Must have experience in industry standard CI/CD tools like Git/BitBucket, Jenkins, Maven, Artifactory and Chef.
- Must be good at these DevOps tools:
Version Control Tools: Git, CVS
Build Tools: Maven and Gradle
CI Tools: Jenkins
- Hands-on experience with Analytics tools, ELK stack.
- Knowledge of Java will be an advantage.
- Experience designing and implementing an effective and efficient CI/CD flow that gets code from dev to prod with high quality and minimal manual effort.
- Ability to help debug and optimise code and automate routine tasks.
- Should be extremely good in communication
- Experience in dealing with difficult situations and making decisions with a sense of urgency.
- Experience in Agile and Jira will be an add on
You will work on:
Your primary work involves developing and maintaining tools for build, release, deployment, monitoring and operations both on cloud as well as on-premises infrastructure. You are required to work closely with Developers and Cloud Architects and own infrastructure automation, CI/CD processes and support operations.
What you will do (Responsibilities):
- Day-to-day operational support of CI/CD infrastructure relied upon by teams deploying software to the cloud or on-premise
- Write Code to develop deployment of various services to private or public cloud/on-premise environments.
- Participate in cloud projects to implement new technology solutions, Proof of concepts to improve cloud technology offerings.
- Work with developers to deploy to private or public cloud/on-premise services, debug and resolve issues.
- On call responsibilities to respond to emergency situations and scheduled maintenance.
- Contribute to and maintain documentation for systems, processes, procedures and infrastructure configuration
What you bring (Skills):
- Strong Linux System skills
- Scripting in bash, python
- Basic file handling & networking
- Comfortable in Git repositories specifically on GitHub, Gitlab, Bitbucket, Gerrit
- Comfortable in interfacing with SQL and No-SQL databases like MySQL, Postgres, MongoDB, ElasticSearch, Redis
Great if you know (Skills):
- Understanding various build and CI/CD systems – Maven, Gradle, Jenkins, Gitlab CI, Spinnaker or Cloud based build systems
- Exposure to deploying and automating on any public cloud – GCP, Azure or AWS
- Private cloud experience – VMWare or OpenStack
- Big DataOps experience – managing infrastructure and processes for Apache Airflow, Beam, Hadoop clusters
- Containerized applications – Docker based image builds and maintainenace.
- Kubernetes applications – deploy and develop operators, helm charts, manifests among other artifacts.
Advantage Cognologix:
- Higher degree of autonomy, startup culture & small teams
- Opportunities to become expert in emerging technologies
- Remote working options for the right maturity level
- Competitive salary & family benefits
- Performance based career advancement
About Cognologix:
Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business first approach to help meet our client’s strategic goals.
We are DevOps focused organization helping our clients focus on their core product activities by handling all aspects of their infrastructure, integration and delivery.
Benefits Working With Us:
- Health & Wellbeing
- Learn & Grow
- Evangelize
- Celebrate Achievements
- Financial Wellbeing
- Medical and Accidental cover.
- Flexible Working Hours.
- Sports Club & much more.
You will work on:
You will be working on some of our clients massive scale Infrastructure and DevOps requirements - designing for microservices and large scale data analytics. You will be working on enterprise scale problems - but will be part of our agile team that delivers like a startup. You will have opportunity to be part of team that's building and managing large private cloud.
What you will do (Responsibilities):
- Work on cloud marketplace enablements for some of our clients products
- Write Kubernetes Operators to automate custom PaaS solutions
- Participate in cloud projects to implement new technology solutions, Proof of concepts to improve cloud technology offerings.
- Work with developers to deploy to private or public cloud/on-premise services, debug and resolve issues.
- On call responsibilities to respond to emergency situations and scheduled maintenance.
- Contribute to and maintain documentation for systems, processes, procedures and infrastructure configuration
What you bring (Skills):
- Experience with administering of and debugging on Linux based systems with programming skills in Scripting, Golang, Python among others
- Expertise in Git repositories specifically on GitHub, Gitlab, Bitbucket, Gerrit
- Comfortable with DevOps for Big Data databases like Terradata, Netezza, Hadoop based ecosystems, BigQuery, RedShift among others
- Comfortable in interfacing with SQL and No-SQL databases like MySQL, Postgres, MongoDB, ElasticSearch, Redis
Great if you know (Skills):
- Understanding various build and CI/CD systems – Maven, Gradle, Jenkins, Gitlab CI, Spinnaker or Cloud based build systems
- Exposure to deploying and automating on any public cloud – GCP, Azure or AWS
- Private cloud experience – VMWare or OpenStack
- Big DataOps experience – managing infrastructure and processes for Apache Airflow, Beam, Hadoop clusters
- Containerized applications – Docker based image builds and maintainenace.
- Kubernetes applications – deploy and develop operators, helm charts, manifests among other artifacts.
Advantage Cognologix:
- Higher degree of autonomy, startup culture & small teams
- Opportunities to become expert in emerging technologies
- Remote working options for the right maturity level
- Competitive salary & family benefits
- Performance based career advancement
About Cognologix:
Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business first approach to help meet our client’s strategic goals.
We are DevOps focused organization helping our clients focus on their core product activities by handling all aspects of their infrastructure, integration and delivery.
Benefits Working With Us:
- Health & Wellbeing
- Learn & Grow
- Evangelize
- Celebrate Achievements
- Financial Wellbeing
- Medical and Accidental cover.
- Flexible Working Hours.
- Sports Club & much more.
We are looking for a System Administrator to maintain, upgrade and manage our software, hardware and networks.
Resourcefulness is a necessary skill in this role. You should be able to diagnose and resolve problems quickly. You should also have the patience to communicate with a variety of interdisciplinary teams and users.
Your goal will be to ensure that our technology infrastructure runs smoothly and efficiently.
Responsibilities
- Install and configure software and hardware
- Administer Microsoft 365 Enterprise Mobility and Security suite + Intune
- Set up accounts and workstations
- Monitor performance and maintain systems according to requirements
- Troubleshoot issues and outages
- Ensure security through access controls
- Upgrade systems with new releases and models
Must-have Skills
Technical Skills
- Functional & operational understanding of Microsoft Active Directory, Office 365 and Microsoft Intune or Microsoft Enterprise Mobility and Security
- Functional knowledge of Active Directory security groups
- Experience with configuring Microsoft 365 security policies
- Ability to troubleshoot issues related to Windows 10 operating systems
- Basic hardware troubleshooting skills
Soft skills
- Eager to learn and apply new technologies and tools to get the job done
- Strong analytical and organizational skills
- Good communications skills to be able to communicate with people to understand and comprehend underlying issues
Bonus Points
- Microsoft Certifications: MCSA/MCSE/MS-101 Enterprise Mobility and Security
- Network configuration and management
- Fortinet firewall configuration
at Niyuj Enterprise Software Solutions Pvt. Ltd.
Software Engineer in Test/Test Engineer work within software development agile teams to
ensure software is designed and implemented for testability. They write automated unit,
component and system tests to ensure code quality and detect regressions early in the
development cycle. They are responsible for accurate test execution documentation. They also
maintain the integration and test frameworks used by multiple development teams. They must be
able to support a fast-paced agile software release process for one of Convergent’s cybersecurity
customers.
Job Duties
Develop software to perform unit, component and system testing
Develop and maintain test execution and tracking software
Contribute towards architecture designs providing feedback on testability
Decompose user stories into tasks and estimate story points for user stories and tasks
Perform manual tests as required
Experience and Skills:
Required Skills
o Experience in test design and implementation
o Experience in release management
o Experience with Python
o Experience with Docker
o Experience with Linux systems (CentOS and RHEL preferred)
o Experience with automation tools and frameworks (unittest, Jenkins, Selenium)
o Experience with Linux BASH scripting and administration
o Works and communicates well in a distributed/remote team environment
Desired Skills
o Experience with Git and Ansible
o Experience with RedHat Package Manager (RPM) and yum repositories
o Experience testing applications and deploying them to cloud micro service
architectures (AWS, OpenShift, Azure, etc.)
o Experience with Django based full stack web development
o Experience testing web application frontends (React preferred)
o Familiarity with KVM/ESX virtual environments
o BS in Computer Science, Engineering or a related field (3 years work experience in place of degree)
If you are interested for above position, then please send me your updated cv & following details:
Current CTC:
Expected CTC:
Notice period:
Can relocate to Baner Pune (Y / N) :
Experience with automation tools and frameworks (unittest, Jenkins, Selenium) in years:
Experience in years in Python Scripting:
Experience with Linux systems (CentOS and RHEL preferred) in years:
Experience in Docker in years:
Company Name - Focus Datatech System Pvt Ltd, Pune
Company Name - Focus Datatech System Pvt Ltd, Pune
Job Description -
OVM AND SERVER TASKS
- Good Understanding about virtualization technology like VMware/OracleVM and Linux administration including installation, configuration, and maintenance.
- Using ITIL best practice that can be incorporated into the operational running of the IT system to include Change/incident/problem/capacity planning.
- NFS file sharing and the Linux automounter.
- Perform installation and configuration of operating systems, applications and storage provisioning.
- Process improvements and makes recommendations in continuous quality improvements.
- Deliver Changes to the UNIX platform by leveraging the change control process communicating and seeking approvals from business owners.
- Working closely with internal development teams, database teams and implementation teams to deliver a new solution.
- Managing the 24*7 Production support.
- Troubleshooting problems related to Apache, Mail, DNS, FTP, MySQL, etc.
- Giving technical support and configure according to the requirement for the Development Team.
- Creation and execution of Shell (Bash) Script to make the work automate and to interact with databases.
- Solving login Problem, handling troubleshooting, installation and many more.
- Configure installation NIS/SAMBA/NFS setup with Active Directory integration and Provide support and Maintenance.
- Solve complex Cloud Infrastructure problems.
- Drive DevOps culture in the organization by working with engineering and product teams.
- Be a trusted technical advisor to developers and help them architect scalable, robust, and highly-available systems.
- Frequently collaborate with developers to help them learn how to run and maintain systems in production.
- Drive a culture of CI/CD. Find bottlenecks in the software delivery pipeline. Fix bottlenecks with developers to help them deliver working software faster. Develop and maintain infrastructure solutions for automation, alerting, monitoring, and agility.
- Evaluate cutting edge technologies and build PoCs, feasibility reports, and implementation strategies.
- Work with engineering teams to identify and remove infrastructure bottlenecks enabling them to move fast. (In simple words you'll be a bridge between tech, operations & product)
Skills required:
Must have:
- Deep understanding of open source DevOps tools.
- Scripting experience in one or more among Python, Shell, Go, etc.
- Strong experience with AWS (EC2, S3, VPC, Security, Lambda, Cloud Formation, SQS, etc)
- Knowledge of distributed system deployment.
- Deployed and Orchestrated applications with Kubernetes.
- Implemented CI/CD for multiple applications.
- Setup monitoring and alert systems for services using ELK stack or similar.
- Knowledge of Ansible, Jenkins, Nginx.
- Worked with Queue based systems.
- Implemented batch jobs and automated recurring tasks.
- Implemented caching infrastructure and policies.
- Implemented central logging.
Good to have:
- Experience dealing with PI information security.
- Experience conducting internal Audits and assisting External Audits.
- Experience implementing solutions on-premise.
- Experience with blockchain.
- Experience with Private Cloud setup.
Required Experience:
- B.Tech. / B.E. degree in Computer Science or equivalent software engineering degree/experience.
- You need to have 2-4 years of DevOps & Automation experience.
- Need to have a deep understanding of AWS.
- Need to be an expert with Git or similar version control systems.
- Deep understanding of at least one open-source distributed systems (Kafka, Redis, etc)
- Ownership attitude is a must.
We offer a suite of memberships and subscriptions to spice up your lifestyle. We believe in practicing an ultimate work life balance and satisfaction. Working hard doesn’t mean clocking in extra hours, it means having a zeal to contribute the best of your talents. Our people culture helps us inculcate measures and benefits which help you feel confident and happy each and every day. Whether you’d like to skill up, go off the grid, attend your favourite events or be an epitome of fitness. We have you covered round and about.
- Health Memberships
- Sports Subscriptions
- Entertainment Subscriptions
- Key Conferences and Event Passes
- Learning Stipend
- Team Lunches and Parties
- Travel Reimbursements
- ESOPs
Thats what we think would bloom up your personal life, as a gesture for helping us with your talents.
Join us to be a part of our Exciting journey to Build one Digital Identity Platform!!!