Loading...

{{notif_text}}

Join the fight against Covid-19 in India. Check the resources we have curated for you here.https://fightcovid.cutshort.io
Noida, NCR (Delhi | Gurgaon | Noida)
3 - 5 years
{{::renderSalaryString({min: 700000, max: 2000000, duration: '', currency: 'INR', equity: false})}}

Skills

DevOps
CI/CD
Amazon Web Services (AWS)
Amazon EC2
Python
Shell Scripting
Kubernetes
Docker
EKS

Job description

About Extramarks Education India Pvt Ltd

Extramarks Education India Private Limited

 

Extramarks is leading the Education Technology sector in India by providing 360⁰ education support to learners through new age digital education solutions. These solutions are used by schools in class room for imparting education and by students at home to make learning easy and effective. Keeping pace with globalization and technology in education, Extramarks empowers young learners to step in with the latest technology and have anytime-anywhere access to quality learning. In a very short period, Extramarks has become extremely popular among schools and students. More than 8000 schools use digital learning and technology solutions of Extramarks across India, Singapore, Kuwait, UAE and South Africa.

 

Extramarks Learning App allows students to learn at home at their own pace and space and provides complete educational support eliminating the need of a tutor. The three pronged pedagogical approach of Learn, Practice and Test ensures better learning outcomes for students. All concepts are first explained in an easy to learn manner with the help of rich media then the students are allowed to practice the concept. Virtual Practicing modules and Q&A allow the retention of knowledge that is tested on a robust teaching platform to identify the learning gaps.

 

Extramarks is currently in the process of scaling up rapidly the entire technology workforce, by hiring experienced professionals with unique skill who stretches from strategy to execution. Extramarks is driving student learning through data driven techniques by providing personalized learning recommendations to the students. Apart from such customer facing innovations, we are also implementing internal operational excellence solutions through similar data driven approaches. It is a once in a lifetime opportunity to join Extramarks and help change the face of education in India.

Founded

2007

Type

Product

Size

250+ employees

Stage

View company

Why apply to jobs via CutShort

No long forms
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Discover employers in your network
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Make your network count
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
{{2101133 | number}}
Matches delivered
{{3712187 | number}}
Network size
{{6212 | number}}
Companies hiring

Similar jobs

DevOps Engineer (Automation)

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 2500000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

DevOps Engineer (Automation)    ABOUT US Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership. LOCATION:  Mumbai THE POSITION Ashnik is looking for talented and passionate Technical consultant to be part of the training team and work with customers on DevOps Solution. You will be responsible for implementation and consultations related work for customers across SEA and India. We are looking for personnel with personal qualities like - Passion for working for different customers and different environments. Excellent communication and articulation skills Aptitude for learning new technology and willingness to understand technologies which he/she is not directly working on. Willingness to travel within and outside the country. Ability to independently work at the customer site and navigate through different teams.   RESPONSIBILITIES                      First 2 months: Get an in-depth understanding of Containers, Kubernetes, CI/CD, IaC. Get hands-on experience with various technologies of Mirantis Kubernetes Engine, Terraform, Vault, Sysdig After 2 months: The ideal candidate can will ensure the following outcomes from every client deployment:   Utilize various open source technologies and tools to orchestrate solutions. Write scripts and automation using Perl/Python/Groovy/Java/Bash Build independent tools and solutions to effectively scale the infrastructure. Will be involved in automation using CI/CD/DevOps concepts. Be able to document procedures for building and deploying. Build independent tools and solutions to effectively scale the infrastructure. Work on a cloud-based infrastructure spanning Amazon Web Services and Microsoft Azure Work with pre-sales and sales team to help customers during their evaluation of Terraform, Vault and other open source technologies. Conduct workshops for customers as needed for technical hand-holding and technical handover.   SKILLS AND EXPERIENCE Graduate/Post Graduate in any technology. Hands on experience in Terraform, AWS Cloud Formation, Ansible, Jenkins, Docker, Git, Jira etc Hands on experience at least in one scripting language like Perl/Python/Groovy/Bash Knowledge of Java/JVM based languages. Experience in Jenkins maintenance and scalability, designing and implementing advanced automation pipelines with Jenkins. Experience with a repository manager like JFrog Artifactory Strong background in git, Github/Bitbucket, and code branching/merging strategies Ability to understand and make trade-offs among different DevOps tools.   ADDITIONAL SKILLS Experience with Kubernetes, AWS, Google Cloud and/or Azure is a strong plus Some experience with secrets/key management preferably with HashiCorp Vault Experience using monitoring solutions, i.e. Datadog, Prometheus, ELK Stack, NewRelic, Nagios etc Will have to report in their Mumbai office post covid.Package: 25 lakhs- 30 lakhs

Job posted by
apply for job
apply for job
Sandeepa Kasala picture
Sandeepa Kasala
Job posted by
Sandeepa Kasala picture
Sandeepa Kasala
Apply for job
apply for job

Head Kubernetes

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 20 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Head Kubernetes Docker, Kubernetes Engineer – Remote (Pan India)    ABOUT US Established in 2009, Ashnik is a leading open-source solutions and consulting company in South East Asia and India, headquartered in Singapore. We enable digital transformation for large enterprises through our design, architecting, and solution skills. Over 100 large enterprises in the region have acknowledged our expertise in delivering solutions using key open-source technologies. Our offerings form critical part of Digital transformation, Big Data platform, Cloud and Web acceleration and IT modernization. We represent EDB, Pentaho, Docker, Couchbase, MongoDB, Elastic, NGINX, Sysdig, Redis Labs, Confluent, and HashiCorp as their key partners in the region. Our team members bring decades of experience in delivering confidence to enterprises in adopting open source software and are known for their thought leadership.   THE POSITION Ashnik is looking for talented and passionate Technical consultant to be part of the training team and work with customers on DevOps Solution. You will be responsible for implementation and consultations related work for customers across SEA and India. We are looking for personnel with personal qualities like - Passion for working for different customers and different environments. Excellent communication and articulation skills Aptitude for learning new technology and willingness to understand technologies which he/she is not directly working on. Willingness to travel within and outside the country. Ability to independently work at the customer site and navigate through different teams. SKILLS AND EXPERIENCE ESSENTIAL SKILLS 3+ years of experience with B.E/ B.Tech, MCA, Graduation Degree with higher education in the technical field. Must have prior experience with docker containers, swarm and/or Kubernetes. Understanding of Operating System, processes, networking, and containers. Experience/exposure in writing and managing Dockerfile or creating container images. Experience of working on Linux platform. Ability to perform installations and system/software configuration on Linux. Should be aware of networking and SSL/TLS basics. Should be aware of tools used in complex enterprise IT infra e.g. LDAP, AD, Centralized Logging solution etc. A preferred candidate will have Prior experience with open source Kubernetes or another container management platform like OpenShift, EKS, ECS etc. CNCF Kubernetes Certified Administrator/Developer Experience in container monitoring using Prometheus, Datadog etc. Experience with CI/CD tools and their usage Knowledge of scripting language e.g., shell scripting RESPONSIBILITIES First 2 months: Get an in-depth understanding of containers, Docker Engine and runtime, Swarm and Kubernetes. Get hands-on experience with various features of Mirantis Kubernetes Engine, and Mirantis Secure Registry After 2 months: Work with Ashnik’s pre-sales and solution architects to deploy Mirantis Kubernetes Engine for customers Work with customer teams and provide them guidance on how Kubernetes Platform can be integrated with other tools in their environment (CI/CD, identity management, storage etc) Work with customers to containerize their applications. Write Dockerfile for customers during the implementation phase. Help customers design their network and security policy for effective management of applications deployed using Swarm and/or Kubernetes. Help customer design their deployments either with Swarm services or Kubernetes Deployments. Work with pre-sales and sales team to help customers during their evaluation of Mirantis Kubernetes platform Conduct workshops for customers as needed for technical hand-holding and technical handover. Package: upto30 L Office: Mumbai

Job posted by
apply for job
apply for job
Sandeepa Kasala picture
Sandeepa Kasala
Job posted by
Sandeepa Kasala picture
Sandeepa Kasala
Apply for job
apply for job

DevOps Engineer - SDE 3

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Verloop is looking for an enthusiastic and self-driven individual who is passionate to work at a fast-growing start-up and can handle chaotic situations at times. We are in the business of building high-impact products to make customers' lives easier. This is a great opportunity to join a team of superstars where your passion and desire to succeed will be rewarded with significant career transformation. The ideal candidate will have a zeal to learn and an unending thirst to look for the next big challenge. If you enjoy finding solutions to complex customer engagement problems, can coordinate between our customers and the internal product team, and are willing to challenge the current status quo of chat, we are looking for you. As a part of the DevOps team, you will help ensure that our systems are always running fine for our customers and end-users. SKILLS & EXPERIENCE 6-9 years of experience in DevOps. Experience with at least one of the GCP and Azure cloud platforms. Experience managing a large-scale Microservices based system on the Kubernetes platform. Experience working with messaging infrastructures - Kafka, SQS / SNS, etc. Strong UNIX, OS, and network skills, experience in GitHub, deployment pipelines, caching, databases and cloud CDNs. Experience with alerting, monitoring, and logging systems like Grafana, Prometheus, and ELK. Strong debugging skills to help engineering resolve issues. Familiarity with Javascript, Golang, and Python to assist with engineering in deployments AND debugging runs. JOB RESPONSIBILITIES Work towards automating systems and services as they are built and deployed. Constantly monitor systems for health and take corrective action as required. Help with scaling and growing systems as demands require. Manage to log, alerting and monitoring infrastructure across multiple clouds. Work with engineering to help identify and resolve problems in systems. Ramp up and scale healthy infra and security practices in the organization Available for on-call during emergencies to handle and resolve problems in a quick and efficient manner. Participate in peer-reviews of solution design.

Job posted by
apply for job
apply for job
Preeti Chhetri picture
Preeti Chhetri
Job posted by
Preeti Chhetri picture
Preeti Chhetri
Apply for job
apply for job

Cloud AWS Administrator

Founded 1987
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote only
Experience icon
3 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Designation: Senior Cloud Administrator (ABL_SS_501) Position description: Serve as the Technical SME for Cloud Infrastructure Provide strategic architecture leadership with mastery in Infrastructure on projects and initiatives  Apply advanced troubleshooting techniques to provide solutions to issues pertaining to Service Availability, Performance, and Resiliency Monitor & Optimize the performance using AWS dashboards and logs Partner with Engineering leaders and peers in delivering technology solutions that meet the business requirements  Provide input to project teams, including determining development and deployment strategies, including code management, continuous integration, and other aspects of Dev Ops Set up and performance tune infrastructure for Kafka, Redis, and big data tools.  Experience in cost optimization and consolidated billing management.   Primary Responsibilities: Perform setup and performance tuning of AWS hosted B2C mobile apps, for parameters like AWS, storage and security   Reporting Team Reporting Designation: Head - Big Data Engineering and Cloud Development (ABL_SS_414) Reporting Department: Application Development (2487) Required Skills: Experience with Kafka, Redis, zookeeper and  Bigdata Administration Experienced working as AWS Cloud Admin AWS certification would be preferred AWS Networking (API gateway, VPC, VPN, Transit gateway, Subnets, Route table, Security groups) Good understanding in Monitoring (Cloudwatch, alarms, logs, custom metrics, Trust SNS configuration) Experience in Security administration (Identity Access Management (IAM), Config, Key Management Service (KMS), CloudTrail ) Experience with Devops and Automation. Knowledge on Storage administration (S3, Life cycle management, Event configuration) Preferred to have hands on experience working with kubernetes and Docker

Job posted by
apply for job
apply for job
Naim Punasiya picture
Naim Punasiya
Job posted by
Naim Punasiya picture
Naim Punasiya
Apply for job
apply for job

Senior DevOps Engineer

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Pune
Experience icon
5 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

You will work on:    You will be working on some of our clients massive scale Infrastructure and DevOps requirements - designing for microservices and large scale data analytics. You will be working on enterprise scale problems - but will be part of our agile team that delivers like a startup. You will have opportunity to be part of team that's building and managing large private cloud.  What you will do (Responsibilities): Work on cloud marketplace enablements for some of our clients products Write Kubernetes Operators to automate custom PaaS solutions  Participate in cloud projects to implement new technology solutions, Proof of concepts to improve cloud technology offerings.                Work with developers to deploy to private or public cloud/on-premise services, debug and resolve issues.  On call responsibilities to respond to emergency situations and scheduled maintenance.  Contribute to and maintain documentation for systems, processes, procedures and infrastructure configuration    What you bring (Skills):   Experience with administering of and debugging on Linux based systems with programming skills in Scripting, Golang, Python among others  Expertise in Git repositories specifically on GitHub, Gitlab, Bitbucket, Gerrit   Comfortable with DevOps for  Big Data databases like Terradata, Netezza, Hadoop based ecosystems,   BigQuery, RedShift among others Comfortable in interfacing with SQL and No-SQL databases like MySQL, Postgres, MongoDB, ElasticSearch, Redis    Great if you know (Skills): Understanding various build and CI/CD systems – Maven, Gradle, Jenkins, Gitlab CI, Spinnaker or Cloud based build systems  Exposure to deploying and automating on any public cloud – GCP, Azure or AWS  Private cloud experience – VMWare or OpenStack  Big DataOps experience – managing infrastructure and processes for Apache Airflow, Beam, Hadoop clusters  Containerized applications – Docker based image builds and maintainenace.  Kubernetes applications – deploy and develop operators, helm charts, manifests among other artifacts.    Advantage Cognologix:  Higher degree of autonomy, startup culture & small teams   Opportunities to become expert in emerging technologies   Remote working options for the right maturity level   Competitive salary & family benefits   Performance based career advancement    About Cognologix:  Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business first approach to help meet our client’s strategic goals. We are DevOps focused organization helping our clients focus on their core product activities by handling all aspects of their infrastructure, integration and delivery.

Job posted by
apply for job
apply for job
Rupa Kadam picture
Rupa Kadam
Job posted by
Rupa Kadam picture
Rupa Kadam
Apply for job
apply for job

DevOps Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
via Qrata
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
1 - 4 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 1600000, duration: "undefined", currency: "INR", equity: false})}}

RoleWe are looking for an experienced DevOps engineer that will help our team establish DevOps practice. You will work closely with the technical lead to identify and establish DevOps practices in the company. You will also help us build scalable, efficient cloud infrastructure. You’ll implement monitoring for automated system health checks. Lastly, you’ll build our CI pipeline, and train and guide the team in DevOps practices.Responsibilities- Deployment, automation, management, and maintenance of production systems.- Ensuring availability, performance, security, and scalability of production systems.- Evaluation of new technology alternatives and vendor products.- System troubleshooting and problem resolution across various application domains and platforms.- Providing recommendations for architecture and process improvements.- Definition and deployment of systems for metrics, logging, and monitoring on the AWSplatform.- Manage the establishment and configuration of SaaS infrastructure in an agile wayby storing infrastructure as code and employing automated configuration management tools with a goal to be able to re-provision environments at any point in time.- Be accountable for proper backup and disaster recovery procedures.- Drive operational cost reductions through service optimizations and demand-basedauto-scaling.- Have on-call responsibilities.- Perform root cause analysis for production errors- Uses open source technologies and tools to accomplish specific use cases encounteredwithin the project.- Uses coding languages or scripting methodologies to solve a problem with a custom workflow.Requirements- Systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive.- Prior experience as a software developer in a couple of high-level programminglanguages.- Extensive experience in any Javascript-based framework since we will be deploying services to NodeJS on AWS Lambda (Serverless)- Strong Linux system administration background.- Ability to present and communicate the architecture in a visual form.- Strong knowledge of AWS (e.g. IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda, NATgateway, DynamoDB)- Experience maintaining and deploying highly-available, fault-tolerant systems at scale (~1 Lakh users a day)- A drive towards automating repetitive tasks (e.g. scripting via Bash, Python, Ruby, etc)- Expertise with Git- Experience implementing CI/CD (e.g. Jenkins, TravisCI)- Strong experience with databases such as MySQL, NoSQL, Elasticsearch, Redis and/orMongo.- Stellar troubleshooting skills with the ability to spot issues before they become problems.- Current with industry trends, IT ops and industry best practices, and able to identify theones we should implement.- Time and project management skills, with the capability to prioritize and multitask asneeded.

Job posted by
apply for job
apply for job
Mrunal Kokate picture
Mrunal Kokate
Job posted by
Mrunal Kokate picture
Mrunal Kokate
Apply for job
apply for job

Senior DevOps Engineer

Founded 2018
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
6 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 4500000, duration: "undefined", currency: "INR", equity: false})}}

Roles & Responsibilities Manage systems on AWS infrastructure including application servers, database servers Proficiency with EC2, Redshift, RDS, Elasticsearch, MongoDB and other AWS services. Proficiency with managing a distributed service architecture with multiple microservices - including maintaining dev, QA, staging and production environments, managing zero-downtime releases, ensuring failure rollbacks with zero-downtime and scaling on-demand Containerization of workloads and rapid deployment Driving cost optimization while balancing performance Manage high availability of existing systems and proactively address system maintenance issues Manage AWS infrastructure configurations along with Application Load Balancers, HTTPS configurations, and network configurations (VPCs) Work with the software engineering team to automate code deployment Build and maintain tools for deployment, monitoring and operations. And troubleshoot and resolve issues in our dev, test and production environments. Familiarity with managing Spark based ETL pipelines a plus Experience in managing a team of DevOps Engineers Required Qualifications Bachelor's or Master's degree in a quantitative field Cloud computing experience, Amazon Web Services (AWS). Bonus if you've worked on Azure, GCP and on cost optimization. Prior experience in working on a distributed microservices architecture and containerization. Strong background in Linux/Windows administration and scripting Experience with CI/CD pipelines, Git, deployment configuration and monitoring tools Working understanding of various components for Web Architecture A working understanding of code and script (Javascript, Angular, Python) Excellent communication skills, problem-solving and troubleshooting.

Job posted by
apply for job
apply for job
Manu Panwar picture
Manu Panwar
Job posted by
Manu Panwar picture
Manu Panwar
Apply for job
apply for job

DevOps Engineer

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
via PeerXP
{{rendered_skills_map[skill] || skill}}
Location icon
Gurgaon, NCR (Delhi | Gurgaon | Noida)
Experience icon
2 - 3 years
Salary icon
Best in industry{{renderSalaryString({min: 300000, max: 500000, duration: "undefined", currency: "INR", equity: false})}}

2 years of experience in DevOps. Hands-on knowledge and experience on version control tools such as GIT and SVN. Experience working with Apache, Nginx, JBoss, and Tomcat Servers. Understanding of load balancing technologies. Knowledge of containerization technologies such as Docker and Kubernetes. Strong in GitLab CI/CD. Passionate to resolve reliability issues and identify strategies to mitigate going forward. Basic awareness but strong interest to learn, work, and grow in the DevOps area. DevOps/Operations, Dev or QA background with some awareness or practical experience working on Cloud (AWS services ELB, VPC, S3, RDS ), Python/Shell scripting, Jenkins, Dockers, Kubernetes. Knowledge of Infra-as-a Code concepts using AWS, Terraform, Cloud formation. Should be Good in Unix and aware of basic networking concepts (DNS, DHCP, VPN, NAT, TCP/IP). Proficiency in scripting languages including Bash, Python. Help increase system performance with a focus on high availability and scalability. Propose, scope, design, and implement various infrastructure architectures. Strong knowledge of configuration management tools. Keep up to date on modern technologies and trends and advocate for their inclusion within products when it makes sense. Ability to learn and apply new technologies through self-learning. Worked on system backup & recovery.

Job posted by
apply for job
apply for job
Sweta Sharma picture
Sweta Sharma
Job posted by
Sweta Sharma picture
Sweta Sharma
Apply for job
apply for job

Tech Lead (System Requirement and Data Management Lead)

Founded 2015
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
10 - 18 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2500000, duration: "undefined", currency: "INR", equity: false})}}

We are looking for a System Engineer who can manage requirements and data management in Rational Doors and Siemens Polarion. You will be part of a global development team with resources in China, Sweden and the US.   Responsibilities and tasks Import of requirement specifications to DOORS module Create module structure according to written specification (e-mail, word, etc) Format: reqif, word, excel, pdf, csv Make adjustments to data required to be able to import to tool Review that the result is readable and possible to work with Import of information to new or existing modules in DOORS Feedback of Compliance status from an excel compliance matrix to a module in DOORS Import requirements from one module to another based on baseline/filter… Import lists of items: Test cases, documents, etc in excel or csv to a module Provide guidance on format to information holder at client Link information/attribute data from one module to others Status, test results, comment Link requirements according to information from the client in any given format Export data and reports Assemble report based on data from one or several modules according to filters/baseline/written requests in any given format Export statistics from data in DOORS modules Create filters in DOORS modules   Note: Polarion activities same as DOORS activities, but process, results and structure may vary   Requirements – Must list (short, and real must, no order) =>10 years of overall experience in Automotive Industry Having requirement management experience in the automotive industry. =>3 years of experience in Rational Doors as user Knowledge in Siemens Polarion, working knowledge is a plus Experience in offshore delivery for more than 7 years Able to lead a team of 3 to 5 people and manage temporary additions to team Having working knowledge in ASPICE and handling requirements according to ASPICE L2 Experience in setting up offshore delivery that best fits the expectations of the customer Experience in setting up quality processes and ways of working Experience in metrics management – propose, capture and share metrics with internal/ external stakeholders Good Communication skills in English   Requirements - Good to have list, strictly sorted in falling priority order Experience in DevOps framework of delivery Interest in learning new languages Handling requirements according to ASPICE L3 Willingness in travel, travel to Sweden may be needed (approx. 1-2 per year)   Soft skills Candidate must be driving and proactive person, able to work with minimum supervision and will be asked to give example situations incoming interviews. Good team player with attention to detail, self-disciplined, able to manage their own time and workload, proactive and motivated. Strong sense of responsibility and commitment, innovative thinking.

Job posted by
apply for job
apply for job
Sateesh Hegde picture
Sateesh Hegde
Job posted by
Sateesh Hegde picture
Sateesh Hegde
Apply for job
apply for job

Manager - Information Security

Founded 2007
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
7 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1400000, max: 1700000, duration: "undefined", currency: "INR", equity: false})}}

Overall purpose of the job - This role would be responsible for identifying and implementing mitigations, practices and controls ensuring adequate application and infrastructure security posture is maintained all at times Key Performance Areas - • Good at application threat modeling and applications risk identification & remediation • Strong web application security experience with thorough understanding of web application vulnerabilities • Knowledge of database, application, and web server design and implementation • Familiarity with Security standards \ frameworks and groups (OWASP, OSSTM, WASC, FISMA) • Experience in dynamic and static application vulnerability scanners like HP WebIspenct, IBM AppScan, HP Fortify, etc • Create, implement & review data protection strategy across the organization. • Experience in client handling including interaction with developers for understanding the mitigations • Experience on Mobility Platform like Phone-Gap \ native Android \ Worklite and MDM /MAM • Knowledge of DevOps and other upcoming technologies used in SDLC • Experience in manual verification of false positives reported by automated tool • Devise and enforce standards and best practices for data protection in line with international standards and industry best practices. • Evaluate the adequacy of security measures including network security to protect organizational data and information assets • Define and implement project as per approved Plan of action. • Identify security solutions as per business needs • Manage POC for agreed and approved solutions as per defined process • Conduct partner reviews • Coordinate with vendors / partners on closure of projects / activities • Manage intra and inter department conflict amicably • Benchmark and compare security practices with the industry • Implementation, operation and maintenance of the Information Security Management System based on standards like ISO/IEC 27001, Cobit, ITIL etc as applicable. • Information security risk assessments and controls selection activities • Track all audit schedules and ensure closure of all security gaps. • Reporting of all critical security issues • Co-ordinate for Risk Assessment of IT systems and Third Party workloads • Facilitate Internal process and IT audits • Software license compliance at all times • Implement tools and processes related to compliance monitoring as per internal security policies and applicable laws and regulations • Facilitate and drive initiatives of Internal Audits for Information Technology and update on Closure and Identified Risk to the Management • Review of Third Party applications / systems and network security on monthly basis • Adherence To Change Management Processes

Job posted by
apply for job
apply for job
Kushal Dadhich picture
Kushal Dadhich
Job posted by
Kushal Dadhich picture
Kushal Dadhich
Apply for job
apply for job
Did not find a job you were looking for?
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on CutShort.
Want to apply for this role at Extramarks Education India Pvt Ltd?
Hiring team responds within a day
apply for this job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.
File upload not supportedAudio recording not supported
This browser does not support file upload. Please follow the instructions to upload your resume.This browser does not support audio recording. Please follow the instructions to record audio.
  1. Click on the 3 dots
  2. Click on "Copy link"
  3. Open Google Chrome (or any other browser) and enter the copied link in the URL bar
Done