

Candidate MUST HAVE product-based company experience and a minimum of 3years of experience in DevOps.
What you will do (or learn) :
1. Build our application stack on AWS. Infrastructure as code (read Terraform)
2. Build state-of-the-art CI/CD pipelines.
3. Manage data warehouses and data pipelines.
4. Work on infrastructure and data security.
5. State-of-the-art log management system and tooling around them.
6. Monitoring and alerting system.
What do we expect from you?
1. 3 to 10 years of experience with DevOps or SRE principles.
2. Good fundamentals of database management and other distributed systems management.
3. Experience in infrastructure as code or other configuration management systems.
4. Experience in scripting languages (like bash, python, go lang etc.)
5. Good understanding of Linux systems
6. Strong debugging and troubleshooting skills
7. Experience in tooling around monitoring, CI/CD, log management systems.

Similar jobs

We are looking for QA role who has experience into Python ,AWS,and chaos engineering tool(Monkey,Gremlin)
⦁ Strong understanding of distributed systems
- Cloud computing (AWS), and networking principles.
- Ability to understand complex trading systems and prepare and execute plans to induce failures
- Python.
- Experience with chaos engineering tooling such as Chaos Monkey, Gremlin, or similar


•Should know about OOPS Concept, Core java, and basic Android
•Able to Design, Develop, Test & Implement an Android Application
•Basic knowledge of Javascript, Jquery gets a chance to work on React Native.
•Understanding of Linux/Ubuntu, Web servers, Cross Browser compatibility.
•Strong knowledge of UI development.
•Knowledge of 3rd party APIs implementation, while iOS & Android app development is good.



Responsibilities
- Work on execution and scheduling of all tasks related to assigned projects' deliverable dates
- Optimize and debug existing codes to make them scalable and improve performance
- Design, development, and delivery of tested code and machine learning models into production environments
- Work effectively in teams, managing and leading teams
- Provide effective, constructive feedback to the delivery leader
- Manage client expectations and work with an agile mindset with machine learning and AI technology
- Design and prototype data-driven solutions
Eligibility
- Highly experienced in designing, building, and shipping scalable and production-quality machine learning algorithms in the field of Python applications
- Working knowledge and experience in NLP core components (NER, Entity Disambiguation, etc.)
- In-depth expertise in Data Munging and Storage (Experienced in SQL, NoSQL, MongoDB, Graph Databases)
- Expertise in writing scalable APIs for machine learning models
- Experience with maintaining code logs, task schedulers, and security
- Working knowledge of machine learning techniques, feed-forward, recurrent and convolutional neural networks, entropy models, supervised and unsupervised learning
- Experience with at least one of the following: Keras, Tensorflow, Caffe, or PyTorch


Key Responsibilities:
- Design, develop, implement and evolve our backend platform (whose APIs and functionality provided are used by other game services, apps, and admin back office)
- Develop highly efficient, robust, quality code for applications and services
- Maintain clean coding practices
- Enhancing and supporting existing functions, some of which are 24/7
- Supporting product owners as required
- Providing support to the testing team during testing phases of the projects
- Contributing to solution proposals as required
- Creating work effort estimates as required
- Mentor, train on the best practices junior software associates and other team members.
Key Competencies (Functional):
- Clean coding practices
- Knowledge of scalable web architectures is required
- A solid understanding of Linux, JVM is required
- Knowledge of DevOps, Docker, Build Pipelines is good to have
- Knowledge of git workflow is essential
- Working with SOA, Microservices.
Any specific knowledge / experience:
- Functional programming experience with immubtable datastructures is required
- Knowledge of Erlang / Elixir is a plus.
- Knowledge of Clojure is perfect
Job description
Objectives of this Role
- Source senior level, highly specialist candidates for leading global businesses, covering the European and International markets
- Screen and interview candidates to ensure we put forward the best quality candidates to clients
- Consult with clients on overall hiring strategies and tailor your approach accordingly
- Keep up-to-date with latest industry trends to ensure candidates can be evaluated against industry standard assessments
- Build and develop your client portfolio, providing expert consultation to ensure repeat business
- Generate new leads and clients using your network of contacts
- Network online and offline with potential candidates to promote our employer brand and ensure we attract the best professionals
Daily and Monthly Responsibilities
- Liaise with clients to understand role requirements in order to source the most suitable candidates
- Write and post technical job descriptions on specialist IT job boards, social media and any other relevant channels
- Source, screen and compile a shortlist of qualified candidates for various technical roles
- Interview candidates combining various methods (e.g. structured interviews, technical assessments and behavioral questions)
- Build a candidate CRM to ensure a solid pipeline of qualified candidates - ensuring candidate data is kept updated
- Participate in tech conferences and meetups to network with IT professionals
- Keep up-to-date with new technological trends in order to form strategic conversations with clients on future hiring needs
Skills and Qualifications
- Proven work experience in recruitment - ideally as a Technical Recruiter
- Hands-on experience with various interview formats (e.g. Teams, Zoom, Google Hangouts)
- Technical expertise with an ability to understand and explain job requirements for IT roles
- Experience using LinkedIn Talent Solutions to source quality candidates
- Excellent verbal and written communication skills
- Strong tenacity and ability to build a solid network
Preferred Qualifications
- Degree in HR, communications, marketing, business or similar
- 2 years IT or Tech recruitment experience in a 360 agency role
- Strong experience in sales, business development or client development roles



- A Data and MLOps Engineering lead that has a good understanding of modern Data engineering frameworks with a focus on Microsoft Azure and Azure Machine Learning and its development lifecycle and DevOps.
- Aims to solve the problems encountered when turning Data into meaningful solutions using transformations and data science code into production Machine Learning systems. Some of these challenges include:
- ML orchestration - how can I automate my ML workflows across multiple environments
- Scalability - how can I take advantage of the huge computational power available in the cloud?
- Serving - how can I make my ML models available to make predictions reliably when needed?
- Monitoring - how can I effectively monitor my ML system in production to ensure reliability? Not just system metrics, but also get insight into how my models are performing over time
- Reuse – how can I profess reuse of artefacts built and establish templates and patterns?
The MLOps team works closely with ML Engineering and DevOps teams. Rather than focus just on individual use cases, the focus would be to specialise in building the platforms and tools that can help adoption of MLOps across the organisation and develop best practices and ways of working to develop a state of the art MLOps capability.
A good understanding of AI/Machine Learning and software engineering best practices such as Cloud Engineering, Infrastructure-as-Code, and CI/CD.
Have excellent communication and consulting skills, while delivering innovative AI solutions on Azure.
Responsibilities will include:
- Building state-of-the-art MLOps platforms and tooling to help adoption of MLOps across organization
- Designing cloud ML architectures and provide a roadmap for flexible patterns
- Optimizing solutions for performance and scalability
- Leading and driving the evolving best practices for MLOps
- Helping to showcase expertise and leadership in this field
Tech stack
These are some of the tools and technologies that we use day to day. Key to success will be attitude and aptitude with a vision to build the next big thing in AI/ML field.
- Python - including poetry for dependency management, pytest for automated testing and fastapi for building APIs
- Microsoft Azure Platform - primarily focused on Databricks, Azure ML
- Containers
- CI/CD – Azure DevOps
- Strong programming skills in Python
- Solid understanding of cloud concepts
- Demonstrable interest in Machine Learning
- Understanding of IaC and CI/CD concepts
- Strong communication and presentation skills.
Remuneration: Best in the industry
Connect: https://www.linkedin.com/in/shweta-gupta-a361511


- Experience in leading a software team and managing software delivery project.
- Strong experience in Python development in a full stack environment is a requirement, including , Flask
- Strong Hands-on with VueJS/Vuex, JavaScript, React, Angular · Experience with SQLAchemy or similar ORM frameworks
- Experience working with mapping APIs (e.g., Google Maps, Mapbox) · Experience using Elasticsearch and docker environment is a plus
- Strong knowledge of SQL, comfortable working with MySQL and/or PostgreSQL databases
- Understands concepts of Data Modeling
- Experience with REST.
- Experience with Git, GitFlow, code review process
- Good understand with basic UI and UX principles
- Must enjoy problem solving and have excellent communication skills

Having and Hands on experience in the particular Domain.
– JavaScript, jQuery, Angular, React/Redux, Backbone
– RSpec, Capybara, Factory Girl
– Resque, Sidekiq
– Elasticsearch, Sphinx
– Deploy with Capistrano, Mina or Heroku
– Chef, Amazon Web Services
– PostgreSQL, MongoDB, Redis, Memcached
– Active Admin
– Spree, LocomotiveCMS
– REST API
– Coffescript, HAML, SASS, LESS

