Cutshort logo
Mactores Cognition Private Limited's logo

DataOps Engineer

Dhwani Thakkar's profile picture
Posted by Dhwani Thakkar
5 - 15 yrs
₹5L - ₹21L / yr
Remote only
Skills
Data Warehouse (DWH)
Informatica
ETL
skill iconAmazon Web Services (AWS)
Amazon S3
Amazon EMR
PySpark
Apache Airflow

Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.


We are looking for a DataOps Engineer with expertise while operating a data lake. Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.


You have experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.


If you love solving problems using yo, professional services background, usual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.


What you will do?


  • Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
  • Debug and fix production issues in PySpark.
  • Determine the RCA (Root cause analysis) for production issues.
  • Collaborate with product teams for L3/L4 production issues in PySpark.
  • Contribute to enhancing the ETL efficiency
  • Build CloudWatch dashboards for optimizing the operational efficiencies
  • Handle escalation tickets from L1 Monitoring engineers
  • Assign the tickets to L1 engineers based on their expertise


What are we looking for?


  • AWS data Ops engineer.
  • Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
  • Must have set up or led the project to enable Data Ops on AWS or any other cloud data platform.
  • Strong data engineering experience on Cloud platform, preferably AWS.
  • Experience with data pipelines designed for reuse and use parameterization.
  • Experience of pipelines was designed to solve common ETL problems.
  • Understanding or experience on various AWS services can be codified for enabling DataOps like Amazon EMR, Apache Airflow.
  • Experience in building data pipelines using CI/CD infrastructure.
  • Understanding of Infrastructure as code for DataOps ennoblement.
  • Ability to work with ambiguity and create quick PoCs.


You will be preferred if


  • Expertise in Amazon EMR, Apache Airflow, Terraform, CloudWatch
  • Exposure to MLOps using Amazon Sagemaker is a plus.
  • AWS Solutions Architect Professional or Associate Level Certificate
  • AWS DevOps Professional Certificate


Life at Mactores


We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.


1. Be one step ahead

2. Deliver the best

3. Be bold

4. Pay attention to the detail

5. Enjoy the challenge

6. Be curious and take action

7. Take leadership

8. Own it

9. Deliver value

10. Be collaborative


We would like you to read more details about the work culture on https://mactores.com/careers 


The Path to Joining the Mactores Team

At Mactores, our recruitment process is structured around three distinct stages:


Pre-Employment Assessment: 

You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.


Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.


HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.


At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Mactores Cognition Private Limited

Founded :
2008
Type
Size :
20-100
Stage :
Bootstrapped
About

Mactores is a global technology consulting and product company with focus on delivering solutions on Cloud, Big Data, Deep Analytics, DevOps, IoT & AI

Read more
Company social profiles
linkedintwitterfacebook

Similar jobs

Vola Finance
at Vola Finance
1 video
5 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3yrs+
Upto ₹20L / yr (Varies
)
skill iconAmazon Web Services (AWS)
Data engineering
Spark
SQL
Data Warehouse (DWH)
+4 more

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)


Roles & Responsibilities


Basic Qualifications:

● The position requires a four-year degree from an accredited college or university.

● Three years of data engineering / AWS Architecture and security experience.


Top candidates will also have:

Proven/Strong understanding and/or experience in many of the following:-

● Experience designing Scalable AWS architecture.

● Ability to create modern data pipelines and data processing using AWS PAAS components (Glue, etc.) or open source tools (Spark, Hbase, Hive, etc.).

● Ability to develop SQL structures that support high volumes and scalability using

RDBMS such as SQL Server, MySQL, Aurora, etc.

● Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse

● Experience in creating Network Architecture for secured scalable solution.

● Experience with Message brokers such as Kinesis, Kafka, Rabbitmq, AWS SQS, AWS SNS, and Apache ActiveMQ. Hands-on experience on AWS serverless architectures such as Glue,Lamda, Redshift etc.

● Working knowledge of Load balancers, AWS shield, AWS guard, VPC, Subnets, Network gateway Route53 etc.

● Knowledge of building Disaster management systems and security logs notification system

● Knowledge of building scalable microservice architectures with AWS.

● To create a framework for monthly security checks and wide knowledge on AWS services

● Deploying software using CI/CD tools such CircleCI, Jenkins, etc.

● ML/ AI model deployment and production maintainanace experience is mandatory.

● Experience with API tools such as REST, Swagger, Postman and Assertible.

● Versioning management tools such as github, bitbucket, GitLab.

● Debugging and maintaining software in Linux or Unix platforms.

● Test driven development

● Experience building transactional databases.

● Python, PySpark programming experience .

● Must experience engineering solutions in AWS.

● Working AWS experience, AWS certification is required prior to hiring

● Working in Agile Framework/Kanban Framework

● Must demonstrate solid knowledge of computer science fundamentals like data structures & algorithms.

● Passion for technology and an eagerness to contribute to a team-oriented environment.

● Demonstrated leadership on medium to large-scale projects impacting strategic priorities.

● Bachelor’s degree in Computer science or Electrical engineering or related field is required

Read more
DeepIntent
at DeepIntent
2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
2 - 5 yrs
Best in industry
Data Warehouse (DWH)
Informatica
ETL
SQL
skill iconJava
+1 more

Who You Are:


- In-depth and strong knowledge of SQL.

- Basic knowledge of Java.

- Basic scripting knowledge.

- Strong analytical skills.

- Excellent debugging skills and problem-solving.


What You’ll Do:


- Comfortable working in EST+IST Timezone

- Troubleshoot complex issues discovered in-house as well as in customer environments.

- Replicate customer environments/issues on Platform and Data and work to identify the root cause or provide interim workaround as needed.

- Ability to debug SQL queries associated with Data pipelines.

- Monitoring and debugging ETL jobs on a daily basis.

- Provide Technical Action plans to take a customer/product issue from start to resolution.

- Capture and document any Data incidents identified on Platform and maintain the history of such issues along with resolution.

- Identify product bugs and improvements based on customer environments and work to close them

- Ensure implementation/continuous improvement of formal processes to support product development activities.

- Good in external and internal communication across stakeholders.

Read more
Factory Edtech
Agency job
via Qrata by Blessy Fernandes
Delhi
2 - 5 yrs
₹10L - ₹12L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more
Job Title:
Data Analyst

Job Brief:
The successful candidate will turn data into information, information into insight and insight into business decisions.

Data Analyst Job Duties
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance
and quality control plans to identify improvements.

About Us
We began in 2015 with an entrepreneurial vision to bring a digital change in the manufacturing landscape of India. With a team of 300+ we are working towards the digital transformation of business in the manufacturing industry across domains like Footwear, Apparel, Textile, Accessories etc. We are backed by investors such as
Info Edge (Naukri.com), Matrix Partners, Sequoia, Water Bridge Ventures and select Industry leaders.
Today, we have enabled 2000+ Manufacturers to digitize their distribution channel.

Responsibilities
● Interpret data, analyze results using statistical techniques and provide ongoing reports.
● Develop and implement databases, data collection systems, data analytics and other strategies that
optimize statistical efficiency and quality.
● Acquire data from primary or secondary data sources and maintain databases/data systems.
● Identify, analyze, and interpret trends or patterns in complex data sets.
● Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and
correct code problems.
● Work with management to prioritize business and information needs.
● Locate and define new process improvement opportunities.
Requirements
● Proven working experience as a Data Analyst or Business Data Analyst.
● Technical expertise regarding data models, database design development, data mining and segmentation
techniques.
● Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc),
programming (XML, Javascript, or ETL frameworks).
● Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS
etc).
● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of
information with attention to detail and accuracy.
● Adept at queries, report writing and presenting findings.


Job Location
South Delhi, New Delhi
Read more
Gurugram, Bengaluru (Bangalore), Chennai
2 - 9 yrs
₹9L - ₹27L / yr
DevOps
Microsoft Windows Azure
gitlab
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+15 more
Greetings!!

We are looking out for a technically driven  "ML OPS Engineer" for one of our premium client

COMPANY DESCRIPTION:
This Company is a global management consulting firm. We are the trusted advisor to the world's leading businesses, governments, and institutions. We work with leading organizations across the private, public and social sectors. Our scale, scope, and knowledge allow us to address


Key Skills
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)
Read more
Hyderabad
4 - 7 yrs
₹14L - ₹25L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Roles and Responsibilities

Big Data Engineer + Spark Responsibilies Atleast 3 to 4 years of relevant experience as Big Data Engineer Min 1 year of relevant hands-on experience into Spark framework. Minimum 4 years of Application Development experience using any programming language like Scala/Java/Python. Hands on experience on any major components in Hadoop Ecosystem like HDFS or Map or Reduce or Hive or Impala. Strong programming experience of building applications / platforms using Scala/Java/Python. Experienced in implementing Spark RDD Transformations, actions to implement business analysis. An efficient interpersonal communicator with sound analytical problemsolving skills and management capabilities. Strive to keep the slope of the learning curve high and able to quickly adapt to new environments and technologies. Good knowledge on agile methodology of Software development.
Read more
Simform Solutions
at Simform Solutions
4 recruiters
Dipali Pithava
Posted by Dipali Pithava
Ahmedabad
4 - 8 yrs
₹5L - ₹12L / yr
ETL
Informatica
Data Warehouse (DWH)
Relational Database (RDBMS)
DBA
+4 more
We are looking for Lead DBA, with 4-7 years of experience

We are a fast-growing digital, cloud, and mobility services provider with a principal market being North
America. We are looking for talented database/SQL experts for the management and analytics of large
data in various enterprise projects.

Responsibilities
 Translate business needs to technical specifications
 Manage and maintain various database servers (backup, replicas, shards, jobs)
 Develop and execute database queries and conduct analyses
 Occasionally write scripts for ETL jobs.
 Create tools to store data (e.g. OLAP cubes)
 Develop and update technical documentation

Requirements
 Proven experience as a database programmer and administrator
 Background in data warehouse design (e.g. dimensional modeling) and data mining
 Good understanding of SQL and NoSQL databases, online analytical processing (OLAP) and ETL
(Extract, transform, load) framework
 Advance Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server
Integration Services (SSIS)
 Familiarity with BI technologies (strong Tableu hands-on experience) is a plus
 Analytical mind with a problem-solving aptitude
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
Pune
4 - 7 yrs
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
Remote only
4 - 8 yrs
₹10L - ₹15L / yr
Big Data
Hadoop
kafka
Spark
skill iconAmazon Web Services (AWS)
  • Hands-on programming expertise in Java OR Python
  • Strong production experience with Spark (Minimum of 1-2 years)
  • Experience in data pipelines using Big Data technologies (Hadoop, Spark, Kafka, etc.,) on large scale unstructured data sets
  • Working experience and good understanding of public cloud environments (AWS OR Azure OR Google Cloud)
  • Experience with IAM policy and role management is a plus
Read more
GitHub
at GitHub
4 recruiters
Nataliia Mediana
Posted by Nataliia Mediana
Remote only
3 - 15 yrs
$50K - $80K / yr
skill iconData Science
Data Scientist
Data engineering
Financial analysis
Finance
+8 more

We are a nascent quantitative hedge fund led by an MIT PhD and Math Olympiad medallist, offering opportunities to grow with us as we build out the team. Our fund has  world class investors and big data experts as part of the GP,  top-notch ML experts as advisers to the fund, plus has equity funding to grow the team, license data and scale the data processing.

We are interested in researching and taking in live a variety of quantitative strategies based on historic and live market data, alternative datasets, social media data (both audio and video) and stock fundamental data.

You would join, and, if qualified, lead a growing team of data scientists and researchers, and be responsible for a complete lifecycle of quantitative strategy implementation and trading.

Requirements:

  • Atleast 3 years of relevant ML experience
  • Graduation date : 2018 and earlier
  •   3-5 years of experience in high level Python programming.
  • Master Degree (or Phd) in quantitative disciplines such as Statistics, Mathematics, Physics, Computer Science in top universities.
  •   Good knowledge of applied and theoretical statistics, linear algebra and machine learning techniques. 
  •   Ability to leverage financial and statistical insights to research, explore and harness a large collection of quantitative strategies and financial datasets in order to build strong predictive models.
  • Should take ownership for the research, design, development and implementation of the strategy development and effectively communicate with other team mates
  •   Prior experience and good knowledge of lifecycle and pitfalls of algorithmic strategy development and modelling. 
  •   Good practical knowledge in understanding financial statements, value investing, portfolio and risk management techniques.
  •   A proven ability to lead and drive innovation to solve challenges and road blocks in project completion.
  • A valid Github profile with some activity in it

Bonus to have:

  •   Experience in storing and retrieving data from large and complex time series databases
  •   Very good practical knowledge on time-series modelling and forecasting (ARIMA, ARCH and Stochastic modelling)
  •   Prior experience in optimizing and back testing quantitative strategies, doing return and risk attribution, feature/factor evaluation. 
  •   Knowledge of AWS/Cloud ecosystem is an added plus (EC2s, Lambda, EKS, Sagemaker etc.) 
  •   Knowledge of REST APIs and data extracting and cleaning techniques 
  •   Good to have experience in Pyspark or any other big data programming/parallel computing
  •   Familiarity with derivatives, knowledge in multiple asset classes along with Equities.
  •   Any progress towards CFA or FRM is a bonus
  • Average tenure of atleast 1.5 years in a company
Read more
Remote, NCR (Delhi | Gurgaon | Noida)
3 - 12 yrs
₹8L - ₹14L / yr
Data Warehouse (DWH)
ETL
Amazon Redshift

Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design.

Role and Responsibility

·         Plan, create, coordinate, and deploy data warehouses.

·         Design end user interface.

·         Create best practices for data loading and extraction.

·         Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment.

·         Develop reporting applications and data warehouse consistency.

·         Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers.

·         Supervise design throughout implementation process.

·         Design and build cubes while performing custom scripts.

·         Develop and implement ETL routines according to the DWH design and architecture.

·         Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse.

·         Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required.

·         Manage multiple projects at once.

DESIRABLE SKILL SET

·         Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures

·         Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database

·         High proficiency in dimensional modeling techniques and their applications

·         Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel

·         Several years working experience with Tableau,  MicroStrategy, Information Builders, and other reporting and analytical tools

·         Working knowledge of SAS and R code used in data processing and modeling tasks

·         Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos