
Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.
We are looking for a DataOps Engineer with expertise while operating a data lake. Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.
You have experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.
If you love solving problems using yo, professional services background, usual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.
What you will do?
- Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
- Debug and fix production issues in PySpark.
- Determine the RCA (Root cause analysis) for production issues.
- Collaborate with product teams for L3/L4 production issues in PySpark.
- Contribute to enhancing the ETL efficiency
- Build CloudWatch dashboards for optimizing the operational efficiencies
- Handle escalation tickets from L1 Monitoring engineers
- Assign the tickets to L1 engineers based on their expertise
What are we looking for?
- AWS data Ops engineer.
- Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
- Must have set up or led the project to enable Data Ops on AWS or any other cloud data platform.
- Strong data engineering experience on Cloud platform, preferably AWS.
- Experience with data pipelines designed for reuse and use parameterization.
- Experience of pipelines was designed to solve common ETL problems.
- Understanding or experience on various AWS services can be codified for enabling DataOps like Amazon EMR, Apache Airflow.
- Experience in building data pipelines using CI/CD infrastructure.
- Understanding of Infrastructure as code for DataOps ennoblement.
- Ability to work with ambiguity and create quick PoCs.
You will be preferred if
- Expertise in Amazon EMR, Apache Airflow, Terraform, CloudWatch
- Exposure to MLOps using Amazon Sagemaker is a plus.
- AWS Solutions Architect Professional or Associate Level Certificate
- AWS DevOps Professional Certificate
Life at Mactores
We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.
1. Be one step ahead
2. Deliver the best
3. Be bold
4. Pay attention to the detail
5. Enjoy the challenge
6. Be curious and take action
7. Take leadership
8. Own it
9. Deliver value
10. Be collaborative
We would like you to read more details about the work culture on https://mactores.com/careers
The Path to Joining the Mactores Team
At Mactores, our recruitment process is structured around three distinct stages:
Pre-Employment Assessment:
You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.
Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.
HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.
At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.

Similar jobs
Job Summary:
The company is seeking a Senior Graphic Designer (3 - 10 years of experience) to lead and execute creative and compliant packaging designs for our exported pharmaceutical products. This role demands expertise in international packaging regulations, brand consistency, and visual communication for a global market.
Core Responsibility:
Lead all aspects of pharmaceutical packaging design for export, from concept to final artwork, ensuring adherence to diverse international regulatory requirements and brand alignment.
Qualifications:
- Bachelor's degree in Graphic Design or related field.
- 3 - 10 years of proven experience in pharmaceutical packaging design, with exposure to export requirements.
- Strong knowledge of international packaging regulations and guidelines.
- Expertise in graphic design software, including CorelDraw, Adobe Illustrator, and other relevant design tools.

DevOps Engineer
Our engineering team is looking for Big-Data DevOps engineers to join the engineering team and help us automate the build, release, packaging and infrastructure provisioning and support processes. The candidate is expected to own the full life-cycle of provisioning, configuration management, monitoring, maintenance and support for cloud as well as on premise deployments.
Responsibilities
- 3-plus years of DevOps experience managing the Big Data application stack including HDFS, YARN, Spark, Hive and Hbase
- Deeper understanding of all the configurations required for installing and maintaining the infrastructure in the long run
- Experience setting up high availability, configuring resource allocation, setting up capacity schedulers, handling data recovery tasks
- Experience with middle-layer technologies including web servers (httpd, ningx),
application servers (Jboss, Tomcat) and database systems (postgres, mysql)
- Experience setting up enterprise security solutions including setting up active directories, firewalls, SSL certificates, Kerberos KDC servers, etc.
- Experience maintaining and hardening the infrastructure by regularly applying required security packages and patches
- Experience supporting on-premise solutions as well as on AWS cloud
- Experience working with and supporting Spark-based applications on YARN
- Experience with one or more automation tools such as Ansible, Terraform, etc
- Experience working with CI/CD tools like Jenkins and various test report and coverage Plugins
- Experience defining and automating the build, versioning and release processes for complex enterprise products
- Experience supporting clients remotely and on-site
- Experience working with and supporting Java- and Python-based tech stacks would be a Plus
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.

Experience 4-6 yrs (Sr developer ) 6-8 yrs (Lead)
Responsibilities
- Interact closely with design, product, and development teams to create elegant, usable, responsive and interactive interfaces across multiple devices.
- Turning UI/UX designs into prototypes, creating excellent interactions from designs, writing reusable content modules and maintainability of the code.
- Implement UI development principles to ensure that the product client-side serves at scale.
- Review and optimize the app usage by monitoring key metrics and rectifying the issues proactively.
- Review and Optimize application usage, by monitoring key metrics, for maximum speed and scalability
- Mentoring and guiding the team members.
- An ability to perform well in a fast-paced environment and bring in optimal flow for rapidly changing design/ technology.
Requirements
- 5+ years of relevant work experience as a web developer, UI developer, Angular Developer or front-end engineer.
- Sound knowledge in HTML, CSS & JavaScript.
- Familiar with UI layouts, SASS, Bootstrap, and the CSS GRID system
- Proficient with Typescript (Angular 2 & above)
- Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services used in the system.
- Passionate to create good design and usability
- A team player with excellent communication skills
The ideal candidate is well versed in Photoshop and experienced in all manner of web-design elements, such as the design and layout of buttons, links, menus, and text.
Will work creatively and directly with User Interface Head to create and deliver quality work in a timely manner.
Will be responsible for creating the look and feel of the public website/marketplace/eCommerce
Creating responsive design using media query and bootstrap.
Web and mobile UI/UX design & wireframe mockups, and prototypes Using Photoshop, illustrator online mockup tool. Graphic work like logo, brochure, magazine ad, poster, icon & infographic design using Photoshop and illustrator and any previous eCommerce/marketplace will be an added advantage.
Before a product is released to the public, it goes through a lot of quality assurance testing: the QA lead organizes and manages that testing. In fact, the QA lead is just as responsible for a successful product launch as the engineers or developers involved in that project. This is a skilled profession that requires previous experience in quality assurance, preferably in the same industry as the company posting the job, along with prior management experience. QA leads typically keep full-time hours, but they may experience periods of overtime closer to product rollouts. Candidates who enjoy testing products and working in fast-paced environments are a great fit for this role.


Note: This is an “Equity-linked Founding Member Role”. We are targeting a $100Bn+ Addressable Market (D2C e-commerce) enabling tremendous wealth creation for founding team members.
__________________________________________________________
What’s your favorite chocolate? Something from the house of Cadbury’s, Nestle or Amul? But did you know there are more than 100 Indian D2C Brands that have launched more than 1000 variants of Bars and Chocolates?
In Fact, there are more than 10,000 Indian D2C brands that have launched lakhs of exciting new-age products across Food, Beverage, Beauty and Wellness!
Want to try them for free? That’s good enough a reason to Signup on Blingg https://blingg.app.link/install">Here.
But how about we enable the discovery of these exciting D2C Brands and their products to millions of Indian consumers…well that’s what we are aiming to do.
About Blingg:
Blingg is India’s first D2C Trials Membership at just Rs.99 built for urban millennials. Blingg delivers a customized box of products from new DTC brands for you to sample across Food, Beverage, Beauty, Personal Care and Wellness.
We are a team of 6 but our goal is singular - to democratize discovery and user acquisition for d2c brands. We are revolutionizing e-commerce in India, one BlinggBox at a time, you can hop on board and make it a whole lot better with your passion, skills and experience.
About the Role:
For a product that brings two worlds together, Users and D2C Brands, the need for a seamless, robust and enticing experience is the key. That’s where you and your technical experience and expertise come in. Our hunt is for a techie at heart, who has done this before and has built an app/platform which is both simple and intuitive.
We are revolutionizing e-commerce, one BlinggBox at a time, you can hop on board and make it a whole lot better with your skills and experience.
Technical Requirements:
->3 - 6 years of proven working experience in Core Flutter + Android + iOS development and deep proficiency with MySQL, Linux and JavaScript or similar languages.
->Hands on Experience with Firebase, AWS, RDMS, Git
->Relevant Backend experience: Nodejs, Postgres/MySQL Or Algorithms
->Experience with third-party libraries and APIs
->Hands-on engineering experience with modern software stacks, cloud-based platforms, and high-stakes infrastructure.
-> Previous experience of work in early-stage startups is highly preferred.
-> Demonstrated ability to work independently and make decisions with minimal supervision.
What Will You Do?
->Taking responsibility for the overall planning, execution, and success of product releases.
->Taking responsibility for the complete architecture of the product ->Driving the adoption of best practices & leading code reviews, design reviews, and architecture
->Experiment with new & relevant technologies and tools, and drive adoption while measuring yourself on the impact you can create
->Implementation of long-term technology vision for Blingg
What’s in it for you?
- Once in a lifetime opportunity to Build, Manage, Lead and Blitzscale D2C Commerce
- We are targeting a $100Bn+ Addressable Market (D2C e-commerce) enabling tremendous wealth creation for founding team members.
- Super Attractive ESOPs+ Salary
Still, confused about whether you should apply for this role? Watch this video and decide for yourself-> https://www.youtube.com/watch?v=um8tRxuD9po">https://www.youtube.com/watch?v=um8tRxuD9po
- Create and lead innovative programs, software, and analytics that drive improvements to the availability, scalability, efficiency, and latency of the application
- Work with Directors to define and execute technical strategy and arbitrate technical processes and decisions.
- Work with development teams to guide future technology choices and foster cross-team collaboration
- Work across all teams to help define and clarify requirements, explore technical feasibility, and help define product directions and plans.
- Work closely with product & engineering leaders and tech leads to turn requirements into actionable plans that teams can understand.
- Define, Refine & Develop POCs in quick turnaround time and demonstrate the same to all stakeholders effortlessly
- Embed with teams from time to time to write code, provide technical guidance, etc.
- Very deep knowledge of the entire tech stack. Excellent understanding of the entire SDLC.
- Provides a multiplier effect in getting stuff done.
- Acts as SME for the team or product area within the Engineering organization and for cross-functional organizations as well.
- Teaches others why new features are important. Good understanding of customer use cases.
- A Bachelor's degree in any technical discipline
- Minimum 5 years of experience administering AEM applications
- Build management using Bamboo/Jenkins or relevant tech
- Configuration and Release management
- Design and build enablement
- Good communication skills
- Strong development experience in Core Java, J2EE, Spring Boot, Oracle SQL/PLSQL and App servers like WebLogic, JBOSS, Unix
- Good Knowledge of SOAP and REST API
- Should have knowledge in SOLID Principles & Design Patterns
- Should have working experience in UI like JSF, JSP, Html, CSS and Javascript / Jquery
- Experience with full-lifecycle development (i.e. design, coding, testing, debugging, etc.)
- Working experience to fix common vulnerabilities and security threat in SOA\Microservices applications
- Knowledge in OWASP Standards and Working knowledge in fixing security issues, data encryption and cryptography.
- Prior experience and knowledge of security tools like Fortify, Sonatype and webInspect is a plus
- Good to have knowledge and working experience in Angular JS
- Translate business requirements into detailed specs/designs
- Design thinking while arriving at solution
- Strong technical troubleshooting, diagnosing and problem-solving skills
- Ability to work with distributed teams in a collaborative and productive manner
- Solving complex business and workflow issues with solid scalable technical solutions
- Must be a self-motivated, proven performer who enjoy challenging assignments in a high-energy, fast growing workplace
- Agility and ability to adapt quickly to changing requirements and scope and priorities
- Good in communication, both written and verbal
- Should have Agile scrum experience.
- Team Player with very good attitude
- Attention to detail and focus on quality
- Knowledge in financial services domain is a plus
- Good to have knowledge on Static AppSec Testing (SAST) and Dynamic AppSec Testing (DAST)

