
• Oversee lifecycle marketing strategies which identify valuable consumers and opportunities and ensure error free continuous implementation
• Pilot new campaigns that drive increased engagement and revenue
• Create and continually refine email strategy to increase acquisition, retention (CLV) and subscription rates
• Manage external agency and multiple vendor relationships related to CRM, personalised consumer marketing and database management
• Serve as a CRM business intelligence leader, utilizing analytics to drive customer-centric action plans
• Manage the distribution and presentation of CRM reporting and status to executive stakeholders
Digital Activation / DMP
• Develop audience segments by applying CRM learnings
• Create test and learn approach to optimize performance and develop insights on best practices
• Provide support in measurement and analysis of precision media campaigns
Qualifications
• 2-8 years CRM and/or Direct Marketing experience, preferably in a Retail, CPG or ecommerce industry
• Knowledge of database marketing strategies including the use of 1st, 2nd, 3rd party data
• Creative thinking and exceptional analytical skills; experience conducting CRM campaign analytics and consumer research to inform marketing decisions

Similar jobs
KEY ACCOUNTABILITIES
Pre-production:
Prepare the ideal work place layout and ensure
adherence to the standardised methods and layouts
Take up discrepancies in the sewing operations
with respective design team / Customer for
resolution.
Identify potential over consumption of accessories
and estimate the same.
Arrive at the Standard Minute Value (SMV) from
first principles using GSD software
Line Balancing
Coordinate with the head supervisor and allocate
SMOs/machines appropriately to various
operations, maintain optimum WIP flow between
operations, as well as clear bottlenecks.
Loading Plan
Give feedback on the loading plan to Shift
Managers and concerned people in planning, as
well as maintain efficiency in the assigned lines as
per plan, so as to maintain the overall factory
efficiency as per plan
Style Overlap
Plan and schedule style overlaps in coordination
with Group leaders, as well as allocate work
effectively, so as to have a minimal drop in
efficiency during style changes.
Prepare the Work Distribution Plan & the Work
Flow Charts, as well as plan the machine set-up
along with the Group, so as to achieve the required
through put time.
Pilot Study for new styles
Conduct pilot studies to identify and document
variations against the Article Description, so as to
minimise bottlenecks during bulk production.
Submit the First Commission Report, identifying
the process bottlenecks (critical operations, sewing
methods, work aids), so as to minimise the
This document is the property of MAS Holdings (Pvt) Ltd and the information contained herein is confidential. Persons receiving this document
are responsible to maintain confidentiality except when absolutely necessary and should be consistent with intended use.
occurrence of related issues during bulk production.
Training and development
Operator Improvement
Observe, improve and monitor the sewing method
of the operators according to the AD, ensure that
they follow the correct method and improve their
performance, as well as suggest new work aids, so
as to improve overall efficiency of the production
floor.
Multi-skilling
Identify operators and train them in coordination
with the Group leaders on various operations
(Multi-Skilling), so as to facilitate line balancing.
Facilitate the pilot-run for new styles, prepare
machine layouts, feeding plans & Standard Work
Sheet (STW), avoid wastage before the completion
of 50 pieces, participate in change-over meetings
and inform the persons concerned
Ensure all style analyses are carried out, verify and
obtain all relevant information on the styles
including operation sequence, methods, standard
minute values (SMV), machine / attachment
requirements
Review the following reports and forward to
relevant personnel to facilitate production:
Daily efficiency report
Hourly production file
Machine requirement reports
Daily Production Start Date (PSD) report
Reports
Prepare various reports viz. Work Distribution
Plan, Pilot Study Report, Work Flow Charts,
Monthly Operational Performance Report etc. to
facilitate decision making.
Perform additional duties commensurate with the
current role, as and when requested by management
MAJOR CHALLENGES
(Describe the major challenges you face in carrying out your job, and what you do in order to overcome them)
1. METHOD FOLLOW UP & SETTING – THROUGH STW AUDIT
This document is the property of MAS Holdings (Pvt) Ltd and the information contained herein is confidential. Persons receiving this document
are responsible to maintain confidentiality except when absolutely necessary and should be consistent with intended use.
1. BE in Production/Mechanical/Industrial/Textile
2. 3 years’ experience in production floor - Garments Industry

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions.
Key Roles & Responsibilities:
- Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
- Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
- Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
- Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
- Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
- Implement data governance, security, and compliance best practices.
- Build and maintain data models, transformations, and data marts for analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
- Automate infrastructure and deployments using Terraform, Airflow, or dbt.
- Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
- Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.
Basic Qualifications:
- Bachelor’s or Master’s Degree in Computer Science or Data Science.
- 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
- Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
- Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
- Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
- Proficiency in SQL, Python, or Scala for data transformation and analytics.
- Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
- Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
- Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
- Strong understanding of data governance, access control, and encryption strategies.
- Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.
Preferred Qualifications:
- Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
- Experience in BI and analytics tools (Tableau, Power BI, Looker).
- Familiarity with data observability tools (Monte Carlo, Great Expectations).
- Experience with machine learning feature engineering pipelines in Databricks.
- Contributions to open-source data engineering projects.
Integrations SME
- 5 - 7 years of IT experience relevant to this position.
- Mandatory Skills:
- Experience in Development of Integrations using OIC for the Oracle Fusion Modules like Financials / SCM / HCM
- Good hands-on experience in monitoring and debugging of OIC integration and migration of OIC components.
- Excellent client interfacing skills / working with IT as well as business stakeholders and writing technical design documents.
- Should be able to review the requirements and design with business and also provide estimates for various enhancements on integrations
- Hands-on experience in data migration/integration methods SOAP and Rest web services, FBDI, BIP, and HDL.
- Hands-on analysis, design, testing, implementation, and post-implementation experience.
- Good to Have:
- Should have excellent skills in Webservice technologies such as XML / XPATH / XSLT / SOAP / WSDL / XSD / JSON and REST Technologies
- Must have implemented integrations using web service and technology adapters like FTP / File / SOAP / REST / DB
- Hands on Experience on Encryption and Decryption in FTP
- Hands on experience on Mulesoft, Dell Boomi and SOA

Position: PHP Sr. Software Engineer
Exp: 4-8 Years
Location: Noida,Sec-62
As a PHP developer, you must possess in-depth knowledge of Core PHP concepts with
object-oriented programming, design pattern such as MVC, MVP, writing queries against
popular RDBMS(s), and strong knowledge of front-end technologies including HTML,
JavaScript, and CSS.
Responsibilities:
Analysis of requirements.
Turning requirements into functionalities using PHP, Cake PHP, JavaScript, HTML, & CSS.
Troubleshooting application and code issues.
Updating and altering application features to enhance performance.
PHP Developer Requirements:
A degree in computer science B.Sc. (IT), BCA, MCA, B.E., B.Tech.
Knowledge of PHP web frameworks including Cake PHP, Laravel, Symphony.
Strong Knowledge of front-end technologies including CSS, JavaScript, and HTML.
Understanding of object-oriented PHP programming.
Previous experience creating scalable applications.
Proficient with code versioning tools including Git, Mercurial, CVS, and SVN.
Familiarity with SQL/NoSQL databases.
Good problem-solving skills.
Nice to have:
Knowledge of JIRA, GitHub, AWS Services, Smarty Templates
|

Job Description
- Strong understanding of the Ruby programming language and Rails framework
- Proficiency in Javascript including experience with React, Angular or Vue.
- Database understanding (Postgres / MySQL).
- Previous experience maintaining production applications.
- Focus on writing clear, maintainable, tested code.
Skills Required
- Navigating and understanding a large codebase.
- Experience with Git, continuous integration and regular deployments. Understanding and appreciation of UX and usability.
- Excellent communication skills and diligent ability to contribute to the team by performing code reviews.
- Writing tests using Rsp,c and Capybara.
server side technologies to the limits. You will work with our team of talented engineers to design and build the next generation of our mobile related applications.
Node.JS
StartUp Experience




