
Senior Manager - Category & Catalog
at A fast-growing SaaS commerce company permanent WFH & Office
Responsibilities:
- PL ownership: Drive and ensure 100% achievement of the targeted sales numbers of the categories.
- Ensure Profitability by negotiating the best category margins, securing marketing support from brands, and focusing on delivering the budgeted top-line and bottom-line.
- Assortment planning: Ensure we have a competitive catalog compared to other players across brands and sub-categories globally.
- Customer Experience: Anticipate and create solutions for customer pain points. Should be able to resolve customer escalations and deliver on the expected NPS for the category.
- Category marketing: Create weekly, monthly, and quarterly calendars Have joint marketing plans signed off with partner brands, execute the plan in coordination with the internal team
- Operational efficiency: Work with cross-functional teams like CS, Finance, Sales to ensure a smooth functioning system and to be able to deliver on the required set of parameters measured across each function
Following skills are mandatory:
- 5-10 years of experience in global category roles, preferably digital categories like gift cards, perks, deals, experiences, loans, insurance, merchandise etc.
- Hands-on in Sales, Operations, Catalog and Category Management.
- Strong analytical aptitude in problem-solving, multi-tasker, critical thinker, and a tactical executioner with a proven track record of finding solutions to business problems.
- Good communication and team management skills.
- Engineering + MBA from reputed colleges would be preferred.
What can you look for?
A wholesome opportunity in a fast-paced environment will enable you to juggle between concepts yet maintain the quality of content, interact, share your ideas, and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Plum.

Similar jobs
Desired Competencies (Technical/Behavioral Competency)
Must-Have
Snowflake, DBT , PL/SQL , Azure/AWS Overall Knowledge , Knowledge on DB Modelling , Knowledge on Data Warehouse concepts , Well versed with Agile Delivery, ETL Tools – Informatica/ADF
Good-to-Have
Azure 900/104/204 Certified, Informatica/SSIS/ADF
SN
Responsibility of / Expectations from the Role
1
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
2
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
3
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
4
Solid MS SQL Server skills including reporting experience.
5
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
6
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
7
Ensuring root cause resolution to identified problems.
8
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
9
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
10
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
11
Follow best practices and standards around data governance, security and privacy.
12
Comfortable working in a fast-paced team environment coordinating multiple projects.
13
Effective software development life cycle management skills and experience with GitHub
14
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
15
Data transformation and Data Analytics Documentation
Job Title: Lead DevOps Engineer
Experience Required: 4 to 5 years in DevOps or related fields
Employment Type: Full-time
About the Role:
We are seeking a highly skilled and experienced Lead DevOps Engineer. This role will focus on driving the design, implementation, and optimization of our CI/CD pipelines, cloud infrastructure, and operational processes. As a Lead DevOps Engineer, you will play a pivotal role in enhancing the scalability, reliability, and security of our systems while mentoring a team of DevOps engineers to achieve operational excellence.
Key Responsibilities:
Infrastructure Management: Architect, deploy, and maintain scalable, secure, and resilient cloud infrastructure (e.g., AWS, Azure, or GCP).
CI/CD Pipelines: Design and optimize CI/CD pipelines, to improve development velocity and deployment quality.
Automation: Automate repetitive tasks and workflows, such as provisioning cloud resources, configuring servers, managing deployments, and implementing infrastructure as code (IaC) using tools like Terraform, CloudFormation, or Ansible.
Monitoring & Logging: Implement robust monitoring, alerting, and logging systems for enterprise and cloud-native environments using tools like Prometheus, Grafana, ELK Stack, NewRelic or Datadog.
Security: Ensure the infrastructure adheres to security best practices, including vulnerability assessments and incident response processes.
Collaboration: Work closely with development, QA, and IT teams to align DevOps strategies with project goals.
Mentorship: Lead, mentor, and train a team of DevOps engineers to foster growth and technical expertise.
Incident Management: Oversee production system reliability, including root cause analysis and performance tuning.
Required Skills & Qualifications:
Technical Expertise:
Strong proficiency in cloud platforms like AWS, Azure, or GCP.
Advanced knowledge of containerization technologies (e.g., Docker, Kubernetes).
Expertise in IaC tools such as Terraform, CloudFormation, or Pulumi.
Hands-on experience with CI/CD tools, particularly Bitbucket Pipelines, Jenkins, GitLab CI/CD, Github Actions or CircleCI.
Proficiency in scripting languages (e.g., Python, Bash, PowerShell).
Soft Skills:
Excellent communication and leadership skills.
Strong analytical and problem-solving abilities.
Proven ability to manage and lead a team effectively.
Experience:
4 years + of experience in DevOps or Site Reliability Engineering (SRE).
4+ years + in a leadership or team lead role, with proven experience managing distributed teams, mentoring team members, and driving cross-functional collaboration.
Strong understanding of microservices, APIs, and serverless architectures.
Nice to Have:
Certifications like AWS Certified Solutions Architect, Kubernetes Administrator, or similar.
Experience with GitOps tools such as ArgoCD or Flux.
Knowledge of compliance standards (e.g., GDPR, SOC 2, ISO 27001).
Perks & Benefits:
Competitive salary and performance bonuses.
Comprehensive health insurance for you and your family.
Professional development opportunities and certifications, including sponsored certifications and access to training programs to help you grow your skills and expertise.
Flexible working hours and remote work options.
Collaborative and inclusive work culture.
Join us to build and scale world-class systems that empower innovation and deliver exceptional user experiences.
You can directly contact us: Nine three one six one two zero one three two
Roles & Responsibilities:
Lead Generation & Prospecting: Identify potential clients via LinkedIn, cold calls, emails, and networking.
Sales & Revenue Growth: Develop sales strategies to meet targets and drive enrollments.
Client Acquisition & Relationship Management: Build strong relationships with students, professionals, and corporate clients.
Pitching & Presentations: Deliver compelling sales presentations and course demos.
Market Research & Competitive Analysis: Stay updated on industry trends and competitor offerings.
Collaboration & Reporting: Coordinate with marketing, operations, and academic teams while maintaining sales reports.
Negotiation & Deal Closure: Handle pricing discussions, close deals, and ensure customer satisfaction.
Qualifications:*
1. 10+ years of experience, with 3+ years as Database Architect or related role
2. Technical expertise in data schemas, Amazon Redshift, Amazon S3, and Data Lakes
3. Analytical skills in data warehouse design and business intelligence
4. Strong problem-solving and strategic thinking abilities
5. Excellent communication skills
6. Bachelor's degree in Computer Science or related field; Master's degree preferred
*Skills Required:*
1. Database architecture and design
2. Data warehousing and business intelligence
3. Cloud-based data infrastructure (Amazon Redshift, S3, Data Lakes)
4. Data governance and security
5. Analytical and problem-solving skills
6. Strategic thinking and communication
7. Collaboration and team management
2.Prepares work to be accomplished by gathering information and materials.
3.Illustrates concept by designing rough layout of art and copy regarding arrangement, size, type size and style, and related aesthetic concepts.
Role Description :
- Good Understanding of Java 8 with proven hands-on skills
- Experience in Spring Framework - Spring Boot, Spring Data, Spring REST
- Experience in Spring Reactive
- Experience in Git, Gradle / Maven
- Practitioner of clean code and SOLID principles
- Able to test drive features
- Can debug code at ease and ensure quality code is produced
- Knowledge of design patterns
- Comfortable with agile practices, user stories and task breakdown
- Understands REST principles and Micro services
- Experienced with technologies such as
- Required Java, Spring Boot
- Good to have Maven, GIT, Swagger, PCF, Rabbit MQ
- Good API skills technology such as Rest Webservice
- Good foundation in data structures, algorithms and OO Design with rock solid programming skills
- Experienced on creating unit test using JUnit, Mockito or PowerMock
- Experienced on mark up language such as JSON and YML
- Experienced on using Quality and Security scan tools such as Sonar, Fortify and WebInspect
- Experienced on agile methodology
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server


Job Role:
- Build pixel-perfect, buttery smooth UIs across both mobile platforms.
- Leverage native APIs for deep integrations with both platforms.
- Diagnose and fix bugs and performance bottlenecks for performance that feels native.
- Maintain and write clean code to ensure the product is of the highest quality.

Required experience and skills
- Bachelors degree in Computer Science and Engineering. Bonus: Masters degree.
- 3+ years of total full-time work experience, preferably shipping SaaS applications.
- Startup experience strongly desired and comfortable wearing multiple hats.
- Proficiency in
- 1. Ruby, Ruby on Rails, Mysql, Linux, Git, AWS, CI/CD, NewRelic.
- 2. broad range of internet technologies and applications.
- Knowledge of frontend technologies and frontend development experience is prized and a plus. We use React and React Native for our web app and mobile apps respectively.
- Keep high quality performance, scalability, availability & security in mind to build, extend, and maintain web front end, mobile, and backend components of an evolving, real-time application
- Write bug free clean, elegant, testable code that scales well, and is delivered on time.
- Very curious and self driven.
Responsibilities
Your responsibilities include:
-
- Completing moderately complex projects containing some ambiguity with minimal oversight
- Troubleshooting and resolve most common production issues without assistance; being able to determine basic troubleshooting steps for uncommon production issues and occasionally contributes to on-call runbook
- Removing most blockers individually and proactively escalate or seek assistance to become unblocked quickly
About Tophatter
Tophatter is re-imagining discovery commerce in a world increasingly connected by smartphones. We are the world's fastest, most entertaining marketplace for mobile shoppers.
Role and Responsibilities
The candidate for the role will be responsible for enabling single view for the data from multiple sources.
- Work on creating data pipelines to graph database from data lake
- Design graph database
- Write Graph Database queries for front end team to use for visualization
- Enable machine learning algorithms on graph databases
- Guide and enable junior team members
Qualifications and Education Requirements
B.Tech with 2-7 years of experience
Preferred Skills
Must Have
Hands-on exposure to Graph Databases like Neo4J, Janus etc..
- Hands-on exposure to programming and scripting language like Python and PySpark
- Knowledge of working on cloud platforms like GCP, AWS etc.
- Knowledge of Graph Query languages like CQL, Gremlin etc.
- Knowledge and experience of Machine Learning
Good to Have
- Knowledge of working on Hadoop environment
- Knowledge of graph algorithms
- Ability to work on tight deadlines

