
Must be willing to work from Guwahati.
2. Must be from FMCG, FMCD, Manufacturing, or Plant-based setups ONLY.
3. Need candidate with 15+ years in core finance, accounts & controllership.
4. Must be a Chartered Accountant (CA), Cost Accountant (ICWA), or hold an MBA in Finance.
5. The candidate must be Proficiency in SAP (preferably SAP HANA) and MIS/financial reporting tools.
6. Need candidate having experience in P&L ownership, plant finance, sales accounting, and regional controllership
7. Candidate must be open to travel across regional plants and operational locations as required by the role.

Similar jobs
Role: Quality Analyst
Responsibilities:
Reviewing software requirements and preparing test scenarios
Executing tests on software usability
Conduct unit testing and integration testing and troubleshooting & bug management related to the modules
Preparing reports on all aspects related to the software testing carried out on a module and reporting to the technical team
Minimum Qualification:
4-year (Full-time) Bachelor’s Degree in IT/CSE/Electronics & Communication / Computer Applications or equivalent OR MCA from any university/ Professional Institution recognized by Government
6+ years of relevant experience including experience in manual or automation testing
Ability to multi-task and display strong result-oriented approach
Good communication, project execution & planning skills
Excellent inter-personal and team skills and ability to work in a multi-cultural environment.
- Looking for a senior resource with min 5+ years of hands-on experience in SAP CPQ.
- Must have experience in
- SAP CPQ Callidus Cloud with product knowledge on below points
- CPQ Product integration
- Save Configuration
- Price Object and Delivery Object Customization
- Product feature Customization
- Performance improvement,
- SAP CPQ Quote implementation
- AWS
- Python, CPQ, Iron python, HTML, CSS
- Hands on experience in JAVASCRIPT, jQuery, Postman, GitHub
- Good to have experience in scripting languages like Shell Scripting
Key Responsibilities:
- Manage and grow sales across multiple marketplaces such as Amazon and Flipkart and quick commerce platforms such as Zepto, Blinkit, Swiggy Instamart, and Dunzo.
- Develop and execute strategies for product listings, pricing, promotions, and inventory management.
- Build strong relationships with account managers across marketplaces and quick commerce platforms.
- Optimize product visibility and performance through SEO, A+ content, paid ads, and promotional campaigns.
- Analyze sales data to forecast demand, identify growth opportunities, and maximize revenue.
- Ensure smooth coordination of operations, from order fulfilment to returns management.
- Coordinate with the internal marketing team for content creation and marketing campaigns.
- Monitor and adapt to evolving marketplace algorithms, policies, and competitor activity.
Key Skills:
- Strong understanding of e-commerce and quick commerce ecosystems.
- Proven experience in managing multi-channel marketplace operations.
- Expertise in tools and strategies for marketplace sales optimization.
- Analytical mindset with proficiency in sales forecasting and performance tracking.
- Excellent negotiation and communication skills.
- Prior experience of managing marketplaces in D2C business required. Experience in nutraceuticals, wellness, or FMCG sector is a plus.

Job Description: Full Stack Developer – RattanIndia Technologies Private Limited
Based at: New Delhi
Overview:
Required Full Stack developer with 3 to 6 years of experience with competencies in development of web applications in financial services domain on salesforce development platform and shall be responsible for requirement analysis, design, coding, testing and implementation of web applications.
Key Accountabilities:
- Working closely with business teams on the analysis to understand the requirements.
- Shall be responsible for designing, coding, testing and implementation of web application.
- Will be responsible for system design and application development of web applications.
- Work closely with the product managers and other team members to build features, functionality, and application.
- Design and implementation of low-latency, high-availability, and high performance applications.
- Implementation of security and data protection.
- Demonstrate creativity and sound analytical skills in identifying effective approaches to develop solutions independently.
- Deliver features with high quality, on-time as per project plans and delivery commitments.
- Delivering status updates to the management and stakeholders regularly.
- Demonstrate creativity and sound analytical skills in identifying effective approaches to develop solutions independently.
Academic Qualification:
- Tech Computer Science/MCA from Top Tier Institutes.
Skillset Requirements:
- Expertise with Node.JS and associated technologies such as React and Express.
- Fluent in HTML5, CSS3, and JavaScript
- Experience with SQL (Postgres) data models
- Extensive experience developing and working with REST APIs.
- Experience working in a DevOps environment and using tools like Travis, Jenkins.
- Solid experience with containerization using Docker and Kubernetes.
- Proficiency with source control and team collaboration tools (GitHub, Asana, Jira, Slack)
- Experience with data interchange formats such as XML or JSON.
- Solid Experience with AWS, Heroku and Salesforce.
About RattanIndia ltd.
For Details visit: http://www.rattanindia.com">www.rattanindia.com
Title: Data Engineer (Azure) (Location: Gurgaon/Hyderabad)
Salary: Competitive as per Industry Standard
We are expanding our Data Engineering Team and hiring passionate professionals with extensive
knowledge and experience in building and managing large enterprise data and analytics platforms. We
are looking for creative individuals with strong programming skills, who can understand complex
business and architectural problems and develop solutions. The individual will work closely with the rest
of our data engineering and data science team in implementing and managing Scalable Smart Data
Lakes, Data Ingestion Platforms, Machine Learning and NLP based Analytics Platforms, Hyper-Scale
Processing Clusters, Data Mining and Search Engines.
What You’ll Need:
- 3+ years of industry experience in creating and managing end-to-end Data Solutions, Optimal
Data Processing Pipelines and Architecture dealing with large volume, big data sets of varied
data types.
- Proficiency in Python, Linux and shell scripting.
- Strong knowledge of working with PySpark dataframes, Pandas dataframes for writing efficient pre-processing and other data manipulation tasks.
● Strong experience in developing the infrastructure required for data ingestion, optimal
extraction, transformation, and loading of data from a wide variety of data sources using tools like Azure Data Factory, Azure Databricks (or Jupyter notebooks/ Google Colab) (or other similiar tools).
- Working knowledge of github or other version control tools.
- Experience with creating Restful web services and API platforms.
- Work with data science and infrastructure team members to implement practical machine
learning solutions and pipelines in production.
- Experience with cloud providers like Azure/AWS/GCP.
- Experience with SQL and NoSQL databases. MySQL/ Azure Cosmosdb / Hbase/MongoDB/ Elasticsearch etc.
- Experience with stream-processing systems: Spark-Streaming, Kafka etc and working experience with event driven architectures.
- Strong analytic skills related to working with unstructured datasets.
Good to have (to filter or prioritize candidates)
- Experience with testing libraries such as pytest for writing unit-tests for the developed code.
- Knowledge of Machine Learning algorithms and libraries would be good to have,
implementation experience would be an added advantage.
- Knowledge and experience of Datalake, Dockers and Kubernetes would be good to have.
- Knowledge of Azure functions , Elastic search etc will be good to have.
- Having experience with model versioning (mlflow) and data versioning will be beneficial
- Having experience with microservices libraries or with python libraries such as flask for hosting ml services and models would be great.
- Client Communication & Handling
- Providing demos to clients as required
- Lead Generation
- Conceiving and developing efficient and intuitive marketing strategies
- Coordinating with clients for time to time updating services
- This job will be purely field Marketing job.
Scala/Akka, Microservices, Akka streams, Play, Kafka, NoSQL. Java/J2EE, Spring, Architecture & Design, Enterprise integration ML/Data analytics, BigData technologies, Python/R programming, ML Algorithms/data models, ML tools(H2O/Tensorflow/Prediction.io), Realtime dashboards & batch, cluster compute (Spark/Akka), Distributed Data platforms (HDFS, Elastic search, Splunk, Casandra), Kafka with cross DC replication, NoSQL & In-memory DB
NAME OF THE ORGANIZATION:
Truminds Software Systems
LOCATION:
Gurgaon
POSITION TITLE
Backend Engineer
YEARS OF EXPERIENCE:
4--7 Yrs.
JOB DESCRIPTION:
Good programming experience on Nodejs, Express, promises, having developed production applications using these frameworks.
Should be able to architect and design the backend services without supervision
Interface and integration with Mysql or MongoDB databases
Deployment experience on AWS including understanding of Nginx, AWS services would be desirable
EDUCATIONAL QUALIFICATION:
• B.E./B.Tech. in Computer Science/ IT or MCA would be preferred.
• Excellent oral and written communication skills

