
Experience in PLSQL OR PL/SQL OR Oracle
SQL Server

About Decimal Technologies
About
Connect with the team
Similar jobs
About the Company
We are a fast-growing D2C FMCG food brand focused on delivering high-quality food products through both owned and third-party e-commerce platforms. The organization operates in a dynamic, customer-first environment and emphasizes innovation, digital expansion, and operational excellence across online marketplaces. With a strong presence across leading e-commerce and quick-commerce platforms, the company is rapidly scaling its digital footprint nationwide.
Roles and Responsibilities
1. Online Catalog Management
- Ensure product catalog hygiene across all e-commerce platforms
- Update product listings, descriptions, pricing, and availability
- Conduct regular audits to identify and resolve listing issues or inconsistencies
2. E-Commerce Platform Operations
- Manage day-to-day operations on platforms such as Amazon, Shopify, BigBasket, Swiggy Instamart, Blinkit, and others
- Coordinate with platform points of contact (POCs) for promotions, campaigns, and operational alignment
- Resolve technical and operational issues in a timely manner
3. Sales and Performance Optimization
- Drive sales growth through data-backed strategies and promotions
- Plan and execute initiatives to improve product visibility and conversion
- Identify and onboard new e-commerce platforms to expand digital reach
- Analyse sales trends and optimize listings for performance
4. Customer Coordination
- Handle customer queries, complaints, and feedback across platforms
- Work closely with customer support teams to enhance the buying experience
- Implement strategies to improve customer satisfaction and online ratings
5. Logistics and Order Fulfilment
- Coordinate with production, warehousing, and logistics teams
- Track and resolve issues related to delays, returns, and cancellations
- Monitor inventory levels to avoid stock-outs or overstocking
6. Online Reputation Management
- Monitor brand reviews and ratings across e-commerce platforms
- Respond to customer feedback professionally and promptly
- Collaborate with marketing teams to maintain a positive brand image
7. Data Analysis and Reporting
- Analyse sales and platform performance using Excel and related tools
- Prepare weekly and monthly performance reports
- Share actionable insights to optimize campaigns and product placement
8. Cross-Functional Collaboration
- Work closely with marketing, production, warehouse, and logistics teams
- Coordinate with digital media teams on platform-specific promotions
- Stay updated on e-commerce trends and best practices
Desired Candidate Profile
Experience:
- Minimum 1 years of experience in e-commerce operations
- FMCG or D2C background preferred
Platform Expertise:
- Hands-on experience with Amazon, Shopify, Swiggy Instamart, Flipkart, and similar platforms
Education:
- Bachelor’s degree (BBA, BMS, or equivalent preferred)
Job Title: Business Development Intern (0–1 Year Experience)
Location: Gurgaon (On-site)
About the Role:
We are looking for a motivated Business Development Intern to join our team. This role is ideal for freshers who have strong communication skills and basic knowledge of business development or IT sales.
Key Requirements:
- 0–1 years of experience (Freshers welcome)
- Excellent communication skills (verbal and written)
- Basic knowledge of IT sales, lead generation, or B2B business development
Responsibilities:
- Assist in lead generation and client outreach through calls, emails, and LinkedIn
- Support the team in communicating with potential clients
- Learn and contribute to the sales process
Education:
- B.Tech or MBA preferred, but candidates with strong communication skills are encouraged to apply
Role: Mainframe Developer
Skill Set: Mainframe, COBOL, JCL, DB2
Years of exp: 5 to 12 yrs
Location: Pune , Chennai
Mode of work: WFO
Job Description:
A mid-senior level Mainframe lead with 4 to 8 years of hands on experienced on COBOL Programming, JCL and DB2 Coding and processing, to deliver a critical project for one of our biggest clients in banking domain. The Individual should be passionate about technology, experienced in developing and managing cutting edge technology applications.
Technical Skills:
- An excellent techie with strong hands-on experience in COBOL, JCL and DB2 knowledge.
- Preferably, good exposure to Mainframe to distributed migration project.
- A master of DB2
- Should have good analytical and development skill on Cobol programming and JCL
- Capable of analyzing requirements and develop software as per project defined software process
- Develop and peer review of LLD (Initiate/ participate in peer reviews)
- Should have good writing and verbal communication skills
Job Title: Tech Lead and SSE – Kafka, Python, and Azure Databricks (Healthcare Data Project)
Experience: 4 to 12 years
Role Overview:
We are looking for a highly skilled Tech Lead with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams.
Key Responsibilities:
- Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks.
- Architect scalable data streaming and processing solutions to support healthcare data workflows.
- Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data.
- Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.).
- Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions.
- Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows.
- Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering.
- Stay updated with the latest cloud technologies, big data frameworks, and industry trends.
Required Skills & Qualifications:
- 4+ years of experience in data engineering, with strong proficiency in Kafka and Python.
- Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing.
- Experience with Azure Databricks (or willingness to learn and adopt it quickly).
- Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus).
- Proficiency in SQL, NoSQL databases, and data modeling for big data processing.
- Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications.
- Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus.
- Strong analytical skills, problem-solving mindset, and ability to lead complex data projects.
- Excellent communication and stakeholder management skills.
Sr. HR Applications Architect
Job Title – Technical Solution Architect/ Sr HR Applications Architect
Team – GIS
Role Type – Individual Contributor
Key relationships –
HR leads of various verticals
Technology and implementation teams
You will be supported by your peers and experts across many fields who will help you succeed.
Job Responsibilities:
COMPANY HR-Applications team is looking for a passionate, engaging Sr HR Applications Technical Architect to join our growing team. This role will perform Technology evaluation, Identification, Solution Design, Execute the design for entire stack of HR-Applications echo-system and perform Technical Production Support.
Designs, develops, modifies, debugs and evaluates programs for functional areas, including but not limited to finance, human resources, manufacturing and marketing. Analyzes existing programs or formulates logic for new systems, devises logic procedures, prepares flowcharting, performs coding and tests/debugs programs. Develops conversion and system implementation plans. Prepares and obtains approval of system and programming documentation. Recommends changes in development, maintenance and system standards. Trains users in conversion and implementation of system. May be internal or external, client-focused, working in conjunction with Professional Services and outsourcing functions. May include company-wide, web-enabled solutions.
Role Purpose:
Lead design and implementation of the HR systems of the organization mainly in SAP SuccessFactors and Cornerstone on Demand
Interface with business stakeholders, assess feasibility of the requirements and guide the Technology Leads and Implementation teams to align the solution development
Front-run the LMS migration initiative from SAP SuccessFactors to Cornerstone on Demand ensuring a scalable solution to accommodate future enhancements and adoption to all BU’s of Company
Explore new technologies and practices, be a part of the core team building an HR COE and define the standards and best practices
Act as a SPOC/L3 for the current product support related activities and the Learning HR-Echo System
Cross- training teams on knowledge transfer across business functions
Qualifications & Experience:
Excellent grasp of one or more HR systems, preferably SuccessFactors and Cornerstone on Demand
Proven experience leading System Integrations, Data Migrations, Implementations, Assessments and Process Improvements on technical stack
10+ years of experience as an HR Enterprise Architect
Knowledge of Learning Management Systems (LMS) is desired
Experience working in a Global Production support model
Looking for Dot Net Developer| Gurgaon to join a team of rockstar developers. The candidate should have a minimum of 3+ yrs. There are multiple openings. If you're looking for career growth & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans/BITS grads with 10+ years of development experience + work with F500 companies (our customers).
Company Background - CodeVyasa is a Software Product-Engineering and Development company that helps Early-stage & Mid-Market Product companies with IT Consulting, App Development, and On-demand Tech Resources. Our Journey over the last 3 years has been nothing short of a roller-coaster. Along our way, we've won some of the most prestigious awards while driving immense value to our customers & employees. Here's the link to our website (codevyasa.com). To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q1 2024
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience).
- Minimum of 3 years of experience as a Dot net, c#.
- Proficiency in Dot net core development.
- Strong knowledge of SQL Server, API.
- Aptitude for learning new technologies quickly.
- Good problem-solving and analytical skills
Role - Senior FW Engineer
Location - Ahmedabad
Looking for a person to work on our control system firmware for electric vehicles. The person will be spearheading the development of automotive quality control system firmware working with the system intelligence team and with the Hardware team to bring out next generation subsystems for the product impacting the entire EV market.
Qualifications: B.E./ B.Tech. (Electronics/ Electrical/ Mechatronics/ Mechanical)
Employment: Permanent / Full Time
Location: Ahmedabad
Experience: 3 to 5 years in Firmware development
Job Responsibilities
- Working with motor controller firmware and control systems.
- Working with PFC and PSFB sections of charge controllers.
- Working closely with the System Intelligence team to implement system control algorithms.
- Write maintainable, MISRA compliant, safety compliant code.
- Unit test the system for its requirements.
Eligibility
- 3 to 5 years of experience in firmware development.
- Automotive firmware development experience is a plus.
- Familiar with firmware concepts such as RTOS and bootloaders.
- Experience in working with safety critical systems.
- Experience with MISRA C.
- Experience with unit testing frameworks.
- Familiarity with TDD methodologies.
- Experience with version control tools such as Git , SVN.
About the Company
Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!
We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!
Your Role
Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.
Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.
Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.
Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.
Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.
Requirements
These are must have skill-sets that we are looking for:
- Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
- Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
- Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
- Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Capable of writing clear and lucid reports and demystifying data for the rest of us.
- Be curious and care about the planet!
- Minimum 2 years of demonstrable industry experience working with large and noisy datasets.
Benefits
- Work from anywhere: Work by the beach or from the mountains.
- Open source at heart: We are building a community where you can use, contribute and collaborate on.
- Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
- Flexible timings: Fit your work around your lifestyle.
- Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
- Work Machine of choice: Buy a device and own it after completing a year at BSA.
- Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
- Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
- Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support
- Build technical solutions using latest open source and cloud based technologies
- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements
- Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment
- Build demos and POCs in support of business development for new and existing clients
- Prepare project documents and PowerPoint presentations for client communication
- Conduct training sessions to train associates and help shape their growth











