

Understand business problems and translate business requirements into technical requirements.
Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it.
Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization.
Gather requirements and communicate findings in the form of a meaningful story with the stakeholders.
Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery
adoption.
Lead and mentor data analysts.
What we are looking for-
Apart from your love for data and ability to code even while sleeping you would need the following.
Minimum of 02 years of experience in designing and delivery of data science solutions.
You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off.
Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand.
Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc.
Bachelors/Masters degree in Engineering/Technology or MBA from
Tier-1 B School or MSc. in Statistics or Mathematics.

About Ganit Business Solutions
About
Ganit Inc. is in the business of enhancing the Decision Making Power (DMP) of businesses by offering solutions that lie at the crossroads of discovery-based artificial intelligence, hypothesis-based analytics, and the Internet of Things (IoT).
The company's offerings consist of a functioning product suite and a bespoke service offering as its solutions. The goal is to integrate these solutions into the core of their client's decision-making processes as seamlessly as possible. Customers in the FMCG/CPG, Retail, Logistics, Hospitality, Media, Insurance, and Banking sectors are served by Ganit's offices in both India and the United States. The company views data as a strategic resource that may assist other businesses in achieving growth in both their top and bottom lines of business. We build and implement AI and ML solutions that are purpose-built for certain sectors to increase decision velocity and decrease decision risk.
Connect with the team
Company social profiles
Similar jobs

Job Title : Lead Web App Developer – Frontend
Experience : 10+ Years
Location : Bengaluru
Work Mode : Hybrid (3 days in office)
Role Overview :
We are seeking a Lead Frontend Engineer to build and maintain responsive, cloud-based web applications. This role involves developing efficient user interfaces using React.js, collaborating closely with UX and backend teams, and ensuring high-quality code.
Key Responsibilities:
- Frontend Development: Create and maintain web applications using React.js, HTML, CSS, and JavaScript/TypeScript.
- Collaboration: Work with UX designers to deliver excellent user experiences and ensure seamless backend integration.
- Code Quality: Write clean, efficient, and maintainable code while staying updated with modern practices.
- Build & Deployment: Manage build pipelines with Jenkins, deploy to CDNs, and leverage GCP for scalable solutions.
- Technical Expertise: Implement JavaScript frameworks and libraries for web analytics and use Git for version control.
Requirements:
Technical Skills:
- Proficiency in React.js, HTML, CSS, JavaScript/TypeScript, and Node.js frameworks.
- Strong debugging and problem-solving skills.
- Exposure to RDBMS and SQL (preferred).
Soft Skills:
- Excellent communication and collaboration abilities.
- Leadership skills to guide and inspire a team of developers.
- Education: Bachelor's/Master’s in Computer Science or related fields.
Notice Period: 30 days preferred
Interview Process:
- Technical Round: Online technical assessment (1 hour).
- Face-to-Face Interview: In-person interview with the team at the Bengaluru office (3 hours).
- Final Round: Online interview with the CEO (30 minutes).
Additional Information:
- Candidates from Bengaluru or willing to relocate are preferred.
- F2F interview attendance on a weekday is mandatory.
- 3+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Experience in AWS.
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
- 2.5+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Experience in AWS.
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving

Requirements:
● Understanding our data sets and how to bring them together.
● Working with our engineering team to support custom solutions offered to the product development.
● Filling the gap between development, engineering and data ops.
● Creating, maintaining and documenting scripts to support ongoing custom solutions.
● Excellent organizational skills, including attention to precise details
● Strong multitasking skills and ability to work in a fast-paced environment
● 5+ years experience with Python to develop scripts.
● Know your way around RESTFUL APIs.[Able to integrate not necessary to publish]
● You are familiar with pulling and pushing files from SFTP and AWS S3.
● Experience with any Cloud solutions including GCP / AWS / OCI / Azure.
● Familiarity with SQL programming to query and transform data from relational Databases.
● Familiarity to work with Linux (and Linux work environment).
● Excellent written and verbal communication skills
● Extracting, transforming, and loading data into internal databases and Hadoop
● Optimizing our new and existing data pipelines for speed and reliability
● Deploying product build and product improvements
● Documenting and managing multiple repositories of code
● Experience with SQL and NoSQL databases (Casendra, MySQL)
● Hands-on experience in data pipelining and ETL. (Any of these frameworks/tools: Hadoop, BigQuery,
RedShift, Athena)
● Hands-on experience in AirFlow
● Understanding of best practices, common coding patterns and good practices around
● storing, partitioning, warehousing and indexing of data
● Experience in reading the data from Kafka topic (both live stream and offline)
● Experience in PySpark and Data frames
Responsibilities:
You’ll
● Collaborating across an agile team to continuously design, iterate, and develop big data systems.
● Extracting, transforming, and loading data into internal databases.
● Optimizing our new and existing data pipelines for speed and reliability.
● Deploying new products and product improvements.
● Documenting and managing multiple repositories of code.
Candidate will actively take Initiatives time to time to enhance design services
We are a self organized engineering team with a passion for programming and solving business problems for our customers. We are looking to expand our team capabilities on the DevOps front and are on a lookout for 4 DevOps professionals having relevant hands on technical experience of 4-8 years.
We encourage our team to continuously learn new technologies and apply the learnings in the day to day work even if the new technologies are not adopted. We strive to continuously improve our DevOps practices and expertise to form a solid backbone for the product, customer relationships and sales teams which enables them to add new customers every week to our financing network.
As a DevOps Engineer, you :
- Will work collaboratively with the engineering and customer support teams to deploy and operate our systems.
- Build and maintain tools for deployment, monitoring and operations.
- Help automate and streamline our operations and processes.
- Troubleshoot and resolve issues in our test and production environments.
- Take control of various mandates and change management processes to ensure compliance for various certifications (PCI and ISO 27001 in particular)
- Monitor and optimize the usage of various cloud services.
- Setup and enforce CI/CD processes and practices
Skills required :
- Strong experience with AWS services (EC2, ECS, ELB, S3, SES, to name a few)
- Strong background in Linux/Unix administration and hardening
- Experience with automation using Ansible, Terraform or equivalent
- Experience with continuous integration and continuous deployment tools (Jenkins)
- Experience with container related technologies (docker, lxc, rkt, docker swarm, kubernetes)
- Working understanding of code and script (Python, Perl, Ruby, Java)
- Working understanding of SQL and databases
- Working understanding of version control system (GIT is preferred)
- Managing IT operations, setting up best practices and tuning them from time-totime.
- Ensuring that process overheads do not reduce the productivity and effectiveness of small team. - Willingness to explore and learn new technologies and continuously refactor thetools and processes.
Client Retention & Servicing
Coordinating client request with back-end teams.
Supervision (Only) of Centre Operations
- Must Have Experience in JavaScript and jQuery
- Immediate joiner required
- Work from office
- Paid leaves Facility
-
Owns the end to end implementation of the assigned data processing components/product features i.e. design, development, dep
loyment, and testing of the data processing components and associated flows conforming to best coding practices -
Creation and optimization of data engineering pipelines for analytics projects.
-
Support data and cloud transformation initiatives
-
Contribute to our cloud strategy based on prior experience
-
Independently work with all stakeholders across the organization to deliver enhanced functionalities
-
Create and maintain automated ETL processes with a special focus on data flow, error recovery, and exception handling and reporting
-
Gather and understand data requirements, work in the team to achieve high-quality data ingestion and build systems that can process the data, transform the data
-
Be able to comprehend the application of database index and transactions
-
Involve in the design and development of a Big Data predictive analytics SaaS-based customer data platform using object-oriented analysis
, design and programming skills, and design patterns -
Implement ETL workflows for data matching, data cleansing, data integration, and management
-
Maintain existing data pipelines, and develop new data pipeline using big data technologies
-
Responsible for leading the effort of continuously improving reliability, scalability, and stability of microservices and platform


As a Full Stack Software Engineer at Clipboard Health, you will help shape the way nurses find jobs and the way healthcare systems hire nurses and other practitioners today. We help nurses be able to work when and where they want, and to enable healthcare facilities to not have a staffing shortage.
We're a Y Combinator backed company (W2017) and are looking for a full-stack engineer. We currently use React, Node, Express, Mongo and MySQL.
What we are looking for:
• 2+ years experience in a software engineering role
• Have written and are comfortable writing apps in React and Node.js
• Ability to ship quickly and with elegant code
• Expertise in at least one language (Javascript, Python, Ruby) and proficient in Javascript
• Bonus if you're product-oriented
• Passion to build a business that solve problems for real people
About Our Benefits:
• Unlimited vacation
• Autonomy and ownership over your work

