Cutshort logo
CoffeeBeans
CoffeeBeans  cover picture
Founded :
2017
Type :
Products & Services
Size :
20-100
Stage :
Bootstrapped

About

CoffeeBeans Consulting is a technology partner dedicated to driving business transformation. With deep expertise in Cloud, Data, MLOPs, AI, Infrastructure services, Application modernization services, Blockchain, and Big Data, we help organizations tackle complex challenges and seize growth opportunities in today’s fast-paced digital landscape. We’re more than just a tech service provider; we're a catalyst for meaningful change

Read more

Tech stack

skill iconJava
SQL
NOSQL Databases

Candid answers by the company

What does the company do?
What is the location preference of jobs?

CoffeeBeans Consulting, founded in 2017, is a high-end technology consulting firm that helps businesses build better products and improve delivery quality through a mix of engineering, product, and process expertise. They work across domains to deliver scalable backend systems, data engineering pipelines, and AI-driven solutions, often using modern stacks like Java, Spring Boot, Python, Spark, Snowflake, Azure, and AWS. With a strong focus on clean architecture, performance optimization, and practical problem-solving, CoffeeBeans partners with clients for both internal and external projects—driving meaningful business outcomes through tech excellence.

Company social profiles

linkedintwitter

Jobs at CoffeeBeans

CoffeeBeans
at CoffeeBeans
2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Pune, Hyderabad
5 - 8 yrs
Upto ₹28L / yr (Varies
)
Apache Spark
skill iconScala
skill iconPython

Focus Areas:

  • Build applications and solutions that process and analyze large-scale data.
  • Develop data-driven applications and analytical tools.
  • Implement business logic, algorithms, and backend services.
  • Design and build APIs for secure and efficient data exchange.

Key Responsibilities:

  • Develop and maintain data processing applications using Apache Spark and Hadoop.
  • Write MapReduce jobs and complex data transformation logic.
  • Implement machine learning models and analytics solutions for business use cases.
  • Optimize code for performance and scalability; perform debugging and troubleshooting.
  • Work hands-on with Databricks for data engineering and analysis.
  • Design and manage Airflow DAGs for orchestration and automation.
  • Integrate and maintain CI/CD pipelines (preferably using Jenkins).

Primary Skills & Qualifications:

  • Strong programming skills in Scala and Python.
  • Expertise in Apache Spark for large-scale data processing.
  • Solid understanding of data structures and algorithms.
  • Proven experience in application development and software engineering best practices.
  • Experience working in agile and collaborative environments.


Read more
CoffeeBeans
at CoffeeBeans
2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Pune
7 - 9 yrs
Upto ₹32L / yr (Varies
)
skill iconPython
ETL
Data modeling
CI/CD
databricks
+2 more

We are looking for experienced Data Engineers who can independently build, optimize, and manage scalable data pipelines and platforms.

In this role, you’ll:

  • Work closely with clients and internal teams to deliver robust data solutions powering analytics, AI/ML, and operational systems.
  • Mentor junior engineers and bring engineering discipline into our data engagements.

Key Responsibilities

  • Design, build, and optimize large-scale, distributed data pipelines for both batch and streaming use cases.
  • Implement scalable data models, warehouses/lakehouses, and data lakes to support analytics and decision-making.
  • Collaborate with stakeholders to translate business requirements into technical solutions.
  • Drive performance tuning, monitoring, and reliability of data pipelines.
  • Write clean, modular, production-ready code with proper documentation and testing.
  • Contribute to architectural discussions, tool evaluations, and platform setup.
  • Mentor junior engineers and participate in code/design reviews.

Must-Have Skills

  • Strong programming skills in Python and advanced SQL expertise.
  • Deep understanding of ETL/ELT, data modeling (OLTP & OLAP), warehousing, and stream processing.
  • Hands-on with distributed data processing frameworks (Apache Spark, Flink, or similar).
  • Experience with orchestration tools like Airflow (or similar).
  • Familiarity with CI/CD pipelines and Git.
  • Ability to debug, optimize, and scale data pipelines in production.

Good to Have

  • Experience with cloud platforms (AWS preferred; GCP/Azure also welcome).
  • Exposure to Databricks, dbt, or similar platforms.
  • Understanding of data governance, quality frameworks, and observability.
  • Certifications (e.g., AWS Data Analytics, Solutions Architect, or Databricks).

Other Expectations

  • Comfortable working in fast-paced, client-facing environments.
  • Strong analytical and problem-solving skills with attention to detail.
  • Ability to adapt across tools, stacks, and business domains.
  • Willingness to travel within India for short/medium-term client engagements, as needed.
Read more
CoffeeBeans
at CoffeeBeans
2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5 - 8 yrs
Upto ₹26L / yr (Varies
)
Spark
skill iconScala
SQL
NOSQL Databases
Windows Azure
+2 more

Roles and responsibilities-


- Tech-lead in one of the feature teams, candidate need to be work along with team lead in handling the team without much guidance

- Good communication and leadership skills

- Nurture and build next level talent within the team

- Work in collaboration with other vendors and client development team(s)

- Flexible to learn new tech areas

- Lead complete lifecycle of feature - from feature inception to solution, story grooming, delivery, and support features in production

- Ensure and build the controls and processes for continuous delivery of applications, considering all stages of the process and its automations

- Interact with teammates from across the business and comfortable explaining technical concepts to nontechnical audiences

- Create robust, scalable, flexible, and relevant solutions that help transform product and businesses


Must haves: 

- Spark

- Scala

- Postgres(or any SQL DB)s

- Elasticsearch(or any No-SQL DB)

- Azure (if not, any other cloud experience)

- Big data processing


Good to have:

- Golang

- Databricks

- Kubernetes

Read more
CoffeeBeans
at CoffeeBeans
2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Hyderabad
5 - 8 yrs
Upto ₹25L / yr (Varies
)
DevOps
skill iconKubernetes
skill iconAmazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)

We are seeking an experienced Lead DevOps Engineer with deep expertise in Kubernetes infrastructure design and implementation. This role requires someone who can architect, build, and manage enterprise-grade Kubernetes clusters from the ground up. You’ll lead modernization initiatives, shape infrastructure strategy, and work with cutting-edge cloud-native technologies.


🚀 Key Responsibilities

Infrastructure Design & Implementation

  • Architect and design enterprise-grade Kubernetes clusters across AWS, Azure, and GCP.
  • Build production-ready Kubernetes infrastructure with HA, scalability, and security best practices.
  • Implement Infrastructure as Code with Terraform, Helm, and GitOps workflows.
  • Set up monitoring, logging, and observability for Kubernetes workloads.
  • Design and execute backup and disaster recovery strategies for containerized applications.

Leadership & Team Management

  • Lead a team of 3–4 DevOps engineers, providing technical mentorship.
  • Drive best practices in containerization, orchestration, and cloud-native development.
  • Collaborate with development teams to optimize deployment strategies.
  • Conduct code reviews and maintain infrastructure quality standards.
  • Build knowledge-sharing culture with documentation and training.

Operational Excellence

  • Manage and scale CI/CD pipelines integrated with Kubernetes.
  • Implement security policies (RBAC, network policies, container scanning).
  • Optimize cluster performance and cost-efficiency.
  • Automate operations to minimize manual interventions.
  • Ensure 99.9% uptime for production workloads.

Strategic Planning

  • Define the infrastructure roadmap aligned with business needs.
  • Evaluate and adopt new cloud-native technologies.
  • Perform capacity planning and cloud cost optimization.
  • Drive risk assessment and mitigation strategies.

🛠 Must-Have Technical Skills

Kubernetes Expertise

  • 6+ years of hands-on Kubernetes experience in production.
  • Deep knowledge of Kubernetes architecture (etcd, API server, scheduler, kubelet).
  • Advanced Kubernetes networking (CNI, Ingress, Service mesh).
  • Strong grasp of Kubernetes storage (CSI, PVs, StorageClasses).
  • Experience with Operators and Custom Resource Definitions (CRDs).

Infrastructure as Code

  • Terraform (advanced proficiency).
  • Helm (developing and managing complex charts).
  • Config management tools (Ansible, Chef, Puppet).
  • GitOps workflows (ArgoCD, Flux).

Cloud Platforms

  • Hands-on experience with at least 2 of the following:
  • AWS: EKS, EC2, VPC, IAM, CloudFormation
  • Azure: AKS, VNets, ARM templates
  • GCP: GKE, Compute Engine, Deployment Manager

CI/CD & DevOps Tools

  • Jenkins, GitLab CI, GitHub Actions, Azure DevOps
  • Docker (advanced optimization and security practices)
  • Container registries (ECR, ACR, GCR, Docker Hub)
  • Strong Git workflows and branching strategies

Monitoring & Observability

  • Prometheus & Grafana (metrics and dashboards)
  • ELK/EFK stack (centralized logging)
  • Jaeger/Zipkin (tracing)
  • AlertManager (intelligent alerting)

💡 Good-to-Have Skills

  • Service Mesh (Istio, Linkerd, Consul)
  • Serverless (Knative, OpenFaaS, AWS Lambda)
  • Running databases in Kubernetes (Postgres, MongoDB operators)
  • ML pipelines (Kubeflow, MLflow)
  • Security tools (Aqua, Twistlock, Falco, OPA)
  • Compliance (SOC2, PCI-DSS, GDPR)
  • Python/Go for automation
  • Advanced Shell scripting (Bash/PowerShell)

🎓 Qualifications

  • Bachelor’s in Computer Science, Engineering, or related field.
  • Certifications (preferred):
  • Certified Kubernetes Administrator (CKA)
  • Certified Kubernetes Application Developer (CKAD)
  • Cloud provider certifications (AWS/Azure/GCP).

Experience

  • 6–7 years of DevOps/Infrastructure engineering.
  • 4+ years of Kubernetes in production.
  • 2+ years in a lead role managing teams.
  • Experience with large-scale distributed systems and microservices.


Read more
CoffeeBeans
at CoffeeBeans
2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Pune
5 - 7 yrs
Upto ₹22L / yr (Varies
)
skill iconPython
SQL
ETL
Data modeling
Spark
+6 more

Role Overview

We're looking for experienced Data Engineers who can independently design, build, and manage scalable data platforms. You'll work directly with clients and internal teams to develop robust data pipelines that support analytics, AI/ML, and operational systems.

You’ll also play a mentorship role and help establish strong engineering practices across our data projects.

Key Responsibilities

  • Design and develop large-scale, distributed data pipelines (batch and streaming)
  • Implement scalable data models, warehouses/lakehouses, and data lakes
  • Translate business requirements into technical data solutions
  • Optimize data pipelines for performance and reliability
  • Ensure code is clean, modular, tested, and documented
  • Contribute to architecture, tooling decisions, and platform setup
  • Review code/design and mentor junior engineers

Must-Have Skills

  • Strong programming skills in Python and advanced SQL
  • Solid grasp of ETL/ELT, data modeling (OLTP & OLAP), and stream processing
  • Hands-on experience with frameworks like Apache Spark, Flink, etc.
  • Experience with orchestration tools like Airflow
  • Familiarity with CI/CD pipelines and Git
  • Ability to debug and scale data pipelines in production

Preferred Skills

  • Experience with cloud platforms (AWS preferred, GCP or Azure also fine)
  • Exposure to Databricks, dbt, or similar tools
  • Understanding of data governance, quality frameworks, and observability
  • Certifications (e.g., AWS Data Analytics, Solutions Architect, Databricks) are a bonus

What We’re Looking For

  • Problem-solver with strong analytical skills and attention to detail
  • Fast learner who can adapt across tools, tech stacks, and domains
  • Comfortable working in fast-paced, client-facing environments
  • Willingness to travel within India when required
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

HealthAsyst cover picture
HealthAsyst's logo

HealthAsyst

http://www.healthasyst.com
Founded
1999
Type
Services
Size
100-1000
Stage
Profitable

About the company

Among the multitude of healthcare IT companies, HealthAsyst has the required expertise in the US healthcare domain helping us offer world class healthcare IT services & solutions with improvement in productivity and efficiency.


HealthAsyst® is currently a leading provider of IT services to some of the largest healthcare IT vendors in the United States.


We bring the value of cutting-edge technology through our deep expertise in product engineering, custom software development, testing, large-scale healthcare IT implementation and integrations, on-going maintenance and support, BI & Analytics, and remote monitoring platforms.


As a true partner, we help our customers navigate a complex regulatory landscape, deal with cost pressures, and offer high-quality services. As a healthcare transformation agent, we enable innovation in technology and accelerate problem-solving while delivering unmatched cost benefits to healthcare technology companies.


Since 1999, HealthAsyst IT Services has been delivering technology solutions through a systematic, consultative approach to prestigious healthcare tech companies.

Jobs

4

Certa cover picture
Certa's logo

Certa

https://www.getcerta.com
Founded
2018
Type
Products & Services
Size
100-1000
Stage
Raised funding

About the company

Certa’s no-code platform makes it easy to digitize and manage the lifecycle of all your suppliers, partners, and customers. With automated onboarding, contract lifecycle management, and ESG management, Certa eliminates the procurement bottleneck and allows companies to onboard third-parties 3x faster.

Jobs

4

Data Axle cover picture
Data Axle's logo

Data Axle

https://data-axle.com/
Founded
1972
Type
Services
Size
1000-5000
Stage
Profitable

About the company

Data Axle is a product company that offers various data and technology solutions, including software-as-a-service (SaaS) and data-as-a-service (DaaS). These solutions help businesses manage and leverage data for marketing, sales, and business intelligence.


They are data-driven marketing solutions provider that helps clients with clean data, lead generation, strategy development, campaign design, and day-to-day execution needs. It solves the problem of inaccurate and incomplete data, enabling businesses to make informed decisions and drive growth. Data Axle operates in various industries, including healthcare, finance, retail, and technology.


About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission

critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.


Data Axle India is recognized as a Great Place to Work!

This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

Jobs

4

Quadron iSolutions Pvt Ltd cover picture
Quadron iSolutions Pvt Ltd's logo

Quadron iSolutions Pvt Ltd

http://www.quadronisolutions.com
Founded
2017
Type
Services
Size
20-100
Stage
Profitable

About the company

Quadron iSolutions Pvt. Ltd believes in providing best quality service on a 24x7 basis. We ensure that our Clients have end to end communication with their customers yielding in best client satisfaction. We have a blend of perfectly trained and qualified staff that can provide you best quality services at par with the industry standards at the most cost-effective manner. We work as a strategic partner to help our clients streamline their business operations. Company Details: Headquarters Pune, Maharashtra Year founded 2017 Company type Privately Held Company size 11-50 employees Specialties • Business Intelligence • Data Entry • Data Mining • BPO Services • Compliance • Web Developing • Digital Marketing • Electrical Designing

Jobs

1

Koolioai cover picture
Koolioai's logo

Koolioai

https://www.koolio.ai/
Founded
2023
Type
Product
Size
0-20
Stage
Raised funding

About the company

Jobs

2

OIP Insurtech cover picture
OIP Insurtech's logo

OIP Insurtech

https://www.oipinsurtech.com/
Founded
2012
Type
Products & Services
Size
1000-5000
Stage
Profitable

About the company

OIP Insurtech streamlines insurance operations and optimizes workflows by combining deep industry knowledge with advanced technology. Established in 2012, OIP InsurTech partners with carriers, MGAs, program managers, and TPAs in the US, Canada, and Europe, especially the UK.


With 1,200 professionals serving over 100 clients, we deliver insurance process automation, custom software development, high-quality underwriting services, and skilled tech staff to augment our clients.


While saving time and money is the immediate win, the real game-changer is giving our clients the freedom to grow their books, run their businesses, and focus on what they love. We’re proud to support them on this journey and make a positive impact on the industry!

Jobs

4

OpenIAM cover picture
OpenIAM's logo

OpenIAM

http://www.openiam.com
Founded
2008
Type
Product
Size
20-100
Stage
Bootstrapped

About the company

OpenIAM is a pioneering Identity and Access Management (IAM) solutions provider that has been transforming enterprise security since 2008. Based in New York, this self-funded and profitable company has established itself as an innovator in the IAM space, being the first to introduce a converged architecture stack and fully containerized suite for cloud environments. With a global presence and partnerships with major systems integrators like Thales and Indra, OpenIAM serves mid to large enterprises across various sectors including financial services, healthcare, education, and manufacturing.

Jobs

2

OneSpider Technologies LLP cover picture
OneSpider Technologies LLP's logo

OneSpider Technologies LLP

https://www.onespider.in
Founded
2021
Type
Products & Services
Size
0-20
Stage
Bootstrapped

About the company

OneSpider Technologies LLP is a leading provider of software and mobile application solutions for the Pharma and FMCG sector. Our products help distributors and retailers streamline operations.

Jobs

1

Automate Accounts cover picture
Automate Accounts's logo

Automate Accounts

https://www.automateaccounts.com/
Founded
2015
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

Automate Accounts is a technology-driven company dedicated to building intelligent automation solutions that streamline business operations and boost efficiency. We leverage modern platforms and tools to help businesses transform their workflows with cutting-edge solutions.

Jobs

1

Founded
2022
Type
Product
Size
20-100
Stage
Raised funding

About the company

Jobs

2

Want to work at CoffeeBeans ?
CoffeeBeans 's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs