

Koantek
https://koantek.comAbout
Jobs at Koantek
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Similar companies
About the company
Jobs
2
About the company
Baker Street Fintech (Product Name: Cambridge Wealth) is a Financial Products Company. We help build world-class Fintech Products for our Clients who want to manage their wealth on our platform. Founded by professionals with Experiences spanning from PwC UK to Banking and Technology firms, we are a financially stable, profitable company growing quickly!
Jobs
3
About the company
Jobs
0
About the company
About Pendo
Pendo is a leading product experience and software analytics platform that helps companies understand how users interact with their software and improve those experiences. It operates in the product analytics and digital adoption space, enabling organizations to combine analytics, in-app guidance, and user feedback in one unified platform.
Pendo – Key Highlights
- Founded in 2013, headquartered in Raleigh, North Carolina
- Serves 14,000+ companies globally
- Processes 20B+ daily events and supports 1B+ users
- 850+ employees across global offices
- Raised $350M+ total funding from investors like General Atlantic, Tiger Global, and Sapphire Ventures
Chisel was acquired by Pendo in 2026, marking a key milestone in its journey. The acquisition strengthens Pendo’s push into AI-driven product experience, with Chisel’s agentic capabilities becoming a core part of Pendo’s broader platform vision.
Chisel Labs is an AI-powered product management platform built to help product teams move faster and make better decisions. It operates in the product management and AI SaaS space, bringing feedback, roadmapping, and documentation into a unified system of record.
At its core, Chisel functions as an AI PM Agent, automating workflows like PRDs, research, and feedback analysis - allowing teams to focus on strategy, prioritization, and product outcomes.
About Chisel
Chisel is a lean, globally distributed team with presence across the US and India. The team operates at the intersection of AI, product management, and enterprise SaaS, with a strong emphasis on ownership, speed, and building for real-world product teams at scale. Post-acquisition, the team is now part of Pendo’s broader organization.
🏆 Milestones
- Founded in the early 2020s as a next-gen product management platform
- Built one of the early AI-native PM agents for automating product workflows
- Grew adoption across global teams with integrations like Jira, Salesforce, and Zendesk
- Achieved strong product recognition across PM tooling ecosystems
- Acquired by Pendo (2026) to accelerate AI innovation in product experience
Jobs
2
About the company
Optimo Capital is a newly established NBFC with a mission to serve the underserved MSME businesses with their credit needs in India. Less than 15% of MSMEs have access to formal credit. We aim to bridge this credit gap by employing a phygital model (physical branches + digital decision-making).
Being a technology and data-first company, tech and data enthusiasts play a crucial role in building the tech & infra to help the company thrive.
Jobs
1
About the company
Jobs
15
About the company
Your Go-To AI Consultancy For AI Research, AI Products, AI Solutions, AI MVP Design, Idea Validation
Jobs
15
About the company
Recruiting Bond is a global leader in Recruitment Process Outsourcing (RPO), Executive Search, Headhunting, Talent Mapping, and Workforce Consulting. Founded by Pavan B, we are on a mission to power businesses through transformative talent strategies that scale teams, accelerate innovation, and unlock human potential.
With a presence across 25+ industries—from IT, Healthcare, and FinTech to Gaming, BioTech, and Web3—we specialize in hiring that drives outcomes. Our domain expertise spans high-growth startups to Fortune 500 companies, delivering elite CXO and leadership talent, strategic workforce solutions, and inclusive hiring at scale.
We help businesses:
✔️ Hire the right leaders and builders
✔️ Scale globally with speed and precision
✔️ Build talent-first roadmaps from MVP to IPO
Whether you're launching, scaling, or transforming—Recruiting Bond is your strategic partner in talent.
🔹 Industries: Technology | Healthcare | FinTech | Retail | Manufacturing | EdTech | Crypto | Real Estate | Web3 | Logistics | Energy & more
🔹 Services: Executive Hiring | RPO | Talent Strategy | Workforce Design | Startup Consulting | Diversity Recruitment
📨 Let’s build the future—together: https://recruitingbond.c
Jobs
6




