
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified

About Koantek
About
Similar jobs
At BigThinkCode, our technology solves complex problems. We are looking for a talented problem solver with middleware and backend development expertise to join our technology team at Chennai.
Must have skills:
- Any programming: nodeJS and typescript.
- Hands on experience using Middleware like e. g., Apache Kafka or RabbitMQ or Redis.
- Aware about Asynchronous and Loose Coupling.
- Familiar with API's gateway and microservice design patterns.
- Working on distributed systems - Nice to have.
- Event drive architecture (EDA) microservices project experience - Nice to have.
Below job description for your reference, if interested share your profile to connect and discuss.
Company: BigThinkCode Technologies
URL: https://www.bigthinkcode.com/
Role: Senor engineer (backend - nodeJS)
Experience required: 3-5 years
Work location: Chennai
Joining time: Immediate - 2 weeks
Work Mode: Work from office (Hybrid)
About the Role:
Scalable Systems & AI Integration - We are seeking a talented Senior Software Engineer to join our team and lead the development of high-performance, distributed systems. This role is ideal for an engineer who thrives in fast-paced environments, enjoys tackling complex data challenges, and is eager to bridge the gap between traditional microservices and modern Agentic AI workflows.
Key Responsibilities:
- Architect & Build: Design and implement highly scalable microservices capable of processing massive data volumes with low latency.
- Event-Driven Systems: Develop robust, asynchronous communication patterns using event-driven architecture to ensure system resilience and decoupling.
- AI Integration: Lead the integration of Large Language Models (LLMs) and Agentic AI workflows into existing enterprise applications to enhance automation and intelligence.
- Database Management: Optimize data storage and retrieval strategies; leverage MongoDB and explore/implement Graph Databases (e.g., Neo4j, Google Spanner) for complex relationship mapping.
- Deployment & Scaling: Containerize applications using Docker and manage deployments within Kubernetes (K8s) environments.
- Collaboration: Work closely with cross-functional teams to adapt to a rapidly evolving tech stack and translate business requirements into technical reality.
Core Requirements:
- Scalable Systems: Proven track record of building and maintaining production-grade systems handling high throughput and large datasets.
- Microservices: Good understanding of microservice design patterns, API gateways.
- Event-Driven Architecture: Strong experience with message brokers (e.g., Kafka, RabbitMQ, or Redis).
- Programming: While we are open to various backgrounds, the Node.js/MongoDB stack is our primary environment.
Why Join Us:
- Collaborative work environment.
- Exposure to modern tools and scalable application architectures.
- Medical cover for employee and eligible dependents.
- Tax beneficial salary structure.
- Comprehensive leave policy
- Competency development training programs.
1) Minimum 4+ years of experience as QA.
2) Skill required: Manual testing, Automation testing, SDLC.
3) Location: Mumbai
SKILLS: UI Development,Angular,Javascript,HTML,CSS,Monitering Management,etc.,
Role
We are looking for a Lead UI Engineer to design and build multiple channels for user interaction.
Responsibilities
- Design/Architect and develop core features of the UI/frontend.
- Design/Architect and develop policy framework to provide rich UI for various features of the solution.
Requirements:
- Must have 6+ years of experience in working in frontend technologies
- Must have at least 2-3 years. of experience with Angular2 or above versions.
- Must have at least 5+ years of experience with HTML/CSS/JavaScript/TypeScript.
- Experience with PrimeNG and Vega is a plus
- Strong background in developing UI for Monitoring and Management systems, dealing with topology, and different telemetry such as metrics, traces and logs
- Familiar with containerization solutions like Docker/Kubernetes etc.
- Familiar with serverless technologies like AWS Lambda.
- B.E/B.Tech/MS degree in Computer Science, or equivalent
candidate must have at least a minimum of 2+ years of experience in recruitment and selection. Only Immediate joiners must apply.
Roles & Responsibilities
1. Build and develop commercial relationships with our key targeted B2B partners (e-rickshaw, battery dealers and distributors).
2. Establish a systematic approach for partner outreach and B2B relationship management.
3. Strategize newer ways to grow current partnerships & form new ones.
4. Understand the current market trends and build & grow relationships accordingly.
5. Negotiate and finalize the partnership deals.
Desired Skills
1. 3+ years of relevant work experience.
2. Successfully built and managed B2B partnerships in the past.
3. Excellent at Opportunity Analysis and Business Planning.
If you are someone
1. Who is resourceful, and able to effectively negotiate and close the B2B deals. You prefer traveling and visiting B2B partners.
2. Understand internal dynamics and partner profiles to create a win-win situation.
3. Energetic, hunger to learn.
4. Comfortable getting your hands dirty and high on execution.
This Opportunity is for YOU!
Roles and Responsibilities :
Key Skills: Responsible for creating successful campaigns, responsible for generating the ROI of the company, should have great knowledge of how ads work, great knowledge on landing pages.
About Company: Workhours is India's 1st Ed-Tech Company Focusing on Self-Employment. We want to change the way people think about their careers from how to get a job to how to become self-employed. Example of how digital payments changed the behaviour of a person paying from cash to paying using digital apps.
Job Description: We are looking for a skilled Social Media Marketing executive who is excellent at creating successful campaigns, should be responsible for Generating the ROI of the company and should have a piece of great knowledge of how ads work and who has a piece of great knowledge of landing pages.
Roles and Responsibilities:
- Responsible for creating Successful Campaigns.
- Responsible for generating the ROI of the company.
- Should have a piece of great knowledge of how ads work.
- knowledge on landing pages.
mandatory Skills:
- Excellent communication skills
- Work Report Submissions
- Team Player
- Able to Handle Multitasking
- Should be Good at Training (Very Very Important)
- Good at Planning Strategies.
- Understand functional and non-functional testing requirements for APIs
- Document functional and non-functional testing scenarios and cases where needed, on one of the test management tools
- Identify or create data for testing
- Automation framework design and implementation according to project structure
- Develop test automation scripts using tools like RestAssured and SoapUI
- Must be able to use all web methods like GET, PUT, POST, DELETE, etc.
- Validate feedback, response time, and error code
- Validate XML and JSON body using JSON Parser
- Raise PR to check in code for test automation scripts
- Conduct peer reviews of test cases and automation scripts
Mandatory Skills
- Intermediate to advanced level skill in programming using Java and/or Python
- Intermediate to advanced level skill in using API test automation tools and developing test automation frameworks - RESTAssured, Postman, SoapUI, Karate, Robot
- Intermediate to advanced level skill in working with databases preferably SQL
- Configuration management - GIT • Build management - Maven
- Continuous Integration - Jenkins
- Excellent verbal and written communication skills
To be the right fit, you'll need:








