

JK Technosoft Ltd
https://jktech.comAbout
Jobs at JK Technosoft Ltd
About the Role:
We are looking for a Data Architect with a strong background in data engineering & cloud data platforms. The ideal candidate will design and implement scalable data architectures that power enterprise analytics, AI/ML, and GenAI solutions — ensuring data availability, quality, and governance across the organization.
Key Responsibilities:
Data Architecture & Strategy
- Design & Architecture: Design and implement robust, scalable, and optimized data engineering solutions on the Databricks platform. Architect data pipelines that scale efficiently and reliably.
- Data Pipeline Development: Develop ETL/ELT pipelines leveraging Databricks notebooks, Delta Lake, Snowflake tech stack, Azure Data Factory etc.
- Cloud Integration: Work closely with cloud platforms like Azure, AWS, or GCP to integrate Databricks or Snowflake with data storage (e.g., ADLS, S3, etc.), databases, and other services.
- Performance Optimization: Optimize the performance of data workflows by tuning Databricks clusters, improving query performance, and identifying bottlenecks in data processing.
- Collaboration: Collaborate with data scientists, analysts, and business stakeholders to understand business requirements and translate them into scalable data solutions.
- Data Governance & Security: Ensure best practices for data security, governance, and compliance when working with sensitive or large datasets.
- Automation & Monitoring: Automate data pipeline deployments and create monitoring dashboards for ongoing performance checks.
- Continuous Improvement: Stay up to date with the latest Databricks features and Snowflake eco system best practices to continuously improve existing systems and processes.
Required Skills & Experience:
- 12+ years of experience in Data Architecture / Data Engineering roles.
- Proven expertise in data modeling, ETL/ELT design, and cloud-based data solutions (AWS Redshift, Snowflake, BigQuery, or Synapse).
- Hands-on experience with data pipeline orchestration tools (Airflow, DBT, Azure Data Factory, etc.).
- Proficiency in Python, SQL, and Spark for data processing and integration.
- Experience with API integrations and data APIs for AI systems.
- Excellent communication and stakeholder management skills.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Roles and Responsibilities:
- Design, develop, and maintain the end-to-end MLOps infrastructure from the ground up, leveraging open-source systems across the entire MLOps landscape.
- Creating pipelines for data ingestion, data transformation, building, testing, and deploying machine learning models, as well as monitoring and maintaining the performance of these models in production.
- Managing the MLOps stack, including version control systems, continuous integration and deployment tools, containerization, orchestration, and monitoring systems.
- Ensure that the MLOps stack is scalable, reliable, and secure.
Skills Required:
- 3-6 years of MLOps experience
- Preferably worked in the startup ecosystem
Primary Skills:
- Experience with E2E MLOps systems like ClearML, Kubeflow, MLFlow etc.
- Technical expertise in MLOps: Should have a deep understanding of the MLOps landscape and be able to leverage open-source systems to build scalable, reliable, and secure MLOps infrastructure.
- Programming skills: Proficient in at least one programming language, such as Python, and have experience with data science libraries, such as TensorFlow, PyTorch, or Scikit-learn.
- DevOps experience: Should have experience with DevOps tools and practices, such as Git, Docker, Kubernetes, and Jenkins.
Secondary Skills:
- Version Control Systems (VCS) tools like Git and Subversion
- Containerization technologies like Docker and Kubernetes
- Cloud Platforms like AWS, Azure, and Google Cloud Platform
- Data Preparation and Management tools like Apache Spark, Apache Hadoop, and SQL databases like PostgreSQL and MySQL
- Machine Learning Frameworks like TensorFlow, PyTorch, and Scikit-learn
- Monitoring and Logging tools like Prometheus, Grafana, and Elasticsearch
- Continuous Integration and Continuous Deployment (CI/CD) tools like Jenkins, GitLab CI, and CircleCI
- Explain ability and Interpretability tools like LIME and SHAP
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Roles and Responsibilities:
· Design and implement scalable web applications and platforms using technologies such as Typescript, NestJS, Angular, NodeJS, ExpressJS, TypeORM, and Postgres
· Good understanding of web and REST API design patterns
· Experience with AWS technologies such as EKS, ECS, ECR, Fargate, EC2, Lambda, ALB will be an added advantage
· Hands-on experience with unit test frameworks like Jest
· Good working knowledge of JIRA, Confluence, Git
· Basic knowledge of Kubernetes and Terraform for infrastructure as code
· Basic knowledge of Docker compose and Docker
· Strong understanding of microservices architecture and ability to implement components independently
· Proven track record of problem-solving skills
· Excellent communication skills
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Experience – 8 – 10 years
Location - NCR
Roles and Responsibilities:
System Analyst
The individual in this role will gather, document, and analyze client functional, transactional, and insurance business requirements across all insurance functions and third-party integrations. System Analyst will also work within a cross-functional project team to provide business analytical support and leadership from the business side. The individual will play a highly visible, client-facing, and the consultative role and have the ability to offer system solutions to enhance client implementations and transform client workflow and business processes. Individual should be very good in mapping business functions / attributes with the Insurance rules and data.
Skills Required:
- A successful candidate in this role must:
- Good hands-on skills in Oracle, PL/SQL, TSQL
- Object oriented knowledge should be good
- Have Functional knowledge on P & C
- Overall Tech - 60% and Functional - 40%
Primary Skills:
- Good Data Mapping knowledge, RDBMS / SQL Knowledge
Secondary Skills:
- Oracle Big+
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Roles and Responsibilities:
Hands-on experience on Angular, CSS, Scripting, NodeJs/express
Experience in Responsive Web, Automation, CI/CD, Github, Microservices, Postgres (or any other RDBMS)
Experience in AWS (SQS, SNS, Cognito)
Implementing Observability/Monitoring.
Strong experience in REST APIs
Similar companies
About the company
Jobs
25
About the company
1HR Solutions is the emerging talent acquisition partner.
Are you among one of those who thinks that it is alright to just do an ordinary routine to recruit people and diversify the business? Well no that is not how the business revolution comes. It comes by building and fostering each and every part of the come, no matter how big or small. HR services are the wheels of any company’s success.
Jobs
16
About the company
Jobs
1
About the company
At LearnTube, we're reimagining how the world learns making education accessible, affordable, and outcome-driven using Generative AI. Our platform turns scattered internet content into structured, personalised learning journeys using:
- AI-powered tutors that teach live, solve doubts instantly, and give real-time feedback
- Frictionless delivery via WhatsApp, mobile, and web
- Trusted by 2.2 million learners across 64 countries
Founded by Shronit Ladhani and Gargi Ruparelia, both second-time entrepreneurs and ed-tech builders:
- Shronit is a TEDx speaker and an outspoken advocate for disrupting traditional learning systems.
- Gargi is one of the Top Women in AI in India, recognised by the government, and leads our AI and scalability roadmap.
Together, they bring deep product thinking, bold storytelling, and executional clarity to LearnTube’s vision. LearnTube is proudly backed by Google as part of their 2024 AI First Accelerator, giving us access to cutting-edge tech, mentorship, and cloud credits.
Jobs
1
About the company
Founded in 2015, Phi Commerce has created PayPhi, a ground-breaking omni-channel payment processing platform which processes digital payments at doorstep, online & in-store across variety of form factors such as cards, net-banking, UPI, Aadhaar, BharatQR, wallets, NEFT, RTGS, and NACH. The company was established with the objective to digitize white spaces in payments & go beyond routine payment processing.
Phi Commerce’s PayPhi Digital Enablement suite has been developed with the mission of empowering very large untapped blue-ocean sectors dominated by offline payment modes such as cash & cheque to accept digital payments.
Custom built enablers around PayPhi, help businesses create, present and process digital payments. Our solutions take into consideration legacy systems and complex workflows of businesses, use case, stakeholders in payment ecosystem such as merchants, consumers, banks, networks and ancillary players etc. This uniquely positions us to eliminate friction in first & last mile of payments and create a sustainable digital payment ecosystem.
Jobs
2
About the company
Jobs
3
About the company
An AI-powered talent platform that bridges the gap between job-ready candidates and companies that hire with certainty.
Jobs
1
About the company
Jobs
1






