

JK Technosoft Ltd
https://jktech.comAbout
Jobs at JK Technosoft Ltd
About the Role:
We are looking for a Data Architect with a strong background in data engineering & cloud data platforms. The ideal candidate will design and implement scalable data architectures that power enterprise analytics, AI/ML, and GenAI solutions — ensuring data availability, quality, and governance across the organization.
Key Responsibilities:
Data Architecture & Strategy
- Design & Architecture: Design and implement robust, scalable, and optimized data engineering solutions on the Databricks platform. Architect data pipelines that scale efficiently and reliably.
- Data Pipeline Development: Develop ETL/ELT pipelines leveraging Databricks notebooks, Delta Lake, Snowflake tech stack, Azure Data Factory etc.
- Cloud Integration: Work closely with cloud platforms like Azure, AWS, or GCP to integrate Databricks or Snowflake with data storage (e.g., ADLS, S3, etc.), databases, and other services.
- Performance Optimization: Optimize the performance of data workflows by tuning Databricks clusters, improving query performance, and identifying bottlenecks in data processing.
- Collaboration: Collaborate with data scientists, analysts, and business stakeholders to understand business requirements and translate them into scalable data solutions.
- Data Governance & Security: Ensure best practices for data security, governance, and compliance when working with sensitive or large datasets.
- Automation & Monitoring: Automate data pipeline deployments and create monitoring dashboards for ongoing performance checks.
- Continuous Improvement: Stay up to date with the latest Databricks features and Snowflake eco system best practices to continuously improve existing systems and processes.
Required Skills & Experience:
- 12+ years of experience in Data Architecture / Data Engineering roles.
- Proven expertise in data modeling, ETL/ELT design, and cloud-based data solutions (AWS Redshift, Snowflake, BigQuery, or Synapse).
- Hands-on experience with data pipeline orchestration tools (Airflow, DBT, Azure Data Factory, etc.).
- Proficiency in Python, SQL, and Spark for data processing and integration.
- Experience with API integrations and data APIs for AI systems.
- Excellent communication and stakeholder management skills.
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Roles and Responsibilities:
- Design, develop, and maintain the end-to-end MLOps infrastructure from the ground up, leveraging open-source systems across the entire MLOps landscape.
- Creating pipelines for data ingestion, data transformation, building, testing, and deploying machine learning models, as well as monitoring and maintaining the performance of these models in production.
- Managing the MLOps stack, including version control systems, continuous integration and deployment tools, containerization, orchestration, and monitoring systems.
- Ensure that the MLOps stack is scalable, reliable, and secure.
Skills Required:
- 3-6 years of MLOps experience
- Preferably worked in the startup ecosystem
Primary Skills:
- Experience with E2E MLOps systems like ClearML, Kubeflow, MLFlow etc.
- Technical expertise in MLOps: Should have a deep understanding of the MLOps landscape and be able to leverage open-source systems to build scalable, reliable, and secure MLOps infrastructure.
- Programming skills: Proficient in at least one programming language, such as Python, and have experience with data science libraries, such as TensorFlow, PyTorch, or Scikit-learn.
- DevOps experience: Should have experience with DevOps tools and practices, such as Git, Docker, Kubernetes, and Jenkins.
Secondary Skills:
- Version Control Systems (VCS) tools like Git and Subversion
- Containerization technologies like Docker and Kubernetes
- Cloud Platforms like AWS, Azure, and Google Cloud Platform
- Data Preparation and Management tools like Apache Spark, Apache Hadoop, and SQL databases like PostgreSQL and MySQL
- Machine Learning Frameworks like TensorFlow, PyTorch, and Scikit-learn
- Monitoring and Logging tools like Prometheus, Grafana, and Elasticsearch
- Continuous Integration and Continuous Deployment (CI/CD) tools like Jenkins, GitLab CI, and CircleCI
- Explain ability and Interpretability tools like LIME and SHAP
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Roles and Responsibilities:
· Design and implement scalable web applications and platforms using technologies such as Typescript, NestJS, Angular, NodeJS, ExpressJS, TypeORM, and Postgres
· Good understanding of web and REST API design patterns
· Experience with AWS technologies such as EKS, ECS, ECR, Fargate, EC2, Lambda, ALB will be an added advantage
· Hands-on experience with unit test frameworks like Jest
· Good working knowledge of JIRA, Confluence, Git
· Basic knowledge of Kubernetes and Terraform for infrastructure as code
· Basic knowledge of Docker compose and Docker
· Strong understanding of microservices architecture and ability to implement components independently
· Proven track record of problem-solving skills
· Excellent communication skills
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Experience – 8 – 10 years
Location - NCR
Roles and Responsibilities:
System Analyst
The individual in this role will gather, document, and analyze client functional, transactional, and insurance business requirements across all insurance functions and third-party integrations. System Analyst will also work within a cross-functional project team to provide business analytical support and leadership from the business side. The individual will play a highly visible, client-facing, and the consultative role and have the ability to offer system solutions to enhance client implementations and transform client workflow and business processes. Individual should be very good in mapping business functions / attributes with the Insurance rules and data.
Skills Required:
- A successful candidate in this role must:
- Good hands-on skills in Oracle, PL/SQL, TSQL
- Object oriented knowledge should be good
- Have Functional knowledge on P & C
- Overall Tech - 60% and Functional - 40%
Primary Skills:
- Good Data Mapping knowledge, RDBMS / SQL Knowledge
Secondary Skills:
- Oracle Big+
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
- Roles and Responsibilities:
Hands-on experience on Angular, CSS, Scripting, NodeJs/express
Experience in Responsive Web, Automation, CI/CD, Github, Microservices, Postgres (or any other RDBMS)
Experience in AWS (SQS, SNS, Cognito)
Implementing Observability/Monitoring.
Strong experience in REST APIs
Similar companies
About the company
CAW Studios is Product Development Studio. WE BUILD TRUE PRODUCT TEAMS for our clients. Each team is a small, well-balanced group of geeks and a product manager that together produce relevant and high-quality products. We use data to make decisions, bringing big data and analysis to software development. We believe the product development process is broken as most studios operate as IT Services. We operate like a software factory that applies manufacturing principles of product development to the software.
Jobs
13
About the company
Jobs
3
About the company
We are a globally awarded UX & UI design studio trusted by brands like Adobe, Kotak, AU Bank, VMware, Fossil, and many more
We are a team of deep-thinking designers and strategists driven by the impact of design. We believe great design isn't just functional — it's thoughtful, scalable, and emotionally resonant. At Ungrammary, we go beyond crafting interfaces — we design experiences that solve real-world problems and touch millions of users.
Specializing in fintech, healthcare, SaaS, e-commerce, and enterprise tech. We've helped global enterprises and high-growth startups create products that are as intelligent as they are intuitive. With a strong foundation in user experience, user research, and design thinking, we bring clarity to complexity.
If you're ready to be part of a fast-paced, high-impact design team or want to work on world-class products, drop us a line at [email protected] — we’d love to hear from you.
More about us:
- 🏆 iF Design Award 2024, IDA 2022, DNA Paris Design Award 2022, MUSE Creative Gold Award 2021
- Ranked among the Top 20 Global UX Agencies by Clutch
- 70+ brands served across 10+ countries.
- 100+ million user interactions impacted.
- Trusted by global brands like Adobe, Kotak Mahindra Bank, Adani Capital, AU Bank, VMware, Fossil, Brooks, and more.
- Designed products across 15+ industries including fintech, healthcare, SaaS, and e-commerce
Jobs
4
About the company
Jobs
2
About the company
Jobs
1
About the company
We are an award-winning software agency that specializes in crafting cutting-edge solutions for a variety of industries.
Our expertise spans across a wide range of areas, including Full Stack development, Blockchain Technology, and Game Development.
Jobs
2
About the company
Albert Invent is a cutting-edge R&D software company that’s built by scientists, for scientists. Their cloud-based platform unifies lab data, digitises workflows and uses AI/ML to help materials and chemical R&D teams invent faster and smarter. With thousands of researchers in 30+ countries already using the platform, Albert Invent is helping transform how chemistry-led companies go from idea to product.
What sets them apart: built specifically for chemistry and materials science (not generic SaaS), with deep integrations (ELN, LIMS, AI/ML) and enterprise-grade security and compliance.
Jobs
0
About the company
SimplyFI is a secure digital marketplace developed to provide both liquidity and efficiency in the import-export supply chain. SimplyFI is a B2B marketplace and one single source for both supply chain finance and trade services, powered by cutting-edge cryptography, which ensures both data privacy and auditability in real time.
SimplyFI is built on enterprise-grade Blockchain Technology to support global trade deals with the infrastructure needed to participate with a highly available and highly scalable network where trade services, which include financial, shipping, and insurance services, quality services, etc., can interoperate and collaborate in a secure way.
We also specialize in delivering end-to-end solutions for the smart connected world through IOT and AI technologies. We uncover new ways of harnessing data and generate insights to create a more holistic and better user experience with our products.
Jobs
0
About the company
Jobs
3
About the company
Shopflo is an enterprise technology company providing a specialized checkout infrastructure platform designed to boost conversion rates for direct-to-consumer (D2C) e-commerce brands. Founded in 2021, it focuses on enhancing the online buying experience through fast, customizable, and secure checkout pages that reduce cart abandonment.
We aim to supercharge conversions for e-commerce websites at checkout by improving user experience, helping build stronger intent and trust during the purchase
Problem statement -
(1) There is ~70% drop off at checkout for most independent e-commerce retailer (outside of large marketplaces)
(2) E-commerce cart platforms allow minimal flexibility on checkout, with their experience still same as the last decade
(3) Whereas user experiences are defined by new consumer platforms such as Swiggy, Amazon, etc.
There is a fundamental unbundling of monolith shopping cart platforms globally for mid-market and enterprise customers, who are moving towards headless (read modular) architecture.
Shopflo aims to be the global default for checkout experiences.
Jobs
2




