
š Weāre Hiring: Senior AI Engineer (Customer Facing) | Remote
Are you passionate about building and deploying enterprise-grade AI solutions?
Do you enjoy combining deep technical expertise with customer-facing problem-solving?
Weāre looking for a Senior AI Engineer to design, deliver, and integrate cutting-edge AI/LLM applications for global enterprise clients.
What Youāll Do:
š¹ Partner directly with enterprise customers to understand business requirements & deliver AI solutions
š¹ Architect and integrate intelligent agent systems (LangChain, LangGraph, CrewAI)
š¹ Build LLM pipelines with RAG and client-specific knowledge
š¹ Collaborate with internal teams to ensure seamless integration
š¹ Champion engineering best practices with production-grade Python code
What Weāre Looking For:
āļø 5+ years of hands-on experience in AI/ML engineering or backend systems
āļø Proven track record with LLMs & intelligent agents
āļø Strong Python and backend expertise
āļø Experience with vector databases (Pinecone, We aviate, FAISS)
āļø Excellent communication & customer-facing skills
Preferred: Cloud (AWS/Azure/GCP), MLOps knowledge, and startup/AI services experience.
š Remote role | High-impact opportunity | Backed by strong leadership & growth
If this sounds like you (or someone in your network), letās connect!

Similar jobs

Job Summary:
As a Cloud Architect at organization, you will play a pivotal role in designing, implementing, and maintaining our multi-cloud infrastructure. You will work closely with various teams to ensure our cloud solutions are scalable, secure, and efficient across different cloud providers. Your expertise in multi-cloud strategies, database management, and microservices architecture will be essential to our success.
Key Responsibilities:
- Design and implement scalable, secure, and high-performance cloud architectures across multiple cloud platforms (AWS, Azure, Google Cloud Platform).
- Lead and manage cloud migration projects, ensuring seamless transitions between on-premises and cloud environments.
- Develop and maintain cloud-native solutions leveraging services from various cloud providers.
- Architect and deploy microservices using REST, GraphQL to support our application development needs.
- Collaborate with DevOps and development teams to ensure best practices in continuous integration and deployment (CI/CD).
- Provide guidance on database architecture, including relational and NoSQL databases, ensuring optimal performance and security.
- Implement robust security practices and policies to protect cloud environments and data.
- Design and implement data management strategies, including data governance, data integration, and data security.
- Stay-up-to-date with the latest industry trends and emerging technologies to drive continuous improvement and innovation.
- Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability.
- Optimize cost and performance across different cloud environments.
Qualifications/ Experience & Skills Required:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: 10 - 15 Years
- Proven experience as a Cloud Architect or in a similar role, with a strong focus on multi-cloud environments.
- Expertise in cloud migration projects, both lift-and-shift and greenfield implementations.
- Strong knowledge of cloud-native solutions and microservices architecture.
- Proficiency in using GraphQL for designing and implementing APIs.
- Solid understanding of database technologies, including SQL, NoSQL, and cloud-based database solutions.
- Experience with DevOps practices and tools, including CI/CD pipelines.
- Excellent problem-solving skills and ability to troubleshoot complex issues.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Deep understanding of cloud security practices and data protection regulations (e.g., GDPR, HIPAA).
- Experience with data management, including data governance, data integration, and data security.
Preferred Skills:
- Certifications in multiple cloud platforms (e.g., AWS Certified Solutions Architect, Google Certified Professional Cloud Architect, Microsoft Certified: Azure Solutions Architect).
- Experience with containerization technologies (Docker, Kubernetes).
- Familiarity with cloud cost management and optimization tools.
Position Overview: We are seeking a talented Data Engineer with expertise in Power BI to join our team. The ideal candidate will be responsible for designing and implementing data pipelines, as well as developing insightful visualizations and reports using Power BI. Additionally, the candidate should have strong skills in Python, data analytics, PySpark, and Databricks. This role requires a blend of technical expertise, analytical thinking, and effective communication skills.
Key Responsibilities:
- Design, develop, and maintain data pipelines and architectures using PySpark and Databricks.
- Implement ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into actionable insights.
- Develop interactive dashboards, reports, and visualizations using Power BI to communicate key metrics and trends.
- Optimize and tune data pipelines for performance, scalability, and reliability.
- Monitor and troubleshoot data infrastructure to ensure data quality, integrity, and availability.
- Implement security measures and best practices to protect sensitive data.
- Stay updated with emerging technologies and best practices in data engineering and data visualization.
- Document processes, workflows, and configurations to maintain a comprehensive knowledge base.
Requirements:
- Bachelorās degree in Computer Science, Engineering, or related field. (Masterās degree preferred)
- Proven experience as a Data Engineer with expertise in Power BI, Python, PySpark, and Databricks.
- Strong proficiency in Power BI, including data modeling, DAX calculations, and creating interactive reports and dashboards.
- Solid understanding of data analytics concepts and techniques.
- Experience working with Big Data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in programming languages such as Python and SQL.
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to work independently and manage multiple tasks simultaneously in a fast-paced environment.
Preferred Qualifications:
- Advanced degree in Computer Science, Engineering, or related field.
- Certifications in Power BI or related technologies.
- Experience with data visualization tools other than Power BI (e.g., Tableau, QlikView).
- Knowledge of machine learning concepts and frameworks.
ABOUT EPISOURCE:
Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.
The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be ādeployedā? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.
Whatās our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.
ABOUT THE ROLE:
Weāre looking to hire someone to help scale Machine Learning and NLP efforts at Episource. Youāll work with the team that develops the models powering Episourceās product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.
This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.
You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.
During the course of a typical day with our team, expect to work on one or more projects around the following;
1. Create and maintain optimal data pipeline architectures for ML
2. Develop a strong API ecosystem for ML pipelines
3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible
4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems
5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operationsĀ Ā
6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms
Ā
7. Designing scalable implementations of the models developed by our Data Science teamsĀ Ā
8. Big data and distributed ML with PySpark on AWS EMR, and more!
BASIC REQUIREMENTSĀ
-
Ā Bachelorās degree or greater in Computer Science, IT or related fields
-
Ā Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects
-
Strong experience with bash scripting, unix environments and building scalable/distributed systems
-
Experience with automation/configuration management using Ansible, Terraform, or equivalent
-
Very strong experience with AWS and Python
-
Experience building CI/CD systems
-
Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent
-
Ability to build and manage application and performance monitoring processes
- Solid experience in designing, implementing, and securing cloud environments, including services such as EC2, S3, RDS, IAM, VPC, and CloudTrail.
- Strong understanding of DevOps methodologies and experience with CI/CD pipelines and tools (e.g., Jenkins, GitHub, SonarQube).
- In-depth knowledge of cloud security best practices, industry standards, and compliance frameworks (e.g., NIST, CIS, ISO 27001).
- Proficiency in scripting languages such as Python, Bash, Groovy.
- Experience with Infrastructure-as-Code (IaC) tools like AWS CloudFormation or Terraform.
- Familiarity with security scanning and monitoring tools, such as AWS Security Hub, GuardDuty, Inspector, or third-party solutions.
- Strong understanding of network security concepts, including firewalls, VPNs, and secure network architectures.
- Knowledge of secure coding practices and experience with application security testing tools (e.g., SAST, DAST, fuzzing, and secure coding patterns).
- Excellent problem-solving skills and ability to work collaboratively in a team-oriented environment.
- Participate in incident handling and other related duties to support the information security function.
- The ability to learn and apply new concepts quickly
- Strong written and oral communication skills
Neokred is a FinTech company based in Bangalore and an ISO 9001 | 27001 & 20000-1 and PCIDSS certified firm in Information and Data security. The company builds Consumer Tech for Financial Infrastructure stack to provide curated versions of embedded banking in the payment ecosystem. We've created a platform which enables Corporates, Banks, FinTechās, Retail Companies, and Start-ups to launch their own banking services or financial products, such as issuance of co-branded cards, facilitating lending, and virtual bank accounts and KYC for their customers or employees with the help of low code plug and play technology stack.
BRIEF DESCRIPTION OF THE ROLE:
We are looking for an analytical, result driven Senior Java Developer who will use his or her understanding of programming languages and tools to build and analyse codes, formulate more efficient processes, solve problems, and create more seamless experience for users.Ā
Your KRAs will include the following:
- You will design, build, and own APIs and Services, which will be the core of the product.
- You will participate in continuing education and training to remain current on best practices, learn new programming languages and better assist other team members.
- You will part of developing ideas for new programs, products or features by monitoring industry developments and trends.
- You will have to take lead on projects, compile and analyse data, processes, and codes to troubleshoot problems and identify areas of improvement.Ā
YOU SHOULD POSSESS:Ā
- Minimum 4+ years of experience with Proficient understanding of Java, Hibernate, Springboot.
- luency in JAVA, Operating System may be required and Experience on Database such as MySQL or Postgre SQL.
- Proficiency with Springboot, Spring Security and Hibernate.
- Strong understanding of Computer Science Fundamentals, Data Structures and Algorithms, SOLID Design Principles and REST Patterns.
- Focus on efficiency, user experience and process improvement. ⢠Excellent project and time management skills.
- Strong problem solving and communication skills.
- Ability to work independently or with a group.Ā
⢠Develop White-box test cases from API functional specification
⢠Write maintainable scripts for API Automation testing
⢠Follow release cycles and commitment to deadlines
⢠Collaborate with the team and communicate effectively
⢠Ability to work in a fast-paced start-up
CANDIDATE MUST HAVE
⢠Modular/reusable test scripts using Java / JavaScript
⢠Load testing
⢠Test tools e.g. JMeter, Apache Benchmark, etc
DESIRED SKILLS & EXPERIENCE
⢠BE/BTech in Computer Science or related technical discipline
⢠Good knowledge of Java / JavaScript-based test frameworks
⢠Should have experience in building API automation from scratch
⢠Experience in writing modular/reusable test scripts using Java / JavaScript
⢠Experience with performance and load testing
⢠Experience with test tools e.g. JMeter, Apache Benchmark, etc.
⢠Knowledge of JSON-based Restful Web Services
⢠Experience in working with penetration testing tools will be a plus
⢠Knowledge of GIT, Bitbucket, JIRA, Linux Shell Script, and CI/CD process







