
Similar jobs
Mandatory
Strong Senior / Lead Software Engineer profile
Mandatory (Experience 1) - Must have Min 6 YOE in Software development, wherein 1-2 Yrs as Senior or Lead Role.
Mandatory (Experience 2) - Must have experience with Python + Django / Flask or similar framework
Mandatory (Experience 3) - Must have experience with Relational Databases (like MySQL, PostgreSQL, Oracle etc)
Mandatory (Experience 4) - Must have good experience in Micro Services or Distributed System frameworks(eg, Kafka, Google pub / Sub, AWS SNS, Azure Service Bus) or Message brokers(eg,RabbitMQ)
Mandatory (Location) - Candidate must be from Bengaluru
Mandatory (Company) - Product / Start-up companies only
Mandatory (Stability) - Should have worked for at least 2 years in 1 Company in last 3 years..
Job Title: AI Engineer - NLP/LLM Data Product Engineer Location: Chennai, TN- Hybrid
Duration: Full time
Job Summary:
About the Role:
We are growing our Data Science and Data Engineering team and are looking for an
experienced AI Engineer specializing in creating GenAI LLM solutions. This position involves collaborating with clients and their teams, discovering gaps for automation using AI, designing customized AI solutions, and implementing technologies to streamline data entry processes within the healthcare sector.
Responsibilities:
· Conduct detailed consultations with clients functional teams to understand client requirements, one use case is related to handwritten medical records.
· Analyze existing data entry workflows and propose automation opportunities.
Design:
· Design tailored AI-driven solutions for the extraction and digitization of information from handwritten medical records.
· Collaborate with clients to define project scopes and objectives.
Technology Selection:
· Evaluate and recommend AI technologies, focusing on NLP, LLM and machine learning.
· Ensure seamless integration with existing systems and workflows.
Prototyping and Testing:
· Develop prototypes and proof-of-concept models to demonstrate the feasibility of proposed solutions.
· Conduct rigorous testing to ensure accuracy and reliability.
Implementation and Integration:
· Work closely with clients and IT teams to integrate AI solutions effectively.
· Provide technical support during the implementation phase.
Training and Documentation:
· Develop training materials for end-users and support staff.
· Create comprehensive documentation for implemented solutions.
Continuous Improvement:
· Monitor and optimize the performance of deployed solutions.
· Identify opportunities for further automation and improvement.
Qualifications:
· Advanced degree in Computer Science, Artificial Intelligence, or related field (Masters or PhD required).
· Proven experience in developing and implementing AI solutions for data entry automation.
· Expertise in NLP, LLM and other machine-learning techniques.
· Strong programming skills, especially in Python.
· Familiarity with healthcare data privacy and regulatory requirements.
Additional Qualifications( great to have):
An ideal candidate will have expertise in the most current LLM/NLP models, particularly in the extraction of data from clinical reports, lab reports, and radiology reports. The ideal candidate should have a deep understanding of EMR/EHR applications and patient-related data.
Experience working on waterfall or Agile (Agile model preferred)Solid understanding of Python scripting and/or frameworks like Django, Flask Back up RRS.
- Leads more than one projects end-to-end and collaborates across functions. Drives planning, estimation and execution.
- Manages stakeholder expectations and offers scalable, reliable, performant and easy to maintain solutions
- Consistently delivers complex, well backed and bug-free products in time
- Consistently takes well thought technical/design decisions
- Develops expertise in more than one area and shares knowledge with others. able to mentor/train in areas which are new to them.
- Drives people to solve engineering challenges
- Enjoys high respect of Tech and other cross functional teams
- Demonstrates effective communication with project team, management and
internal/external clients as necessary.
- Surfaces both technical and non-technical team challenges and helps resolve them
- Champion for SDLC best practices and high quality standards
- Expert at in RoR, Golang, NodeJS or Python. Good to have exposure to ML.
- Must have experience in cloud computing
- Operates independently with almost no oversight
- Is able to apply domain expertise to think critically and make wise decisions for the
team, taking into account tradeoffs and constraints.
- Communicates tech decisions through design docs and tech talks
- Has delivered multiple projects with end-to-end engineering ownership
- Keeps track of new technology/tools and embraces them as necessary
- 7+ years of experience in product driven organization
- A Bachelors or Masters degree in engineering from a reputed institute (preferably
Experience: 8-12 yrs
Location: Noida
Notice Period: Immediate Or 15 days
Job Description:
• 8-12 yrs. experience in Java, J2EE, SQL, JavaScript, HTML, CSS, XML, Oracle, SQL Server
• Strong in Core Java, J2EE and MVC architecture.
• Good written and oral communication skills (English required).
• Good interpersonal skills, with a focus on listening and questioning skills.
• Ability to absorb and retain information quickly.
• Proven analytical and problem-solving abilities.
Must Have:
* Min 3 years in Angular.
* 1 year in AWS.
* Basic knowledge 3-4 months exposure in Python .
Responsibilities
- Lead the development of the backend systems for our first product
- Build reliable, secure and performant backend systems
- Drive test coverage and continuous delivery automation within the team
- Mentor and provide feedback to teammates
Requirements
- 5+ years of software development experience
- Strong computer science fundamentals
- Deep and wide knowledge of Java ecosystem
- Can write code that is readable, maintainable, secure and performant
- Know the importance of tests and how to approach writing different types of tests
- Good intuition for REST API design
- Deep understanding of relational databases, transactions, entity-relationship modeling
- Comfortable writing highly concurrent systems
- Experienced in using profilers, tuning garbage collection, optimizing SQL queries
Desired Skills and Experience
- Golang , Java , Python , Ruby
About Us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take
key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable
competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to
help businesses develop data-driven strategies and make smarter decisions.
Products@DataWeave
We, the Products team at DataWeave, build data products that provide timely insights that are readily
consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are
focussed on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest
data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
● Opportunity to work on some of the most compelling data products that we are building for online
retailers and brands.
● Ability to see the impact of your work and the value you are adding to our customers almost immediately.
● Opportunity to work on a variety of challenging problems and technologies to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the
team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Role and Responsibilities
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
● 4-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and
dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.











