

About nurtrcom
About
Company video


Connect with the team
Similar jobs
Senior Generative AI Engineer
Job Id: QX016
About Us:
The QX impact was launched with a mission to make AI accessible and affordable and deliver AI Products/Solutions at scale for the enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights; businesses will continue to face challenges to better understand their customers and even lose them.
Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Job Summary:
We seek a highly experienced Senior Generative AI Engineer who focus on the development, implementation, and engineering of Gen AI applications using the latest LLMs and frameworks. This role requires hands-on expertise in Python programming, cloud platforms, and advanced AI techniques, along with additional skills in front-end technologies, data modernization, and API integration. The Senior Gen AI engineer will be responsible for building applications from the ground up, ensuring robust, scalable, and efficient solutions.
Responsibilities:
· Build GenAI solutions such as virtual assistant, data augmentation, automated insights and predictive analytics
· Design, develop, and fine-tune generative AI models (GANs, VAEs, Transformers).
· Handle data preprocessing, augmentation, and synthetic data generation.
· Work with NLP, text generation, and contextual comprehension tasks.
· Develop backend services using Python or .NET for LLM-powered applications.
· Build and deploy AI applications on cloud platforms (Azure, AWS, GCP).
· Optimize AI pipelines and ensure scalability.
· Stay updated with advancements in AI and ML.
Skills & Requirements:
- Strong knowledge of machine learning, deep learning, and NLP.
- Proficiency in Python, TensorFlow, PyTorch, and Keras.
- Experience with cloud services, containerization (Docker, Kubernetes), and AI model deployment.
- Understanding of LLMs, embeddings, and retrieval-augmented generation (RAG).
- Ability to work independently and as part of a team.
- Bachelor’s degree in Computer Science, Mathematics, Engineering, or a related field.
- 6+ years of experience in Gen AI, or related roles.
- Experience with AI/ML model integration into data pipelines.
Core Competencies for Generative AI Engineers:
1. Programming & Software Development
a. Python – Proficiency in writing efficient and scalable code with strong knowledge with NumPy, Pandas, TensorFlow, PyTorch and Scikit-learn.
b. LLM Frameworks – Experience with Hugging Face Transformers, LangChain, OpenAI API, and similar tools for building and deploying large language models.
c. API integration such as FastAPI, Flask, RESTful API, WebSockets or Django.
d. Knowledge of Version Control, containerization, CI/CD Pipelines and Unit Testing.
2. Vector Database & Cloud AI Solutions
a. Pinecone, FAISS, ChromaDB, Neo4j
b. Azure Redis/ Cognitive Search
c. Azure OpenAI Service
d. Azure ML Studio Models
e. AWS (Relevant Services)
3. Data Engineering & Processing
- Handling large-scale structured & unstructured datasets.
- Proficiency in SQL, NoSQL (PostgreSQL, MongoDB), Spark, and Hadoop.
- Feature engineering and data augmentation techniques.
4. NLP & Computer Vision
- NLP: Tokenization, embeddings (Word2Vec, BERT, T5, LLaMA).
- CV: Image generation using GANs, VAEs, Stable Diffusion.
- Document Embedding – Experience with vector databases (FAISS, ChromaDB, Pinecone) and embedding models (BGE, OpenAI, SentenceTransformers).
- Text Summarization – Knowledge of extractive and abstractive summarization techniques using models like T5, BART, and Pegasus.
- Named Entity Recognition (NER) – Experience in fine-tuning NER models and using pre-trained models from SpaCy, NLTK, or Hugging Face.
- Document Parsing & Classification – Hands-on experience with OCR (Tesseract, Azure Form Recognizer), NLP-based document classifiers, and tools like LayoutLM, PDFMiner.
5. Model Deployment & Optimization
- Model compression (quantization, pruning, distillation).
- Deployment using Azure CI/CD, ONNX, TensorRT, OpenVINO on AWS, GCP.
- Model monitoring (MLflow, Weights & Biases) and automated workflows (Azure Pipeline).
- API integration with front-end applications.
6. AI Ethics & Responsible AI
- Bias detection, interpretability (SHAP, LIME), and security (adversarial attacks).
7. Mathematics & Statistics
- Linear Algebra, Probability, and Optimization (Gradient Descent, Regularization, etc.).
8. Machine Learning & Deep Learning
a. Expertise in supervised, unsupervised, and reinforcement learning.
a. Proficiency in TensorFlow, PyTorch, and JAX.
b. Experience with Transformers, GANs, VAEs, Diffusion Models, and LLMs (GPT, BERT, T5).
Personal Attributes:
- Strong problem-solving skills with a passion for data architecture.
- Excellent communication skills with the ability to explain complex data concepts to non-technical stakeholders.
- Highly collaborative, capable of working with cross-functional teams.
- Ability to thrive in a fast-paced, agile environment while managing multiple priorities effectively.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
Ready to make an impact? Apply today and become part of the QX impact team!
We are looking for an enthusiastic developer with a strong understanding of core Ruby
and Rails framework along with PostgreSQL database. Someone who is passionate
about coding and loves to work in an ongoing challenging environment. You will be part
of a talented software team. You have to consistently deliver in a fast paced
environment and should be more than willing to build software that people love to use.
Key Responsibilities
The individual role that you’ll play in our team:
● Developing large multi-tenant applications in Rails.
● Understanding Rails best practices and religiously introducing those to our
codebase.
● Knowledge on how to do effective Refactoring.
● Ability to write unit tests and following those practices religiously.
● Working closely with the Product managers and UX team.
● Helping QAs to write automated integration tests.
● Staying up-to-date with current and future Backend technologies and
architectures.
Read the ‘Skills and Experience’ section, it is not the usual yada yada, you’ll be
asked specific questions on these.
Skills and Experience
● Ruby on Rails architecture best practices
● Knowledge on the latest versions on ROR
● Strong OOP knowledge in Ruby.
● Asynchronous Networking in Ruby
● Designing RESTFul HTTP APIs using JSON-Schema or JSON API (jsonapi.org).
● Ability to architect and develop API only backend
● Experience in using ActiveRecordSerializer
● Understanding O-Auth2 or JWT (Json Web Token) authentication mechanisms.
● How to use RSpec
● Rails Security Best Practices
● PostgreSQL and Rails.
● SQL concepts like Joins, Relationships etc.
● Understanding DB Partition strategies.
● Knowledge about refactoring ActiveRecord Models (read this - “7 Patterns to
Refactor Fat ActiveRecord Models”).
● Understanding scaling strategies for a high-traffic Rails applications (2 million+
requests a day).
● Background Job processing using Redis and Sidekiq
● Experience in using Amazon Web Services (AWS) tools.
● Writing automated Deployment Scripts using Capistrano, Ansible etc.
● Sending emails in Rails
● Knowledge in Linux and Git is mandatory
Optional Skills
● Knowledge in using Chef or Puppet
● Ability to do basic DevOps like setting up a Linux server.
● Websocket communication in Rails 5.
● Node.js
● JRuby
3 - 5 Years of experience, in that min 2 years of experience with PHP MVC frameworks like
Laravel(preferred), CodeIgniter, etc.
Strong Experience in PHP and MySQL and their declarative query languages
Understanding the fully synchronous behaviour of PHP
Strong experience in JavaScript and good to have working knowledge on any JS frameworks like Angular (Preferred) React, Python etc. - Understanding of MVC design patterns
Good understanding of front-end technologies JavaScript, jQuery, Angular, HTML, and CSS3
Knowledge of object-oriented PHP programming
Understanding accessibility and security compliance
Strong knowledge of the common PHP or web server exploits and their solutions
Understanding fundamental design principles behind a scalable application
User authentication and authorization between multiple systems, servers, and environments
Integration of multiple data sources and databases into one system
Familiarity with limitations of PHP as a platform and its workarounds
Creating database schemas that represent and support business processes
Proficient understanding of code versioning tools, such as Git
Basic knowledge of AWS and Shell command
amazingly passionate team of data scientists and engineers who are striving to achieve
a pioneering vision.
Responsibilities:
Our web crawling team is very unique in the industry - while we have many “single-site”
crawlers, our unique proposition and technical efforts are all geared towards building
“generic” bots that can crawl and parse data from thousands of websites, all using the
same code. This requires a whole different level of thinking, planning, and coding.
Here’s what you’ll do -
● Build, improve, and run our generic robots to extract data from both the web and
documents – handling critical information among a wide variety of structures and
formats without error.
● Derive common patterns from semi-structured data, build code to handle them,
and be able to deal with exceptions as well.
● Be responsible for the live execution of our robots, managing turnaround times,
exceptions, QA, and delivery, and building a bleeding-edge infrastructure to
handle volume and scope.
Requirements:
● Either hands-on experience with, or relevant certifications in the Robotic Process
Automation tools, preferably Kofax Kapow, Automation Anywhere, UIPath,
BluePrism, OR crawling purely in Python and relevant libraries.
● 1-3 years of experience with building, running, and maintaining crawlers.
● Successfully worked on projects that delivered 99+% accuracy despite a wide
variety of formats.
● Excellent SQL or MongoDB/ElasticSearch skills and familiarity with Regular
Expressions, Data Mining, Cloud Infrastructure, etc.
Other Infrastructure Requirements
● High-speed internet connectivity for video calls and efficient work.
● Capable business-grade computer (e.g., modern processor, 8 GB+ of RAM, and
no other obstacles to interrupted, efficient work).
● Headphones with clear audio quality.
● Stable power connection and backups in case of internet/power failure.
Position: Frontend Engineer / MEAN Stack Developer
Location: Bangalore
Experience: 2.5 to 4 years, preferably in an agile environment
Strong Knowledge MongoDB, Express.js, AngularJS, Node.js
Strong Knowledge on User Interface and User Experience Design
Good Knowledge on Deployment Process (Nginix, Docker, AWS, Digital Ocean)
Basic Knowledge other stacks like ReactJS, View.js
- Minimum 3+ years' experience as a software developer.
- Proficiency in JavaScript, Angular.js, HTML 5, CSS.
- Familiarity with Git.
- Linux and/or OS X experience.
- Experience consuming API endpoints.
- Good to have - Kafka/queuing system
- Should be well-versed in using multiple Databases - SQL/NO SQL,
- Should have experience using in- memory Data Bases-Redis
- Experience with Unit-Testing.
- Experience building web applications with responsive design.
- Experience with Node.js and/or other server-side JavaScript technologies and tools.
- Strong knowledge of design principles, user interfaces, web standards and usability.
- ES6, React JS/JSX, Redux, Web pack, Immutable.js.
Desired Skills and Experience








