
Key Responsibilities
- Backend & API Engineering (60% Focus)
- Design and implement REST APIs and microservices for high-performance AI systems.
- Apply enterprise object-oriented programming best practices for secure, scalable backend services.
- Integrate AI-powered features with cloud-native architectures.
- Generative AI & LLM Development (40% Focus)
- Build LLM-powered features with the OpenAI API or other LLM APIs (reasoning and non-reasoning models, temperature tuning, version control).
- Implement retrieval-augmented generation (RAG).
- Apply advanced prompt engineering and model tuning techniques for optimized results.
- Deploy and manage solutions using Docker and secure integrations (e.g., SSO, Google Drive).
What We’re Looking For
- 10+ years of backend engineering experience (REST APIs, microservices, OO enterprise architecture).
- 3+ years of experience building AI/ML solutions (LLMs, RAG, OpenAI API).
- Strong hands-on Python expertise and object-oriented design patterns.
- Hands-on experience with Langchain, Lambda, Docker, and secure system integrations.
- Proven track record delivering production-ready, scalable applications.

Similar jobs
About the Role:
We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
Looking out for Product Marketers -2023 MBA Passouts who can join immediately.
Key Responsibilities:
- Develop and identify new ICPs, IBPs, potential buyer-in-market signals and more for new products and services we or our clients offer to the market
- Responsible for the creation and execution of end-to-end marketing campaigns to position our products as solutions for our customers and partners worldwide.
- Communicate the value of new products and services to the sales and marketing team Speak and present both internally and externally to promote the story of our or our client's offerings
- Develop and implement promotional activities such as promotions and product launches
- Understand the market, target customers, their needs, the competition landscape to identify growth opportunities
- Build highly engaging campaigns across user funnels to increase conversion rates
- Create messaging and hooks for the entire customer journey, top of the funnel, middle of the funnel, bottom of the funnel
- Representing the customer voice in product development, marketing and all related communications.
- Develop comprehensive marketing plans for sales and marketing teams.
- Work closely with the product/service development team to determine the most profitable course for each existing and new product.
- Help the marketing team to generate strong momentum ahead of new product launches.
- Focus on implementing programs that consistently generate new, high-quality leads for our company
- Increase our digital presence with meaningful content, messaging, and tactics
Key Requirement:
- Data-driven with keen creative capabilities
- Ability to decode, grasp, and complex technical domains and convert it into customer problem statements
- Incredible ownership, accountability, follow-up/follow-through skills
- Strong collaboration across teams
- Strong numerical and analytical aptitude.
- Ability to think and react in a high-energy, fast-paced environment.
- Good People Management skills -Good organizational skills including prioritizing, scheduling, time management, and meeting deadlines.
- Technical aptitude and agility to learn web-based tools
- Very strong written communication and presentation skills
- Project ownership and using customer data to identify and prioritize opportunities
- MBA degree is preferred
This profile will include the following responsibilities:
- Develop Parsers for XML and JSON Data sources/feeds
- Write Automation Scripts for product development
- Build API Integrations for 3rd Party product integration
- Perform Data Analysis
- Research on Machine learning algorithms
- Understand AWS cloud architecture and work with 3 party vendors for deployments
- Resolve issues in AWS environmentWe are looking for candidates with:
Qualification: BE/BTech/Bsc-IT/MCA
Programming Language: Python
Web Development: Basic understanding of Web Development. Working knowledge of Python Flask is desirable
Database & Platform: AWS/Docker/MySQL/MongoDB
Basic Understanding of Machine Learning Models & AWS Fundamentals is recommended.
Keycloak IAM Engineer
Should have 3-4 years experience in IAM domain, out of which at least one project implementation of Keycloak IAM product. Specific skill set requirements are:
-
Understanding User Organization and its mapping onto LDAP
-
Agent/Agent-less, OAUTH, SAML based & custom token based SSO.
-
User reconciliation from HR and creation in LDAP/IAM using automation.
-
SSO with Enterprise Applications such as telecom billing, ERP applications
-
Integration of OOTB connectors & creation of custom connectors (Rest/SOAP
API based).
-
Advanced customization at UI, workflow levels.
-
Setting up Groups and Roles
-
Setting up system and processes for Access certifications.
-
Target application account provisioning and birthright role assignment to users
based on default and request based policies.
-
Using UMS Email & SMPP driver to configure email/sms notifications.
-
Custom web application for Forget password, Reset password & LDAP account
unlock for user who can't access IAM over internet.
About the Company-
About the Internship:
selected intern's day-to-day responsibilities include:
1. Develop server-side/client-side applications.
2. Basic knowledge of Data structures.
3. Test and document the developed code.
4. Must have basic knowledge of any one of the following technologies -
PHP, Javascript, Golang, Dart.
Note - Only those candidates can apply who:
1. are available for full-time (in-office) internship
2. can start the internship between 1st Aug'22 and 15th Aug'22
- Design, create, test, and maintain data pipeline architecture in collaboration with the Data Architect.
- Build the infrastructure required for extraction, transformation, and loading of data from a wide variety of data sources using Java, SQL, and Big Data technologies.
- Support the translation of data needs into technical system requirements. Support in building complex queries required by the product teams.
- Build data pipelines that clean, transform, and aggregate data from disparate sources
- Develop, maintain and optimize ETLs to increase data accuracy, data stability, data availability, and pipeline performance.
- Engage with Product Management and Business to deploy and monitor products/services on cloud platforms.
- Stay up-to-date with advances in data persistence and big data technologies and run pilots to design the data architecture to scale with the increased data sets of consumer experience.
- Handle data integration, consolidation, and reconciliation activities for digital consumer / medical products.
Job Qualifications:
- Bachelor’s or master's degree in Computer Science, Information management, Statistics or related field
- 5+ years of experience in the Consumer or Healthcare industry in an analytical role with a focus on building on data pipelines, querying data, analyzing, and clearly presenting analyses to members of the data science team.
- Technical expertise with data models, data mining.
- Hands-on Knowledge of programming languages in Java, Python, R, and Scala.
- Strong knowledge in Big data tools like the snowflake, AWS Redshift, Hadoop, map-reduce, etc.
- Having knowledge in tools like AWS Glue, S3, AWS EMR, Streaming data pipelines, Kafka/Kinesis is desirable.
- Hands-on knowledge in SQL and No-SQL database design.
- Having knowledge in CI/CD for the building and hosting of the solutions.
- Having AWS certification is an added advantage.
- Having Strong knowledge in visualization tools like Tableau, QlikView is an added advantage
- A team player capable of working and integrating across cross-functional teams for implementing project requirements. Experience in technical requirements gathering and documentation.
- Ability to work effectively and independently in a fast-paced agile environment with tight deadlines
- A flexible, pragmatic, and collaborative team player with the innate ability to engage with data architects, analysts, and scientists
Responsibilities
- Manual Testing + Test Automation
- Manual testing (Functional), Database testing
- Exposure in Selenium, Javascript, etc
- Exposure in Agile and DevOps
- Excellent communication skills
- Review requirements, specifications and technical design documents to provide timely and meaningful feedback
- Create detailed, comprehensive and well-structured test plans and test cases
- Estimate, prioritize, plan and coordinate testing activities
- Design, develop and execute automation scripts using open source tools
- Identify, record, document thoroughly and track bugs
- Perform thorough regression testing when bugs are resolved
- Develop and apply testing processes for new and existing products to meet client needs
- Liaise with internal teams (e.g. developers and product managers) to identify system requirements
- Monitor debugging process results
- Investigate the causes of non-conforming software and train users to implement solutions
- Track quality assurance metrics, like defect densities and open defect counts
Stay up to date with new testing tools and test strategies
Requirements
- Proven work experience in software development
- Proven work experience in software quality assurance
- Strong knowledge of software QA methodologies, tools and processes
- Experience in writing clear, concise and comprehensive test plans and test cases
- Hands-on experience with both white box and black box testing
- Hands-on experience with automated testing tools
- Solid knowledge of SQL and scripting
- Experience working in an Agile/Scrum development process
- Experience with performance and/or security testing is a plus
- BS/MS degree in Computer Science, Engineering or a related subject
- Knowledge about API/Service Testing would be important








