
with the engineering team to strategize and execute the development of data products
● Execute analytical experiments methodically to help solve various problems and make a true impact across
various domains and industries
NLP ENGINEER at KARZA TECHNOLOGIES
● Identify relevant data sources and sets to mine for client business needs, and collect large structured and
unstructured datasets and variables
● Devise and utilize algorithms and models to mine big data stores, perform data and error analysis to improve
models, and clean and validate data for uniformity and accuracy
● Analyze data for trends and patterns, and Interpret data with a clear objective in mind
● Implement analytical models into production by collaborating with software developers and machine
learning engineers
● Communicate analytic solutions to stakeholders and implement improvements as needed to operational
systems
What you need to work with us:
● Good understanding of data structures, algorithms, and the first principles of mathematics.
● Proficient in Python and using packages like NLTK, Numpy, Pandas
● Should have worked on deep learning frameworks (like Tensorflow, Keras, PyTorch, etc)
● Hands-on experience in Natural Language Processing, Sequence, and RNN Based models
● Mathematical intuition of ML and DL algorithms
● Should be able to perform thorough model evaluation by creating hypotheses on the basis of statistical
analyses
● Should be comfortable in going through open-source code and reading research papers.
● Should be curious or thoughtful enough to answer the “WHYs” pertaining to the most cherished
observations, thumb rules, and ideas across the data science community.
Qualification and Experience Required:
● 1 - 4 years of relevant experience
● Bachelor/ Master’s degree in computer science / Computer Engineering / Information Technology

Similar jobs
🚀 Hiring: Data Engineer | GCP + Spark + Python + .NET |
| 6–10 Yrs | Gurugram (Hybrid)
We’re looking for a skilled Data Engineer with strong hands-on experience in GCP, Spark-Scala, Python, and .NET.
📍 Location: Suncity, Sector 54, Gurugram (Hybrid – 3 days onsite)
💼 Experience: 6–10 Years
⏱️ Notice Period :- Immediate Joiner
Required Skills:
- 5+ years of experience in distributed computing (Spark) and software development.
- 3+ years of experience in Spark-Scala
- 5+ years of experience in Data Engineering.
- 5+ years of experience in Python.
- Fluency in working with databases (preferably Postgres).
- Have a sound understanding of object-oriented programming and development principles.
- Experience working in an Agile Scrum or Kanban development environment.
- Experience working with version control software (preferably Git).
- Experience with CI/CD pipelines.
- Experience with automated testing, including integration/delta, Load, and Performance
5+ years experience in design, build, and implementation of Configure, Price, Quote (CPQ) solutions using the Conga CPQ platform. Should have Conga CPQ/CLM expertise
Design and Development:
Develop and design Conga CPQ solutions that align with business objectives and technology best practices.
Implementation:
Lead and guide the implementation of Conga CPQ projects, ensuring seamless integration and optimal performance.
Technical Leadership:
Provide technical leadership and guidance to teams, including code reviews and establishing development standards.
Problem Solving:
Troubleshoot and resolve technical issues related to Conga CPQ, including those arising from integrations.
Integration Expertise:
Design and manage integrations between Conga CPQ and other systems, including Salesforce.
Stakeholder Management:
Collaborate with business stakeholders to understand requirements and ensure that solutions meet their needs
About the Role
We are looking for an experienced Senior Backend Developer to design and build scalable, secure, and high-performance backend systems. The ideal candidate will have deep expertise in Python/Django, microservices architecture, and cloud technologies, along with strong problem-solving skills and leadership capabilities.
Key Responsibilities
•Design and develop backend services using Django and Python.
•Architect and implement microservices-based solutions for scalability and maintainability.
•Work with PostgreSQL and Redis for efficient data storage and caching.
•Build and maintain RESTful APIs and ensure robust API design principles.
•Implement system design best practices for high availability and fault tolerance.
•Containerize applications using Docker and manage deployments with Kubernetes.
•Integrate with cloud platforms (AWS/Azure) for hosting and infrastructure management.
•Apply security best practices to protect data and application integrity.
•Collaborate with frontend, QA, and DevOps teams for seamless delivery.
•Mentor junior developers and conduct code reviews to maintain quality standards.
Required Skills & Expertise
•Django/Python – Advanced proficiency in backend development.
•Microservices Architecture – Strong understanding of distributed systems.
•PostgreSQL & Redis – Expertise in relational and in-memory databases.
•Docker/Kubernetes – Hands-on experience with containerization and orchestration.
•API Design & System Design – Ability to design scalable and secure systems.
•Cloud (AWS/Azure) – Practical experience with cloud services and deployments.
•Security Best Practices – Knowledge of authentication, authorization, and data protection.
Preferred Qualifications
•Experience with CI/CD pipelines and DevOps practices.
•Familiarity with message queues (e.g., RabbitMQ, Kafka).
•Exposure to monitoring tools (Prometheus, Grafana).
What We Offer
•Competitive salary and benefits.
•Opportunity to work on cutting-edge backend technologies.
•Collaborative and growth-oriented work environment.
🌍 We’re Hiring: Senior Field AI Engineer | Remote | Full-time
Are you passionate about pioneering enterprise AI solutions and shaping the future of agentic AI?
Do you thrive in strategic technical leadership roles where you bridge advanced AI engineering with enterprise business impact?
We’re looking for a Senior Field AI Engineer to serve as the technical architect and trusted advisor for enterprise AI initiatives. You’ll translate ambitious business visions into production-ready applied AI systems, implementing agentic AI solutions for large enterprises.
What You’ll Do:
🔹 Design and deliver custom agentic AI solutions for mid-to-large enterprises
🔹 Build and integrate intelligent agent systems using frameworks like LangChain, LangGraph, CrewAI
🔹 Develop advanced RAG pipelines and production-grade LLM solutions
🔹 Serve as the primary technical expert for enterprise accounts and build long-term customer relationships
🔹 Collaborate with Solutions Architects, Engineering, and Product teams to drive innovation
🔹 Represent technical capabilities at industry conferences and client reviews
What We’re Looking For:
✔️ 7+ years of experience in AI/ML engineering with production deployment expertise
✔️ Deep expertise in agentic AI frameworks and multi-agent system design
✔️ Advanced Python programming and scalable backend service development
✔️ Hands-on experience with LLM platforms (GPT, Gemini, Claude) and prompt engineering
✔️ Experience with vector databases (Pinecone, Weaviate, FAISS) and modern ML infrastructure
✔️ Cloud platform expertise (AWS, Azure, GCP) and MLOps/CI-CD knowledge
✔️ Strategic thinker able to balance technical vision with hands-on delivery in fast-paced environments
✨ Why Join Us:
- Drive enterprise AI transformation for global clients
- Work with a category-defining AI platform bridging agents and experts
- High-impact, customer-facing role with strategic influence
- Competitive benefits: medical, vision, dental insurance, 401(k)
Airflow developer:
Exp: 5 to 10yrs & Relevant exp must be above 4 Years.
Work location: Hyderabad (Hybrid Model)
Job description:
· Experience in working on Airflow.
· Experience in SQL, Python, and Object-oriented programming.
· Experience in the data warehouse, database concepts, and ETL tools (Informatica, DataStage, Pentaho, etc.).
· Azure experience and exposure to Kubernetes.
· Experience in Azure data factory, Azure Databricks, and Snowflake.
Required Skills: Azure Databricks/Data Factory, Kubernetes/Dockers, DAG Development, Hands-on Python coding.
Assist with the development, execution, management, and measurement of digital marketing programs and campaigns that drive customer awareness, traffic, sales and retention across D2C eCommerce touchpoints
- The desired candidate must be from a D2C company.
- Strong hands-on experience in handling e-commerce platforms/websites.
- MBA- Marketing with minimum of 3 years experience in the same role.
Responsibilities:
1. Develop, propose, and execute a robust promotion plan in partnership with brand teams to drive sales on owned D2C platforms
2. Create D2C channel from scratch for a brand as well as run an existing one
3. Activate key marketing channels (social, SEM, SEO, display, email, influencer, and affiliate marketing) to drive qualified traffic
4. Utilize analytics tools such as Google Analytics and Adobe analytics to assess and optimize the performance of sites, campaigns, and marketing channels
5. Utilize web and sales analytics data to analyze business performance and provide insights and recommendations for growth
6. Develop and propose innovative marketing programs and initiatives to grow the D2C business.
7. Effectively provide executive stakeholders with timely promotion and marketing campaign performance reports
8. Ensure content marketing and content updation on own channel to engage consumers
9. Launch a long-running program to ensure higher retention and drive repeat purchases
Requirements:
Experience in running D2C channels from an end-to-end perspective
Experience in data analytics and use of web analytics tools, such as Google Analytics, to assess and report site and channel performance
Understanding of consumer behavior and market trends
High result orientation – getting things done on time and as per agreed-upon norms.
Good Communication Skills
Attention to detail;
Creativity
Good research capabilities
As one of the first members of the frontend development team, you will be the sole owner of the entire front end development cycle for our consumer-facing web
products. Also, you will be responsible for shaping up the entire system for scale and
collaborating intensively with the backend and design teams to create the best consumer
experiences.
Responsibilities :
Develop all user-facing products preferably in React
Build reusable components and front-end libraries for future use
Translate designs and wireframes into high-quality code
Optimize components for maximum performance across a vast array of web-capable
devices and browsers
Focus on code maintainability and performance of the application.
Provide technical advice and assists in solving programming problems.
Enhance SEO, Analytics and overall frontend architecture for better performance
Requirements :
Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object
model
Good foundation in design and a knack for designing interactions and elegant interfaces
Good to have - understanding of React.js and its core principles
Good to have - experience with popular React.js workflows (such as Flux or Redux)
Experience with data structure libraries
Familiarity with RESTful APIs
Familiarity with modern front-end build pipelines and tools
A knack for benchmarking and optimization
Familiarity with code versioning tools (such as Git)
Proficient in industry-standard best practices such as Design Patterns, Coding Standards,
Coding modularity, Prototypes, etc.
Experience in using debugging tools used for profiling/optimizing performance
Experience in debugging, tuning and optimizing UI components
A minimum of two (2) years of relevant development or engineering experience.








