
Role: Data Engineer
Company: PayU
Location: Bangalore/ Mumbai
Experience : 2-5 yrs
About Company:
PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities.
The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services.
Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services.
India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments.
PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants.
Job responsibilities:
- Design infrastructure for data, especially for but not limited to consumption in machine learning applications
- Define database architecture needed to combine and link data, and ensure integrity across different sources
- Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems
- Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed
- Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack.
- Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions
Requirements to be successful in this role:
- Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica.
- Strong experience with scalable compute solutions such as in Kafka, Snowflake
- Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc.
- Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL)
- A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks
- Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI)
- Experience with designing and implementing tools that support sharing of data, code, practices across organizations at scale

Similar jobs
Job Summary
We are looking for a skilled Python AI Developer with experience in Flask, AI/LLM integrations, and Model Context Protocol (MCP) to build intelligent, scalable, and production-ready AI-powered applications. You will work on AI agents, tool integrations, and backend APIs that interact with modern LLM platforms like OpenAI/Claude.
Key Responsibilities
- Develop and maintain backend services using Python and Flask/FastAPI
- Build and integrate AI/LLM-based features using OpenAI, Claude, or similar models
- Implement Model Context Protocol (MCP) / tool-function calling frameworks
- Design and develop AI agents that interact with external tools, APIs, and databases
- Build RAG pipelines using vector databases for intelligent retrieval
- Manage context, memory, and conversation state in AI workflows
- Create scalable REST APIs for AI-powered applications
- Optimize performance, security, and reliability of AI services
- Collaborate with product, frontend, and data teams
Required Skills
- Strong Python programming
- Experience with Flask or FastAPI
- Hands-on with OpenAI / GPT / Claude / LLM APIs
- Experience in MCP / Tool Calling / Function Calling / AI Agents
- Knowledge of RAG, embeddings, and vector databases (Pinecone, FAISS, Chroma, Weaviate)
- API development, JSON, async programming
- Understanding of prompt engineering and context handling
Good to Have
- LangChain / LlamaIndex
- Docker / Cloud (AWS, GCP, Azure)
- WebSockets / Streaming responses
- Authentication, API security, JWT
- Basic ML / NLP knowledge
- CI/CD and deployment experience
Key Responsibilities
1. Sourcing and Procurement
- Identify, evaluate, and onboard suppliers and vendors for various products and services
- Compare suppliers based on pricing, quality standards, delivery timelines, and reliability
- Negotiate contracts, pricing, payment terms, and delivery conditions to achieve cost efficiency
2. Purchase Order Management
- Prepare and issue purchase orders with accurate specifications, quantities, and delivery schedules
- Track purchase orders and follow up with suppliers to ensure on-time delivery
- Coordinate with internal departments to align procurement requirements
- Resolve discrepancies related to purchase orders, invoices, and deliveries
3. Inventory Management
- Monitor inventory levels to ensure uninterrupted operations
- Coordinate with warehouse and logistics teams for timely stock replenishment
- Assist in inventory planning to minimize excess stock and avoid shortages
4. Supplier Relationship Management
- Develop and maintain strong, long-term relationships with suppliers and vendors
- Address supplier performance issues, quality concerns, and delivery delays
- Conduct regular supplier performance reviews
5. Compliance and Documentation
- Ensure all procurement activities comply with company policies, procedures, and statutory regulations
- Maintain accurate records of purchases, supplier contracts, pricing, and procurement documentation
- Support audits and internal reviews by providing required procurement data
Skills and Qualifications
Required Qualifications
- Proven experience in the Purchase/Procurement field or a similar role, preferably in a relevant industry
- Strong understanding of procurement processes, policies, and best practices
- Excellent negotiation, communication, and interpersonal skills
- Strong analytical, problem-solving, and decision-making abilities
- Proficiency in MS Office (Excel, Word, Outlook)
- Hands-on experience with ERP systems for purchase and inventory management
Preferred Skills
- Ability to manage multiple priorities in a fast-paced environment
- Strong coordination skills with internal teams and external vendors
Location: Hybrid/ Remote
Type: Contract / Full‑Time
Experience: 5+ Years
Qualification: Bachelor’s or Master’s in Computer Science or a related technical field
Responsibilities:
- Architect & implement the RAG pipeline: embeddings ingestion, vector search (MongoDB Atlas or similar), and context-aware chat generation.
- Design and build Python‑based services (FastAPI) for generating and updating embeddings.
- Host and apply LoRA/QLoRA adapters for per‑user fine‑tuning.
- Automate data pipelines to ingest daily user logs, chunk text, and upsert embeddings into the vector store.
- Develop Node.js/Express APIs that orchestrate embedding, retrieval, and LLM inference for real‑time chat.
- Manage vector index lifecycle and similarity metrics (cosine/dot‑product).
- Deploy and optimize on AWS (Lambda, EC2, SageMaker), containerization (Docker), and monitoring for latency, costs, and error rates.
- Collaborate with frontend engineers to define API contracts and demo endpoints.
- Document architecture diagrams, API specifications, and runbooks for future team onboarding.
Required Skills
- Strong Python expertise (FastAPI, async programming).
- Proficiency with Node.js and Express for API development.
- Experience with vector databases (MongoDB Atlas Vector Search, Pinecone, Weaviate) and similarity search.
- Familiarity with OpenAI’s APIs (embeddings, chat completions).
- Hands‑on with parameters‑efficient fine‑tuning (LoRA, QLoRA, PEFT/Hugging Face).
- Knowledge of LLM hosting best practices on AWS (EC2, Lambda, SageMaker).
Containerization skills (Docker):
- Good understanding of RAG architectures, prompt design, and memory management.
- Strong Git workflow and collaborative development practices (GitHub, CI/CD).
Nice‑to‑Have:
- Experience with Llama family models or other open‑source LLMs.
- Familiarity with MongoDB Atlas free tier and cluster management.
- Background in data engineering for streaming or batch processing.
- Knowledge of monitoring & observability tools (Prometheus, Grafana, CloudWatch).
- Frontend skills in React to prototype demo UIs.
Job Title: Manual QA Tester
Location: Bangalore
Experience Required: 4+ Years
Work Mode: Onsite / Hybrid (as per company policy)
Notice Period: Immediate joiners or candidates who have completed their notice period only
About the Role
We are looking for a detail-oriented and methodical Manual QA Tester with strong analytical skills and a passion for ensuring high-quality software. You will be responsible for understanding requirements, planning test coverage, identifying defects, and ensuring a seamless user experience across web and API interfaces.
Key Responsibilities
- Design, develop, and execute comprehensive test cases based on functional requirements and specifications
- Perform manual testing of web-based applications, APIs, and mobile apps (if applicable)
- Identify, record, and track bugs using JIRA or similar tools
- Work closely with developers and product teams to understand features and resolve issues
- Maintain clear and concise documentation of test results, reports, and testing processes
- Validate bug fixes, regression cases, and perform smoke testing
- Perform cross-browser and device compatibility testing
- Analyze and debug issues using logs, browser developer tools, and relevant tools
Interview Focus Areas
- Strong BlackBox & WhiteBox testing mindset
- Ability to prioritize logically and structure test coverage effectively
- Attention to detail and discipline in documentation
- Quality of clarifying questions during ambiguity
- Ability to segregate API vs UI test cases and cover edge cases & security
- Understanding of web-based systems, APIs, and integration points
Mandatory Tools & Skills
- Postman – for API testing
- Chrome Developer Tools – for UI inspection, console & network analysis
- JIRA – for defect tracking and test management
- Log Analysis – basic familiarity with reading and debugging logs
- Strong command of test case writing, reporting, and scenario-based testing
Good-to-Have Skills
- Experience with tools like TestRail, Zephyr, or Confluence
- Exposure to Agile/Scrum methodology
- Understanding of database queries (basic SQL)
- Experience testing responsive designs and mobile UIs

- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good
understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least
one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision
is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have exposure and working knowledge in AI environment with Machine learning experience
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis,
MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana,
Graylog, StatsD, Datadog etc
should be able to do mock-up designing, prototyping and wireframing
Should be able to create mobile and application designs
AAFT –a venture of Marwah Studio, under the visionary leader, Mr. Sandeep Marwah, has been Asia’s
first film school providing Media and Arts Education for more than 27 years now. We are India’s first
media university skilling creative talents across the country.
Established in the year 1990, Asian Academy of Film and Television (AAFT), is a Media and
Communication academy located in Delhi/NCR. It is located in Film City Noida, and is known for
offering various undergraduate, postgraduate, diploma and short-term courses in the fields of Cinema,
Mass Communication, Animation, Fashion and Photography. AAFT has been facilitating Media
education for more than 27 years, and all its courses are professional in nature. The academy is
approved by the UGC and recognised by the Ministry of Education.
AAFT has international accreditation from the International Chamber of Media & Entertainment
Industry. Till date, AAFT has trained more than 1 lakh students from 180 countries. The academy also
has collaboration with international universities such as New York University, USA, Oxford Business
College, United Kingdom and Deakin University, Melbourne, Australia. The training faculty members at
AAFT comprises industry experts and celebrities like Sonam Kapoor, Anil Kapoor and Nawazuddin
Siddiqui.
The group has ventured into Ed-Tech domain by launching AAFT Online envisions a single platform for
skilling and promoting creative talent across the globe. AAFT online aims at building a consortium of
the like-minded creative professionals by facilitating and promoting creative art skills across the world
by sourcing, engaging and collaborating via a technologically assisted eLearning platform.
We are on a mission to create a pool of market-ready creative talents in creative skills arts. An
integrated e-Learning platform exhaustively covering the national, international and multilingual
programs in creative arts skills, intending to make it accessible, affordable and inclusive to every
creative person present anytime, anywhere.
AAFT Online a world-class Ed-Tech platform to promote, share and collaborate with creative people.
It’s a common platform for learners and facilitators of creative talent.
Department: Sales
Designation: Learning Consultant
Key Responsibilities:
We are looking to hire passionate Inside Sales professionals for our Sales team. The role will be based
either based out of Gurgaon. The role involves high volume calling coupled with attractive monthly
incentives!
Your primary job responsibility will include (and not limited to):
-
-
-
-
Driving inquiries & admissions for the various short term/long term learning programs being
offered.
Managing the entire sales cycle: starting from engaging with the large volume of well-qualified
leads till the final conversion i.e. enrollment in the program.
Counseling students via call/email on which program would be useful for their career
progression and suggest the best options.
Updation of the CRM on a regular basis.Mandates:
-
-
-
-
-
-
-
-
-
-
-
Job Role: Inside Sales
Location: Sector 28, Gurgaon
Required Work Experience: 1-5 years (sales work exp.)
Educational Qualification: Graduation
Budget: 5-7 LPA fixed CTC + uncapped incentives
Shift Details: 6 days working; 10AM-7PM, Sundays fixed off.
Age Bracket: Not beyond 29 years.
Gender: Any.
Should be target oriented and a self-driven individual.
Proficient in Microsoft Office - especially Excel.
Excellent communication and interpersonal skills.
Role Description
- Office 365 Migration Specialist
- Position requires prior Office 365 mailbox migration experience, scheduling migration batches and process management.
- Under the general direction of the customer IT/Security/Okta Teams is responsible for assisting in the planning and support of the migration from multiple on-premises Exchange environments & third party provided Mail services into a single Azure-hosted Office 365 environment.
- Candidates must be flexible to work Monday through Friday
Primary roles and responsibilities
- Hands-on migration/deployment of Exchange 2007/2010/2016 to Office 365 Azure-hosted solution
- Provide Level II/ Level III support for email and messaging systems
- Perform large-scale infrastructure implementations and migrations
- Configure tools for monitoring and use them to proactively identify issues within email & messaging systems
- Deployment of O365 Application (One Drive, SharePoint, Teams) with existing data migration
- Implementation and migration of Exchange Archiving and Journaling.
- Create and maintain technical documentation.
- Develop and train Service Desk and Endpoint Support to resolve level 1 and 1.5 Office 365 issues
- Participate in monthly maintenance windows
- Performs related duties as assigned
Experience and educational requirements
- Bachelor’s degree/Any equivalent Post Graduation.
Minimum skills, knowledge, and ability requirements
- Hands-on Exchange to O365 migration experience is a must.
- Very Good Knowledge on One Drive, SharePoint
- Integration of O365 with Okta
- Good communication and customer handling skills, candidates need to able to work with minimal supervision.
- Solid understanding of Microsoft Exchange 2007, 2010, 2013, 2016, O365.
- Solid understanding of ITIL/ITSM best practices.
- Excellent problem solving and decision-making skills.
- Experience with Veritas Enterprise Vault a plus.
- Ability to work independently and as part of a team.
- Certifications in this field are a plus.

Description
Job title: Flutter Developer
Location: Chennai
Experience: 1.5 to 3 yrs
Primary Skill Flutter, Dart, JavaScript
Good to have Mongo Db (No Sql), My SQL and Exposure to AWS
Job Description –
* Have two or more iOS/Android apps developed with Flutter. Either deployed on the AppStore/Google
Play or available on Github;
* Experience with third-party libraries and APIs;
* Understanding of the Agile development life-cycle;
* Experience with Git, Jenkins, or other version control tools;
* Good to have No SQL (Mongo DB) and My SQL Exposure
* Good to have experience in AWS cloud environment
* Managing self-directed assignments to resolve software defects or implement features to meet schedule
commitments with a high-level of quality.
* Acquiring knowledge of industry and company standards, tools and technology with a focus on
pragmatic application to business-focused software solutions through initiative and self-study
Benefits
SALARY: 300000 LPA to 500000 LPA











