


Company: PluginLive
About the company:
PluginLive Technology Pvt Ltd is a leading provider of innovative HR solutions. Our mission is to transform the hiring process through technology and make it easier for organizations to find, attract, and hire top talent. We are looking for a passionate and experienced Data Engineering Lead to guide the data strategy and engineering efforts for our Campus Hiring Digital Recruitment SaaS Platform.
Role Overview:
The Data Engineering Lead will be responsible for leading the data engineering team and driving the development of data infrastructure, pipelines, and analytics capabilities for our Campus Hiring Digital Recruitment SaaS Platform. This role requires a deep understanding of data engineering, big data technologies, and team leadership. The ideal candidate will have a strong technical background, excellent leadership skills, and a proven track record of building robust data systems.
Job Description
Position: Data Engineering Lead - Campus Hiring Digital Recruitment SaaS Platform
Location: Chennai
Minimum Qualification: Bachelor’s degree in computer science, Engineering, Data Science, or a related field. Master’s degree or equivalent is a plus.
Experience: 7+ years of experience in data engineering, with at least 3 years in a leadership role.
CTC: 20-30 LPA
Employment Type: Full Time
Key Responsibilities:
Data Strategy and Vision:
- Develop and communicate a clear data strategy and vision for the Campus Hiring Digital Recruitment SaaS Platform.
- Conduct market research and competitive analysis to identify trends, opportunities, and data needs.
- Define and prioritize the data roadmap, aligning it with business goals and customer requirements.
Data Infrastructure Development:
- Design, build, and maintain scalable data infrastructure and pipelines to support data collection, storage, processing, and analysis.
- Ensure the reliability, scalability, and performance of the data infrastructure.
- Implement best practices in data management, including data governance, data quality, and data security.
Data Pipeline Management:
- Oversee the development and maintenance of ETL (Extract, Transform, Load) processes.
- Ensure data is accurately and efficiently processed and available for analytics and reporting.
- Monitor and optimize data pipelines for performance and cost efficiency.
Data Analytics and Reporting:
- Collaborate with data analysts and data scientists to build and deploy advanced analytics and machine learning models.
- Develop and maintain data models, dashboards, and reports to provide insights and support decision-making.
- Ensure data is easily accessible and usable by stakeholders across the organization.
Team Leadership:
- Lead, mentor, and guide a team of data engineers, fostering a culture of collaboration, continuous improvement, and innovation.
- Conduct code reviews, provide constructive feedback, and ensure adherence to development standards.
- Collaborate with cross-functional teams including product, engineering, and marketing to ensure alignment and delivery of data goals.
Stakeholder Collaboration:
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Communicate effectively with non-technical stakeholders to explain data concepts and progress.
- Participate in strategic planning and decision-making processes.
Skills Required:
- Proven experience in designing and building scalable data infrastructures and pipelines.
- Strong proficiency in programming languages such as Python, R, Data visualization tools like Power BI, Tableau, Qlik, Google Analytics
- Expertise in big data technologies such as Apache Airflow, Hadoop, Spark, Kafka, and cloud data platforms like AWS, Oracle Cloud.
- Solid understanding of database technologies, both SQL and NoSQL.
- Experience with data modeling, data warehousing, and ETL processes.
- Strong analytical and problem-solving abilities.
- Excellent communication, collaboration, and leadership skills.
Preferred Qualifications:
- Experience in HR technology or recruitment platforms.
- Familiarity with machine learning and AI technologies.
- Knowledge of data governance and data security best practices.
- Contributions to open-source projects or active participation in the tech community.

About Pluginlive
About
Connect with the team
Company social profiles
Similar jobs

Job Title : Python Backend Engineer (with MLOps & LLMOps Experience)
Experience : 4 to 8 Years
Location : Gurgaon Sector - 43
Employment Type : Full-time
Job Summary :
We are looking for an experienced Python Backend Engineer with a strong background in FastAPI, Django, and hands-on exposure to MLOps and LLMOps practices.
The ideal candidate will be responsible for building scalable backend solutions, integrating AI/ML models into production environments, and implementing efficient pipelines for machine learning and large language model operations.
Mandatory Skills : Python, FastAPI, Django, MLOps, LLMOps, REST API development, Docker, Kubernetes, Cloud (AWS/Azure/GCP), CI/CD.
Key Responsibilities :
- Develop, optimize, and maintain backend services using Python (FastAPI, Django).
- Design and implement API endpoints for high-performance and secure data exchange.
- Collaborate with data science teams to deploy ML/LLM models into production using MLOps/LLMOps best practices.
- Build and manage CI/CD pipelines for ML models and ensure seamless integration with backend systems.
- Implement model monitoring, versioning, and retraining workflows for machine learning and large language models.
- Optimize backend performance for scalability and reliability in AI-driven applications.
- Work with Docker, Kubernetes, and cloud platforms (AWS/Azure/GCP) for deployment and orchestration.
- Ensure best practices in code quality, testing, and security for all backend and model deployment workflows.
Required Skills & Qualifications :
- 4 to 8 years of experience as a Backend Engineer with strong expertise in Python.
- Proficient in FastAPI and Django frameworks for API and backend development.
- Hands-on experience with MLOps and LLMOps workflows (model deployment, monitoring, scaling).
- Familiarity with machine learning model lifecycle and integration into production systems.
- Strong knowledge of RESTful APIs, microservices architecture, and asynchronous programming.
- Experience with Docker, Kubernetes, and cloud environments (AWS, Azure, or GCP).
- Exposure to CI/CD pipelines and DevOps tools.
- Good understanding of Git, version control, and testing frameworks.
Nice to Have :
- Experience with LangChain, Hugging Face, or similar LLM frameworks.
- Knowledge of data pipelines, feature engineering, and ML frameworks (TensorFlow, PyTorch, etc.).
- Understanding of vector databases (Pinecone, Chroma, etc.).
Education :
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.


About the Role
We are hiring a Senior Backend Developer for our client, a US-based tech company, to take ownership of backend architecture, scalability, and infrastructure. This is a critical engineering role where you'll lead feature development and ensure high system performance as the company grows rapidly.
Key Responsibilities
- Build and maintain scalable backend systems using Python (Django, Flask, or FastAPI)
- Optimize and manage relational databases (PostgreSQL, MySQL)
- Design and implement asynchronous processing using Redis and RabbitMQ
- Architect and deploy containerized microservices with a focus on performance and scalability
- Provision and monitor infrastructure on AWS (EC2, RDS, S3)
- Diagnose system performance issues and implement robust solutions
Minimum Requirements
- 3+ years of backend development experience
- Proficient in Python and at least one framework (Django, Flask, or FastAPI)
- Strong knowledge of PostgreSQL or MySQL
- Experience with Redis and RabbitMQ
- Hands-on experience with AWS (EC2, RDS, S3)
- Strong understanding of data structures, algorithms, and backend system design
Good to Have
- Familiarity with JavaScript and frontend frameworks like React.js or Vue.js
- Experience with WebSockets for real-time features
- Exposure to VoIP/WebRTC/SIP/IP PBX technologies
Key Traits We Value
- Smart, logical, and curious problem-solvers
- Self-driven individuals who thrive in fast-paced environments
- Passion for learning and adapting to new technologies
What We Offer
- Competitive Compensation
- 100% remote work
- A collaborative, growth-focused environment
- Challenging projects with real impact


Key Skills required (Items in Bold are mandatory keywords) :
1. Proficiency in Python & Django
2. Solid understanding of Python concepts
3. Experience with some form of Machine Learning (ML)
4. Experience in using libraries such as Numpy and Pandas
5. Some form of experience with NLP and Deep Learning using any of Pytorch, Tensorflow, Keras, Scikit-learn or similar
6. Hands on experience with RDBMS such as Postgres or MySQL
7. Experience building REST APIs using DRF or Flask
8. Comfort with Git repositories, branching and deployment using Git
9. Working experience with Docker
10. Experience in deploying Django applications to AWS,Digital Ocean or Heroku



Mandatory Skill set : C++ and Python - UNIX- Database - SQL or Postgres
Developer Role EXP : 3 to 5yrs
Location : Bangalore /Chennai/Hyderabad
1. Strong proficiency in C++ , with fair knowledge of the language specification (Telecom experience is preferred).
2. Proficient understanding of standard template library (STL): algorithms, containers, functions, and iterators
3. Must have experience on Unix platforms, should possess shell scripting skills.
4. Knowledge on compilers(gcc, g) and debugger (dbx). Knowledge of libraries and linking.
5. Good understanding of code versioning tools (e.g. Git, CVS etc.)
6. Able to write and understand python scripts (both python2 and python3)
7. Handson with logic implementation in python and should be familiar with list comprehension and is comfortable in integrating it with C++ and Unix scripts
8. Able to implement multithreading in both C++ and Python environment.
9. Familiar with Postgres SQL.
C++ developer with Python as secondary - 3 to 4 yrs exp / should be CW.

Job Responsibilities
● Implement and maintain Django-based applications
● Use server-side logic to integrate user-facing elements.
● Develop software related to asset management
● Write and implement software solutions that integrate different systems.
● Identify and suggest various opportunities to improve efficiency and functionality.
● Coordinating the workflow between the graphic designer, the HTML coder, and yourself
● Creating self-contained, reusable, and testable modules and components
● Continuously discover, evaluate, and implement new technologies to maximize development efficiency.
● Unit-test code for robustness, including edge cases, usability, and general reliability.
● Should have the ability to work with old and new versions of django.
● Understand existing code base and adapt to business needs as required
Required Skills
● 3 years experience in software industry
● Minimum 2 year experience in Python
● Minimum 1 Year experience in Django
● Basic understanding of front end technologies like HTML, CSS, JavaScript and jQuery
● Ability to build user interfaces using the latest web standards
● Familiarity with event-driven programming in Python
● Able to create database schemas that represent and support business processes
● Strong unit test and debugging skills
● Experience working in Linux
● Excellent problem solving skills
● Excellent verbal and written communication skills
● Ability to work well in a team development environment

What You'll Do
You will be part of our data platform & data engineering team. As part of this agile team, you will work in our cloud native environment and perform following activities to support core product development and client specific projects:
- You will develop the core engineering frameworks for an advanced self-service data analytics product.
- You will work with multiple types of data storage technologies such as relational, blobs, key-value stores, document databases and streaming data sources.
- You will work with latest technologies for data federation with MPP (Massive Parallel Processing) capabilities
- Your work will entail backend architecture to enable data modeling, data queries and API development for both back-end and front-end data interfaces.
- You will support client specific data processing needs using SQL and Python/Pyspark
- You will integrate our product with other data products through Django APIs
- You will partner with other team members in understanding the functional / non-functional business requirements, and translate them into software development tasks
- You will follow the software development best practices in ensuring that the code architecture and quality of code written by you is of high standard, as expected from an enterprise software
- You will be a proactive contributor to team and project discussions
Who you are
- Strong education track record - Bachelors or an advanced degree in Computer Science or a related engineering discipline from Indian Institute of Technology or equivalent premium institute.
- 2-3 years of experience in data queries, data processing and data modeling
- Excellent ANSI SQL skills to handle complex queries
- Excellent Python and Django programming skills.
- Strong knowledge and experience in modern and distributed data stack components such as the Spark, Hive, Airflow, Kubernetes, Docker etc.
- Experience with cloud environments (AWS, Azure) and native cloud technologies for data storage and data processing
- Experience with relational SQL and NoSQL databases, including Postgres, Blobs, MongoDB etc.
- Familiarity with ML models is highly preferred
- Experience with Big Data processing and performance optimization
- Should know how to write modular, optimized and documented code.
- Should have good knowledge around error handling.
- Experience in version control systems such as GIT
- Strong problem solving and communication skills.
- Self-starter, continuous learner.
Good to have some exposure to
- Start-up experience is highly preferred
- Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
- Agile software development methodologies.
- Working in multi-functional, multi-location teams
What You'll Love About Us – Do ask us about these!
- Be an integral part of the founding team. You will work directly with the founder
- Work Life Balance. You can't do a good job if your job is all you do!
- Prepare for the Future. Academy – we are all learners; we are all teachers!
- Diversity & Inclusion. HeForShe!
- Internal Mobility. Grow with us!
- Business knowledge of multiple sectors


About LatentBridge:
We are a global intelligent automation firm with a market-leading pay-as-you-go SaaS platform and
proprietary automation accelerators that can optimise and scale enterprises’ digital programs. We
provide end-to end-automation solutions through advisory, implementation and managed services.
Our cloud platform and industry-focused AI products, enable us to make automation accessible to
every enterprise.
Python, Javascript, HTML, CSS.
Experience Range - 3 - 7 Years.
Roles and Responsibilities
- Pytest
- Worked on utilities modules and libraries
- Unified server for API testing
- Building automation framework for performance and functional testing
- Build mock services
- Self-driven
Desired Candidate Profile
- Python
- Automation
- Development

- Write clean, well- designed code
- Produce detailed specifications
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Contribute in all phases of the development lifecycle
- Follow industry best practices
- Develop and deploy new features to facilitate related procedures and tools if necessary.
Requirements :
- Knows best practices for front- end development
- Strong knowledge of Javascript APIs.
- Must have experience in building web applications in Python/Django
- Able to handle multiple databases in Django.
- Knows how to do multiple environments (production/ staging)setup in Django.
- Knows how to set up/deploy Django apps in AWS.
- Knowledge of MongoDB, postgresql mysql
- Familiar with Docker/Docker- compose



