


Company: PluginLive
About the company:
PluginLive Technology Pvt Ltd is a leading provider of innovative HR solutions. Our mission is to transform the hiring process through technology and make it easier for organizations to find, attract, and hire top talent. We are looking for a passionate and experienced Data Engineering Lead to guide the data strategy and engineering efforts for our Campus Hiring Digital Recruitment SaaS Platform.
Role Overview:
The Data Engineering Lead will be responsible for leading the data engineering team and driving the development of data infrastructure, pipelines, and analytics capabilities for our Campus Hiring Digital Recruitment SaaS Platform. This role requires a deep understanding of data engineering, big data technologies, and team leadership. The ideal candidate will have a strong technical background, excellent leadership skills, and a proven track record of building robust data systems.
Job Description
Position: Data Engineering Lead - Campus Hiring Digital Recruitment SaaS Platform
Location: Chennai
Minimum Qualification: Bachelor’s degree in computer science, Engineering, Data Science, or a related field. Master’s degree or equivalent is a plus.
Experience: 7+ years of experience in data engineering, with at least 3 years in a leadership role.
CTC: 20-30 LPA
Employment Type: Full Time
Key Responsibilities:
Data Strategy and Vision:
- Develop and communicate a clear data strategy and vision for the Campus Hiring Digital Recruitment SaaS Platform.
- Conduct market research and competitive analysis to identify trends, opportunities, and data needs.
- Define and prioritize the data roadmap, aligning it with business goals and customer requirements.
Data Infrastructure Development:
- Design, build, and maintain scalable data infrastructure and pipelines to support data collection, storage, processing, and analysis.
- Ensure the reliability, scalability, and performance of the data infrastructure.
- Implement best practices in data management, including data governance, data quality, and data security.
Data Pipeline Management:
- Oversee the development and maintenance of ETL (Extract, Transform, Load) processes.
- Ensure data is accurately and efficiently processed and available for analytics and reporting.
- Monitor and optimize data pipelines for performance and cost efficiency.
Data Analytics and Reporting:
- Collaborate with data analysts and data scientists to build and deploy advanced analytics and machine learning models.
- Develop and maintain data models, dashboards, and reports to provide insights and support decision-making.
- Ensure data is easily accessible and usable by stakeholders across the organization.
Team Leadership:
- Lead, mentor, and guide a team of data engineers, fostering a culture of collaboration, continuous improvement, and innovation.
- Conduct code reviews, provide constructive feedback, and ensure adherence to development standards.
- Collaborate with cross-functional teams including product, engineering, and marketing to ensure alignment and delivery of data goals.
Stakeholder Collaboration:
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Communicate effectively with non-technical stakeholders to explain data concepts and progress.
- Participate in strategic planning and decision-making processes.
Skills Required:
- Proven experience in designing and building scalable data infrastructures and pipelines.
- Strong proficiency in programming languages such as Python, R, Data visualization tools like Power BI, Tableau, Qlik, Google Analytics
- Expertise in big data technologies such as Apache Airflow, Hadoop, Spark, Kafka, and cloud data platforms like AWS, Oracle Cloud.
- Solid understanding of database technologies, both SQL and NoSQL.
- Experience with data modeling, data warehousing, and ETL processes.
- Strong analytical and problem-solving abilities.
- Excellent communication, collaboration, and leadership skills.
Preferred Qualifications:
- Experience in HR technology or recruitment platforms.
- Familiarity with machine learning and AI technologies.
- Knowledge of data governance and data security best practices.
- Contributions to open-source projects or active participation in the tech community.

About Pluginlive
About
Connect with the team
Company social profiles
Similar jobs

Mandatory
Strong Senior / Lead Software Engineer profile
Mandatory (Experience 1) - Must have Min 6 YOE in Software development, wherein 1-2 Yrs as Senior or Lead Role.
Mandatory (Experience 2) - Must have experience with Python + Django / Flask or similar framework
Mandatory (Experience 3) - Must have experience with Relational Databases (like MySQL, PostgreSQL, Oracle etc)
Mandatory (Experience 4) - Must have good experience in Micro Services or Distributed System frameworks(eg, Kafka, Google pub / Sub, AWS SNS, Azure Service Bus) or Message brokers(eg,RabbitMQ)
Mandatory (Location) - Candidate must be from Bengaluru
Mandatory (Company) - Product / Start-up companies only
Mandatory (Stability) - Should have worked for at least 2 years in 1 Company in last 3 years..

Job Responsibilities:
● Design, test, and build scalable backend python services
● Closely collaborate with marketing and product teams to build innovative, robust, and easy-to-use features that serve.
● Developing high-quality code based on detailed designs that cater to the product requirements.
● Responsible for troubleshooting, testing, and maintaining the core product software and databases to ensure strong optimization and functionality
Required Skills:
● Degree in Computer Science, Software Engineering or equivalent.
● Minimum 3+ years experience in software development.
● Expertise in Python 3.7, Django 2.2+, and REST APIs.
● Willingness to learn and ability to flourish in a dynamic, high-growth, entrepreneurial environment
● Hands-on, self-starter, capable of working independently
● True love for technology and what you do
● Maniacal attention to detail


Responsibilities :
● Designing and developing robust and scalable server-side applications using Python, Flask, Django, or other relevant frameworks and technologies.
● Collaborating with other developers, data scientists, and data engineers to design and implement RESTful APIs, web services, and microservices architectures.
● Writing clean, maintainable, and efficient code, and reviewing the code of other team members to ensure consistency and adherence to best practices.
● Participating in code reviews, testing, debugging, and troubleshooting to ensure the quality and reliability of applications.
● Optimising applications for performance, scalability, and security, and monitoring the production environment to ensure uptime and availability.
● Staying up-to-date with emerging trends and technologies in web development, and evaluating and recommending new tools and frameworks as needed.
● Mentoring and coaching junior developers to ensure they grow and develop their skills and knowledge in line with the needs of the team and the organisation.
● Communicating and collaborating effectively with other stakeholders, including product owners, project managers, and other development teams, to ensure projects are delivered on time and to specification.
You are a perfect match, if you have these qualification :
● Strong experience in Python and server-side development frameworks such as Flask or Django.
● Experience in building RESTful APIs, web services, and microservices architectures.
● Experience in using database technologies such as MySQL, PostgreSQL, or MongoDB.
● Familiarity with cloud-based platforms such as AWS, Azure, or Google Cloud Platform.
● Knowledge of software development best practices such as Agile methodologies, Test-Driven Development (TDD), and Continuous Integration/Continuous Deployment (CI/CD).
● Excellent problem-solving and debugging skills, and the ability to work independently as well as part of a team.
Strong communication and collaboration skills, and the ability to work effectively with other stakeholders in a fast-paced environment

- Solid understanding of Data structures and Algorithms.
- Exceptional coding skills in an Object-Oriented programming language (Golang/Python)
- Must have basic understanding of AWS (EC2, Lambda, Boto, CI/CD), Celery, RabbitMq and similar task queue management tools/libraries.
- Experience with web technologies Python, Linux, Apache, Solr, Memcache, Redis, grpc
- Experience with high performance services catering to millions of daily traffic is a plus
- Strong understanding of Python and Django.
- Good knowledge of various Python Libraries, APIs, and tool kits.
- Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3.
- Proficient understanding of code versioning tools such as Git.
- Understanding of the threading limitations of Python, and multi-process architecture
- Understanding of databases and MySQL
Responsibilities :
- Comply with coding standards and technical design.
- Adapts structured coding styles for easy review, testing, and maintainability of the code.
- Active participation in troubleshooting and debugging.
- Preparing technical documentation of code.


At nFerence Labs, the "Google of Biomedicine", we are building the world's first massive-scale platform for pharmaco-biomedical computing. Our platform is premised on using AI/Deep Learning (on clinical text, medical images, and other signals) and massive high-performance computing to help pharma companies perform faster and more efficient drug discovery, and also help early diagnosis of several key diseases.
We collaborate heavily with premier medical institutions such as the Mayo Clinic and build systems to get deep medical insights from patient information including patient notes and lab information, medical images, ECGs, etc. We are a well-funded company and are looking to grow on all fronts.
We are hiring an experienced backend staff engineer for our Pramana team. Our Digital Pathology-as-a-service venture, Pramana is an in-line quality assurance software suite which for the first time in the industry, provides confidence to the labs for the accuracy and applicability of their digital assets while supporting industry-standard image formats.
Pramana’s whole slide imaging system is built upon the strong hardware expertise of former Spectral Insights (that nference acquired in 2020) and the strong software expertise of nference. Modular systems with Robotic automation have allowed Pramana to reduce the reliance on several technical staff. This will significantly reduce the total costs of ownership and is a more transparent model for Pramana’s clients.
Must have
- 5+ years experience with solid backend/engineering experience in C++/ Python
- Knowledge of data structures and an eye for architecture.
- Solid CS fundamentals, fluent in multi-threaded and asynchronous programming, and a strong inclination for architecting at scale.
- Excellent technical design, problem-solving, debugging, and communication skills.
- Rapid prototyping worked on distributed systems at scale.
- Basic knowledge of SQL as well as NoSQL databases.
- Proficient in Golang/ Python, design, and concurrency patterns.
Good to have
- Proficient in writing unit tests and profiling and benchmarking golang applications
- Experience in maintaining protobuf contract
- Experience in working with GRPC and grace
Benefits:
- Be a part of the “Google of biomedicine” as recognized by the Washington Post
- Work with some of the brilliant minds of the world solving exciting real-world problems through Artificial Intelligence, Machine Learning, analytics and insights through triangulating unstructured and structured information from the biomedical literature as well as from large-scale molecular and real-world datasets.
- Our benefits package includes the best of what leading organizations provide, such as stock options, paid time off, healthcare insurance, gym/broadband reimbursement.

o Strong Python development skills, with 7+ yrs. experience with SQL.
o A bachelor or master’s degree in Computer Science or related areas
o8+ years of experience in data integration and pipeline development
o Experience in Implementing Databricks Delta lake and data lake
o Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Snowflake Spark
o Experience in working with multiple file formats (Parque, Avro, Delta Lake) & API
o experience with AWS Cloud on data integration with S3.
o Hands on Development experience with Python and/or Scala.
o Experience with SQL and NoSQL databases.
o Experience in using data modeling techniques and tools (focused on Dimensional design)
o Experience with micro-service architecture using Docker and Kubernetes
o Have experience working with one or more of the public cloud providers i.e. AWS, Azure or GCP
o Experience in effectively presenting and summarizing complex data to diverse audiences through visualizations and other means
o Excellent verbal and written communications skills and strong leadership capabilities
Skills:
Python

Backend Engineer (Senior Role 7 Years) - 3 Positions
We are looking for an ambitious and self-driven Sr. backend Engineer to join JiT Finco. As a member of the core technology team, you will be working with the existing product development team and owning a few modules that are critical to our customer journeys.
We are looking for an engineer to join our front-end engineering team who has experience in Python (Django Framework)
Skills and Requirements
- Bachelor’s degree in computer science, information technology, or a similar field.
- Previous experience 6-8 Yrs working as a Python Developer
- Experience with owning and developing product modules in Django
- In-depth knowledge of Django Framework
- Built RESTful APIs and familiarity with Postman
- Sound familiarity with GitHub
- Exposure to aspects related to product architecture
- Has experience in architecting and building for scale
- Knowledge of AWS, CI/CD pipeline et al
- Knowledge of performance testing frameworks
- Familiarity with Celery would be an added advantage
- Experience with Agile development
- You write clean, concise, self-documenting code that both you and your team can still understand a year after. You test everything.
- You already master a few scripting languages, or easily find your way around a new one. You enjoy picking up new things and incorporating those in your skillset.
The Interview Process
- Introductory call with (60 mins)
- Coding Assignments (150 mins)
- Follow up call with (Related to assignment - 60 mins)
- Closing call and offer (30 mins)
The entire process should take max 3 days subject to your availability.
JiT Finco Techstack
AWS, Python (DJango), PostGres SQL, Celery, React, React JS






