
Sr. Software Engineer (Back End)
at It is award-winning robotics company, supported by SINE IIT-

• Experienced in designing and integrating RESTful APIs
• Knowledge of Python
• Excellent debugging and optimization skills
SKILLS
• 3-5 years of experience building large-scale software applications and working with large
software teams.
• Bachelor’s degree in computer science, information technology, or engineering
• Experience designing and integrating RESTful APIs
• Knowledge of Python and Backend Development
• Experience building Web/Mobile applications
• Excellent debugging and optimization skills
• Unit and Integration testing experience
• Being knowledgeable about engineering processes and good practices
• Passionate about learning new tools. Ability to continuously learn and acquire
knowledge.
• Able to adapt to changing complexity of tasks.

Similar jobs


Company: PluginLive
About the company:
PluginLive Technology Pvt Ltd is a leading provider of innovative HR solutions. Our mission is to transform the hiring process through technology and make it easier for organizations to find, attract, and hire top talent. We are looking for a passionate and experienced Data Engineering Lead to guide the data strategy and engineering efforts for our Campus Hiring Digital Recruitment SaaS Platform.
Role Overview:
The Data Engineering Lead will be responsible for leading the data engineering team and driving the development of data infrastructure, pipelines, and analytics capabilities for our Campus Hiring Digital Recruitment SaaS Platform. This role requires a deep understanding of data engineering, big data technologies, and team leadership. The ideal candidate will have a strong technical background, excellent leadership skills, and a proven track record of building robust data systems.
Job Description
Position: Data Engineering Lead - Campus Hiring Digital Recruitment SaaS Platform
Location: Chennai
Minimum Qualification: Bachelor’s degree in computer science, Engineering, Data Science, or a related field. Master’s degree or equivalent is a plus.
Experience: 7+ years of experience in data engineering, with at least 3 years in a leadership role.
CTC: 20-30 LPA
Employment Type: Full Time
Key Responsibilities:
Data Strategy and Vision:
- Develop and communicate a clear data strategy and vision for the Campus Hiring Digital Recruitment SaaS Platform.
- Conduct market research and competitive analysis to identify trends, opportunities, and data needs.
- Define and prioritize the data roadmap, aligning it with business goals and customer requirements.
Data Infrastructure Development:
- Design, build, and maintain scalable data infrastructure and pipelines to support data collection, storage, processing, and analysis.
- Ensure the reliability, scalability, and performance of the data infrastructure.
- Implement best practices in data management, including data governance, data quality, and data security.
Data Pipeline Management:
- Oversee the development and maintenance of ETL (Extract, Transform, Load) processes.
- Ensure data is accurately and efficiently processed and available for analytics and reporting.
- Monitor and optimize data pipelines for performance and cost efficiency.
Data Analytics and Reporting:
- Collaborate with data analysts and data scientists to build and deploy advanced analytics and machine learning models.
- Develop and maintain data models, dashboards, and reports to provide insights and support decision-making.
- Ensure data is easily accessible and usable by stakeholders across the organization.
Team Leadership:
- Lead, mentor, and guide a team of data engineers, fostering a culture of collaboration, continuous improvement, and innovation.
- Conduct code reviews, provide constructive feedback, and ensure adherence to development standards.
- Collaborate with cross-functional teams including product, engineering, and marketing to ensure alignment and delivery of data goals.
Stakeholder Collaboration:
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Communicate effectively with non-technical stakeholders to explain data concepts and progress.
- Participate in strategic planning and decision-making processes.
Skills Required:
- Proven experience in designing and building scalable data infrastructures and pipelines.
- Strong proficiency in programming languages such as Python, R, Data visualization tools like Power BI, Tableau, Qlik, Google Analytics
- Expertise in big data technologies such as Apache Airflow, Hadoop, Spark, Kafka, and cloud data platforms like AWS, Oracle Cloud.
- Solid understanding of database technologies, both SQL and NoSQL.
- Experience with data modeling, data warehousing, and ETL processes.
- Strong analytical and problem-solving abilities.
- Excellent communication, collaboration, and leadership skills.
Preferred Qualifications:
- Experience in HR technology or recruitment platforms.
- Familiarity with machine learning and AI technologies.
- Knowledge of data governance and data security best practices.
- Contributions to open-source projects or active participation in the tech community.



We're looking for AI/ML enthusiasts who build, not just study. If you've implemented transformers from scratch, fine-tuned LLMs, or created innovative ML solutions, we want to see your work!
Make Sure before Applying (GitHub Profile Required):
1. Your GitHub must include:
- At least one substantial ML/DL project with documented results
- Code demonstrating PyTorch/TensorFlow implementation skills
- Clear documentation and experiment tracking
- Bonus: Contributions to ML open-source projects
2. Pin your best projects that showcase:
- LLM fine-tuning and evaluation
- Data preprocessing pipelines
- Model training and optimization
- Practical applications of AI/ML
Technical Requirements:
- Solid understanding of deep learning fundamentals
- Python + PyTorch/TensorFlow expertise
- Experience with Hugging Face transformers
- Hands-on with large dataset processing
- NLP/Computer Vision project experience
Education:
- Completed/Pursuing Bachelor's in Computer Science or related field
- Strong foundation in ML theory and practice
Apply if:
- You have done projects using GenAI, Machine Learning, Deep Learning.
- You must have strong Python coding experience.
- Someone who is available immediately to start with us in the office(Hyderabad).
- Someone who has the hunger to learn something new always and aims to step up at a high pace.
We value quality implementations and thorough documentation over quantity. Show us how you think through problems and implement solutions!


Key Responsibilities
- Design and implement scalable, maintainable, and efficient Python applications
- Lead technical projects from conception to deployment
- Collaborate with cross-functional teams to define and implement new features
- Write clean, testable code with appropriate documentation
- Conduct code reviews and provide constructive feedback to team members
- Mentor junior developers and contribute to their professional growth
- Participate in technical architecture discussions and decision-making
- Troubleshoot and debug complex software issues
- Contribute to continuous improvement of development practices and processes
- Design and implement AI/ML solutions using LLMs and related technologies
- Optimize and maintain AI model deployment pipelines
Required Qualifications
- Bachelor's degree in Computer Science, Software Engineering, or related field
- 5+ years of professional software development experience
- Strong proficiency in Python and its ecosystem (Django/Flask, FastAPI)
- Experience with SQL and NoSQL databases
- Solid understanding of software design patterns and principles
- Experience with version control systems (Git)
- Strong knowledge of RESTful APIs and microservices architecture
- Proficiency in writing unit tests and understanding of TDD practices
- Experience with CI/CD pipelines and deployment automation
- Strong problem-solving and analytical skills
AI/ML Technical Skills
- Experience with LLM frameworks (LangChain, LlamaIndex)
- Knowledge of working with large language models (GPT, Claude, etc.)
- Understanding of prompt engineering and LLM fine-tuning concepts
- Experience with vector databases (Pinecone, Weaviate, or similar)
- Familiarity with AI model deployment and serving (BentoML, Ray Serve)
- Experience with machine learning libraries (PyTorch, TensorFlow, or similar)
- Knowledge of AI/ML observability and monitoring tools
- Understanding of AI safety practices and responsible AI development
- Experience with embedding models and semantic search implementations
- Familiarity with AI application development patterns and best practices


Roles:
- Developing core infrastructure in Python, Django.
- Developing models and business logic (e. g. transactions, payments, diet plan, search, etc).
- Architecting servers and services that enable new product features.
- Building out newly enabled product features
- Minimum 4 years of industry or open-source experience.
- Proficient in at least one OO language: Python(preferred)/Golang/Java.
- Writing high-performance, reliable and maintainable code.
- Good knowledge of database structures, theories, principles, and practices.
- Experience working with AWS components [EC2, S3, RDS, SQS, ECS, Lambda].

- Minimum 4 years’ experience developing Node.js applications on top of RESTful APIs.
- Experience building single page applications using JavaScript frameworks and libraries (React)
- Experience with cross-browser, cross-platform and design constraints on the web.
- Experience with test automation: TDD, unit/integration/functional testing.
- Solid understanding of object-oriented design and programming concepts.
- Have a passion for quality and writing clean, solid, readable code that scales and performs well.
- Proficient in Git and familiarity with continuous integration.
- A team player who values collaboration, innovation, and inclusion
- Comfortable working in an Agile environment
- Strong verbal and written communication skills


Job definition
Working at CAST R&D means being an important part of a highly-talented, fast-paced, multi-cultural and Agile team based in Paris (France) and Bangalore (India). The team builds a sophisticated source code analysis platform leveraging parsing, control flow, data flow and others mechanisms to fully understand the inner structure of the complex IT Systems developed and used by the Fortune 500 companies.
You, working as individual contributor as part of team, will contribute to the core part of our platform: source code analyzers. You will contribute to the evolution of our technology to support the latest evolution of languages and frameworks. You will have the opportunity to work on the different parts of the analysis chain from parsing to developing new quality rules notably the ones related to the emerging CISQ standard. Python or C++ will be your main languages.
About your Team
We develop source code analyzers which are plugins of our platform CAST AIP.
Close to customer needs, we deliver incrementally features and fixes according to priorities. Using Lean approach and eXtreme Programming:
• We write user documentation
• We thank our testers to find bugs which enable us to add more unit test coverage.
Profile
The candidate should have a passion for technology and a flexible, creative approach to problem solving.
• Must have at least 4 years of experience
• Must have expertise in Python or C++ development
• Must be pragmatic
• Must have excellent written, oral and telephone communication skills in English.
• Must have strong analytical and logical skills
• Must accept to follow a framework of rules on how to write/design static analyzers
• Hands on TDD addict: writing unit tests.
• Must accept to do a lot of maintenance
• Eager to learn new languages and frameworks at “high” level
Knowledge on Model to Code Generation
Ability to work independently, with minimal training and direct guidance
Ability to respond to customer inquiries quickly
Ability to quickly modify/setup routes
Familiarity with Rhapsody Secure transmission protocols (e.g. Secure File Transfer (SFT) and Secure Object Access Protocol (SOAP) routes process, etc.
Prior experience with protocols like OSLC, SOAP and REST APIs
Ability to identify and resolve exceptions with electronic data exchange between EMR data submitters, and data recipients.
Knowledge of HL7/XML/FHIR/EDI standards
Strong in building JUnit tests during development


Working behind the scenes, the ideal candidate has a unique blend of technical expertise and insatiable curiosity, with a methodical, analytical mindset.
You should be comfortable working alongside a team as well as independently in the design and development of mission-critical websites, applications, and layers of the infrastructure.
Experience with JavaScript, Amazon Web Services (AWS), Git,
The selected candidate must have in-depth knowledge of basic algorithms and data structures. The developer will be responsible for building and maintaining functional and stable web applications.
Develop server-side logic, definition, and maintenance of the central database
Ensure high performance and responsiveness to front-end applications and integrate the elements
Participate in the entire application lifecycle, focusing on coding and debugging.
1.Write clean code to develop functional web applications
2.Gather and address technical and design requirements
3.Building reusable code and libraries for future use
4.Implementation of security and data protection
5.Conduct UI tests and optimize performance.
6.Design robust APIs to support mobile and desktop users


