

Bharat Headhunters
http://www.bharatheadhunters.comAbout
Connect with the team
Jobs at Bharat Headhunters
Job description
We are looking for a Java Backend Developer with solid experience in Network Management Systems (NMS/EMS) to be part of a fast-paced product development environment. The ideal candidate should have hands-on experience in core Java and Spring Boot, with an understanding of telecom network protocols like SNMP, Netconf, and FCAPS.
You'll be involved in developing scalable backend solutions for managing and monitoring network infrastructure, particularly within the optical networking/OTN domain.
Key Responsibilities
- Design, develop, and maintain backend modules using Core Java & Spring Boot
- Work on NMS/EMS platforms and implement network features aligned with FCAPS
- Develop or integrate with communication protocols like SNMP, Netconf
- Collaborate with front-end, QA, and network teams to deliver end-to-end solutions
- Debug production issues, optimize system performance, and ensure scalability
- Work in Agile/Scrum teams to deliver high-quality releases within tight timelines
- Follow best practices in coding, testing, and CI/CD pipelines
Required Skills
- Strong proficiency in Core Java (Java 8 or above) and Spring Boot
- 3+ years of experience working with NMS/EMS platforms
- Good understanding of network management protocols – SNMP, Netconf
- Familiarity with FCAPS architecture (Fault, Configuration, Accounting, Performance, Security)
- Strong knowledge of Linux environments and shell scripting
- Experience with REST APIs, microservices architecture, and debugging tools
Nice to Have
- Prior exposure to optical networking technologies (OTN/SONET/SDH)
- Understanding of L0/L1 transport network configurations
- Hands-on with telecom equipment vendors or network simulators

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description:
- Very good Core Java programming skills, with 7+ years of experience, of which 3 - 4 in core
- development programs.
- Expertise on Agile and Build (Jenkins/GIT/Continuous Integration)
- Rich experience in software development SDLC - Requirements, Analysis,
- coding, review process, build and test.
- Fair understanding Linux fundamentals
- Solid understanding of object-oriented programming
- Good English comprehension and communication skills, both oral and written,
- Ability to write routine business correspondence, and technical notes
- Strong customer service orientation, proactive in updates/blockers

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Job Responsibilities
- Design, build & test ETL processes using Python & SQL for the corporate data warehouse
- Inform, influence, support, and execute our product decisions
- Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
- Evaluate and prototype new technologies in the area of data processing
- Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
- High energy level, strong team player and good work ethic
- Data analysis, understanding of business requirements and translation into logical pipelines & processes
- Identification, analysis & resolution of production & development bugs
- Support the release process including completing & reviewing documentation
- Configure data mappings & transformations to orchestrate data integration & validation
- Provide subject matter expertise
- Document solutions, tools & processes
- Create & support test plans with hands-on testing
- Peer reviews of work developed by other data engineers within the team
- Establish good working relationships & communication channels with relevant departments
Skills and Qualifications we look for
- University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
- 4 - 6 years experience with data engineering.
- Strong coding ability and software development experience in Python.
- Strong hands-on experience with SQL and Data Processing.
- Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
- Good working experience in any one of the ETL tools (Airflow would be preferable).
- Should possess strong analytical and problem solving skills.
- Good to have skills - Apache pyspark, CircleCI, Terraform
- Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
- Understanding & experience of agile / scrum delivery methodology

Similar companies
About the company
We created Technogise with a vision to solve the most critical and strategic problems for a business. And we would do that by delivering technology solutions, which seamlessly adapt to the evolving needs of the business. Over these 8 years, we have grown into a team of 75+ Technogisers - a mix of strategists, problem-solvers, pioneers, visionaries, and innovators. Using a value-driven, goal-oriented, and problem-solving approach of leveraging technology, we craft new-age solutions.
Jobs
2
About the company
Jobs
11
About the company
Jobs
3
About the company
India hosts 25M+ events every year, but ordering food is still a nightmare—unreliable vendors, inconsistent pricing, and no seamless way to book catering. We’re fixing this."
Craft My Plate is a vertically integrated food marketplace where customers can instantly customize menus, place orders, and get it delivered conveniently —without dealing with vendors.
Jobs
11
About the company
Jobs
158
About the company
Jobs
3
About the company
Jobs
13
About the company
Jobs
2
About the company
Jobs
6