
About Sarvodaya Infotech
Similar jobs
nCruiter is a next-generation technical screening and talent assessment platform. We are a SaaS-based organization with an interview service (IaaS) at its core. IaaS & Video Interview Platforms have a combined market size of 5-6Bn USD. It’s a niche and new market segment in the B2B sector.
We aim to disrupt the traditional evaluation techniques for screening candidates with completely automated hiring solutions. InCruiter helps companies save their time on candidate evaluation and makes hiring 1.5X faster.
InCruiter is a market leader and one of India’s Top 3 players in this segment having a wide range of Clients in the domestic and international market as well. Our major clientele includes Blenheim Chalcot, Betsol, Wabtec, Dassault, UST Global, Systematic Ventures LLC, and so on.
Key Responsibilities:
● Onboarding & Induction: Facilitate smooth onboarding and orientation for new hires.
● Employee Documentation: Manage and maintain employee records, contracts, and HR databases.
● Payroll Support: Assist in payroll processing, salary disbursement, and related documentation.
● Compliance: Ensure adherence to labor laws and statutory compliances (PF, ESI, etc.).
● Employee Engagement: Plan and execute initiatives to enhance employee morale and workplace culture.
● Attendance & Leave Management: Monitor employee attendance, manage leave records, and coordinate with managers on related issues.
● Exit Formalities: Handle the complete exit process including documentation, final settlements, and feedback.
Must Have:
● Bachelor’s degree in Human Resources, Business Administration, or related field.
● Minimum 2+ years of HR generalist experience.
● Good knowledge of HR systems, labor laws, and best practices.
● Strong interpersonal and communication skills.
● Detail-oriented with the ability to manage multiple tasks.
● Proficiency in MS Office and HR software/tools.

Job Description :
Position Name: Network Software Developer
Experience: 4 to 8 Years
Work Mode: Work from Bangalore office (near Bellandur) all 5 days.
Mandatory:
- 4+ years of work experience in Networking domain.
- Highly proficient in C and Linux environment.
- Experience in working on Linux based large code base.
- Hands-on experience in working OpenWrt based router firmware.
- Hands on experience with IP and lower layer networking protocols.
Great to Have:
- Experience in working on Qualcomm and MediaTek chipsets.
- Experience in Linux device drivers and device trees.
- Experience in creating and applying Linux kernel patches.

Key Responsibilities:
- Develop and maintain both front-end and back-end components of web applications.
- Collaborate with product managers, designers, and other developers to build user-friendly features.
- Write clean, maintainable, and efficient code that adheres to coding standards and best practices.
- Build reusable code and libraries for future use.
- Optimize applications for maximum speed and scalability.
- Implement responsive design to ensure consistent user experience across all devices.
- Work with databases (SQL/NoSQL) and integrate with third-party services and APIs.
- Troubleshoot, debug, and optimize application performance.
- Participate in code reviews, ensuring code quality and consistency across the team.
- Stay updated on the latest industry trends and best practices in full-stack development.
- Contribute to an agile development process, attending sprints, standups, and retrospectives.
Required Skills and Qualifications:
- Proven experience as a Full Stack Developer or similar role.
- Proficiency in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks (React.js, Next.js, Angular, or Vue.js).
- Strong experience in back-end technologies such as Node.js, Python, Ruby, Java, or PHP.
- Familiarity with database technologies (e.g., MySQL, PostgreSQL, MongoDB).
- Experience with version control systems, particularly Git.
- Knowledge of RESTful API design and integration.
- Familiarity with cloud platforms like AWS, Azure, or Google Cloud is a plus.
- Experience in microservices based architecture is a plus.
- Strong problem-solving skills and attention to detail.
- Ability to work independently as well as part of a team.
- Excellent communication skills, both verbal and written.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
We are the Country's leading Private Security Agency and Facility Management Company, having 75 branches and a manpower of around 40000 across!
Candidate must have:
1. Good knowledge of various labour laws
2. Registers and documents to be maintained under Acts
3. Must have handled labour audits
4. Working knowledge of handling Simpliance, ComplyHR, Teamlease, Ascent HR will be added advantage
5. Should respect timelines and meet deadlines
6. Must have 5-6 years of relevant experience
7. Minimum a Graduate
- Proficiency in both front-end and back-end development technologies.
- Experience with languages such as JavaScript (for front-end), Python, Ruby, Java, or Node.js (for back-end).
- Knowledge of web development frameworks like React.js, Angular, or Vue.js for front-end development.
- Familiarity with server-side frameworks such as Express.js (for Node.js), Django (for Python), or Ruby on Rails.
- Understanding of databases and experience with both relational (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Firebase).
- Experience in using no-code platforms to develop applications and automate workflows without traditional programming.
- Experience with popular no-code tools such as Bubble, Zapier, Airtable, or Webflow.
- Ability to understand business requirements and translate them into solutions using no-code platforms.
- Knowledge of integrating various APIs and services within no-code platforms to extend functionality.

Job Description :
- Hands on experience in developing web Applications using http://asp.net/">ASP.NET core MVC and Azure SQL server.
- Hands on experience in Web API development.
- Should have experience in working with Azure containers to deploy applications.
- Should be able to individually co-ordinate with client team and do delivery.
- Good analytical and problem-solving skills.
- Good oral and written communication skills.
- Ability to work independently / in a team with strong problem-solving skills.
- Estimate work and plan delivery schedule including tasks, milestones and dependencies.
- Ensure timely publishing of status report and quality metrics.
- Should be able to quickly adapt work on new technologies based on project need.
Interview Mode: All rounds of interview will be conducted on Google meet and on boarding will be in Bangalore
Company Overview :
Data template is an IT company which design and develop digital experience and services used by people globally. We bring experiences from multiple domains to market from strategy to execution using our unique approach to transfer deep insight into impact. We help our clients to stay ahead of challenges and accelerate in their market.


JOB DESCRIPTION
- Programming Language: Python.
- Familiarity with some ORM (Object Relational Mapper) libraries Experience in Python web framework such as Django, Flask, etc.
- Good Knowledge in any SQL database administration Understanding of the threading limitations of Python, and multi-process architecture.
- Experience in developing modular plugins for any open source enterprise business web applications
- Knowledge in integrating third-party applications using any web service is also preferred.
- Strong background in Object-Oriented Programming Strong experience in PostgreSQL. Knowledge in HTML 5, CSS and XML/RML is must.
Responsibilities and Duties:
- Knowledge in Python programming and Odoo ERP
- Knowledge in developing applications using Python.
- Knowledge about Web Technologies, Web Services Sound knowledge of coding, server deployment and troubleshooting.

- Application Developer -API
- Experience: 2-5 years
- Experience in Design, build and configure applications to meet business process and application requirements using AZURE API development.
- Design, Build, Test and deploy the APIs using Microsoft Azure.
- Needs to support and handle the production APIs.
- Should have knowledge in Design Patterns.
• Experience with Core Java, J2EE, Spring, Hibernate, HTML/HTML5, JavaScript, Jquery, Web
Services.
• Understand and use Messaging Based Services techniques.
• Working knowledge with MySQL or similar SQL Databases.
• Strong knowledge in Object Oriented Concepts with Core Java.
• Good understanding and practical experience with Enterprise Java Concepts and
Methodologies.
• Good understanding of software development process in an agile environment.
• Excellent problem solving and troubleshooting skills.
• Solid coding practices including documentation, code reviews and unit testing.
• Knowledge of Grails, ReactJS, MongoDB and Junit will be an added advantage.
• Proficiency with database management is a plus.
• Experience with AWS infrastructure, e.g. S3, EC2, database services (RDS).
• Contribute to Production Support and debugging across the platform while working with other
software engineers.
• Strong experience with Application server development - Service layer patterns using Spring.
Specialized Skills:
• Excellent verbal and written communication in English.
• Ability to work in a fast-paced environment.
• Ability to multi-task on a regular basis.
• Ability to accept direction and complete work according to instruction.
• Strong analytical skills while maintaining attention to details.
• Innovative problem-solving techniques.
• Ability to organize, prioritize work, meet deadlines and work independently.
• Ability to handle multiple projects and activities in a timely manner.
• High Emotional Intelligence for an office environment.
• Experience working with office productivity tools; delivery assurance and/or industry standards
on deliverables.
• Experience working with administrative and/or workflow systems.
• Technical leadership of an engineering team to build, deploy, and support a reliable,
performant and scalable RESTful platform.
Professional Experience:
5-8 years’ experience
Qualification:
BE(CSE) B.Tech, M.Tech, M.Sc, B.Sc (IT), ME, M.Sc(CS), MCA
Perks:
• Monday to Friday – 5 Days work schedule
• Health insurance benefits
• Accidental and disability insurance benefits
• Opportunity to grow and promote from within
• Staff development activities
Do not apply if:
• Not having Experience in java, My SQL
• Not fluent in speaking, reading, writing English as this position requires fluent English languagebased communication

