
Technical Requirement:
- At least 5 years of experience in Java, J2EE technologies.
- Expertise in doing development using Java, Spring Boot, Spring modules (Spring Security, Spring MVC, Spring Data JPA, etc.), Hibernate, Web Services
- Worked on REST web services.
- Worked on application server : Tomcat
- Worked on Eclipse IDE
- Knowledge of Swagger, Postman
- Worked on Database like MySQL
- Worked on Git, Maven, CI/CD pipelines
- Good to have knowledge of using Docker and Mongo DB

About India Bison
About
Similar jobs

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
● Execute these via designs and high quality implementations in Java and/or Python.
● Work with frontend engineer to clearly demarcate division of responsibilities via REST based interfaces.
● Focus on solutions that deliver non functional requirements around performance, scalability, security, high availability, monitorability, debuggability and other such concerns.
● Evaluate new technologies and build prototypes for continuous improvements
● Very strong real world experience on Java, Springboot, Microservices.
● Advocate best practices and standards
- 5 years of experience as Java/Jee Developer, Springboot
- Good knowledge of OOPS concepts.
- Experience in Java8, JSP, Spring Core, Spring MVC, Spring Rest & Spring JPA Repository
- Experience in Hibernate, relational databases and sql.
- Experience in Rest API development.
- Experience in implementation of Jasper Reports
- Familiar with Git & Maven
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
- 4+ years of professional development experience in an agile software engineering role
- Strong development skills in Java or Groovy
- Experience with Jenkins, bamboo and CI/CD environments
- Familiarity with Guice or similar dependency injection frameworks
- Familiarity with relational database technology
- Familiarity with document store database technology such as ElasticSearch or Cassandra
- Experience with design and testing of RESTful APIs
- Familiarity with cross platform development on Linux, OSX and Windows
- Strong understanding of Git version control system
- Knowledge of JavaScript (Node.js)
- Familiarity with Amazon Web Services (AWS)
- Familiarity with Docker / Kubernetes
- Experience with continuous integration and build systems such as Jenkins or Bamboo
- Experience with Jira or other project management, issue-tracking or bug-tracking tools
- Should have knowledge of RDBMS
- Familiarity with Node.js, Kotlin, Kafka, DynamoDB
- Previous experience with SaaS or cloud development, Micro-services

- 3+ years of experience with Ruby On Rails.
- Strong Project & Time Management Skills, along with the ability to apply these skills while working independently, or as part of a team.
- Knowledge of blockchain technology, smart contracts and cryptocurrency will be an added advantage
- Experience in fintech domain will be another added advantage
- Bachelor’s degree in computer programming, computer science, or a related field.
- Fluency or understanding of specific languages, such as Java, PHP, or Python, and operating systems may be required.

Overview
At Netradyne, we are revolutionizing the conventional mapping paradigm with our Dynamic mapping technology. Our unique approach leverages computer vision, Edge Computing, and crowd sourcing to deliver rich, highly accurate content in real-time, critical to the successful development of maps
What you will be doing ?
You will work in a fast-paced environment including multiple platforms, architectures, and technologies. You will be responsible for
- Development of various cloud/web components to ingest, process, transform and visualize data at scale.
- Contribute to algorithm development and automated evaluation for measuring quality.
- Follow engineering best practices (unit testing, continuous delivery etc.)
- Deployment and monitoring of production ready infrastructure
Skills
- Strong analytical and problem-solving skills.
- 3-7 years strong programming experience in Python/Java .
- Hands on knowledge in at least one MVC and ORM frameworks.
- Familiar with at least one frontend framework i.e. React/ Angular
- Working knowledge of at least one RDBMS or NoSQL database.
- Exposure to Geospatial databases and tools like Qgis,OSM will be a plus
Must Have
- 1+ to 6 year’s development experience in Java/J2EE Development.
- 1+ years’ experience in Spring, Hibernate.
- 1+ years’ experience in developing REST API’s
- 1+ years’ experience in developing Spring boot applications.
- Hands-On experience in Unit testing.
- Hands On experience in MVC frameworks –AngularJS/Angular7/8
- Understanding of Micro services.
- Understanding of Agile Methodologies.
- Working experience with DB technologies
- Strong analytical and problem-solving skills.
- Aptitude for innovation, working independently and thinking ‘outside of the box’.



