Qualification : Tier 1 institutions/B.Tech./MCA, BCA in Computer Science or equivalentExperience : 4-8 YearsSkills :Must Have : Java, Spring/Spring Boot, Working in a product company would be preferred. Any 2 (Must) out of the below mentioned list:a. JPA/Hibernateb. Messaging queue/Kafka/SQS/distributed message/JMSc. NOSQL/Aerospike/Redis/Cassandra/Elastic Searchd. MicroservicesPurpose of the Job :- The Senior Developer will be part of a cross-functional Scrum team responsible for designing, developing, and testing diverse features of Airtel Payments Bank agile stream. Payments bank values software engineers who are motivated, creative, passionate and positive. You must be comfortable working closely with others and have solid communication skills. Software engineers who learn quickly and have strong technical skills will thrive at Airtel Payments Bank. You must have considerable experience with object-oriented programming and should be familiar with design patterns, data structures, database and other staples of practical software development.Roles and Responsibilities :- Technical design, implementation, deployment, and support.- Partner with Business Analysts to review and implement business requirements.- Perform development and unit testing, working closely with Business.- Should be well versed with TDD [Test Driven Development]- Mentors and oversees development of resources, including reviewing designs and performing code reviews.- Ensure designs are in compliance with specifications- Developing high-scale, low-latency applications for mission-critical systems and delivering high-availability and performance- Should have experience of working on Core Java/J2EE & OOPS concept- Should be well versed with Spring & Spring boot.- Should have a good understanding of Hibernate and Other ORMs- Should have an understanding of working on Web Service (SOAP/REST) and Maven- Build tool such as Jenkins- Caching Technique(Radis, Hazlecast, Aerospike)- Database Knowledge - Oracle, MySQL- Understands deployment related stuff
Technical Skills : - DB: Neo4j(GraphDB) / Oracle DB- Programming language: Java/J2EE- GUI Skills: Angular 2+, D3.js - diagramming- Others Skills: Apache SOLR, DROOLS, Kafka integration, Spring, REST, webservices, LDAP, SSO- Domain: Telecom, OSS, Inventory (will be a bonus).Roles and Responsibilities:- Responsible for the design, development, testing, documentation and analysis of modules or features of new or upgraded software systems and products. - Develops and/or executes implementation according to project plans and priorities. - Creating / Extending GUI components like grids and reports using the core product modules- Creating network, service topology and device GUI diagrams for Ethernet and SDN products like EDI, EPL, EVPL, SDWAN etc.- Creating device and service models for networking technologies like- DWDM/OTN, SONET, MPLS, GPON, FTTH etc- Understanding the existing code and extending the core product components- Writing queries, procedures on database primarily Oracle and Neo4j- Writing Java/J2EE code
Must language: JavaNice to have: pythonMust frameworks and technologies: Springboot, kafka, MQTT, docker/kubernetes, REST APIsPersistence layer: MongoDB, Elastic Search, Any GraphDB (Neo4j/Arango), SQL, HBaseMust have: Exposure in large scale architecture (Concept of queues, micro services, functional programming)Must have: Strong Data structure and design principlesExpert in developing Node.js applications, Strong understanding of NPM and modular application development skills building, Proficiency and hands-on experience with Node.js, Express, Sockets, MongoDB/Elasticsearch/Redis/MySQL, Apache Kafka/Google PubSub, Experience of working in MEAN Stack is a plus)
Job Description: The Data Engineering team is one of the core technology teams of Lumiq.ai and is responsible for creating all the Data related products and platforms which scale for any amount of data, users, and processing. The team also interacts with our customers to work out solutions, create technical architectures and deliver the products and solutions. If you are someone who is always pondering how to make things better, how technologies can interact, how various tools, technologies, and concepts can help a customer or how a customer can use our products, then Lumiq is the place of opportunities. Who are you? Enthusiast is your middle name. You know what’s new in Big Data technologies and how things are moving Apache is your toolbox and you have been a contributor to open source projects or have discussed the problems with the community on several occasions You use cloud for more than just provisioning a Virtual Machine Vim is friendly to you and you know how to exit Nano You check logs before screaming about an error You are a solid engineer who writes modular code and commits in GIT You are a doer who doesn’t say “no” without first understanding You understand the value of documentation of your work You are familiar with Machine Learning Ecosystem and how you can help your fellow Data Scientists to explore data and create production-ready ML pipelines Eligibility At least 2 years of Data Engineering Experience Have interacted with Customers Must Have Skills: Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES Apache Spark Python Scala PostgreSQL Git Linux Good to have Skills: Apache NiFi Apache Kafka Apache Hive Docker Amazon Certification
Requirements: Minimum 4-years work experience in building, managing and maintaining Analytics applications B.Tech/BE in CS/IT from Tier 1/2 Institutes Strong Fundamentals of Data Structures and Algorithms Good analytical & problem-solving skills Strong hands-on experience in Python In depth Knowledge of queueing systems (Kafka/ActiveMQ/RabbitMQ) Experience in building Data pipelines & Real time Analytics Systems Experience in SQL (MYSQL) & NoSQL (Mongo/Cassandra) databases is a plus Understanding of Service Oriented Architecture Delivered high-quality work with a significant contribution Expert in git, unit tests, technical documentation and other development best practices Experience in Handling small teams
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Our company is working on some really interesting projects in Big Data Domain in various fields (Utility, Retail, Finance). We are working with some big corporates and MNCs around the world. While working here as Big Data Engineer, you will be dealing with big data in structured and unstructured form and as well as streaming data from Industrial IOT infrastructure. You will be working on cutting edge technologies and exploring many others while also contributing back to the open-source community. You will get to know and work on end-to-end processing pipeline which deals with all type of work like storing, processing, machine learning, visualization etc.