We are seeking a Full Stack Engineer to join the Engineering team based out of Gurgaon. We provide users with the opportunity to invest in gold, government bonds, crypto currencies and other investment products to grow their savings.
We are constantly looking to improve the investment experience and educate users about
growth opportunities. In each release, we aim to make pluang more useful for our users and add features to ensure state of the art security & reliability. Our users trust us with their hard-earned money and we take it very seriously. We consistently strive to deliver top-quality.
You will be working with a team of highly-motivated, young & dynamic teams of engineers & reporting to the Engineering Lead.
Position Responsibilities
● Be honest, reliable & consistent
● Write efficient & clean code
● Have a strong sense of ownership
● Be a part of development & maintenance of Pluang web app, Operations dashboard and other 3rd party products we own
● Contribute to improving the quality of engineering process & engineering culture
Position Requirements
- Strong in data structure and algorithms
- Experience in Java, Express, API Design & DOM
- Understanding of component based design or other design patterns
- Experience with unit testing, integration testing & continuous integration
- RDBMS and NoSQL databases preferably PostgreSQL, MongoDB
- Good to have passion for investing
We Offer
- Attractive compensation package - competitive salary, flexible bonus scheme.
- We are always looking for ways to promote and inspire innovation.
- Individual career path - management and technical career growth, enhanced by learning and development program, regular performance assessment, teams of multi-national IT professionals.
- Healthy work environment - company-sponsored medical program, food, and beverage program, open communication.
- Friendly policies to support Work-life balance, team building, and celebrations.
About A Series B Wealth Tech startup
Similar jobs
Technical Skills
- Understanding of the WordPress development
- Create custom WordPress themes.
- Create custom WordPress Plugin and WooCommerce Add-Ons.
- Knowledge of PHP, MYSQL, JS, jQuery, CSS and HTML
- Good Knowledge of servers.
- Good written communication skills
- Good understanding of version control like git.
- Knowledge of WordPress coding standards.
- Good knowledge of Gutenberg.
Good to Have
- Basic knowledge of WPRest API and WP-CLI.
- Basic knowledge of VIP Coding Standards.
- Basic knowledge of React.
- Advanced of working knowledge of Gutenberg like custom block creation.
About Company
Espressif Systems (688018) is a public multinational, fabless semiconductor company established in 2008, with headquarters in Shanghai and offices in Greater China, India, and Europe. We have a passionate team of engineers and scientists from all over the world, focused on developing cutting-edge WiFi-and-Bluetooth, low-power IoT solutions. We have created the popular ESP8266 and ESP32 series of chips, modules, and development boards. By leveraging wireless computing, we provide green, versatile, and cost-effective chipsets. We have always been committed to offering IoT solutions that are secure, robust, and power-efficient. By open-sourcing our technology, we aim to enable developers to use Espressif’s technology globally and build smart connected devices. In July 2019, Espressif made its Initial Public Offering on the Sci-Tech Innovation Board (STAR) of the Shanghai Stock Exchange (SSE).
Espressif has a technology center in Pune. The focus is on embedded software engineering and IoT solutions for our growing customers.
About the Role
Espressif’s https://rainmaker.espressif.com/ is a paradigm-shifting IoT cloud platform that provides seamless connectivity to IoT devices to mobile apps, voice assistants, and other services. It is designed with scalability, security, reliability, and operational cost at the center. We are looking for senior cloud engineers who can significantly contribute to this platform by means of architecture, design, and implementation. It’s highly desirable that the candidate has earlier experience of working on large-scale cloud product development and understand the responsibilities and challenges well. Strong hands-on experience in writing code in Go, Java, or Python is a must.
This is an individual contributor role.
Minimum Qualifications
-
BE/B.Tech in Computer Science with 5-10 years of experience.
-
Strong Computer Science Fundamentals.
-
Extensive programming experience in one of these programming languages ( Java, Go, Python) is a must.
-
Good working experience of any of the Cloud Platforms - AWS, Azure, Google Cloud Platform.
-
Certification in any of these cloud platforms will be an added advantage.
-
Good Experience in the development of RESTful APIs, handling the security and
performance aspects.
-
Strong debugging and troubleshooting skills.
-
Experience working with RDBMS or any NoSQL database like DynamoDB, MYSQL, Oracle.
-
Working knowledge about CI/CD tools - Maven/Gradle, Jenkins, experience in a Linux (or Unix) based environment.
Desired Qualifications
-
Exposure to Serverless computing frameworks like AWS Lambda, Google Cloud Functions, Azure Functions
-
Some Exposure to front end development tools - HTML5, CSS, Javascript, React.js/Anular.js
-
Working knowledge on Docker, Jenkins.
Prior experience working in the IoT domain will be an added advantage.
What to expect from our interview process
-
The first step is to email your resume or apply to the relevant open position, along with a sample of something you have worked on such as a public GitHub repo or side project, etc.
-
Next, post shortlisting your profile recruiter will get in touch with you via a mechanism that works for you e.g. via email, phone. This will be a short chat to learn more about your background and interests, to share more about the job and Espressif, and to answer any initial questions you have.
-
Successful candidates will then be invited for 2 to 3 rounds of the technical interviews as per the previous round feedback.
-
Finally, Successful candidates will have interviews with HR. What you offer us
-
Ability to provide technical solutions, support that fosters collaboration and innovation.
Ability to balance a variety of technical needs and priorities according to Espressif’s growing needs.
What we offer
- An open-minded, collaborative culture of enthusiastic technologists.
- Competitive salary
- 100% company paid medical/dental/vision/life coverage
- Frequent training by experienced colleagues and chances to take international trips, attend exhibitions, technical meetups, and seminars.
Technical Lead's Role:
- The availability, security, scalability and interoperability of our platform
- Delivering our product roadmap
- Planning for the future
- Generating enthusiasm and a sense of both technical and product pride
Essential for this position :
- At least 2 years of experience in leading a team of software developers
- At least 4 years of commercial experience with C# and .NET
- At least 2 years of commercial experience with HTML/CSS and one of Javascript frameworks
- At least 2 years of experience with Microsoft Azure
- Designing and developing APIs for both high availability and scalability
- Performance profiling and tuning of .NET code
- Writing automated tests (both unit and integration tests)
- Experience with CI/CD pipelines (ideally Azure DevOps)
- Applying problem-solving skills to technical issues
Mandatory requirement:-
1. Educational background from B.Tech/B.E./MCA
2. Comfortable to work from Noida Office.
3. Able to join in next 30 days max.
Application process is to submit the details on https://pclhealth.talentlyft.com/jobs/net-tech-lead-me8" target="_blank">https://pclhealth.talentlyft.com/jobs/net-tech-lead-me8 and our team member will get in touch with you soon
Are you a high-performing, collaborative, results-oriented and technologically savvy person who is keen on sales in the digital industry? Then this highly visible role is for you!
Our client is the Health-tech initiative of India's largest business house. Started in 2015, it empowers healthcare providers and consumers in India. All healthcare monitoring services are made available through an app that will help connect doctors, hospitals, pharmacies, laboratories, and consumers, enabling preventive and predictive healthcare. It helps the care-givers to track the entire patient journey from the initial appointment and maintaining their records, generating lab test reports to providing virtual consultation and home-care solutions. It is expected that this futuristic guide will strengthen the doctor-patient relationship and enhance the in-clinic experience.
As the Sr. Java Developer, you will be responsible for developing cutting edge health-tech applications that include high scale transaction processing, intelligent bot-based programs, and data analytics
What you will do:
- Building components for the companys advanced health-tech platform using Java, Solr, SpringBoot, DialogFlow
- Communicating effectively in a cross-functional product development team and presenting ideas and solutions effectively
What you need to have:
- Education: B.Tech with Min. 65% marks
- Expert at hands-on programming in JAVA and J2EE
- Proven expertise in Java interfaces with MongoDB (or similar NoSQL databases) as well as relational databases (MySQL, Postgres, etc)
- Key contributor in atleast one 6+ months development project involving SpringBoot and Hibernate
- Strong understanding of application server infrastructure
- Good working knowledge of Maven-based build systems
- Good understanding of build and deployment pipelines that involve ANT and Jenkins
- Proficient understanding of code versioning tools, such as Git or SVN
- Good knowledge of working on Rest APIs, Webservices
- Excellent problem-solving skills
Desired Attributes
- Hands-on experience with Lucene/Solr
- Familiarity with DiagFlow based chatbot building
- Knowledge of NLP
- Learned AI/ML systems
- Excellent interpersonal skills and the ability to build good working relationships.
- Must be self-motivated to prioritize and manage workload and meet critical project milestones and deadlines.
- Able to effectively collaborate with a team as well as take initiative and work independently to solve problems
Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities.
A Day in the Life
Over the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution. We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.
You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations. You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.
Opportunity:
Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world.
https://hive.apache.org/" target="_blank">Apache Hive
Responsibilities:
-
Build robust and scalable data infrastructure software
-
Design and create services and system architecture for your projects
-
Improve code quality through writing unit tests, automation, and code reviews
-
The candidate would write Java code and/or build several services in the Cloudera Data Warehouse.
-
Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs
-
The candidate has to understand the basics of Kubernetes.
-
Build out the production and test infrastructure.
-
Develop automation frameworks to reproduce issues and prevent regressions.
-
Work closely with other developers providing services to our system.
-
Help to analyze and to understand how customers use the product and improve it where necessary.
Qualifications:
-
Deep familiarity with Java programming language.
-
Hands-on experience with distributed systems.
-
Knowledge of database concepts, RDBMS internals.
-
Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus.
-
Has experience working in a distributed team.
-
Has 3+ years of experience in software development.
- Experience in Java and associated technologies Core Java, JSP, Spring, Struts,RESTful Services,SOAP,Tomcat, Hibernate, Maven
- Strong understanding of OOPS concepts
- Proficient understanding of RDBMS Concepts
- Strong understanding of Databases Oracle / MSQL / MySQL etc and PL/SQL Programming
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
- Hands-on Java Engineers, with experience building consumer-facing or enterprise applications using Java stack – Spring, Hibernate, MySQL
- Strong problem solving and analytical skills
- Strong understanding of Object-Oriented Programming concepts and Design patterns.
Do You Know? (Skills good to have)
- Exposure to building service-oriented distributed systems
- In building systems that process big data in a distributed environment, either in real-time streaming or offline batching.
- In messaging systems like Kafka, RabbitMQ, kinesis, etc.
- In real-time computation tools like Storm / Spark or Hadoop-based tools.
- In Data warehousing technologies like Redshift, BigQuery, etc.
Ability to communicate across levels with excellent verbal and written communication skills
To be able to work in teams and collaborate with others to clarify requirements
To be able to assist in documenting requirements as well as resolve conflicts or ambiguities