Cutshort logo
Apache kafka jobs

50+ Apache Kafka Jobs in India

Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!

icon
Inaza

at Inaza

3 candid answers
1 product
Chandu fromCutshort
Posted by Chandu fromCutshort
Remote only
3yrs+
Upto ₹18L / yr (Varies
)
Python
Flask
PostgreSQL
Apache Kafka

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

About us

Inaza is looking for a Solutions Deployment Engineer to join our remote team. This role is centered around developing and deploying innovative solutions for our insurance clients. As a vital member of our team, your work will have a significant impact on Inaza's growth. The position offers a competitive salary in a fully remote setting.

 

Responsibilities

  • Lead the deployment of customized solutions during the Proof of Concept (POC) phase for our insurance clients.
  • Work closely with clients to understand their needs and translate these into effective technical solutions.
  • Configure and tailor our platform to meet client requirements, ensuring a smooth deployment process.
  • Develop and maintain Python scripts for automation and integration.
  • Provide technical support and expertise during the deployment phase, including potential on-site visits to clients.

 

Must Haves

  • 3+ years of experience as a deployment engineer, solutions engineer, support engineer or similar role, preferably within the insurance industry.
  • Strong analytical and problem-solving skills to identify and resolve issues during the deployment process.
  • Proficiency in Python scripting and platform configuration.
  • Experience in client-facing roles, with strong communication and problem-solving skills.
  • Familiarity with data streaming technologies (Pulsar, Kafka) and their application in real-time data processing.
  • Proficiency in using version control systems such as Git.

 

Additional skills that would be advantageous:

  • Knowledge of database management, ideally PostgreSQL.
  • Understanding of containerization technologies like Kubernetes and Docker.
  • Experience with FastAPI or other Python frameworks for API development.
Read more
Databook

at Databook

5 candid answers
1 video
Nikhil Mohite
Posted by Nikhil Mohite
Mumbai
1 - 3 yrs
Upto ₹20L / yr (Varies
)
Data engineering
Python
Apache Kafka
Spark
Amazon Web Services (AWS)
+1 more

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

 

About Databook:-

- Great salespeople let their customers’ strategies do the talking.

 

Databook’s award-winning Strategic Relationship Management (SRM) platform uses advanced AI and NLP to empower the world’s largest B2B sales teams to create, manage, and maintain strategic relationships at scale. The platform ingests and interprets billions of financial and market data signals to generate actionable sales strategies that connect the seller’s solutions to a buyer’s financial pain and urgency.

 

The Opportunity

We're seeking Junior Engineers to support and develop Databook’s capabilities. Working closely with our seasoned engineers, you'll contribute to crafting new features and ensuring our platform's reliability. If you're eager about playing a part in building the future of customer intelligence, with a keen eye towards quality, we'd love to meet you!

 

Specifically, you'll

- Participate in various stages of the engineering lifecycle alongside our experienced engineers.

- Assist in maintaining and enhancing features of the Databook platform.

- Collaborate with various teams to comprehend requirements and aid in implementing technology solutions.

 

Please note: As you progress and grow with us, you might be introduced to on-call rotations to handle any platform challenges.

 

Working Arrangements:

- This position offers a hybrid work mode, allowing employees to work both remotely and in-office as mutually agreed upon.

 

What we're looking for

- 1-2+ years experience as a Data Engineer

- Bachelor's degree in Engineering

- Willingness to work across different time zones

- Ability to work independently

- Knowledge of cloud (AWS or Azure)

- Exposure to distributed systems such as Spark, Flink or Kafka

- Fundamental knowledge of data modeling and optimizations

- Minimum of one year of experience using Python working as a Software Engineer

- Knowledge of SQL (Postgres) databases would be beneficial

- Experience with building analytics dashboard

- Familiarity with RESTful APIs and/or GraphQL is welcomed

- Hand-on experience with Numpy, Pandas, SpaCY would be a plus

- Exposure or working experience on GenAI (LLMs in general), LLMOps would be a plus

- Highly fluent in both spoken and written English language

 

Ideal candidates will also have:

- Self-motivated with great organizational skills.

- Ability to focus on small and subtle details.

- Are willing to learn and adapt in a rapidly changing environment.

- Excellent written and oral communication skills.

 

Join us and enjoy these perks!

- Competitive salary with bonus

- Medical insurance coverage

- 5 weeks leave plus public holidays

- Employee referral bonus program

- Annual learning stipend to spend on books, courses or other training materials that help you develop skills relevant to your role or professional development

- Complimentary subscription to Masterclass

Read more
Opstech

at Opstech

2 candid answers
1 product
Nikhil Mohite
Posted by Nikhil Mohite
Hyderabad
3 - 6 yrs
Upto ₹20L / yr (Varies
)
React.js
NodeJS (Node.js)
Apache Kafka

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

About us:

Opstech is a startup founded to bring Data-Driven Manufacturing to the Small and Medium Scale Sector of India. With the lack of required digital infrastructure for an SME and the onset of the 4th Industrial revolution, We decided to take on the problem head-on and decided to first solve the most pressing problem in the manufacturing industry, the problem of quality for every SME in India.

 

Our products are designed not only to ensure quality for each product but also help make the workforce imbibe a sense of responsibility towards each product.Our products also comply with principles of Six sigma and Lean manufacturing making it very attractive for the manufacturing industry.

 

Position Overview:

As a Full Stack Developer at Opstech, you will play a crucial role in the end-to-end development of web applications and mobile applications. You will be responsible for designing, implementing, testing, and maintaining scalable, high-performance software solutions. Your expertise in React and Node.js will be instrumental in creating seamless and responsive user interfaces. Additionally, your experience with Kafka and AWS will be crucial in designing and implementing scalable, distributed, and cloud-native solutions.

 

Key Responsibilities:

1. Develop and maintain robust and scalable web applications using React for the frontend and Node.js for the backend.

2. Integrate Kafka for real-time event streaming, ensuring seamless communication between microservices.

3. Implement serverless and cloud-native solutions on AWS, leveraging services like Lambda, API Gateway, S3, and more.

4. Design and implement efficient, reusable, and reliable code following best practices.

5. Ensure the responsiveness and performance of applications across various devices and browsers.

6. Debug and resolve issues in a timely manner, ensuring the stability and reliability of the applications.

7. Stay updated on industry trends and emerging technologies, actively contributing to the continuous improvement of development processes.

 

Qualifications:

1. Bachelor's degree in Computer Science, Engineering, or a related field.

2. Proven 3-5 years of experience as a Full Stack Developer with a focus on React, Node.js, Kafka, and AWS.

3. Strong proficiency in JavaScript and its modern frameworks (React).

4. Experience with server-side development using Node.js and related frameworks (Express, etc.).

5. Knowledge of event-driven architecture and experience with Apache Kafka. (PTO)

6. Hands-on experience with AWS services and infrastructure as code (CloudFormation, Terraform).

7. Familiarity with database systems such as MongoDB, MySQL, or PostgreSQL.

8. Understanding of RESTful APIs and microservices architecture.

9. Knowledge of version control systems, preferably Git and Bitbucket

10. Excellent problem-solving and communication skills.

11. Ability to work independently and collaboratively in a team environment.

 

Additional good to have skills

● Experience with containerization technologies (Docker, Kubernetes).

● Knowledge of state management libraries (Redux, MobX).

● Familiarity with build tools and package managers (Webpack, npm).

● Understanding of CI/CD pipelines.

● Front-end technologies such as HTML5, CSS3, and responsive design principles.

Read more
JISA Softech Pvt
Aarti khatpe
Posted by Aarti khatpe
Pune
3 - 5 yrs
₹14L - ₹18L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Design patterns
+3 more

Job Location: Pune 

Experience: 4- 5 years

Functional Area - IT Software - Application Programming , Maintenance

Role Category : Programming & Design

 

Requirement / Job Description:

 

 Core Skills:

 Strong experience of Core Java (1.7 or higher), OOPS concepts and Spring framework (Core,     AOP, Batch, JMS)

 Demonstrated design using Web Services (SOAP and REST)

 Demonstrated Microservices APIs design experience using Spring, Springboot

 Demonstrable experience in Databases like MySQL, PostgreSQL, Oracle PL/SQL development etc

Strong coding skills, good analytical and problem-solving skills

Excellent understanding of Authentication, Identity Management, REST APIs, security and best practices

 Good understanding of web servers like Tomcat Apache, nginx or Vertex/ Grizzly, JBoss etc

 Experience in OAuth principles

 Strong understanding of various Design patterns

 

Other Skills:

  Familiarity with Java Cryptography Architecture (JCA)

 Understanding of API Gateways like Zuul, Eureka Server etc..

 Familiarity with Apache Kafka, MQTT etc.

 

Responsibilities:

 Design, develop, test and debug software modules for an enterprise security product

 Find areas of optimization and produce high quality code

 Collaborate with product managers and other members of the project team in requirements specification and detailed engineering analysis.

 Collaborate with various stake holders and help bring proactive closure on the issues

 Evaluate various technology trends and bring in the best practices

 Innovate and come out of the box solutions

Adapt, thrive and deliver in a highly evolving and demanding product development team

Come up with ways to provide an improved customer experience


Read more
Astra Security

at Astra Security

1 video
3 recruiters
Human Resources
Posted by Human Resources
Remote only
2 - 4 yrs
₹8L - ₹13L / yr
Go Programming (Golang)
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
RESTful APIs
SaaS
+12 more

About us

Astra is a cyber security SaaS company that makes otherwise chaotic pentests a breeze with its one of a kind Pentest Platform. Astra's continuous vulnerability scanner emulates hacker behavior to scan applications for 8300+ security tests. CTOs & CISOs love Astra because it helps them fix vulnerabilities in record time and move from DevOps to DevSecOps with Astra's CI/CD integrations.


Astra is loved by 650+ companies across the globe. In 2023 Astra uncovered 2 million+ vulnerabilities for its customers, saving customers $69M+ in potential losses due to security vulnerabilities. 


We've been awarded by the President of France Mr. François Hollande at the La French Tech program and Prime Minister of India Shri Narendra Modi at the Global Conference on Cyber Security. Loom, MamaEarth, Muthoot Finance, Canara Robeco, ScripBox etc. are a few of Astra’s customers.


Role Overview

As an SDE 2 Back-end Engineer at Astra, you will play a crucial role in the development of a new vulnerability scanner from scratch. You will be architecting & engineering a scalable technical solution from the ground-up.

You will have the opportunity to work alongside talented individuals, collaborating to deliver innovative solutions and pushing the boundaries of what's possible in vulnerability scanning. The role requires deep collaboration with the founders, product, engineering & security teams.

Join our team and contribute to the development of a cutting-edge SaaS security platform, where high-quality engineering and continuous learning are at the core of everything we do.


Roles & Responsibilities:


  • You will be joining our Vulnerability Scanner team which builds a security engine to identify vulnerabilities in technical infrastructure.
  • You will be the technical product owner of the scanner, which would involve managing a lean team of backend engineers to ensure smooth implementation of the technical product roadmap.
  • Research about security vulnerabilities, CVEs, and zero-days affecting cloud/web/API infrastructure.
  • Work in an agile environment of engineers to architect, design, develop and build our microservice infrastructure.
  • You will research, design, code, troubleshoot and support (on-call). What you create is also what you own.
  • Writing secure, high quality, modular, testable & well documented code for features outlined in every sprint.
  • Design and implement APIs in support of other services with a highly scalable, flexible, and secure backend using GoLang
  • Hands-on experience with creating production-ready code & optimizing it by identifying and correcting bottlenecks.
  • Driving strict code review standards among the team.
  • Ensuring timely delivery of the features/products
  • Working with product managers to ensure product delivery status is transparent & the end product always looks like how it was imagined
  • Work closely with Security & Product teams in writing vulnerability detection rules, APIs etc.


Required Qualifications & Skills: 


  • 3-5 years relevant development experience in GoLang
  • Experience in building a technical product from idea to production.
  • Design and build highly scalable and maintainable systems in Golang
  • Expertise in Goroutines and Channels to write efficient code utilizing multi-core CPU optimally
  • Must have hands-on experience with managing AWS/Google Cloud infrastructure
  • Hands on experience in creating low latency high throughput REST APIs
  • Write test suites and maintain code coverage above 80%
  • Working knowledge of PostgreSQL, Redis, Kafka
  • Good to have experience in Docker, Kubernetes, Kafka
  • Good understanding of Data Structures, Algorithms and Operating Systems.
  • Understanding of cloud/web security concepts would be an added advantage


What We Offer:


  • Adrenalin rush of being a part of a fast-growing company
  • Fully remote & agile working environment
  • A wholesome opportunity in a fast-paced environment where you get to build things from scratch, improve and influence product design decisions
  • Holistic understanding of SaaS and enterprise security business
  • Opportunity to engage and collaborate with developers globally
  • Experience with security side of things
  • Annual trips to beaches or mountains (last one was Chikmangaluru)
  • Open and supportive culture 
Read more
Technogise Private Limited

at Technogise Private Limited

1 video
3 recruiters
Parag Shinde
Posted by Parag Shinde
Pune
5 - 8 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Test driven development (TDD)
+1 more

How do Technogisers function?

Value: Exploring technologies and implementing them on the projects provided they make business sense and deliver value.

Engagement: Be it offshore or onshore, we engage ourselves daily with the clients. This assists in building a trustworthy relationship at the same time, collaborating to come up with strategic solutions to business problems.

Solution: We are involved in providing hands-on contributions towards Backend & Front-end design and development at the same time, flourishing our DevOps culture.

Thought Leadership: Attend or present technical meet-ups/workshops/conferences to share knowledge and help build Technogise brand.


How can you become a Technogiser?

 

Core Skills:

  • A thorough understanding of at least one technology stack is the go-to person for any problems related to this
  • Experience: 5 to 8 years of Java experience
  • Should have knowledge about Kafka
  • Should have worked on Springboot, Microservices
  • Should be able to write the test cases
  • Influence technical decision-making and high-level design decisions - choice of frameworks and tech approach
  • Demonstrate the ability to understand different approaches for application, and integration and influence decisions by making appropriate trade-offs

 

Ways of working:

  • You communicate effectively with other roles in the project at the team and client levels.
  • You drive discussions effectively at the team and client levels. Encourage others to participate.

 

Going beyond

  • Establish credibility within the team as a result of technical and leadership skills
  • Mentoring fellow team members within the project team and providing technical guidance to others beyond project boundaries.
  • Build and own Growth framework of people in the project team.
  • Actively participate in organizational activities.

Tech stack: We are polyglots so focus on varied technologies.

Java,Node, Mongodb, Microservices, Go lang, Ruby, Ruby on rails.

Read more
Technogise Private Limited

at Technogise Private Limited

1 video
3 recruiters
Phani Kumar
Posted by Phani Kumar
Bengaluru (Bangalore)
5 - 8 yrs
₹16L - ₹32L / yr
Java
Spring Boot
Test driven development (TDD)
Apache Kafka

How do Technogisers function?

Value: Exploring technologies and implementing them on the projects provided they make business sense and deliver value.

Engagement: Be it offshore or onshore, we engage ourselves daily with the clients. This assists in building a trustworthy relationship at the same time, collaborating to come up with strategic solutions to business problems.

Solution: We are involved in providing hands-on contributions towards Backend & Front-end design and development at the same time, flourishing our DevOps culture.

Thought Leadership: Attend or present technical meet-ups/workshops/conferences to share knowledge and help build Technogise brand.


How can you become a Technogiser?

 

Core Skills:

  • A thorough understanding of at least one technology stack is the go-to person for any problems related to this
  • Experience: 5 to 8 years of Java experience
  • Should have knowledge about Kafka
  • Should have worked on Springboot, Microservices
  • Should be able to write the test cases
  • Influence technical decision-making and high-level design decisions - choice of frameworks and tech approach
  • Demonstrate the ability to understand different approaches for application, and integration and influence decisions by making appropriate trade-offs

 

Ways of working:

  • You communicate effectively with other roles in the project at the team and client levels.
  • You drive discussions effectively at the team and client levels. Encourage others to participate.

 

Going beyond

  • Establish credibility within the team as a result of technical and leadership skills
  • Mentoring fellow team members within the project team and providing technical guidance to others beyond project boundaries.
  • Build and own Growth framework of people in the project team.
  • Actively participate in organizational activities.

Tech stack: We are polyglots so focus on varied technologies.

Java,Node, Mongodb, Microservices, Go lang, Ruby, Ruby on rails.

Read more
Egen Solutions
Remote only
4 - 7 yrs
₹12L - ₹24L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
cicd
+4 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.

 

You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.

 

Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Reflektive

at Reflektive

3 recruiters
Payal Banchare
Posted by Payal Banchare
Remote only
4 - 8 yrs
₹7L - ₹21L / yr
React.js
Angular (2+)
AngularJS (1.x)
Vue.js
Javascript
+7 more

About the job


About Reflektive's Engineering Team

We are seeking a Senior Software Engineer, Front End to help scale Reflektive to being the market leader for employee performance management. The main question to be answered: Can you help a

company scale?


Reflektive has major initiatives to tackle in the next year. Initiatives range from internal scaling, security, engagement, new verticals, pervasive technologies, research and development, data and

analytics, and customer tools. Reflektive’s Senior Software Engineer will contribute in their area of specialization. S/he will help us solve complex design challenges and mature our platform to handle

increasing traffic and scale.


You'll join a lean, prolific team where everyone, including you, is active in the product-defining and development process (where deploying new features every 2 weeks is common). You'll know the

customers we're talking to, and the needs of each one. As a result, you know where your initiative and drive can best make a difference (and be recognized!)


Our engineering team consists of developers from a wide array of backgrounds. Our team primarily focuses on Rails and Javascript, but is always ready to use the best tool for the job when it makes

sense. Following Scrum practices, we work closely with the Product Management team to develop features that focus on empowering and developing employees. Our team is a tight knit, friendly

group of engineers that are dedicated to learning from and teaching to each other. Team members regularly contribute to and optimize our engineering practices and processes. Our team wants to

make software engineering fun, easy, and fulfilling, so we've come up with a set of values that we

apply to our software every day: Simple, Flexible, Consistent, Predictable, Efficient, and Pragmatic.


Responsibilities

● Depending on your specialization, projects/initiatives may include: Security, scaling distributed systems, working on our core services related to user management, building out new verticals, guiding new engagement features, scaling traffic/imports/exports,

and managing APIs.

● Work extremely cross-functionally across Engineering and Product Management.

● Deliverable: (30 days) Own a feature; possibly being paired with another engineer. (60days) Own and drive a new initiative. (90 days) Bring that initiative to production.



Desired Skills and Experience


  • Expert proficiency in Java/Kotlin, Kafka/Pulsar, SQL, Docker and kubernetes.
  • Overall 4+ years of experience as Java full stack developer using any modern frameworks.
  • Strong knowledge on Data structure and Algorithm knowledge
  • Previous experience working in ReactJS/AngularJS (2+ years)
  • Knowledge of Analytics and LookerML would be added plus.
  • Exposure to cloud environment AWS.
  • Knowledge of unit testing frameworks including Junit.
  • Startup experience is strongly desired.
  • You learn quickly, you’re adaptable and versatile.
  • Experience in an Agile and Scrum environment.





About Reflektive

Forward-thinking organizations use Reflektive’s people management platform to drive employee performance and development with Real-Time Feedback, Recognition, Check-Ins, Goal Management,

Performance Reviews, 1-on-1 Profiles, and Analytics. Reflektive’s more than 500 customers include Blue Origin, Comcast, Instacart, Dollar Shave Club, Healthgrades, Wavemaker Global, and Protective

Life. Backed by Andreessen Horowitz, Lightspeed Venture Partners, and TPG Growth, Reflektive has raised more than $100 million to date, and was ranked the 13th Fastest Growing Company in North

America on Deloitte’s 2018 Technology Fast 500TM.

We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status,

veteran status, or disability status.

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
1 - 3 yrs
Best in industry
MongoDB
Big Data
Apache Kafka
Spring MVC
Spark
+1 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.


Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent


The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
3 - 8 yrs
Best in industry
MongoDB
Big Data
Apache Kafka
Spring MVC
Spark
+1 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.


Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent


The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Mumbai, Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Messaging
+5 more

WISSEN TECHNOLOGY is Hiring!!!!!


Java Developer with messaging experience (JMS/EMS/Kafka/RabbitMQ) And CI/CD.


Exp-3 to 6 yrs

Location-Pune|Mumbai|Bangalore

NP- Serving and less than 15 days only.

Requirement:

Core Java 8.0

Mandatory experience in any of the messaging technologies like JMS/EMS/Kafka/RabbitMQ

Extensive experience in developing enterprise-scale n-tier applications for the financial domain.

Should possess good architectural knowledge and be aware of enterprise application design patterns. 

Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.

Mandatory development experience on CI/CD platform.

Good knowledge of multi-threading and high-volume server side development

Experience in sales and trading platforms in investment banking/capital markets

Basic working knowledge of Unix/Linux. 

Strong written and oral communication skills. Should have the ability to express their design ideas and thoughts.

Read more
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Kotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Ramya S
Posted by Ramya S
Pune, Hyderabad, Chennai, Gurugram
3 - 5 yrs
Best in industry
Spark
PySpark
Data engineering
Big Data
Hadoop
+6 more

DATA ENGINEER – CONSULTANT


Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to

understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software

delivery project where you're equally happy coding and tech-leading the team to implement the solution.


Job Responsibilities

• You will partner with teammates to create complex data processing pipelines to solve our clients' most complex challenges

• You will collaborate with Data Scientists to design scalable implementations of their models

• You will pair to write clean and iterative code based on TDD

• Leverage various continuous delivery practices to deploy, support and operate data pipelines

• Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

• Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

• Create data models and speak to the tradeoffs of different modelling approaches

• Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

• Assure effective collaboration between Thoughtworks and the client's teams, encouraging open communication and advocating for shared outcomes


Job Qualifications


Technical skills

• You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

• You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

• Hands-on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

• You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

• Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems

• You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments


Professional skills

• You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

• An interest in coaching, sharing your experience and knowledge with teammates

• You enjoy influencing others and always advocate for technical excellence while being open to change when needed

• Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more


Other things to know


Learning & Development


There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our

cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career

journeys.


About Thoughtworks

Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For over 30 years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science

grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.

Read more
a Customer Data Platform-led personalization and real- time marketing automation solution that delivers superior customer experiences resulting in increased conversions, retention, and growth for enterprises., Bengaluru (Bangalore)
5 - 8 yrs
₹18L - ₹30L / yr
Docker
Kubernetes
DevOps
Jenkins
Apache Kafka
+6 more

Key Responsibilities:

● Set up a production-grade Apache Kafka in Kubernetes with Jenkins integration.

● Design a Jenkins-based CI/CD pipeline for multiple components, including Kafka Connect.

● Implement Horizontal Pod Autoscaler (HPA) for our 30-40 micro components.

● Configure comprehensive logging and monitoring, setting up Grafana dashboards with

Prometheus integration.


Key Requirements:

Must-Have Skills:

● Deep expertise in Git, Jenkins, Kubernetes, Helm, Docker, Kafka, Kafka Connect, Maven, Java,

Prometheus, Grafana, EKS/AKS, Load Balancer, and Ingress.

● Experience working with AWS and Azure cloud environments.

● 5+ years of relevant work experience.

Nice to Have:

● Familiarity with managing On-prem Kubernetes is a plus.

● Familiarity with Secret Manager, NodeJs, Python, Confluent Kafka, MongoDB, ElasticSearch,

FluentBit, FluentD, Aerospike, HBase, and MySQL.

● If you’re passionate about orchestrating large-scale system migrations and love to stay on the

bleeding edge of technology, then you’re just the professional we’re looking for.


Read more
Arroz Technology

at Arroz Technology

2 candid answers
Amogh Saxena
Posted by Amogh Saxena
Bengaluru (Bangalore)
1 - 3 yrs
₹1L - ₹5L / yr
AngularJS (1.x)
Angular (2+)
React.js
NodeJS (Node.js)
MongoDB
+5 more

Job Description: Full Stack Developer Company: Arroz Technology Private Limited CTC: 5 LPA

Location : Bangalore (Onsite)

Responsibilities:

- Design and develop scalable and high-performance web applications using the

MERN (MongoDB, Express.js, React.js, Node.js) stack.

- Collaborate with cross-functional teams to gather requirements and translate them into high-level designs.

- Write clean, reusable, and well-structured code following industry best practices and coding standards.

- Mentor and guide junior developers, providing technical expertise and promoting Professional growth.

- Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.

- Collaborate with frontend and backend developers to integrate components and ensure smooth data flow.

- Work with UI/UX designers to implement responsive and user-friendly interfaces.

- Stay updated with the latest trends and advancements in full-stack development technologies.

- Work in a 10 AM to 6 PM, six-day office role, maintaining regular attendance and punctuality.

Required Skills and Qualifications:

-Strong proficiency in MERN (MongoDB, Express.js, React.js, Node.js) stack development.

-Experience with Redux or similar state management libraries.

-Solid understanding of front-end technologies such as HTML, CSS, and JavaScript.

-Proficiency in RESTful API development and integration.

-Familiarity with version control systems like Git and agile development methodologies.

-Good problem-solving and debugging skills.

-Excellent communication and teamwork abilities.

-Bachelor's degree in Computer Science or a related field (preferred).

Join Arroz Technology Private Limited as a Full Stack Developer and contribute to the development of cutting-edge web applications. This role offers competitive compensation and growth opportunities within a dynamic work environment. 

Read more
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
Best in industry
Docker
Kubernetes
DevOps
Amazon Web Services (AWS)
Go Programming (Golang)
+3 more

Role : Senior Engineer Infrastructure


Key Responsibilities:


● Infrastructure Development and Management: Design, implement, and manage robust and scalable infrastructure solutions, ensuring optimal performance,security, and availability. Lead transition and migration projects, moving legacy systemsto cloud-based solutions.

● Develop and maintain applications and services using Golang.

● Automation and Optimization: Implement automation tools and frameworksto optimize operational processes. Monitorsystem performance, optimizing and modifying systems as necessary.

● Security and Compliance: Ensure infrastructure security by implementing industry best practices and compliance requirements. Respond to and mitigate security incidents and vulnerabilities.



Qualifications:


● Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent practical experience).

● Good understanding of prominent backend languageslike Golang, Python, Node.js, or others.

● In-depth knowledge of network architecture,system security, infrastructure scalability.

● Proficiency with development tools,server management, and database systems.

● Strong experience with cloud services(AWS.), deployment,scaling, and management.

● Knowledge of Azure is a plus

● Familiarity with containers and orchestration services,such as Docker, Kubernetes, etc.

● Strong problem-solving skills and analytical thinking.

● Excellent verbal and written communication skills.

● Ability to thrive in a collaborative team environment.

● Genuine passion for backend development and keen interest in scalable systems. 




Read more
Prolifics Corporation Ltd.,

at Prolifics Corporation Ltd.,

1 video
1 recruiter
Sandhya Patel
Posted by Sandhya Patel
Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
Java
Spring Boot
Apache Kafka
Azure

Job Description:


Organization - Prolifics Corporation


Skill - Java developer


Job type - Full time/Permanent


Location - Bangalore/Mumbai


Experience - 5 to 10 Years


Notice Period – Immediate to 30 Days



Required Skillset:


Spring framework concepts, Spring boot(Mandatory)

Spring batch and dashboard

Apache Kafka(Mandatory)

Azure (Mandatory)

GIT / Maven / Griddle / CI/CD

MS SQL database

Cloud and Data Exposure


Docker, Orchestration using Kubernetes

Genesys pure cloud or any cloud-based contact center platform that can be used to manage customer interactions.

Technical Experience:


The candidate should have 5+ years of experience, preferably at technology or financial firm.

Must have at least 2- 3 years of experience in spring batch / java / Kafka / SQL

Must have hands on experience in database tools and technologies.

Must have exposure to CI / CD and Cloud.

Work scope


Build the spring batch framework to pull the required data from Genesys

Cloud to MS reporting data storage – on prem / Cloud.

Build MS WM Contact Center Data Hub (on Prem / Cloud)

Build dashboard to monitor and manage the data injection, fusion jobs.

Event bridge implementation for real time data ingestion and monitoring

MS Private Cloud

Read more
Neysa Networks Pvt Ltd

at Neysa Networks Pvt Ltd

2 candid answers
Swapna Uchil
Posted by Swapna Uchil
Mumbai
7 - 10 yrs
Best in industry
TensorFlow
PyTorch
Python
Hadoop
R Programming
+9 more

Day in the life...


As a machine learning engineer at Neysa, you would be required to

- Collaborate with network engineers and IT teams to identify network-related challenges and areas where ML can provide solutions. Understand the specific network remediation problems that need to be addressed.

- Develop ML-based models and algorithms specific to issues that affect computer networks, for example, congestion, security threats and human errors.

- Be comfortable handling multiple types and data sources and then pre-process and clean them for modelling purposes. 

- Develop machine learning models that analyse networking data to predict (and possibly prevent) issues, detect anomalies or optimise performance. Choose the right approach for each, such as deep learning, reinforcement learning, or traditional statistical methods. 

- Train and evaluate the efficacy of your models, create performance metrics to assess robustness and effectiveness, and use feedback loops to make course corrections.

- Design solutions that can scale to handle large network environments efficiently. This typically means optimising execution latency and resource usage.  

- Integrate your model to work with running and maintaining a network. These could be with other machines, human operators, or both. 

- Document your design, architecture and your thought process. Work with the technical writers to make sure your message gets through.

- Stay updated with all that is changing in AI and ML.

 

Must have skills

 

- You should have expertise in machine learning algorithms, data processing, and model development. It would be best to have proficiency in associated frameworks such as TensorFlow, PyTorch or scikit-learn.

- You should understand how computer networks work, how they are used, how they fail, and what happens if they fail.

- You should be proficient in programming languages like Python, R, Go, LISP, etc. It would help if you also were very useful with old-school shell scripting. 

- Experience with data processing tools like Hadoop, Spark or Kafka. 

- An above-average understanding of Linux and operating systems in general.

 

What separates the best from the rest

 

ML and AI are continuously evolving, and so is understanding how these technologies can be applied to solve real-world problems. To do your best, you may also need to

- Conceptualise ways to map new problems to existing methodologies or create new ones.

- Be prepared to iterate, reiterate and then iterate your approach.

- Be able to interact with subject matter experts in multiple fields to identify potential use cases for machine learning.

 

What you can expect

 

An environment where you can do your best work...

- The best equipment that complements your talents

- The best tools in the business for you to bring your creations to life

- A great environment

- Flexible work hours and flexible work locations

- The opportunity to make your mark and shape the future

- And have fun...

Read more
Hyderabad
6 - 15 yrs
₹11L - ₹15L / yr
Python
Spark
SQL Azure
Apache Kafka
MongoDB
+4 more

5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus


Skills Required :


Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols

Read more
Aurum Ventures

at Aurum Ventures

1 recruiter
Hariom Pathak
Posted by Hariom Pathak
Navi Mumbai
4 - 7 yrs
₹10L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+2 more

Key roles:

·       Develop backend application with clean code practices foe the application

·       Collaborate with frontend developers to integrate user-facing elements with server side logic

·       Troubleshoot and debug application

·       Gather and address technical and design requirements

·       Build reusable code and libraries for future user-facing

·       Communicate with other third-party team for collaborations


Skillset:

·       Core Java - Hands on experience with Jdk 11 and above

·       Experience with code hosting and collaboration tools like Bitbucket/Github/GitLab

·       Microservice architecture - Rest API calls, inter-service exception handling

·       Non-relational DB - MongoDB basic DB commands to insert, update, delete, find records and Indexing

·       Springboot framework - Spring Data JPA, ORM

·       Event driven architecture - Kafka

·       Tools like Postman, Jenkins, Doppler, IDE, MongoDB atlas


Qualifications:

·       UG: B.Tech/B.E. in Any Specialization, B.Sc in Any Specialization, BCA in Any Specialization

·       PG: Any Postgraduate

Read more
Molecular Connections

at Molecular Connections

4 recruiters
Molecular Connections
Posted by Molecular Connections
Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Kapture CX

at Kapture CX

1 video
Arunashree JS
Posted by Arunashree JS
Bengaluru (Bangalore)
3 - 4 yrs
₹8L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Apache Kafka
+3 more

Roles and Responsibilities:


  • Proven experience in Java 8, Spring Boot, Microservices and API
  • Strong experience with Kafka, Kubernetes
  • strong experience in using RDBMS (Mysql) and NoSQL.
  • Experience working in Eclipse or Maven environments
  • Hands-on experience in Unix and Shell scripting
  • hands-on experience in fine-tuning application response and performance testing.
  • experience in Web Services.
  • Strong analysis and problem-solving skills
  • Strong communication skills, both verbal and written
  • Ability to work independently with limited supervision
  • Proven ability to use own initiative to resolve issues
  • Full ownership of projects and tasks
  • Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines
  • Eagerness to learn
  • Strong team-working skills
Read more
EMAlpha
Sash Sarangi
Posted by Sash Sarangi
Remote only
2 - 5 yrs
₹6L - ₹12L / yr
Vue.js
AngularJS (1.x)
Angular (2+)
React.js
Javascript
+19 more

Required a full stack Senior SDE with focus on Backend microservices/ modular monolith with 3-4+ years of experience on the following:

  • Bachelor’s or Master’s degree in Computer Science or equivalent industry technical skills
  • Mandatory In-depth knowledge and strong experience in Python programming language.
  • Expertise and significant work experience in Python with Fastapi and Async frameworks. 
  • Prior experience building Microservice and/or modular monolith.
  • Should be an expert Object Oriented Programming and Design Patterns.
  • Has knowledge and experience with SQLAlchemy/ORM, Celery, Flower, etc.
  • Has knowledge and experience with Kafka / RabbitMQ, Redis.
  • Experience in Postgres/ Cockroachdb.
  • Experience in MongoDB/DynamoDB and/or Cassandra are added advantages.
  • Strong experience in either AWS services (e.g, EC2, ECS, Lambda, StepFunction, S3, SQS, Cognito). and/or equivalent Azure services preferred.
  • Experience working with Docker required.
  • Experience in socket.io added advantage
  • Experience with CI/CD e.g. git actions preferred. 
  • Experience in version control tools Git etc.


This is one of the early positions for scaling up the Technology team. So culture-fit is really important.

  • The role will require serious commitment, and someone with a similar mindset with the team would be a good fit. It's going to be a tremendous growth opportunity. There will be challenging tasks. A lot of these tasks would involve working closely with our AI & Data Science Team.
  • We are looking for someone who has considerable expertise and experience on a low latency highly scaled backend / fullstack engineering stack. The role is ideal for someone who's willing to take such challenges.
  • Coding Expectation – 70-80% of time.
  • Has worked with enterprise solution company / client or, worked with growth/scaled startup earlier.
  • Skills to work effectively in a distributed and remote team environment.
Read more
Quadratic Insights
Praveen Kondaveeti
Posted by Praveen Kondaveeti
Hyderabad
7 - 10 yrs
₹15L - ₹24L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

About Quadratyx:

We are a product-centric insight & automation services company globally. We help the world’s organizations make better & faster decisions using the power of insight & intelligent automation. We build and operationalize their next-gen strategy, through Big Data, Artificial Intelligence, Machine Learning, Unstructured Data Processing and Advanced Analytics. Quadratyx can boast more extensive experience in data sciences & analytics than most other companies in India.

We firmly believe in Excellence Everywhere.


Job Description

Purpose of the Job/ Role:

• As a Technical Lead, your work is a combination of hands-on contribution, customer engagement and technical team management. Overall, you’ll design, architect, deploy and maintain big data solutions.


Key Requisites:

• Expertise in Data structures and algorithms.

• Technical management across the full life cycle of big data (Hadoop) projects from requirement gathering and analysis to platform selection, design of the architecture and deployment.

• Scaling of cloud-based infrastructure.

• Collaborating with business consultants, data scientists, engineers and developers to develop data solutions.

• Led and mentored a team of data engineers.

• Hands-on experience in test-driven development (TDD).

• Expertise in No SQL like Mongo, Cassandra etc, preferred Mongo and strong knowledge of relational databases.

• Good knowledge of Kafka and Spark Streaming internal architecture.

• Good knowledge of any Application Servers.

• Extensive knowledge of big data platforms like Hadoop; Hortonworks etc.

• Knowledge of data ingestion and integration on cloud services such as AWS; Google Cloud; Azure etc. 


Skills/ Competencies Required

Technical Skills

• Strong expertise (9 or more out of 10) in at least one modern programming language, like Python, or Java.

• Clear end-to-end experience in designing, programming, and implementing large software systems.

• Passion and analytical abilities to solve complex problems Soft Skills.

• Always speaking your mind freely.

• Communicating ideas clearly in talking and writing, integrity to never copy or plagiarize intellectual property of others.

• Exercising discretion and independent judgment where needed in performing duties; not needing micro-management, maintaining high professional standards.


Academic Qualifications & Experience Required

Required Educational Qualification & Relevant Experience

• Bachelor’s or Master’s in Computer Science, Computer Engineering, or related discipline from a well-known institute.

• Minimum 7 - 10 years of work experience as a developer in an IT organization (preferably Analytics / Big Data/ Data Science / AI background.

Read more
Zycus

at Zycus

10 recruiters
Nafis Kurne
Posted by Nafis Kurne
Pune, Mumbai, Bangalore
14 - 26 yrs
₹25L - ₹55L / yr
Vue.js
AngularJS (1.x)
Angular (2+)
React.js
Javascript
+18 more

EXPERTISE AND QUALIFICATIONS

  • 14+ years of experience in Software Engineering with at least 6+ years as a Lead Enterprise Architect preferably in a software product company
  • High technical credibility - ability to lead technical brainstorming, take decisions and push for the best solution to a problem
  • Experience in architecting Microservices based E2E Enterprise Applications
  • Experience in UI technologies such as Angular, Node.js or Fullstack technology is desirable
  • Experience with NoSQL technologies (MongoDB, Neo4j etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • Exposure in SaaS cloud-based platform.
  • Experience on Docker, Kubernetes etc.
  • Experience in planning, designing, developing and delivering Enterprise Software using Agile Methodology
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Hands-on technical leadership with proven ability to recruit and mentor high performance talents including Architects, Technical Leads, Developers
  • Excellent team building, mentoring and coaching skills are a must-have
  • A proven track record of consistently setting and achieving high standards

Five Reasons Why You Should Join Zycus

1. Cloud Product Company: We are a Cloud SaaS Company, and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.

2. A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.

3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization

4. Get a Global Exposure: You get to work and deal with our global customers.

5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.


About Us

Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.


Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-

to-use user interface ensures high adoption and value across the organization.


Start your #CognitiveProcurement journey with us, as you are #MeantforMore

Read more
Markowate
Gauri Parashar
Posted by Gauri Parashar
Gurugram
9 - 15 yrs
Best in industry
Microservices
Architecture
NodeJS (Node.js)
Apache Kafka
Amazon Web Services (AWS)
+5 more

About Us :

Markowate is a digital product development company building digital products on AI, Blockchain, Mobile, and Web3 digital. We work with tech startups as their technical partner and help them with their digital transformation.

Role Overview: As a Solution Architect, you will collaborate with stakeholders, including business executives, project managers, and software development teams, to understand the organization's objectives and requirements. You will then design scalable and efficient software solutions that align with these requirements. Your role involves assessing technologies, creating architecture designs, overseeing the development process, and ensuring the successful implementation of the solutions.


"Note: Should have 9+ years of relevant experience. Must have worked with Node.js technology."


Responsibilities:

  • Collaborate with stakeholders to understand and analyze business and technical requirements, and translate them into scalable and feasible solution designs.
  • Develop end-to-end solution architectures, considering factors such as system integration, scalability, performance, security, and reliability.
  • Research and evaluate new technologies, frameworks, and platforms to determine their suitability for the organization's needs.
  • Provide technical guidance and support to development teams throughout the software development life cycle (SDLC) to ensure adherence to the architectural vision and best practices.
  • Effectively communicate complex technical concepts to non-technical stakeholders, such as executives and project managers, and provide recommendations on technology-related decisions.
  • Identify and mitigate technical risks by proactively identifying potential issues and developing contingency plans.
  • Collaborate with quality assurance teams to define and implement testing strategies that validate the solution's functionality, performance, and security.
  • Create and maintain architectural documentation, including diagrams, technical specifications, and design guidelines, to facilitate efficient development and future enhancements.
  • Stay up-to-date with industry trends, best practices, and emerging technologies to drive innovation and continuous improvement within the organization.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
  • Should have 9+ years of experience and relevant experience as a solution architect.
  • Must have experience in Node.Js and should have deep understanding in AI/ML and data pipelines.
  • Proven experience as a Solution Architect or a similar role, with a strong background in software development.
  • In-depth knowledge of software architecture patterns, design principles, and development methodologies.
  • Proficiency in various programming languages and frameworks.
  • Strong problem-solving and analytical skills, with the ability to think strategically and propose innovative solutions.
  • Excellent communication and presentation skills, with the ability to convey complex technical concepts to both technical and non-technical stakeholders.
  • Experience in cloud computing platforms, such as AWS, Azure, or Google Cloud, and understanding of their services and deployment models.
  • ​Familiarity with DevOps practices, continuous integration/continuous deployment (CI/CD) pipelines, and containerization technologies like Docker and Kubernetes.


Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Windows Azure
+3 more
  1. Role: IoT Application Development (Java) Skill Set:
  • Proficiency in Java 11.
  • Strong knowledge of Spring Boot framework.
  • Experience with Kubernetes.
  • Familiarity with Kafka.
  • Understanding of Azure Cloud services.

1 Experience: 3 to 5 years Location: Bangalore ; Notice period : Immediate Joiners

  1. Job Description: We are seeking an experienced IoT Application Developer with expertise in Java to join our team in Bangalore. As a Java Developer, you will be responsible for designing, developing, and deploying IoT applications. You should have a solid understanding of Java 11 and the Spring Boot framework. Experience with Kubernetes and Kafka is also required. Familiarity with Azure Cloud services is essential. Your role will involve collaborating with the development team to build scalable and efficient IoT solutions using Java and related technologies.


Read more
Mumbai, Navi Mumbai
8 - 10 yrs
₹20L - ₹30L / yr
DevOps
Kubernetes
Docker
Amazon Web Services (AWS)
Linux/Unix
+14 more

Role : Principal Devops Engineer


About the Client


It is a Product base company that has to build a platform using AI and ML technology for their transportation and logiticsThey also have a presence in the global market


Responsibilities and Requirements


• Experience in designing and maintaining high volume and scalable micro-services architecture on cloud infrastructure

• Knowledge in Linux/Unix Administration and Python/Shell Scripting

• Experience working with cloud platforms like AWS (EC2, ELB, S3, Auto-scaling, VPC, Lambda), GCP, Azure

• Knowledge in deployment automation, Continuous Integration and Continuous Deployment (Jenkins, Maven, Puppet, Chef, GitLab) and monitoring tools like Zabbix, Cloud Watch Monitoring, Nagios

• Knowledge of Java Virtual Machines, Apache Tomcat, Nginx, Apache Kafka, Microservices architecture, Caching mechanisms

• Experience in enterprise application development, maintenance and operations

• Knowledge of best practices and IT operations in an always-up, always-available service

• Excellent written and oral communication skills, judgment and decision-making skill

Read more
Blue Yonder

at Blue Yonder

5 recruiters
GnanaMalleshwar Karri
Posted by GnanaMalleshwar Karri
Hyderabad, Bengaluru (Bangalore)
10 - 12 yrs
₹10L - ₹30L / yr
NodeJS (Node.js)
React.js
Angular (2+)
AngularJS (1.x)
MongoDB
+8 more

About Merchandise Operation (Merch Ops): Merchandise Operations (Merch Ops) is a merchandise management system, it is positioned as a host system in the retail solutions, it has ability to maintain the Master/Foundation data, create and manage Purchase Orders, create, and manage Prices & Promotions, perform Replenishment, effective inventory control and financial management. Merc Ops provides Business users with consistent, accurate, and timely data across an enterprise by allowing them to get the:


Right Goods in the...

• Right Silhouettes, Sizes and Colors; at the...

• Right Price; at the...

• Right Location; for the...

• Right Consumer; at the...

• Right Time; at the...

• Right Quantity.


About Team:

• Proven, passionate bunch of disruptors providing solutions that solve real-time supply chain problems.

• Well mixed experienced team with young members and experienced in product, domain, and Industry knowledge.

• Gained Expertise in designing and deploying massively scalable cloud native SaaS products

• The team currently comprises of associates across the globe and is expected to grow rapidly.


Our current technical environment:

• Software: React JS, Node JS, Oracle PL/SQL, GIT, Rest API. Java script.

• Application Architecture: Scalable three tier web application.

• Cloud Architecture: Private cloud, MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD) • Frameworks/Others: Tomcat Apache, RDBMS, Jenkins, Nginx, Oracle Type ORM, Express.


What you'll be doing:

• As a Staff Engineer you will be responsible for the design of the features in the product roadmap

• Creating and encouraging good software development practices engineering-wide, driving strategic technical improvements, and mentoring other engineers.

• You will write code as we expect our technical leadership to be in the trenches alongside junior engineers, understanding root causes and leading by example

• You will mentor engineers

• You will own relationships with other engineering teams and collaborate with other functions within Blue Yonder

• Drive architecture and designs to become simpler, more robust, and more efficient.


• Lead designs discussion and come up with robust and more efficient designs to achieve features in product roadmap

• Take complete responsibility of the features developed right from coding till deployments

• Introduce new technology and tools for the betterment of the product

• Guides fellow engineers to look beyond the surface and fix the root causes rather than symptoms.


What we are looking for:

• Bachelor’s degree (B.E/B.Tech/M.Tech Computer science or related specialization) and minimum 7 to 10 years of experience in Software development, has been an Architect, within the last 1-2 years minimum. • Strong programming experience and background in Node JS and React JS.

• Hands-on development skills along with architecture/design experience.

• Hands-on experience on designing, building deploying and maintenance of enterprise cloud solutions.

• Demonstrable experience, thorough knowledge, and interests in Cloud native architecture, Distributed micro-services, Multi-tenant SaaS solution and Cloud Scalability, performance, and High availability

• Experience with API management platforms & providing / consuming RESTful APIs

• Experience with varied tools such as REST, Hibernate, RDBMS, Docker, Kubernetes, Kafka, React.

• Hands-on development experience on Oracle PL/SQL.

• Experience with DevOps and infrastructure automation.

Read more
Recro

at Recro

1 video
32 recruiters
Amrita Singh
Posted by Amrita Singh
Bengaluru (Bangalore)
2 - 6 yrs
₹5L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+5 more
  • 2.5+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 3+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.
Read more
BlueYonder
Bengaluru (Bangalore), Hyderabad
10 - 14 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Gradle
+13 more

·      Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.

·      BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.

·      This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.

·      Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment

·      The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools

Our current technical environment:

·      Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake

·      • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture

·      • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)

·      Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite

Read more
Concinnity Media Technologies

at Concinnity Media Technologies

2 candid answers
Anirban Biswas
Posted by Anirban Biswas
Pune
7 - 12 yrs
₹12L - ₹21L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Amazon Web Services (AWS)
+3 more

Job Code: CSAW0004


Candidate Experience:

Having 7+ years of relevant experience


Skills and Qualifications.

● Experience in Design, Development of Java in IOT based Projects

● Exposure to AWS (or any other Cloud Platform).

● Assisting the software design team with application development and

integration.

● Ability to solve complex software system issues.

● Experience in handling database queries and database.

● Programming languages and framework - Java, Java Beans, spring boot, spring MVC.

● Experience in message MQ or streaming framework such as apache KAFKA or Active MQ.

● Experience with Docker and Kubernetes

● Passionate and Enthusiastic about work.

● Technical team leadership experience.

Proactive attitude.

● To be a bridge between the team and the counterparts with regards to technical aspects of the project.

● Good communication skills.

● Exposure to Microservices


Education:

Bachelor of Engineering/Technology - BE/BTech

Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Chennai
4 - 7 yrs
₹13L - ₹15L / yr
Data Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+10 more

Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA

Responsibilities:

  • Parse data using Python, create dashboards in Tableau.
  • Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
  • Migrate Datastage jobs to Snowflake, optimize performance.
  • Work with HDFS, Hive, Kafka, and basic Spark.
  • Develop Python scripts for data parsing, quality checks, and visualization.
  • Conduct unit testing and web application testing.
  • Implement Apache Airflow and handle production migration.
  • Apply data warehousing techniques for data cleansing and dimension modeling.

Requirements:

  • 4+ years of experience as a Platform Engineer.
  • Strong Python skills, knowledge of Tableau.
  • Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
  • Proficient in Unix Shell Scripting and SQL.
  • Familiarity with ETL tools like DataStage and DMExpress.
  • Understanding of Apache Airflow.
  • Strong problem-solving and communication skills.

Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.

Read more
Series B product based company
Agency job
via Qrata by Blessy Fernandes
Mumbai, Navi Mumbai
1 - 3 yrs
₹5L - ₹8L / yr
Linux/Unix
Microservices
Python
Amazon Web Services (AWS)
Amazon EC2
+12 more

Roles & Responsibilities:

  • Bachelor’s degree in Computer Science, Information Technology or a related field


  • Experience in designing and maintaining high volume and scalable micro-services architecture on cloud infrastructure


  • Knowledge in Linux/Unix Administration and Python/Shell Scripting


  • Experience working with cloud platforms like AWS (EC2, ELB, S3, Auto-scaling, VPC, Lambda), GCP, Azure


  • Knowledge in deployment automation, Continuous Integration and Continuous Deployment (Jenkins, Maven, Puppet, Chef, GitLab) and monitoring tools like Zabbix, Cloud Watch Monitoring, Nagios Knowledge of Java Virtual Machines, Apache Tomcat, Nginx, Apache Kafka, Microservices architecture, Caching mechanisms


  • Experience in enterprise application development, maintenance and operations


  • Knowledge of best practices and IT operations in an always-up, always-available service


  • Excellent written and oral communication skills, judgment and decision-making skills
Read more
Conviva

at Conviva

1 recruiter
Adarsh Sikarwar
Posted by Adarsh Sikarwar
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹40L / yr
Apache Kafka
Redis
Systems design
Data Structures
Algorithms
+5 more

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses. 


Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.

 

What you get to do in this role:

Work on extremely high scale RUST web services or backend systems.

Design and develop solutions for highly scalable web and backend systems.

Proactively identify and solve performance issues.

Maintain a high bar on code quality and unit testing.

 

What you bring to the role:

5+ years of hands-on software development experience.

At least 2+ years of RUST development experience.

Knowledge of cargo packages for kafka, redis etc.

Strong CS fundamentals, including system design, data structures and algorithms.

Expertise in backend and web services development.

Good analytical and troubleshooting skills.

 

What will help you stand out:

Experience working with large scale web services and applications.

Exposure to Golang, Scala or Java

Exposure to Big data systems like Kafka, Spark, Hadoop etc.

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.  


Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 

Read more
Conviva

at Conviva

1 recruiter
Anusha Bondada
Posted by Anusha Bondada
Bengaluru (Bangalore)
3 - 6 yrs
₹20L - ₹40L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

As Conviva is expanding, we are building products providing deep insights into end-user experience for our customers.

 

Platform and TLB Team

The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real-time. Engineer the next-gen Spark-like system for in-memory computation of large time-series datasets – both Spark-like backend infra and library-based programming model. Build a horizontally and vertically scalable system that analyses trillions of events per day within sub-second latencies. Utilize the latest and greatest big data technologies to build solutions for use cases across multiple verticals. Lead technology innovation and advancement that will have a big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.

 

What You’ll Do

This is an individual contributor position. Expectations will be on the below lines:

  • Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva’s products
  • Responsible for the architecture of the Conviva platform
  • Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
  • Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements, etc.
  • Lead a team to develop a feature or parts of a product
  • Adhere to the Agile model of software development to plan, estimate, and ship per business priority

 

What you need to succeed

  • 5+ years of work experience in software development of data processing products.
  • Engineering degree in software or equivalent from a premier institute.
  • Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
  • Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
  • Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
  • Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
  • Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
  • Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.  

Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 


Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 8 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Spring MVC
+5 more

Required Skills:


  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.


Read more
Conviva

at Conviva

1 recruiter
Anusha Bondada
Posted by Anusha Bondada
Bengaluru (Bangalore)
3 - 15 yrs
₹25L - ₹70L / yr
Scala
Akka
Algorithms
Data Structures
Functional programming
+6 more

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses. 

 

Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.

 

As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.

 

Platform and TLB Team

The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.

 

What You’ll Do

This is an individual contributor position. Expectations will be on the below lines:

  • Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
  • Responsible for the architecture of the Conviva platform
  • Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
  • Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
  • Lead a team to develop a feature or parts of the product
  • Adhere to the Agile model of software development to plan, estimate, and ship per business priority

 

What you need to succeed

  • 9+ years of work experience in software development of data processing products.
  • Engineering degree in software or equivalent from a premier institute.
  • Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
  • Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
  • Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
  • Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
  • Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
  • Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.  

Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 



Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 7 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Spring MVC
+5 more
  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.


Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 7 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Spring MVC
+4 more

Required Education:


B.Tech./ BE - Computer, IT, Electronics only

Required Skills:


  • 2+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus


Read more
Remote, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹20L - ₹55L / yr
Python
Django
Flask
Data Analytics
Data Science
+11 more

CTC Budget: 35-55LPA

Location: Hyderabad (Remote after 3 months WFO)


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


  • 6 plus years of experience as a Python developer.
  • Experience in web development using Python and Django Framework.
  • Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
  • Experience in developing User Interface using HTML, JavaScript, CSS.
  • Experience in server-side templating languages including Jinja 2 and Mako
  • Knowledge into Kafka and RabitMQ (GTH)
  • Experience into Docker, Git and AWS
  • Ability to integrate multiple data sources into a single system.
  • Ability to collaborate on projects and work independently when required.
  • DB (MySQL, Postgress, SQL)


Selection Process: 2-3 Interview rounds (Tech, VP, Client)

Read more
Samsan Technologies

at Samsan Technologies

1 recruiter
HR Varsha
Posted by HR Varsha
Pune
3 - 7 yrs
₹1L - ₹10L / yr
NodeJS (Node.js)
React.js
Angular (2+)
AngularJS (1.x)
MongoDB
+11 more

Job Responsibilities

·        Responsibilities for this position include but are not limited to, the following.

·        Development experience 3-6 years

·        Experience working with Azure cloud-hosted web applications and technologies.

·        Design and develop back-end microservices and REST APIs for connected devices, web applications, and mobile applications.

·        Stay up to date on relevant technologies, plug into user groups, and understand trends and opportunities that ensure we are using the best techniques and tools.

  • Meeting with the software development team to define the scope and scale of software projects.
  • Designing software system architecture.
  • Completing data structures and design patterns.
  • Designing and implementing scalable web services, applications, and APIs.
  • Developing and maintaining internal software tools.
  • Writing low-level and high-level code.
  • Troubleshooting and bug fixing.
  • Identifying bottlenecks and improving software efficiency.
  • Collaborating with the design team on developing micro-services.
  • Writing technical documents.
  • Be an active professional in continuous learning resulting in enhancement in organizational objectives.
  • Provide technical support to all internal teams and customers as it relates to the product.

Requirements:

  • Bachelor’s degree in computer engineering or computer science.
  • Previous experience as a full stack engineer and IoT Products.
  • Advanced knowledge of front-end languages including HTML5, CSS, JavaScript, Angular, React.
  • Proficient in back-end languages including Nodejs and basic knowledge of Java, C#.
  • Experience with cloud computing APIs and Cloud Providers such as Azure or AWS.

·        Working knowledge of database systems (Cassandra, CosmosDB, Redis, PostgreSQL)

·        Messaging systems (RabbitMQ, MQTT, Kafka)

·        Cloud-based distributed application scaling & data processing in the cloud

·        Agile / Scrum methodology

  • Advanced troubleshooting skills.
  • Familiarity with JavaScript frameworks.
  • Good communication skills.

High-level project management skills.

Read more
Pune
7 - 11 yrs
₹25L - ₹33L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+12 more

Hi,

We are hiring for the position of Java Tech Lead. Please find below the details for the same.


A passionate developer who has a strong working knowledge of OOPS and functional programming

principles. Standard Definitions and abbreviations don't entice us that much.

Key skills:

• Strong Java and J2EE background with 5-7 years of experience.

• Strong working experience in Multi-Threading, Exception Management and the Use of Collections.

• Sound knowledge of working with application aspects i.e., Caching, Asynchronous APIs, Logging etc.

• Experience with web application frameworks like Spring Boot or Dropwizard.

• Unit Testing is an everyday affair and hence demands very good unit testing skills using tools like Junit & TestNG.

• Understanding of relational databases, RESTful services, and build tools like Maven & Gradle

• Knows what and when to mock and has used frameworks like Mockito/Power Mock.

• Understanding of message queues such as ActiveMQ, Kafka, and RabbitMQ.

•  Version Control is treated as important as programming skills. Fluent with version control tools like Git and Bitbucket.

• Exposure to Agile/Scrum, TDD not in theory but in practice.

•  Experience with Continuous Integration, Continuous Deployment, Static Code Analysis, Jenkins and SonarQube.

•  Willingness to take ownership of the technical solution and ensure technical expectations of deliverables are met.

• Strong communication skills along with the ability to articulate technical designs and concepts.

• Exposure to cloud and containerization would be a plus.

• Hands-on experience in application development in an enterprise setup.

• Have a good understanding of Distributed Application Architecture.

Read more
Kapture CX

at Kapture CX

1 video
Deepika Dhanraj
Posted by Deepika Dhanraj
Bengaluru (Bangalore)
2 - 7 yrs
₹8L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Apache Kafka
+7 more

Kapture CRM is an enterprise-focused Service automation SaaS platform. We help 500+ enterprises in 14 countries to manage their customer service in a more intelligent, contextual way.


Roles & Responsibilities :


    * Proven experience in Java8, Spring Boot, Microservices/API

    * Strong experience with Kafka, Kubernetes

    * Strong experience in using RDBMS (Mysql) and NoSQL.

    * Experience in working in Eclipse / Maven environments.

    * Hands-on experience in Unix / Shell scripting.

    * Hands-on experience in fine-tuning application response/performance testing.

    * Experience in Web Services.

    * Strong analysis & problem-solving skills

    * Strong communication skills - both verbal and written

    * Ability to work independently with limited supervision

    * Proven ability to use own initiative to resolve issues

    * Full ownership of projects/tasks

    * Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines

    * Eagerness to learn

    * Strong team-working skills

Read more
Hyderabad, Bengaluru (Bangalore)
8 - 12 yrs
₹30L - ₹50L / yr
PHP
Javascript
React.js
Angular (2+)
AngularJS (1.x)
+17 more

CTC Budget: 35-50LPA

Location: Hyderabad/Bangalore

Experience: 8+ Years


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


Work with, learn from, and contribute to a diverse, collaborative

development team

● Use plenty of PHP, Go, JavaScript, MySQL, PostgreSQL, ElasticSearch,

Redshift, AWS Services and other technologies

● Build efficient and reusable abstractions and systems

● Create robust cloud-based systems used by students globally at scale

● Experiment with cutting edge technologies and contribute to the

company’s product roadmap


● Deliver data at scale to bring value to clients Requirements


You will need:

● Experience working with a server side language in a full-stack environment

● Experience with various database technologies (relational, nosql,

document-oriented, etc) and query concepts in high performance

environments

● Experience in one of these areas: React, Backbone

● Understanding of ETL concepts and processes

● Great knowledge of design patterns and back end architecture best

practices

● Sound knowledge of Front End basics like JavaScript, HTML, CSS

● Experience with TDD, automated testing

● 12+ years’ experience as a developer

● Experience with Git or Mercurial

● Fluent written & spoken English

It would be great if you have:

● B.Sc or M.Sc degree in Software Engineering, Computer Science or similar

● Experience and/or interest in API Design

● Experience with Symfony and/or Doctrine

● Experience with Go and Microservices

● Experience with message queues e.g. SQS, Kafka, Kinesis, RabbitMQ

● Experience working with a modern Big Data stack

● Contributed to open source projects

● Experience working in an Agile environment

Read more
Recro

at Recro

1 video
32 recruiters
SD S
Posted by SD S
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Spring
+19 more

Job Summary:


We are looking for a skilled and experienced Java Developer to join our team. As a Java Developer, you will be responsible for developing and maintaining our applications using Java, Spring framework, and other related technologies. The ideal candidate should have a strong understanding of object-oriented programming principles, as well as experience with a variety of technologies such as SQL, NoSQL, and cloud computing.


Responsibilities:


  • Design, develop, and maintain our applications using Java, Spring framework, and other related technologies
  • Write clean, efficient, and optimized code for applications
  • Collaborate with cross-functional teams to understand user requirements and deliver high-quality solutions
  • Develop and maintain backend systems using Spring framework
  • Work with databases, including SQL and NoSQL
  • Ensure code quality and maintain documentation
  • Troubleshoot and debug applications
  • Stay updated with emerging trends and technologies in Java development
  • Work with other teams to deploy and maintain applications


Requirements:


  • 3-7 years of experience in Java development
  • Strong understanding of object-oriented programming principles
  • Experience with Java, Spring framework, and related technologies
  • Familiarity with databases, including SQL and NoSQL
  • Knowledge of cloud computing is a plus
  • Excellent problem-solving and debugging skills
  • Strong communication and collaboration skills
  • Ability to work independently and as part of a team
  • Bachelor's degree in computer science or a related field


Key Skills:


  • Strong proficiency in Java programming language
  • Experience with Spring framework, including Spring Boot and Spring MVC
  • Familiarity with cloud platforms such as AWS, GCP, and Azure
  • Experience building RESTful APIs
  • Knowledge of microservices architecture
  • Familiarity with SQL and relational databases such as MySQL and Postgres
  • Familiarity with NoSQL databases such as MongoDB and Redis
  • Experience with messaging systems such as Kafka and RabbitMQ
  • Experience with containerization tools such as Docker and Kubernetes
  • Understanding of software development principles and experience with SDLC methodologies
  • Experience with Git version control and build tools such as Maven and Gradle
  • Familiarity with front-end technologies such as Angular and React is a plus
  • Strong problem-solving and analytical skills
  • Good communication and interpersonal skills
  • Ability to work independently and take ownership of tasks
  • Experience with test-driven development and unit testing frameworks such as JUnit and Mockito
  • Familiarity with CI/CD tools such as Jenkins is a plus
  • Familiarity with caching technologies such as Redis is a plus
  • Working knowledge of design patterns and software architecture principles is a plus.


Read more
Quantela

at Quantela

3 recruiters
Sai N
Posted by Sai N
Remote only
4 - 10 yrs
₹20L - ₹35L / yr
NodeJS (Node.js)
MongoDB
Mongoose
Express
Elastic Search
+2 more

About Quantela


We are a technology company that offers outcomes business models. We empower our customers with the right digital infrastructure to deliver greater economic, social, and environmental outcomes for their constituents.


When the company was founded in 2015, we specialized in smart cities technology alone. Today, working with cities and towns; utilities, and public venues, our team of 280+ experts offer a vast array of outcomes business models through technologies like digital advertising, smart lighting, smart traffic, and digitized citizen services.


We pride ourselves on our agility, innovation, and passion to use technology for a higher purpose. Unlike other technology companies, we tailor our offerings (what we can digitize) and the business model (how we partner with our customer to deliver that digitization) to drive measurable impact where our customers need it most. Over the last several months alone, we have served customers to deliver outcomes like increased medical response times to save lives; reduced traffic congestion to keep cities moving and created new revenue streams to tackle societal issues like homelessness.


We are headquartered in Billerica, Massachusetts in the United States with offices across Europe, and Asia.


The company has been recognized with the World Economic Forum’s ‘Technology Pioneers’ award in 2019 and CRN’s IoT Innovation Award in 2020.


For latest news and updates please visit us at www.quantela.com


Overview of the Role


Collaborate with cross-functional teams to define, design and build performant modern web applications and services. Build high quality web applications and services by writing clean and modular code


Must have Skills

  • 10+ years of experience in leading teams and delivering product, write unit and integration tests to ensure robustness and reliability of web applications and services. 
  • Measure and improve performance of microservices.
  • Catalyze growth within the team through code reviews and pair programming to maintain high development standards Investigate operational issues to find the root cause and propose improvements Design, build, and maintain APIs, services, and systems across Stripe’s engineering teams.
  • Expert level of experience in design and development of Web Applications, highly scalable distributed systems.
  • Should have experience in development skills using latest technologies including NodeJS, Microservices, Elastic search, Timescale DB, Kafka, Redis, etc.
  • Should have exposure in NoSQL and SQL development.
  • Comprehensive knowledge of physical and logical data modelling, performance tuning.
  • Should possess excellent communication, presentation, and interpersonal skills.
  • Ability to work collaboratively and productively with globally dispersed teams
  • Build high performance teams and Coach team for successful career growth
  • Experience working with relational and non-relational databases, query optimization, and designing schema

Desired Background

Bachelors/Masters degree in Computer Science or Computer Applications

Read more
Vmultiply solutions
Remote only
5 - 10 yrs
₹8L - ₹10L / yr
Elastic Search
Apache Kafka
MongoDB
Jupyter Notebook
databricks
+2 more

1. Need to have an understanding of Elastic Search, Kafka, mongo DB, etc.

2. Should have experience of Jupter noobooks, data bricks

3. Java, Pythons

4. Senior level, 5-10 years of experience

5. It is important they have those skills so that they can take over current work. There are codes written in both Java as well as Python. (Java is legacy but that is the main search engine code). So it will be counter-productive if engineers hired have experience in both.

6. Excellent communication, analytical, research, grasping skills

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort