Cutshort logo
Apache Kafka Jobs in Hyderabad

41+ Apache Kafka Jobs in Hyderabad | Apache Kafka Job openings in Hyderabad

Apply to 41+ Apache Kafka Jobs in Hyderabad on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

icon
Solix Technologies

at Solix Technologies

3 recruiters
Sumathi Arramraju
Posted by Sumathi Arramraju
Hyderabad
3 - 7 yrs
₹6L - ₹12L / yr
Hadoop
skill iconJava
HDFS
Spring
Spark
+1 more
Primary Skills required: Java, J2ee, JSP, Servlets, JDBC, Tomcat, Hadoop (hdfs, map reduce, hive, hbase, spark, impala) 
Secondary Skills: Streaming, Archiving , AWS / AZURE / CLOUD

Role:
·         Should have strong programming and support experience in Java, J2EE technologies 
·         Should have good experience in Core Java, JSP, Sevlets, JDBC
·         Good exposure in Hadoop development ( HDFS, Map Reduce, Hive, HBase, Spark)
·         Should have 2+ years of Java experience and 1+ years of experience in Hadoop 
·         Should possess good communication skills
·         Web Services or Elastic \ Map Reduce 
·         Familiarity with data-loading tools such as Sqoop
·         Good to know: Spark, Storm, Apache HBase
Read more
Aadvi tech
Sravan Kumar
Posted by Sravan Kumar
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
4 - 10 yrs
₹15L - ₹30L / yr
Kafka
Cucumber
skill iconJava
Test Automation (QA)
Selenium
+1 more


Job description:

  • Hands on skills in Java programming languages
  • Experience of testing of Cloud Native applications with exposure of Kafka.
  • Understanding the concepts of K8, Caching, Rest / GRPC and Observability
  • Experience with good programming or scripting practices and tools: code review, ADO/Jenkin etc
  • Apply expertise in Java, API Testing, Cucumber or other test frameworks to design, develop and maintain automation test suites.
  • Intimate familiarity with QA concepts: white-/black-/grey-box testing, acceptance/regression test, system integration test, performance/stress test, and security tests
Read more
Hyderabad, Pune, Noida, Gurugram
6 - 7 yrs
₹22L - ₹24L / yr
Apache Kafka
Zookeeper
Implementation
Windows Azure
skill iconDocker
+5 more

·     IMMEDIATE JOINER

Professional Experience with 5+ years in Confluent Kafka Admin

·    Demonstrated experience design / development.

·    Must have proven knowledge and practical application of – Confluent Kafka (Producers/ Consumers / Kafka Connectors / Kafka Stream/ksqlDB/Schema Registry)

·    Experience in performance optimization of consumers, producers.

·    Good experience debugging issues related offset, consumer lag, partitions.

·    Experience with Administrative tasks on Confluent Kafka.

·    Kafka admin experience including but not limited to setup new Kafka cluster, create topics, grant permissions, offset reset, purge data, setup connectors, setup replicator task, troubleshooting issues, Monitor Kafka cluster health and performance, backup and recovery.

·    Experience in implementing security measures for Kafka clusters, including access controls and encryption, to protect sensitive data.

·    Install/Upgrade Kafka cluster techniques.

·    Good experience with writing unit tests using Junit and Mockito

·    Have experience with working in client facing project.

·    Exposure to any cloud environment like AZURE is added advantage.

·    Experience in developing or working on REST Microservices

Experience in Java, Springboot is a plus

Read more
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconKotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
 is a software product company that provides

is a software product company that provides

Agency job
via Dangi Digital Media LLP by jaibir dangi
Hyderabad
6 - 15 yrs
₹11L - ₹15L / yr
skill iconPython
Spark
SQL Azure
Apache Kafka
skill iconMongoDB
+4 more

5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus


Skills Required :


Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols

Read more
Quadratic Insights
Praveen Kondaveeti
Posted by Praveen Kondaveeti
Hyderabad
7 - 10 yrs
₹15L - ₹24L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

About Quadratyx:

We are a product-centric insight & automation services company globally. We help the world’s organizations make better & faster decisions using the power of insight & intelligent automation. We build and operationalize their next-gen strategy, through Big Data, Artificial Intelligence, Machine Learning, Unstructured Data Processing and Advanced Analytics. Quadratyx can boast more extensive experience in data sciences & analytics than most other companies in India.

We firmly believe in Excellence Everywhere.


Job Description

Purpose of the Job/ Role:

• As a Technical Lead, your work is a combination of hands-on contribution, customer engagement and technical team management. Overall, you’ll design, architect, deploy and maintain big data solutions.


Key Requisites:

• Expertise in Data structures and algorithms.

• Technical management across the full life cycle of big data (Hadoop) projects from requirement gathering and analysis to platform selection, design of the architecture and deployment.

• Scaling of cloud-based infrastructure.

• Collaborating with business consultants, data scientists, engineers and developers to develop data solutions.

• Led and mentored a team of data engineers.

• Hands-on experience in test-driven development (TDD).

• Expertise in No SQL like Mongo, Cassandra etc, preferred Mongo and strong knowledge of relational databases.

• Good knowledge of Kafka and Spark Streaming internal architecture.

• Good knowledge of any Application Servers.

• Extensive knowledge of big data platforms like Hadoop; Hortonworks etc.

• Knowledge of data ingestion and integration on cloud services such as AWS; Google Cloud; Azure etc. 


Skills/ Competencies Required

Technical Skills

• Strong expertise (9 or more out of 10) in at least one modern programming language, like Python, or Java.

• Clear end-to-end experience in designing, programming, and implementing large software systems.

• Passion and analytical abilities to solve complex problems Soft Skills.

• Always speaking your mind freely.

• Communicating ideas clearly in talking and writing, integrity to never copy or plagiarize intellectual property of others.

• Exercising discretion and independent judgment where needed in performing duties; not needing micro-management, maintaining high professional standards.


Academic Qualifications & Experience Required

Required Educational Qualification & Relevant Experience

• Bachelor’s or Master’s in Computer Science, Computer Engineering, or related discipline from a well-known institute.

• Minimum 7 - 10 years of work experience as a developer in an IT organization (preferably Analytics / Big Data/ Data Science / AI background.

Read more
Blue Yonder

at Blue Yonder

5 recruiters
GnanaMalleshwar Karri
Posted by GnanaMalleshwar Karri
Hyderabad, Bengaluru (Bangalore)
10 - 12 yrs
₹10L - ₹30L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconMongoDB
+8 more

About Merchandise Operation (Merch Ops): Merchandise Operations (Merch Ops) is a merchandise management system, it is positioned as a host system in the retail solutions, it has ability to maintain the Master/Foundation data, create and manage Purchase Orders, create, and manage Prices & Promotions, perform Replenishment, effective inventory control and financial management. Merc Ops provides Business users with consistent, accurate, and timely data across an enterprise by allowing them to get the:


Right Goods in the...

• Right Silhouettes, Sizes and Colors; at the...

• Right Price; at the...

• Right Location; for the...

• Right Consumer; at the...

• Right Time; at the...

• Right Quantity.


About Team:

• Proven, passionate bunch of disruptors providing solutions that solve real-time supply chain problems.

• Well mixed experienced team with young members and experienced in product, domain, and Industry knowledge.

• Gained Expertise in designing and deploying massively scalable cloud native SaaS products

• The team currently comprises of associates across the globe and is expected to grow rapidly.


Our current technical environment:

• Software: React JS, Node JS, Oracle PL/SQL, GIT, Rest API. Java script.

• Application Architecture: Scalable three tier web application.

• Cloud Architecture: Private cloud, MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD) • Frameworks/Others: Tomcat Apache, RDBMS, Jenkins, Nginx, Oracle Type ORM, Express.


What you'll be doing:

• As a Staff Engineer you will be responsible for the design of the features in the product roadmap

• Creating and encouraging good software development practices engineering-wide, driving strategic technical improvements, and mentoring other engineers.

• You will write code as we expect our technical leadership to be in the trenches alongside junior engineers, understanding root causes and leading by example

• You will mentor engineers

• You will own relationships with other engineering teams and collaborate with other functions within Blue Yonder

• Drive architecture and designs to become simpler, more robust, and more efficient.


• Lead designs discussion and come up with robust and more efficient designs to achieve features in product roadmap

• Take complete responsibility of the features developed right from coding till deployments

• Introduce new technology and tools for the betterment of the product

• Guides fellow engineers to look beyond the surface and fix the root causes rather than symptoms.


What we are looking for:

• Bachelor’s degree (B.E/B.Tech/M.Tech Computer science or related specialization) and minimum 7 to 10 years of experience in Software development, has been an Architect, within the last 1-2 years minimum. • Strong programming experience and background in Node JS and React JS.

• Hands-on development skills along with architecture/design experience.

• Hands-on experience on designing, building deploying and maintenance of enterprise cloud solutions.

• Demonstrable experience, thorough knowledge, and interests in Cloud native architecture, Distributed micro-services, Multi-tenant SaaS solution and Cloud Scalability, performance, and High availability

• Experience with API management platforms & providing / consuming RESTful APIs

• Experience with varied tools such as REST, Hibernate, RDBMS, Docker, Kubernetes, Kafka, React.

• Hands-on development experience on Oracle PL/SQL.

• Experience with DevOps and infrastructure automation.

Read more
BlueYonder
Bengaluru (Bangalore), Hyderabad
10 - 14 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Gradle
+13 more

·      Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.

·      BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.

·      This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.

·      Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment

·      The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools

Our current technical environment:

·      Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake

·      • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture

·      • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)

·      Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite

Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
An 8 year old IT Services and consulting company.

An 8 year old IT Services and consulting company.

Agency job
via Startup Login by Shreya Sanchita
Remote, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹20L - ₹55L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconData Analytics
skill iconData Science
+11 more

CTC Budget: 35-55LPA

Location: Hyderabad (Remote after 3 months WFO)


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


  • 6 plus years of experience as a Python developer.
  • Experience in web development using Python and Django Framework.
  • Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
  • Experience in developing User Interface using HTML, JavaScript, CSS.
  • Experience in server-side templating languages including Jinja 2 and Mako
  • Knowledge into Kafka and RabitMQ (GTH)
  • Experience into Docker, Git and AWS
  • Ability to integrate multiple data sources into a single system.
  • Ability to collaborate on projects and work independently when required.
  • DB (MySQL, Postgress, SQL)


Selection Process: 2-3 Interview rounds (Tech, VP, Client)

Read more
An 8 year old IT Services and consulting company.

An 8 year old IT Services and consulting company.

Agency job
via Startup Login by Shreya Sanchita
Hyderabad, Bengaluru (Bangalore)
8 - 12 yrs
₹30L - ₹50L / yr
skill iconPHP
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
+17 more

CTC Budget: 35-50LPA

Location: Hyderabad/Bangalore

Experience: 8+ Years


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


Work with, learn from, and contribute to a diverse, collaborative

development team

● Use plenty of PHP, Go, JavaScript, MySQL, PostgreSQL, ElasticSearch,

Redshift, AWS Services and other technologies

● Build efficient and reusable abstractions and systems

● Create robust cloud-based systems used by students globally at scale

● Experiment with cutting edge technologies and contribute to the

company’s product roadmap


● Deliver data at scale to bring value to clients Requirements


You will need:

● Experience working with a server side language in a full-stack environment

● Experience with various database technologies (relational, nosql,

document-oriented, etc) and query concepts in high performance

environments

● Experience in one of these areas: React, Backbone

● Understanding of ETL concepts and processes

● Great knowledge of design patterns and back end architecture best

practices

● Sound knowledge of Front End basics like JavaScript, HTML, CSS

● Experience with TDD, automated testing

● 12+ years’ experience as a developer

● Experience with Git or Mercurial

● Fluent written & spoken English

It would be great if you have:

● B.Sc or M.Sc degree in Software Engineering, Computer Science or similar

● Experience and/or interest in API Design

● Experience with Symfony and/or Doctrine

● Experience with Go and Microservices

● Experience with message queues e.g. SQS, Kafka, Kinesis, RabbitMQ

● Experience working with a modern Big Data stack

● Contributed to open source projects

● Experience working in an Agile environment

Read more
Monarch Tractors India
Venkat Ramthirdh
Posted by Venkat Ramthirdh
Hyderabad
5 - 8 yrs
Best in industry
skill iconPython
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
+20 more

Job Description:

Responsibilities:

·      Participate in the entire application lifecycle, focusing on coding and debugging

·      Ability to design and document the product features, codes.

·      Self-driven and ability to take up ownership of things

·      Write clean code to develop functional applications, automation scripts, test cases

·      Troubleshoot and debug applications

·      Collaborate with developers, cross-functional teams to identify issue, new features and come up with solutions

·      Gather and address technical and design requirements

·      Provide training and support to internal teams

·      Build reusable code and libraries for future use

·      Ability to sync with other developers, designers to identify issues, new features and improvements

Requirements and skills:

·      Solid understanding of linux development environment and systems

·      Expert level knowledge of Python along with frameworks like Django, flask

·      Proficient in writing unit test cases using Pytest framework.

·      Expert level knowledge of SQL databases like MySQL and PostgreSQL

·      Good knowledge of design principles and databases

·      Knowledge of front technologies like Typescript, ReactJs

·      Knowledge of Caching techniques using Memcached and Redis

·      Experience of using queueing service line Kafka

·      Solid understanding of microservices architecture

·      Knowledge of AWS cloud services

·      Expert in serverless technologies like lambda along with API gateway

·      Knowledge of Git, Jira, CI/CD pipelines and containerization like docker

·      Knowledge of logging and monitoring tools like Grafana or Newrelic

 

Read more
Telstra

at Telstra

1 video
1 recruiter
Mahesh Balappa
Posted by Mahesh Balappa
Bengaluru (Bangalore), Hyderabad, Pune
3 - 7 yrs
Best in industry
Spark
Hadoop
NOSQL Databases
Apache Kafka

About Telstra

 

Telstra is Australia’s leading telecommunications and technology company, with operations in more than 20 countries, including In India where we’re building a new Innovation and Capability Centre (ICC) in Bangalore.

 

We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future.

 

Why Telstra?

 

  • We're an iconic Australian company with a rich heritage that's been built over 100 years. Telstra is Australia's leading Telecommunications and Technology Company. We've been operating internationally for more than 70 years.
  • International presence spanning over 20 countries.
  • We are one of the 20 largest telecommunications providers globally
  • At Telstra, the work is complex and stimulating, but with that comes a great sense of achievement. We are shaping the tomorrow's modes of communication with our innovation driven teams.

 

Telstra offers an opportunity to make a difference to lives of millions of people by providing the choice of flexibility in work and a rewarding career that you will be proud of!

 

About the team

Being part of Networks & IT means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy.

With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme.

 

Focus of the role

Any new engineer who comes into data chapter would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.

 

About you

To be successful in the role, you'll bring skills and experience in:-

 

Essential 

  • Hands-on experience in Spark Core, Spark SQL, SQL/Hive/Impala, Git/SVN/Any other VCS and Data warehousing
  • Skilled in the Hadoop Ecosystem(HDP/Cloudera/MapR/EMR etc)
  • Azure data factory/Airflow/control-M/Luigi
  • PL/SQL
  • Exposure to NOSQL(Hbase/Cassandra/GraphDB(Neo4J)/MongoDB)
  • File formats (Parquet/ORC/AVRO/Delta/Hudi etc.)
  • Kafka/Kinesis/Eventhub

 

Highly Desirable

Experience and knowledgeable on the following:

  • Spark Streaming
  • Cloud exposure (Azure/AWS/GCP)
  • Azure data offerings - ADF, ADLS2, Azure Databricks, Azure Synapse, Eventhubs, CosmosDB etc.
  • Presto/Athena
  • Azure DevOps
  • Jenkins/ Bamboo/Any similar build tools
  • Power BI
  • Prior experience in building or working in team building reusable frameworks,
  • Data modelling.
  • Data Architecture and design principles. (Delta/Kappa/Lambda architecture)
  • Exposure to CI/CD
  • Code Quality - Static and Dynamic code scans
  • Agile SDLC      

 

If you've got a passion to innovate, succeed as part of a great team, and looking for the next step in your career, we'd welcome you to apply!

___________________________

 

We’re committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.

To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit tel.st/thrive.

Read more
Accion Labs

at Accion Labs

14 recruiters
Jayasri Palanivelu
Posted by Jayasri Palanivelu
Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
6 - 10 yrs
₹15L - ₹30L / yr
skill iconJava
skill iconSpring Boot
Hibernate (Java)
Microservices
NOSQL Databases
+3 more

Desired Candidate Profile


  • A team focus with strong collaboration and communication skills
  • Exceptional ability to quickly grasp high-level business goals, derive requirements, and translate them into effective technical solutions
  • Exceptional object-oriented thinking, design and programming skills (Java 8 or 11)
  • Expertise with the following technologies : Data Structures, Design Patterns ,Code Versioning Tools(Github/bitbucket/..), XML, JSON, Spring Batch Restful, Spring Cloud, Grafana(Knowledge/Experience), Kafka, Spring Boot, Microservices, DB/NoSQL, Docker, Kubernetes, AWS/GCP, Architecture design (Patterns) Agile, JIRA.
  • Penchant toward self-motivation and continuous improvement; these words should describe you: dedicated, energetic, curious, conscientious, and flexible.
Read more
Java developer based in Hyderabad.

Java developer based in Hyderabad.

Agency job
via Qrata by Rayal Rajan
Hyderabad
5 - 12 yrs
₹6L - ₹25L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconPython
+7 more
Excellent knowledge of Core Java (J2SE) and J2EE technologies.
 Hands-on experience with RESTful services, API design are must.
 Knowledge of microservices architecture is must.
 Knowledge of design patterns is a must.
 Strong knowledge of Exception handling and logging mechanism is a must.
 Agile scrum participation experience. Work experience with several agile teams on an application built
with microservices and event-based architectures to be deployed on hybrid (on-prem/cloud)
environments.
 Good knowledge of Spring framework (MVC, Cloud, Data and Security. Etc) and ORM frameworks like JPA/Hibernate.
 Experience in managing the Source Code Base through Version Control tools like SVN, GitHub,
Bitbucket, etc.
 Experience in using and configuration of Continuous Integration tools Jenkins, Travis, GitLab, etc.
 Experience in the design and development of SaaS/PaaS-based architecture and tenancy models.
 Experience in SaaS/PaaS-based application development used by a high volume of
subscribers/customers.
 Awareness and understanding of data security and privacy.
 Experience in performing Java Code Reviews using review tools like SonarQube, etc.
 Good understanding of end-to-end software development lifecycle. Ability to read and understand
requirements and design documents.
 Good Analytical skills and should be self-driven.
 Good communication with interpersonal skills.
 Open to learning new technologies and domains.
 A good team player and ready to take up new challenges. Active communication and coordination with
Clients and Internal stakeholders
Requirements: Skills and Qualifications
 6-8 years of experience in developing Java/J2EE-based Enterprise Web Applications
 Languages: Java, J2EE, and Python
 Databases: MySQL, Oracle, SQL Server, PostgreSQL, Redshift, MongoDB
 DB Script: SQL and PL/SQL
 Frameworks: Spring, Spring Boot, Jersey, Hibernate and JPA
 OS: Windows, Linux/Unix.
 Cloud Services: AWS and Azure
 Version Controls/ DevOps tools: Git, Bitbucket and Jenkins.
 Message brokers: RabbitMQ, and Kafka
 Deployment Servers: Tomcat, Docker, and Kubernetes
 Build Tools: Gradle/Maven
Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
skill iconJenkins
skill iconGitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
Service Pack
Alice Preetika
Posted by Alice Preetika
Hyderabad
3 - 6 yrs
₹10L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconFlask
NOSQL Databases
Apache Kafka

What are the Key Responsibilities:


  • Responsibilities include writing and testing code, debugging programs, and integrating applications with third-party web services. 
  • Write effective, scalable code
  • Develop back-end components to improve responsiveness and overall performance
  • Integrate user-facing elements into applications
  • Improve functionality of existing systems
  • Implement security and data protection solutions
  • Assess and prioritize feature requests
  • Creates customized applications for smaller tasks to enhance website capability based on business needs
  • Ensures web pages are functional across different browser types; conducts tests to verify user functionality
  • Verifies compliance with accessibility standards
  • Assists in resolving moderately complex production support problems

What are we looking for:


  • 3+ Years of work experience as a Python Developer
  • Expertise in at least one popular Python framework: Django
  • Knowledge of NoSQL databases (Elastic search, MongoDB)
  • Familiarity with front-end technologies like JavaScript, HTML5, and CSS3
  • Familiarity with Apache Kafka will give you an edge over others
  • Good understanding of the operating system and networking concepts
  • Good analytical and troubleshooting skills
  • Graduation/Post Graduation in Computer Science / IT / Software Engineering
  • Decent verbal and written communication skills to communicate with customers, support personnel, and management
Read more
Codvoai

at Codvoai

1 recruiter
Akanksha kondagurla
Posted by Akanksha kondagurla
Hyderabad
3 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
J2EE
skill iconSpring Boot
+2 more

At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.

 

Key Responsibilities

  •   The Kafka Engineer is responsible for Designing and recommending the best  approach suited    for data movement to/from different sources using Apache/Confluent Kafka.
  •   Create topics, set up redundancy cluster, deploy monitoring tools, and alerts, and has good          knowledge of best practices.
  •   Develop and ensure adherence to published system architectural decisions and development       standards
  •   Must be comfortable working with offshore/global teams to deliver projects

 

 

Required Skills

 

  •  Good understanding of Event-based architecture, messaging frameworks and stream processing solutions using Kafka Messaging framework.
  •  3+ years hands-on experience working on Kafka connect using schema registry in a high-volume environment.
  •  Strong knowledge and exposure to Kafka brokers, zookeepers, KSQL, KStream and Kafka Control centre.
  •  Good experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connectors, JMS source connectors, Tasks, Workers, converters, and Transforms.
  •  Hands on experience with developing APIs and Microservices
  •  Solid expertise with Java
  •  Working knowledge on Kafka Rest proxy and experience on custom connectors using the Kafka core concepts and API.

 

Good to have: 

 

  •  Understanding of Data warehouse architecture and data modelling
  •  Good knowledge of big data ecosystem to design and develop capabilities to deliver solutions using CI/CD pipelines.
  •  Good understanding of other AWS services such as CloudWatch monitoring, scheduling, and automation services
  •  Strong skills in In-memory applications, Database Design, and Data Integration
  •  Ability to guide and mentor team members on using Kafka.

 

 

Experience: 3 to 8 Years 

Work timings: 2.30PM - 11.30PM
location - hyderabad

Read more
Jio Platforms Limited

at Jio Platforms Limited

3 recruiters
Giri Korukonda
Posted by Giri Korukonda
Mumbai, Hyderabad
4 - 8 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconExpress
FeatherJS
API
MySQL
+2 more

Expi with Node.js, Express, Feather JS
3rd party API integration knowledge
Database- MySql or NoSql

Kafka Client Integration with Nodejs

Redis integration using Nodejs

Read more
Banking domain product based company

Banking domain product based company

Agency job
via New Era India by Poorti Punj
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 5 yrs
₹10L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Apache Kafka
+2 more

Skills Required:

  • Must-Have: Java, Spring/spring-boot, Data structure, Algorithm
  • Any 2 (Must) out of the below-mentioned list 

a.     JPA/Hibernate

b.     Messaging queue/Kafka/SQS/distributed message/SNS/JMS

c.     NoSQL/Aerospike/Reddis/Cassandra

d.     Microservices

 

Roles and Responsibilities:

  • Technical design, implementation, deployment, and support.
  • Partner with Business Analysts to review and implement business requirements.
  • Perform development and unit testing, working closely with Business.
  • Mentors and oversees the development of resources, including reviewing designs and performing code reviews.
  • Ensure designs are in compliance with specifications
  • Contribute to all phases of the development lifecycle
  • Developing high-volume, low-latency applications for mission-critical systems and delivering high availability and performance
  • Should have experience of working on Core Java/J2EE & OOPS concept
  • Should be well versed with Spring/Struts & Apache Camel (OSGI Framework)
  • Should have a good understanding of Hibernate and Other ORMs
  • Should have an understanding of working on Web Service (SOAP/REST) and Maven
  • Build tools such as Jenkins
  • Caching Technique(Radis, Hazlecast, Aerospike)
  • Database Knowledge - Oracle, MySQL

 

Read more
Product and Service based company

Product and Service based company

Agency job
via Jobdost by Sathish Kumar
Hyderabad, Ahmedabad
4 - 8 yrs
₹15L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Apache
Snow flake schema
skill iconPython
Spark
+13 more

Job Description

 

Mandatory Requirements 

  • Experience in AWS Glue

  • Experience in Apache Parquet 

  • Proficient in AWS S3 and data lake 

  • Knowledge of Snowflake

  • Understanding of file-based ingestion best practices.

  • Scripting language - Python & pyspark

CORE RESPONSIBILITIES

  • Create and manage cloud resources in AWS 

  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 

  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 

  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 

  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.

  • Define process improvement opportunities to optimize data collection, insights and displays.

  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 

  • Identify and interpret trends and patterns from complex data sets 

  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 

  • Key participant in regular Scrum ceremonies with the agile teams  

  • Proficient at developing queries, writing reports and presenting findings 

  • Mentor junior members and bring best industry practices.

 

QUALIFICATIONS

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 

  • Strong background in math, statistics, computer science, data science or related discipline

  • Advanced knowledge one of language: Java, Scala, Python, C# 

  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  

  • Proficient with

  • Data mining/programming tools (e.g. SAS, SQL, R, Python)

  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)

  • Data visualization (e.g. Tableau, Looker, MicroStrategy)

  • Comfortable learning about and deploying new technologies and tools. 

  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 

  • Good written and oral communication skills and ability to present results to non-technical audiences 

  • Knowledge of business intelligence and analytical tools, technologies and techniques.

Familiarity and experience in the following is a plus: 

  • AWS certification

  • Spark Streaming 

  • Kafka Streaming / Kafka Connect 

  • ELK Stack 

  • Cassandra / MongoDB 

  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools

Read more
New age Product based company, developed solution platform

New age Product based company, developed solution platform

Agency job
via Jobdost by Sathish Kumar
Hyderabad
7 - 12 yrs
₹20L - ₹40L / yr
skill iconSpring Boot
NOSQL Databases
skill iconJava
Hibernate (Java)
Microservices
+3 more
MUST HAVE : 
  • Proven experience in any high-level programming languages like Java (Preferably)
  • Strong knowledge of data structure, algorithms and coding
  • Experience with any Messaging technologies like Kafka or Rab-bitMQ, etc.
  • Proven experience in database technologies like NoSQL
  • Hands-on experience with Spring, Spring Security, Spring boot, and Hibernate framework.
  • Working knowledge of developing Restful Micro services.
  • Strong analytical, problem-solving skills.
  • Attend team meetings to discuss projects, brainstorm ideas, and put forward solutions to any issues.
  • Ability to understand the Platform/domain of the business in detail & Ability to multi-task.
  • Good communication and organizational skill
Read more
New Age Product based company, developed solution platform

New Age Product based company, developed solution platform

Agency job
via Jobdost by Sathish Kumar
Hyderabad
2 - 6 yrs
₹7L - ₹20L / yr
Apache Kafka
skill iconJava
skill iconSpring Boot
Hibernate (Java)
Spring
+3 more
MUST HAVE : 
  • Proven experience in any high-level programming languages like Java (Preferably)
  • Strong knowledge of data structure, algorithms and coding
  • Experience with any Messaging technologies like Kafka or Rab-bitMQ, etc.
  • Proven experience in database technologies like NoSQL
  • Hands-on experience with Spring, Spring Security, Spring boot, and Hibernate framework.
  • Working knowledge of developing Restful Micro services.
  • Strong analytical, problem-solving skills.
  • Attend team meetings to discuss projects, brainstorm ideas, and put forward solutions to any issues.
  • Ability to understand the Platform/domain of the business in detail & Ability to multi-task.
  • Good communication and organizational skill
Read more
Symblai

at Symblai

1 recruiter
Vaishali M
Posted by Vaishali M
anywhere, Pune, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹15L - ₹35L / yr
skill iconJava
skill iconNodeJS (Node.js)
skill iconAngularJS (1.x)
skill iconPython
skill iconMongoDB
+8 more

Software Architect

Symbl is hiring a Software Architect who is passionate about leading cross-functional R&D teams. This role will serve as the Architect across the global organization driving product architecture, reducing information silos across the org to improve decision making, and coordinating with other engineering teams to ensure seamless integration with other Symbl services.

*Symbl is seeking for a leader with a demonstrated track record of leading cross-functional dev team, you are fit for the role if *

  • You have a track record of designing and building large-scale, cloud-based, highly available software platforms.
  • You have 8+ years of experience in software development with 2+ years in an architect role.
  • You have experience working on customer-facing machine learning implementations (predictions, recommendations, anomaly detection)
  • You are an API first developer who understands the power of platforms.
  • You are passionate about enabling other developers through your leadership and driving teams to efficient decisions.
  • You have the ability to balance long-term objectives with urgent short-term needs
  • You can successfully lead teams through very challenging engineering problems.
  • You are domain Expertise in one or more of: Data pipelines and workflow, telephony systems, real time audio and video streaming machine learning.
  • You have bachelor's degree in a computer science-related field is a minimum requirement
  • You’ll bring your deep experience with distributed systems and platform engineering principles to the table.
  • You are passionate about operational excellence and know-how to build software that delivers it.
  • You are able to think at scale, define, and meet stringent availability and performance SLAs while ensuring quality and resiliency challenges across our diverse product and tech stacks are addressed with NodeJs as mandatory, Java, Python, Javascript, ReactJS with intersection with ML platform + open source DBs.
  • You understand end-user use cases and are driven to design optimal software that meets business needs.

Your day would look like:

  • Work with your team providing engineering leadership and ensuring your resources are solving the most critical engineering problems while ensuring your products are scalable, performant, and highly available.
  • Focused on delivering the highest quality of services, and you support your team as they push production code that impacts hundreds of Symbl customers.
  • Spent time with engineering managers and developers to create and deliver critical new products and/or features that empower them to introduce change with quality and speed.
  • Made sure to connect with your team, both local and remote, to ensure they are delivering on engineering and operational excellence.
  •  

*Job Location : Anywhere  –  Currently WFH due to COVID

Compensation, Perks, and Differentiators:

  • Healthcare
  • Unlimited PTO
  • Paid sick days
  • Paid holidays
  • Flexi working
  • Continuing education
  • Equity and performance-based pay options
  • Rewards & Recognition
  • As our company evolves, so do our benefits. We’re actively innovating how we support our employees.
Read more
It's a OTT platform

It's a OTT platform

Agency job
via Vmultiply solutions by HR Lakshmi
Hyderabad
6 - 8 yrs
₹8L - ₹15L / yr
Big Data
Apache Kafka
Kibana
skill iconElastic Search
Logstash
Passionate data engineer with ability to manage data coming from different sources.
Should design and operate data pipe lines.
Build and manage analytics platform using Elastic search, Redshift, Mongo db.
Strong programming fundamentals in Datastructures and algorithms.
Read more
Enterprise Artificial Intelligence

Enterprise Artificial Intelligence

Agency job
via Purple Hirez by Aditya K
Hyderabad
5 - 12 yrs
₹10L - ₹35L / yr
Analytics
skill iconKubernetes
Apache Kafka
skill iconData Analytics
skill iconPython
+3 more
  • 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
  • Strong industry expertise with containerization technologies including kubernetes, docker-compose
  • 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
  • Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
  • Experience with scripting languages. Python experience highly desirable.
  • 2+ Industry experience in python
  • Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
  • Demonstrated expertise of building cloud native applications
  • Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
  • Experience in API development using Swagger
  • Strong expertise with containerization technologies including kubernetes, docker-compose
  • Experience with cloud platform services such as AWS, Azure or GCP.
  • Implementing automated testing platforms and unit tests
  • Proficient understanding of code versioning tools, such as Git
  • Familiarity with continuous integration, Jenkins
Responsibilities
  • Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
  • Assist in dev ops operations
  • Develop data ingestion processes and ETLs
  • Design and Implement APIs
  • Assist in dev ops operations
  • Identify performance bottlenecks and bugs, and devise solutions to these problems
  • Help maintain code quality, organization, and documentation
  • Communicate with stakeholders regarding various aspects of solution.
  • Mentor team members on best practices
Read more
Remote, Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 8 yrs
₹45L - ₹80L / yr
Engineering Management
Engineering manager
Engineering lead
VP of engineering
Engineering head
+8 more

We are looking for a bright and exceptional Engineering Lead to join one of our Fortune 500 clients in the Fintech industry for the locations - Bangalore / Gurgaon / Hyderabad based Technology team. The role involves building complex next generation product used by the clients and architecting solutions to support new technical and business initiatives.

 

Responsibilities:

  • Active participation in design, development, delivery, and maintenance of software projects/products.
  • Help the team translate the business requirements into R& D tasks.
  • Work with product groups to outline project deliverables and help manage the roadmap of the R& D tasks.
  • Act as a point of contact for managing and driving production defects & stability improvements to resolution.
  • Tailor processes to help manage time-sensitive issues and bring them to appropriate closure.
  • Engage and lead a team of highly talented technologists.

 

Requirements:

  • A bachelor's degree in Computer Science with 6-9 years of experience, Fintech domain is a plus.
  • Demonstrated contribution to end-to-end delivery of enterprise-grade software.
  • Proficiency in Java or equivalent object-oriented languages, coupled with design and SOA.
  • Hands-on experience in enterprise application development, coupled with knowledge of frameworks & tools like/equivalent to spring, mybatis, git, gradle
  • Experience in at least one shell scripting language, SQL, and data modeling skills.
  • Experience in cloud technologies is a plus.
  • Strong technology acumen, knowledge of software engineering process, design knowledge and architecture intelligence.
  • Attention to detail and quality, and the ability to work well in and across teams.
  • Ability to advocate & influence multiple stakeholders.
  • Excellent analytical and reasoning skills.
  • Ability to learn new domains and deliver output.
  • Experience in leading a team of highly skilled engineers.
  • Strong communication skills.
  • The ideal candidate should have strong problem solving and analytical skills as well as a demonstrated passion for technology.
  • Excellent reasoning ability and good interpersonal and communication skills are essential for this role as it involves interaction with business users.
Read more
Fanatics, Inc.

at Fanatics, Inc.

2 recruiters
Rakesh Akula
Posted by Rakesh Akula
Hyderabad
9 - 12 yrs
₹30L - ₹40L / yr
skill iconGo Programming (Golang)
skill iconPython
skill iconJava
skill iconNodeJS (Node.js)
skill iconKubernetes
+6 more
This group runs extremely high on continuous learning and shared education to avoid silos. To be most effective, you will want to have a solid grasp of engineering principles and a mature background in iterative product http://delivery.on/">delivery.On the team you will:
- Drive Site Reliability Engineering practices across the org.
- Design and develop new tools, dashboards and automations.
- Take ownership from building Proof of Concepts to Minimum Viable Products and continuously delivering enhancements thereafter.
- Create roadmaps and prioritize deliverables across initiatives.
- Technically lead a team of developers in the team.
- Collaborate with SROs, infra, cloud, platform, data and product teams.
- Have a keen eye on new practices/tools/technologies coming up in the industry that can help improve reliability of the infrastructure and services.Requirements
- 9-13 years of design and development experience in one or more of Go, Python and Java.
- Experience in one or more of Kafka, RabbitMQ and Pulsar.
- Experience in one or more of Docker, Kubernetes and OpenShift.
- Experience in public cloud, preferably AWS.
- Experience in one or more of monitoring, observability, alerting and IFTTT tools.
- Experience with atleast one automated unit testing framework.
- Experience in conducting research/experimentation/POCs on new technologies and tools.
- Knowledge of microservices and micro-frontend architectures.
- Knowledge on Site Reliability Engineering.
- Strong understanding of Agile/Scrum methodologies.
- Strong analytical and problem-solving skills.
- Strong written and verbal communication http://skills.nice/">skills.Nice to have
- Experience with open source tools/libraries in the areas of natural language understanding, forecasting and anomaly detection.
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune, Hyderabad
3 - 12 yrs
₹5L - ₹25L / yr
Apache Kafka
Big Data
Hadoop
Apache Hive
skill iconJava
+1 more

Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.

 

Must Have Skills

  • Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
  • Leading in the identification, isolation, resolution and communication of problems within the production environment.
  • Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
  • Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
  • Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
  • Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
  • Works on multiple platforms and multiple projects concurrently.
  • Performs code and unit testing for complex scope modules, and projects
  • Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
  • Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
  • Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
  • Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector,  JMS source connectors, Tasks, Workers, converters, Transforms.
  • Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
  • Working knowledge on Kafka Rest proxy.
  • Ensure optimum performance, high availability and stability of solutions.
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.  Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem. 
  • Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
  • Ability to perform data related benchmarking, performance analysis and tuning.
  • Strong skills in In-memory applications, Database Design, Data Integration.
Read more
Mphasis
Agency job
via Apical Mind by Madhusudan Patade
Bengaluru (Bangalore), Hyderabad, Noida, NCR (Delhi | Gurgaon | Noida)
4 - 8 yrs
₹7L - ₹15L / yr
skill iconPython
SQL
skill iconMongoDB
Relational Database (RDBMS)
skill iconJenkins
+5 more

Developer (SQL & Python) -
Required Technical Skills

  • At least 5+ years’ experience as a software engineer
  • Experience in Python is a must
  • Experience working with relational/non-relational databases and understanding of storage technologies (like MySQL, Sybase, MongoDB, InfluxDB, Cassandra or HBase)
  • Experience / Familiarity with Database Modelling, Normalization techniques
  • Experience / Familiarity with object-oriented design patterns
  • Ability to trouble shoot and fix performance issues across the codebase and database queries

Preferred Skills

  • BA/BS in Computer Science or equivalent practical experience
  • Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps
  • Experience with Agile development concepts and related tools
  • Ability to trouble shoot and fix performance issues across the codebase and database queries
  • Experience with Python frameworks utilizing Asyncio
  • Experience working with large volumes of time series data and building services, APIs and applications based of it
  • Experience in designing multi-tier application architecture and distributed caching solutions
  • Experience with Perl
  • ETL background in any language or tools
  • Experience with cloud technologies like Kubernetes, Docker, OpenStack and Kafka
  • Experience with web technologies like Angular 2+ (or React/Vue), TypeScript, RxJS
  • Experience with Go
Experience in the finance industry and knowledge of financial products/markets
Read more
A Top Level 5 Services MNC

A Top Level 5 Services MNC

Agency job
Bengaluru (Bangalore), Hyderabad, Gurugram
4 - 10 yrs
₹4L - ₹25L / yr
skill iconJava
Apache Kafka
skill iconKubernetes
skill iconReact.js

Software Developer:

  • Strong experience on Java and Python
  • Experience on Kafka, Kubernetes, and React
  • Must have experience on any one database
  • Location :Bangalore and Hyderabad
  • Experience : 5-10 Years
Read more
digital india payments limited
Bhavani Pendyala
Posted by Bhavani Pendyala
Chennai, Hyderabad
3 - 7 yrs
₹4L - ₹12L / yr
skill iconJava
skill iconNodeJS (Node.js)
Fullstack Developer
skill iconReact.js
skill iconRedux/Flux
+13 more
Technology Requirements:
  1. Extensive experience in Javascript / NodeJS in the back end
  2. Front end frameworks such as Bootstrap, Pug, Jquery
  3. Experience in web frameworks like ExpressJS, Webpack
  4. Experience in Nginx, Redis, Apache Kafka and MQTT
  5. Experience with MongoDB
  6. Experience with Version Control Systems like Git / Mercurial
  7. Sound knowledge in Software engineering best practices
  8. Sound knowledge in RestFul API Design
  9. Working knowledge of Automated testing tools
  10. Experience in maintaining production servers (Optional)
  11. Experience with Azure DevOps (Optional)
Soft Skills:
  1. Experience in digital payments or financial services industry is a plus.
  2. Participation in the processes of strategic project-planning meetings.
  3. Be involved and participate in the overall application lifecycle.
  4. Collaborate with External Development Teams.
  5. Define and communicate technical and design requirements, understanding workflows and write code as per requirements.
  6. Develop functional and sustainable web applications with clean codes.
  7. Focus on coding and debugging.
Read more
Backflipt software

Backflipt software

Agency job
via StagInnovations Pvt Ltd by Rajasekhar Gupta
Hyderabad
3 - 6 yrs
₹6L - ₹15L / yr
skill iconJava
skill iconJavascript
Fullstack Developer
skill iconKotlin
Spring
+1 more
  1. Full Stack developer (Java/JavaScript): (experience 3 to 6 years) 
  • Experience in designing multithreaded/concurrent/distributed systems.
  • Experience in Java/Kotlin or JavaScript is required (3+ years).
  • Experience in working on scalable non-blocking server side frameworks like Spring stack(2+ years)
  • Experience with both SQL and NoSQL databases, Message brokers. ex: Postgres/MySQL, MongoDB/DynamoDB/Redis/Neo4j, Kafka (2 +years)
  • Experience in Front end language and frameworks. JavaScript, must be an expert in one of the front end frameworks (ex: React) (2+)
  • Understand cloud technologies and should have deployed applications or products in popular cloud platforms like AWS, GCP, MS Azure, etc (2+ years).

Optional:

    • Experience with BigData is a big plus
    • Knowledge of functional programming principles is a plus.

 

  • Experience in HTML5 and CSS3

 

Read more
Enquero Global LLP

at Enquero Global LLP

3 recruiters
Ankit Chaurasia
Posted by Ankit Chaurasia
Bengaluru (Bangalore), Hyderabad
8 - 11 yrs
₹10L - ₹40L / yr
Technical Architecture
skill iconJava
Microservices
SQL
skill iconElastic Search
+5 more

The primary responsibilities include:

  • Responsible for the overall software development lifecycle.
  • Management and execution against project plans and delivery commitments
  • Drive effective, mature Agile practices and continuous execution improvements within your teams.
  • Manage stakeholder planning and communications ensuring key outcomes
  • Recruit, coach, and mentor the best engineering and management talent
  • Build, coach, and manage a strong team of engineers that set the standard and can up-level the overall talent of the extended organization.
  • Provide a strong understanding of native mobile developer platforms and bring broad thought leadership to the next generation mobile developer experience.
  • Anticipate and aggressively remove obstacles that slow down or prevent products and programs from delivering on product and program objectives.
  • Balance urgent and effective action, commitment to excellence, and taking the initiative to resolve problems; holds internal stakeholders accountable where appropriate.

REQUIRED

  • 8+ years of industry experience with 2+ years of senior leadership experience.
  • Solid track record of over-achieving engineering and platform delivery and scaling targets in high volume, innovative and fast-paced high-pressure environment; proven results in delivery on platform product.
  • Deep understanding of one or more of Java/Python/Scala. Ability to understand and critique the core library/language constructs with skilled knowledge on UI technologies like Angular 2+, React, D3.JS
  • Working experience with Agile methodologies and durable team concepts.
  • Knowledge in DevOps practices and tools.
  • Knowledge in Cloud Technologies, CI/CD, Jenkins, Testing methodologies is preferred.
  • Experience in server-side services using ElasticSearch, Kafka
  • A strong track record of project delivery for large, cross-functional, projects and bringing in and growing engineering talent
  • Excellent written and verbal communication skills with the ability to present complex technical information clearly and concisely to a variety of audiences.
An entrepreneurial spirit combined with strong programming and product management skills
Read more
DelaPlex Software

at DelaPlex Software

2 recruiters
Sunil Kandukuri
Posted by Sunil Kandukuri
Pune, Nagpur, Bengaluru (Bangalore), Hyderabad
4 - 7 yrs
₹4L - ₹8L / yr
skill iconJava
Spring
skill iconSpring Boot
NOSQL Databases
DynamoDB
+4 more

Role: Java developer
Experience: 4+ years

Job description

○ Working experience on JAVA,Spring Boot. (on building web services?)

○ NOSQL DynamoDB knowledge is plus

○ Working experience in building micro services and distributed systems

○ Working experience on using messaging queues RabbitMQ/Kafka is plus

Read more
Online ENT Healthcare giant in India

Online ENT Healthcare giant in India

Agency job
via The Hub by Sridevi Viswanathan
Remote, Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Pune
3 - 8 yrs
₹5L - ₹17L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
MySQL
java
+1 more

Software Development Engineer:

Major Responsibilities:

  • Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
  • Design, develop and maintain complex systems using best of the breed development practices and technology.
  • Responsible for the over-all software development life cycle.
  • Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
  • Adoption and Evolution of the software engineering practices and tools within the organization
  • Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
  • Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
  • Drive and manage the bug triage process
  • Report on status of product delivery and quality to management, customer support and product teams.

Desired Skills

  • Strong programming, debugging, and problem-solving skills
  • Strong understanding of data structures and algorithms
  • Sound understanding of object-oriented programming and excellent software design skills.
  • Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
  • Strong understanding of database design and SQL (mySql/mariaDB) development
  • Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
  • Knowledge of design patterns and good to have experience of large-scale applications
  • Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.

Ideal Experience

  • 3 to 8 years of industry experience.
  • Bachelors or Master’s Degree in Computer Science/ IT
  • Drive discussions to create/improve product, process and technology
  • Provide end to end solution and design details
  • Lead development of formalized solution methodologies
  • Passion to work in startup like environment

Personal Characteristics

  • Passion and commitment
  • Strong and excellent software design intellect
  • High integrity
  • Self-starter
Read more
Hyderabad
4 - 8 yrs
₹12L - ₹24L / yr
skill iconNodeJS (Node.js)
Microservices
skill iconJavascript
skill iconReact.js
skill iconMongoDB
+2 more

The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and aleadership skills. 

 

Responsibilities

  • Writing clean, high-quality, high-performance, maintainable code
  • Develop and support software including applications, database integration, interfaces, and new functionality enhancements
  • Coordinate cross-functionally to insure project meets business objectives and compliance standards
  • Support test and deployment of new products and features
  • Participate in code reviews

 

Qualifications

  • 5+ years of relevant work experience
  • Mandatory experience in building scalable microservices on nodejs platforms
  • Expertise in Object Oriented Design, Database Design, Service architecture
  • Experience with Agile or Scrum software development methodologies
  • Ability to multi-task, organize, and prioritize work
Read more
Listed in fortune 100 fastest growing tech companies in 2019

Listed in fortune 100 fastest growing tech companies in 2019

Agency job
via Purple Hirez by Aditya K
Hyderabad
5 - 12 yrs
₹5L - ₹29L / yr
skill iconJava
Apache Kafka
Microservices
J2EE
skill iconSpring Boot
+1 more

Technical specifications/Skill Set:

  • Minimum of 5+ years of significant experience in application development.
  • Proficient with software development lifecycle (SDLC) methodologies like Agile, Test- driven development
  • Knowledge of system architecture, object-oriented design, and design patterns.
  • Required technical skills: Strong Core Java, J2EE, Spring boot, Akka, API development & distributed application development experience.
  • Desirable technical skills: Micro-services pattern, Kafka, Knative Eventing, Camel-K, Container Technologies like Docker, Kubernetes, NoSql preferably Cassandra.
  • Experience working with high volume data and computationally intensive system.
  • Domain knowledge in Financial Industry and Capital Markets is a plus.
  • Excellent communication skills are essential, with strong verbal and writing proficiencies.
Read more
Arcesium

Arcesium

Agency job
Remote, Hyderabad, Bengaluru (Bangalore)
8 - 14 yrs
₹1L - ₹80L / yr
skill iconKubernetes
Apache Kafka
Distributed Systems

We are looking for a bright and exceptional Engineering Manager to join our Hyderabad based Technology team. The role involves building complex next generation product used by our clients and architecting solutions to support new technical and business initiatives.

What you’ll do: • Manage, design, execute and take complete responsibility for the delivery and maintenance of software projects/products • Help the team translate the business requirements into R&D tasks • Work with business groups to outline project deliverables and manage the roadmap of the R&D tasks • Work with Technical Relationship managers to understand the client initiated R&D requests • Act as a point of contact for managing and driving production defects to resolution • Tailor processes to help manage time-sensitive issues and bring them to appropriate closure • Engage and manage team of highly talented technologists, and aid in them grow professionally with regular mentoring

What you’ll need: • A bachelor’s degree in Computer Science with 8+ years of experience, fintech domain is a plus • Demonstrated track record of end-to-end delivery of enterprise-grade software • Strong technology acumen, knowledge of software engineering process, design knowledge and architecture intelligence • Superior project management skills to ensure high-quality and timely solution delivery

  • Attention to detail and quality, and the ability to work well in and across teams • Ability to advocate & influence multiple stakeholders • Excellent analytical and reasoning skills • Ability to learn new domains and deliver output • Experience leading a team of highly skilled engineers • Strong communication skills

Members of the Arcesium Company Group do not discriminate in employment matters on the basis of sex, race, colour, caste, creed, religion, pregnancy, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class.

 

Read more
Thinkdeeply

at Thinkdeeply

5 recruiters
Aditya Kanchiraju
Posted by Aditya Kanchiraju
Hyderabad
6 - 16 yrs
₹7L - ₹26L / yr
skill iconJava
Technical Architecture
Analytics
skill iconSpring Boot
Apache Kafka
+4 more

We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.

 

Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.

 

Experience

12+ Years

 

Location

Hyderabad

 

Skills

Bachelors/Masters/Phd in CS or equivalent industry experience

10+ years of industry experience in java related frameworks such as Spring and/or Typesafe

Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python

Experience with popular modern web frameworks such as Spring boot, Play framework, or Django

Demonstrated expertise of building and shipping cloud native applications

Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd

Experience in API development using Swagger

Strong expertise with containerization technologies including kubernetes, docker-compose

Experience with cloud platform services such as AWS, Azure or GCP.

Implementing automated testing platforms and unit tests

Proficient understanding of code versioning tools, such as Git

Familiarity with continuous integration, Jenkins

 

Responsibilities

 

Architect, Design and Implement Large scale data processing pipelines

Design and Implement APIs

Assist in dev ops operations

Identify performance bottlenecks and bugs, and devise solutions to these problems

Help maintain code quality, organization, and documentation

Communicate with stakeholders regarding various aspects of solution.

Mentor team members on best practices

 

Read more
UpX Academy

at UpX Academy

2 recruiters
Suchit Majumdar
Posted by Suchit Majumdar
Noida, Hyderabad, NCR (Delhi | Gurgaon | Noida)
2 - 6 yrs
₹4L - ₹12L / yr
Spark
Hadoop
skill iconMongoDB
skill iconPython
skill iconScala
+3 more
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort