

Similar jobs
Title: Senior Java Developer
Top Skills: Java , Spring boot, Microservices, MYSQL, PL/SQL, NoSQL
Work Mode:- Full time
Location:- Teynampet, Chennai, Tamil Nadu
Employer: Innovating digital experiences with cutting-edge technology, design, and user-focused solutions for global brands
We are seeking a highly skilled and experienced Sr. Java Developer with a proven track record in
delivering high-quality applications. If you're passionate about building modern, scalable, and
robust software solutions, we want to hear from you!
Required Qualifications
7+ years of software development experience with a focus on delivering robust and
efficient solutions.
Strong proficiency in Java and its latest versions and expertise in developing
Java-based applications.
Hands-on experience with relational databases and proficiency in writing complex SQL
queries.
Familiarity with Enterprise Java (J2EE / Java EE / Spring) application architectures.
Demonstrated history of delivering modern, high-cadence applications using:
Agile methodologies and test-driven development (TDD).
CI/CD pipelines for streamlined deployment and updates.
Git version control for collaborative development.
Job Description
Java Developer Transformational Product Experiences .
Preferred Skills
Knowledge of microservices architectures and Domain-Driven Design (DDD).
Working knowledge of Python and JavaScript/Node.js.
Experience with Object-Relational Mappers (ORMs) such as Hibernate or JPA.
Strong Problem Solving skills
Bonus Skills
Exposure to emerging Generative AI technologies and tools like OpenAI APIs and GPT
large language models (LLMs).
Practical experience with MongoDB or other NoSQL, MySQL databases.
Role & Responsibilities:
- Handle multiple products/modules simultaneously, lead the team in development and support
- Lead development teams
- Daily updates to customers and Product Managers/Delivery Managers
- Participate in design discussions
- Design and Architect small modules
- Effort estimations for modules, subsystems or use cases.
- Learn new skills and mentor the new team members
- Champion best practices within the team, including code reviews
Work Experience & Skills:
- B.E/B.Tech or any relevant Masters degree from reputed college.
- Strong 5+ years of experience in Java, J2EE, Spring IOC, Spring Annotations, Spring JDBC, Jquery, Java script, HTML 5 and CSS
- Strong experience in Postgres or any SQL technology
- Experience with working on Tomcat, Apache, JBoss or any similar application server
- Should have 3+ years of experience in leading team.
- Good experience in JSON / RESTful API / Web Services.
- Experience in working on Agile methodology using Scrum and sprints
- Experience in using build tools, Maven, GIT, Bugzilla or similar tools.
- Experience in cloud technologies, Microservices and frameworks such as AWS or GPC, container technologies like Docker is a plus.
- Ability to adapt to new development environments, changing business requirements and learning new systems highly desired
- Experience in Microservices, SpringBoot and Angular is a plus.
- Experience with tuning deployed applications for scalability and performance.
- Good knowledge of deployment and scripting on Linux/Unix servers
- Strong technical documentation skills.
- Good oral and written communication skills
Responsibilities:
- Provide technical leadership of critical integrations by MuleSoft mostly with contact-center solution.
- Provide MuleSoft technical expertise and leadership when evaluating and designing integration solutions ensuring all components and subsystems impacted are properly addressed during builds and deployments.
- Collaborate cross-functionally with teammates to implement integration solution.
- Troubleshoot MuleSoft/API technical issues as needed
Qualifications
- Bachelor's Degree required. In lieu of a degree, a comparable combination of education and experience may be considered.
- 3+ years of experience in building scalable, highly available, distributed solutions and services
- 1+ years of experience in middleware technologies: Enterprise Service Bus (ESB), most preferably with MuleSoft CloudHub and Orchestration, Routing and Transformation
- 3+ years of experience working with Java
- Experience in RESTful API architectures, specifications and implementations
- Working knowledge of progressive development processes like scrum, XP, Kanban, TDD, BDD and continuous delivery
- Concept understanding on Google Cloud platforms is a major plus
• Experience working with SQL databases, query optimisation and designing schemas
• High coding standards - understanding of test coverage best practices & test pyramid concept.
• Design, analyze, code, test, and deploy applications to satisfy business requirements for large, complex projects.
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.

ThoughtWorks is a global software consultancy with an aim to create a positive impact on the world through technology. Our community of technologists thinks disruptively to deliver pragmatic solutions for our clients' most complex challenges. We are curious minds who come together as collaborative and inclusive teams to push boundaries, free to be ourselves and make our mark in tech.
Our developers have been contributing code to major organizations and open source projects for over 25 years. They’ve also been writing books, speaking at conferences and helping push software development forward, changing companies and even industries along the way. We passionately believe that software quality is driven by open communication, review and collaboration. That’s why we’re such vehement supporters of open source and have made significant contributions to open source tools for testing, continuous delivery (GoCD), continuous integration (CruiseControl), machine learning and healthcare.
As consultants, we https://www.thoughtworks.com/careers/hub/consultant-life">work with our clients to ensure we’re evolving their technology and empowering adaptive mindsets to meet their business goals. You could influence the digital strategy of a retail giant, build a bold new mobile application for a bank or redesign platforms using event sourcing and intelligent data pipelines. You will use the latest Lean and Agile thinking, create pragmatic solutions to solve mission-critical problems and challenge yourself every day.
You’ll spend time on the following:
- You will champion best practices like writing clean and reusable code using practices like TDD, SOLID principles, OO design, and pair programming
- You will partner with other technologists from cross-functional teams advocating devops culture
- You will work in collaborative, product-focused teams to build innovative customer experiences
- Take ownership and accountability beyond individual deliverables, always pushing the envelope in order to deliver awesome results for our clients
- Learn, digest and subsequently apply the latest technology thinking from ourhttps://www.thoughtworks.com/radar"> tech radar to solve client problems
Here’s what we’re looking for:
- You have 2+ years* of experience using two or more development languages (Java, JavaScript, Ruby, C#, etc.) with experience in Object-Oriented programming
- You’re willing and able to commit to traveling up to 100% (back home on the weekends) across the US and Canada to help our clients solve their business problems
- You can write clean, high-quality code in a variety of languages and are also able to spot (and improve) bad code
- You’re resilient in ambiguous situations and can approach challenges from multiple perspectives
- You have experience with Agile, Lean and/or Continuous Delivery approaches such as Continuous Integration, TDD, Infrastructure as Code, etc.
- Bonus points if you have knowledge of cloud technology such as AWS, Docker or Kubernetes
- You’re willing and able to commit to travel to client sites in order to solve their business problems
• Build data pipelines for structured/unstructured, real-time/batch, events/synchronous/asynchronous using MQ, Kafka, Steam processing using Java / Python
• Design the Data stores for Big Data systems with expertise in Cassandra, HBase
• Implementation of Indexing and Search using Elasticsearch
• Setup and Deployment of Cassandra, Elasticsearch clusters
Required Qualifications and Competencies:
• Strong hands-on experience with Cassandra, data modeling, data replication, clustering, indexing for handling for large data sets
• Experience with SQL, NoSQL, relational database design, and methods for efficiently retrieving data for Time Series Analytics
• Strong understanding of CQL, Data Modeling in-order to achieve highly performant data access
• Strong experience in data modeling in Cassandra to design efficient storage model to meet variety of business needs
• Should have Elasticsearch skill with significant experience working with large Elasticsearch clusters, cluster performance optimisation, capacity planning, enhancing monitoring capabilities for early issue detection, driving operational readiness and ongoing maintenance
• Strong hands-on experience of programming with Java / Python
• Ability to troubleshoot and investigate stability, performance issues



Job Description :
Hiring for Staff Engineer (Back end) for a leading product based company at DLF IT Park, Chennai.
Skill Set :
- Strong Experience in any Programming language (Ruby, Go, Java, or other high-performance languages), Architecture, Design (HLD/LLD), Data structures, Algorithms, Hands-on Coding, Problem Solving, etc
- Experience in Web Technology is Must.
- Looking for candidates with good experience in product development.
- Candidates from product development companies will be preferred.
- Candidates willing to relocate/preferring Chennai can apply.
Responsibilities :
- Analyze and drive product requirements
- Architect and design product features for scale and maintainability
- Lead in the design, implementation, and deployment of successful systems and services
- Ensure the quality of architecture and design of systems
- Implement code with very high coverage of unit tests and component tests
- Perform design and code reviews
- Functionally decompose complex problems into simple, straight-forward solutions
- Fully and completely understand system interdependencies and limitations
- Possess expert knowledge in performance, security, scalability, architecture, and best practices
- Software development of high quality/availability core systems
- Cross-training peers and mentoring teammates
- Document HLD/LLD for easy knowledge sharing and future scaling
Must have :
- 8-12 years of experience designing, integrating and developing distributed applications in Ruby, Go, Java, or other high-performance languages
- Experience with cluster and container orchestration systems such as Docker, Mesos, Marathon, Salt or Kubernetes.
- Experience with Service design, systems engineering, API Design and versioning
- Understanding of Design Patterns, Serverless computing, cloud-first architecture, TDD, BDD, CI/CD, Integration Patterns
Good to have :
- Experience building distributed systems using Kafka. Strong grasp of fundamental concepts of Kafka, ZooKeeper and building producer and consumer applications using Kafka
- Familiarity writing and optimizing advanced SQL queries
- Good Linux/UNIX systems knowledge
- AWS compute and storage PaaS services. AWS certified solutions architect nice to have.
- Experience productionizing Machine Learning models
- Experience publishing technical papers in reputed conferences.

