11+ HDFS Jobs in Pune | HDFS Job openings in Pune
Apply to 11+ HDFS Jobs in Pune on CutShort.io. Explore the latest HDFS Job opportunities across top companies like Google, Amazon & Adobe.
Key Responsibilities:-
- Design, build, and enhance Salesforce applications using Apex, Lightning Web Components (LWC), Visualforce, and SOQL.
- Implement integrations with external systems using REST APIs and event-driven messaging (e.g., Kafka).
- Collaborate with architects and business analysts to translate requirements into scalable, maintainable solutions.
- Establish and follow engineering best practices, including source control (Git), code reviews, branching strategies, CI/CD pipelines, automated testing, and environment management.
- Establish and maintain Azure DevOps-based workflows (repos, pipelines, automated testing) for Salesforce engineering.
- Ensure solutions follow Salesforce security, data modeling, and performance guidelines.
- Participate in Agile ceremonies, providing technical expertise and leadership within sprints and releases.
- Optimize workflows, automations, and data processes across Sales Cloud, Service Cloud, and custom Salesforce apps.
- Provide technical mentoring and knowledge sharing when required.
- Support production environments, troubleshoot issues, and drive root-cause analysis for long-term reliability.
- Stay current on Salesforce platform updates, releases, and new features, recommending adoption where beneficial.
Required Qualifications:-
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
- 6+ years of Salesforce development experience with strong knowledge of Apex, Lightning Web Components, and Salesforce APIs.
- Proven experience with Salesforce core clouds (Sales Cloud, Service Cloud, or equivalent).
- Strong hands-on experience with API integrations (REST/SOAP) and event-driven architectures (Kafka, Pub/Sub).
- Solid understanding of engineering practices: Git-based source control (Salesforce DX/metadata), branching strategies, CI/CD, automated testing, and deployment management.
- Familiarity with Azure DevOps repositories and pipelines.
- Strong knowledge of Salesforce data modeling, security, and sharing rules.
- Excellent problem-solving skills and ability to collaborate across teams.
Preferred Qualifications:-
- Salesforce Platform Developer II certification (or equivalent advanced credentials).
- Experience with Health Cloud, Financial Services Cloud, or other industry-specific Salesforce products.
- Experience implementing logging, monitoring, and observability within Salesforce and integrated systems.
- Background in Agile/Scrum delivery with strong collaboration skills.
- Prior experience establishing or enforcing engineering standards across Salesforce teams.
Role & Responsibilities :
- Lead the design, analysis, and implementation of technical solutions.
- Take full ownership of product features.
- Participate in detailed discussions with the product management team regarding requirements.
- Work closely with the engineering team to design and implement scalable solutions.
- Create detailed functional and technical specifications.
- Follow Test-Driven Development (TDD) and deliver high-quality code.
- Communicate proactively with your manager regarding risks and progress.
- Mentor junior team members and provide technical guidance.
- Troubleshoot and resolve production issues with RCA and long-term solutions
Required Skills & Experience :
- Bachelors/Masters degree in Computer Science or related field with a solid academic track record.
- 6+ years of hands-on experience in backend development for large-scale enterprise products.
- Strong programming skills in Java; familiarity with Python is a plus.
- Deep understanding of data structures, algorithms, and problem-solving.
- Proficient in Spring Boot and RESTful APIs.
- Experience with cloud technologies like ElasticSearch, Kafka, MongoDB, Hazelcast, Ceph, etc.
- Strong experience in building scalable, concurrent applications.
- Exposure to Service-Oriented Architecture (SOA) and Test-Driven Development (TDD).
- Excellent communication and collaboration skills.
Preferred Technologies :
- Java
- Spring Boot, J2EE
- ElasticSearch
- Kafka
- MongoDB, Ceph
- AWS
- Storm, Hazelcast
- TDD, SOA
BE/BTech/BS or equivalent
7+ years of experience in Java and Spring Boot
Strong fundamentals in data structure, algorithm, and object-oriented programming
4+ years of hands-on experience in designing, developing, and delivering large-scale (distributed) system
architecture with complex software design, high scalability and availability.
Extensive experience with technical leadership, defining visions/solutions and collaborating/driving to see
them to completion.
Excellent analytical and problem-solving skills
Experience with any RDBMS and strong SQL knowledge
Comfortable with Unix / Linux command line
Nice to have Skills
Experience with Big Data platforms like Hadoop / Hive / Presto
Experience with ML/AI frameworks like TensorFlow, H20, etc
Used Key Value stores or noSQL databases
Good understanding of docker and container platforms like Mesos and Kubernetes
Security-first architecture approach
Application benchmarking and optimization
Job Role: We are seeking a skilled Java Developer to contribute to the development and enhancement renowned banking application, which supports automatic reconciliation and unified data reporting for their clients. This role involves working on high-impact enhancements, data pipeline integration, and platform modernization. The ideal candidate will be a quick learner, self-motivated, and able to ramp up quickly in a fast-paced environment.
Key Responsibilities:
Design, develop, and maintain Java-based applications using Java 17 and Spring Boot.
Implement and manage message routing using Apache Camel.
Develop and monitor data pipelines using Kafka.
Support and enhance existing cloud-native applications.
Work with OpenShift Container Platform (OCP 4) for container orchestration and deployments.
Utilize Jenkins for CI/CD pipeline automation and management.
Collaborate with cross-functional teams to integrate multiple data sources into a unified reporting platform.
Participate in code reviews, unit testing, and performance tuning.
Troubleshoot and resolve production issues in collaboration with operations teams.
Document development processes and system configurations.
Required Skills:
Strong proficiency in Java 17 and Spring Boot frameworks.
Hands-on experience with Apache Camel for message routing and transformation.
Solid experience in Kafka development and monitoring tools.
Good understanding of cloud pipeline architectures and deployment strategies.
Experience working with OpenShift (OCP 4).
Familiarity with Jenkins for CI/CD and automated deployments.
Understanding of cloud deployment platforms (AWS, Azure, or GCP preferred).
Strong analytical and debugging skills.
Ability to learn quickly and adapt to evolving project requirements.
Nice to Have:
Experience in financial services or transaction reporting platforms.
Familiarity with microservices architecture and containerization best practices.
Knowledge of monitoring tools (e.g., Prometheus, Grafana).
Job Responsibilities
· Responsibilities for this position include but are not limited to, the following.
· Development experience 3-6 years
· Experience working with Azure cloud-hosted web applications and technologies.
· Design and develop back-end microservices and REST APIs for connected devices, web applications, and mobile applications.
· Stay up to date on relevant technologies, plug into user groups, and understand trends and opportunities that ensure we are using the best techniques and tools.
- Meeting with the software development team to define the scope and scale of software projects.
- Designing software system architecture.
- Completing data structures and design patterns.
- Designing and implementing scalable web services, applications, and APIs.
- Developing and maintaining internal software tools.
- Writing low-level and high-level code.
- Troubleshooting and bug fixing.
- Identifying bottlenecks and improving software efficiency.
- Collaborating with the design team on developing micro-services.
- Writing technical documents.
- Be an active professional in continuous learning resulting in enhancement in organizational objectives.
- Provide technical support to all internal teams and customers as it relates to the product.
Requirements:
- Bachelor’s degree in computer engineering or computer science.
- Previous experience as a full stack engineer and IoT Products.
- Advanced knowledge of front-end languages including HTML5, CSS, JavaScript, Angular, React.
- Proficient in back-end languages including Nodejs and basic knowledge of Java, C#.
- Experience with cloud computing APIs and Cloud Providers such as Azure or AWS.
· Working knowledge of database systems (Cassandra, CosmosDB, Redis, PostgreSQL)
· Messaging systems (RabbitMQ, MQTT, Kafka)
· Cloud-based distributed application scaling & data processing in the cloud
· Agile / Scrum methodology
- Advanced troubleshooting skills.
- Familiarity with JavaScript frameworks.
- Good communication skills.
High-level project management skills.
Requirements:
- Energetic self-starter, with a desire to work in a startup environment.
- Proficient in advanced Java programming skills.
- Expert in Application development cloud/on premise end to end. Middle layer, DB layer.
- Nice to have understanding on MQ and DB
- Good hands on in Complex Event Processing systems.
- Solved scale and performance issues while dealing with huge sets of data. Pre compute or data aggregation frameworks to achieve good response time.
- Real world experience working with large datasets and NoSQL database technologies
- Experience of debugging applications running on Unix like systems (e.g. Ubuntu, CentOS)
- Experience developing RESTful APIs for complex data sets
- Knowledge of container based development & deployment (e.g. Dockers, rkt)
- Expertise in software security domain, a plus
Role: Java developer
Experience: 4+ years
Job description
○ Working experience on JAVA,Spring Boot. (on building web services?)
○ NOSQL DynamoDB knowledge is plus
○ Working experience in building micro services and distributed systems
○ Working experience on using messaging queues RabbitMQ/Kafka is plus
Desired Candidate Profile
- A team focus with strong collaboration and communication skills
- Exceptional ability to quickly grasp high-level business goals, derive requirements, and translate them into effective technical solutions
- Exceptional object-oriented thinking, design and programming skills (Java 8 or 11)
- Expertise with the following technologies : Data Structures, Design Patterns ,Code Versioning Tools(Github/bitbucket/..), XML, JSON, Spring Batch Restful, Spring Cloud, Grafana(Knowledge/Experience), Kafka, Spring Boot, Microservices, DB/NoSQL, Docker, Kubernetes, AWS/GCP, Architecture design (Patterns) Agile, JIRA.
- Penchant toward self-motivation and continuous improvement; these words should describe you: dedicated, energetic, curious, conscientious, and flexible.





