11+ Clustering Jobs in Hyderabad | Clustering Job openings in Hyderabad
Apply to 11+ Clustering Jobs in Hyderabad on CutShort.io. Explore the latest Clustering Job opportunities across top companies like Google, Amazon & Adobe.
· IMMEDIATE JOINER
Professional Experience with 5+ years in Confluent Kafka Admin
· Demonstrated experience design / development.
· Must have proven knowledge and practical application of – Confluent Kafka (Producers/ Consumers / Kafka Connectors / Kafka Stream/ksqlDB/Schema Registry)
· Experience in performance optimization of consumers, producers.
· Good experience debugging issues related offset, consumer lag, partitions.
· Experience with Administrative tasks on Confluent Kafka.
· Kafka admin experience including but not limited to setup new Kafka cluster, create topics, grant permissions, offset reset, purge data, setup connectors, setup replicator task, troubleshooting issues, Monitor Kafka cluster health and performance, backup and recovery.
· Experience in implementing security measures for Kafka clusters, including access controls and encryption, to protect sensitive data.
· Install/Upgrade Kafka cluster techniques.
· Good experience with writing unit tests using Junit and Mockito
· Have experience with working in client facing project.
· Exposure to any cloud environment like AZURE is added advantage.
· Experience in developing or working on REST Microservices
Experience in Java, Springboot is a plus
Roles and responsibilities:
- Should be ok to work on Saturdays, can take off on Sunday and Monday.
- You do not need to have great typing skills.
- Entering customer and account data from source documents within time limits.
- Compiling, verifying the accuracy, and sorting information to prepare source data for computer entry.
- Reviewing data for deficiencies or errors, correcting any incompatibilities, and checking output.
- Compile, verify the accuracy, and sort information according to priorities to prepare source data for computer entry.
- Review data for deficiencies or errors, correct any incompatibilities if possible, and check the output.
- Research and obtain further information for incomplete documents.
- Respond to queries for information and access relevant files.
- Comply with data integrity and security policies.
Requirements:
- Bachelor’s degree in any related field.
- Proven data entry work experience, as a Data Entry Operator.
- Experience with MS Office and data programs.
- Familiarity with administrative duties.
- Experiences using office equipment, like fax machines and scanners.
- Excellent knowledge of correct spelling, grammar, and punctuation.
- Attention to detail.
Product based company specializes into architectural product
3+ years experience as Oracle Cloud Techno Fucnional Consultant.
(70%- Technical , 30% - Functional)
5-7 years’ experience in IT Field.
Strong Hands on Experience in Oracle SCM or Oracle Financial modules as a technical Consultant.
Should be strongly experienced working with Oracle cloud ERP application particularly on reporting (BI and OTBI)
Advanced understanding of Oracle PAAS offerings and architecture.
Must have knowledge on:
1) Experience in creation and customization of reports/forms and XML Publisher.
Should have worked in Functional modules (SCM/ Finance)
2) Experience in developing or customizing reports using BI templates.
3) Experience in building Oracle BIP reports, Analysis reports, OTBI and
reports using data models.
4) Should have hands-on experience in BI Publisher (RTF design/ eText/Scheduling/ Parameter Handling/ Bursting/ backup and migration of reportsto different pods).
5) Strong knowledge of writing SQL queries using Oracle Financial or Oracle SCM database tables.
6) Very Strong SQL/PLSQL skills to create custom queries in BI Publisher, Oracle Fusion environment
7)Proficiency in creating reports using templates like RTF, Excel, Pipe delimited, stylesheet, e-text, etc
8) Expertise in working with the Cubes and how to extract data from cubes and joins etc
9) Hands-on experience in converting the reports to ESS jobs for scheduling reports
10)Hands-on experience in migrating the reports between environments
11)Excellent Communication Skills
Good to have Knowledge on:
1) Expertise in Conversions and Integrations via FBDI, Webservices
2) Experience in AP Check printing, Positive pay templates, PO
Printing, AP Printing, automated AR Invoice. Lockbox Processing
/ Bank Statementintegration.
3) Solid understanding of performance tuning best practices and
experience improving end-to-end processing times.
4) Additional Reporting Tools: Analysis Dashboard / ESS Base
Financial reporting studio (FRS) and Smartview reports.
5) Knowledge on Integrations and the OIC module.
▪ Experience in developing internet web applications using Java/J2EE technologies.
▪ Experience in different modules of Spring framework like Dependency Injection (IoC), Spring MVC, Spring ORM along with JPA.
▪ Experience in Java Version 7.0+. Good experience with JMS, Spring & Hibernate.
▪ Experience in identifying database solutions for a given problem, preferably MySQL (good to have)
We strongly believe in: Innovation, Perseverance, Compassion, Value Creation & Ownership
at Persistent Systems
We are hiring for Senior Data Architect for a reputed company
Experience required- 10-19 yrs
Skills required- Having hands on experience on Kafka, Stored procedures, Snowflakes.
Role: Backend Engineer
About Saras Analytics:
-
You are a great teammate with proven capabilities working on SaaS product and a passion for writing exception code.
-
You are hungry for an opportunity to join a start-up and become a member of core platform team driving the company forward.
-
Has good analytical and problem-solving skills and is able to break down a solution into smaller units of work and produce a solution roadmap.
-
Has written high quality, well-tested shared components that can be leveraged by multiple systems. Can step into specific projects to supply additional management, coding and engineering capacity as needed.
-
Has expert knowledge in distributes systems and high-volume transactions.
-
Understand the product mission, goals, and tasks and execute with the team to achieve them.
-
Understand the design and architecture and build modules in accordance with it or recommend and make the necessary changes that adds more stability to our product.
-
Improve the quality of our front-end code and our overall front-end user experience.
-
Takes requirements (business features, technical debts and internal enhancements) and designs resilient solutions.
Requirements
- 1+ years of hands-on experience in Java/Kotlin.
- Demonstrable understanding of Design Patterns.
- Experience in Context & Dependency Injection or Spring framework, Hibernate/JPA.
- Experience in RESTful/SOAP web services in integrating with 3rd Party API Integrations.
- Proficient in SQL (PostgreSQL) and NoSQL databases.
- Experience in multi-threading and concurrency is a plus Experience in JMS messaging using Apache Zookeeper and Kafka is a plus.
- Experience in Angular is a plus. Knowledge of modern CI/CD environments: Git, Gradle, GitLab Familiarity with tools like Postman, SOAPUI, IntelliJ
- Willing to learn our tech-stack (Kotlin | CDI | Angular).
- Develop API integrations using RESTful/SOAP web services.
- Significant technical academic course work or equivalent work experience
- Excellent communication and interpersonal skills.
- Knowledge on cloud infrastructure is a plus.
Advanced technology to Solve Business Problems.( A1)
- Desire to explore new technology and break new ground.
- Are passionate about Open Source technology, continuous learning, and innovation.
- Have the problem-solving skills, grit, and commitment to complete challenging work assignments and meet deadlines.
Qualifications
- Engineer enterprise-class, large-scale deployments, and deliver Cloud-based Serverless solutions to our customers.
- You will work in a fast-paced environment with leading microservice and cloud technologies, and continue to develop your all-around technical skills.
- Participate in code reviews and provide meaningful feedback to other team members.
- Create technical documentation.
- Develop thorough Unit Tests to ensure code quality.
Skills and Experience
- Advanced skills in troubleshooting and tuning AWS Lambda functions developed with Java and/or Python.
- Experience with event-driven architecture design patterns and practices
- Experience in database design and architecture principles and strong SQL abilities
- Message brokers like Kafka and Kinesis
- Experience with Hadoop, Hive, and Spark (either PySpark or Scala)
- Demonstrated experience owning enterprise-class applications and delivering highly available distributed, fault-tolerant, globally accessible services at scale.
- Good understanding of distributed systems.
- Candidates will be self-motivated and display initiative, ownership, and flexibility.
Preferred Qualifications
- AWS Lambda function development experience with Java and/or Python.
- Lambda triggers such as SNS, SES, or cron.
- Databricks
- Cloud development experience with AWS services, including:
- IAM
- S3
- EC2
- AWS CLI
- API Gateway
- ECR
- CloudWatch
- Glue
- Kinesis
- DynamoDB
- Java 8 or higher
- ETL data pipeline building
- Data Lake Experience
- Python
- Docker
- MongoDB or similar NoSQL DB.
- Relational Databases (e.g., MySQL, PostgreSQL, Oracle, etc.).
- Gradle and/or Maven.
- JUnit
- Git
- Scrum
- Experience with Unix and/or macOS.
- Immediate Joiners
Nice to have:
- AWS / GCP / Azure Certification.
- Cloud development experience with Google Cloud or Azure