11+ Adobe CQ Jobs in Hyderabad | Adobe CQ Job openings in Hyderabad
Apply to 11+ Adobe CQ Jobs in Hyderabad on CutShort.io. Explore the latest Adobe CQ Job opportunities across top companies like Google, Amazon & Adobe.

- Create and lead innovative programs, software, and analytics that drive improvements to the availability, scalability, efficiency, and latency of the application
- Work with Directors to define and execute technical strategy and arbitrate technical processes and decisions.
- Work with development teams to guide future technology choices and foster cross-team collaboration
- Work across all teams to help define and clarify requirements, explore technical feasibility, and help define product directions and plans.
- Work closely with product & engineering leaders and tech leads to turn requirements into actionable plans that teams can understand.
- Define, Refine & Develop POCs in quick turnaround time and demonstrate the same to all stakeholders effortlessly
- Embed with teams from time to time to write code, provide technical guidance, etc.
- Very deep knowledge of the entire tech stack. Excellent understanding of the entire SDLC.
- Provides a multiplier effect in getting stuff done.
- Acts as SME for the team or product area within the Engineering organization and for cross-functional organizations as well.
- Teaches others why new features are important. Good understanding of customer use cases.
- A Bachelor's degree in any technical discipline
- Minimum 5 years of experience administering AEM applications
- Build management using Bamboo/Jenkins or relevant tech
- Configuration and Release management
- Design and build enablement
- Good communication skills
Job Summary:
We are looking for an experienced ETL Tester with 5 to 7 years of experience and expertise
in the banking domain. The candidate will be responsible for testing ETL processes,
ensuring data quality, and validating data flows in large-scale projects.
Key Responsibilities:
Design and execute ETL test cases, ensuring data integrity and accuracy.
Perform data validation using complex SQL queries.
Collaborate with business analysts to define testing requirements.
Track defects and work with developers to resolve issues.
Conduct performance testing for ETL processes.
Banking Domain Knowledge: Strong understanding of banking processes such as
payments, loans, credit, accounts, and regulatory reporting.
Required Skills:
5-7 years of ETL testing experience.
Strong SQL skills and experience with ETL tools (Informatica, SSIS, etc.).
Knowledge of banking domain processes.
Experience with test management tools (JIRA, HP ALM).
Familiarity with Agile methodologies.
Location – Hyderabad
Role Overview:
As an ABBY Support Engineer, you will play a crucial role in designing, implementing, and supporting automation solutions, with a strong focus on ABBYY FlexiCapture and related OCR/Data Capture technologies. You will collaborate with cross-functional teams to drive automation efficiencies and ensure seamless integration with enterprise applications.
________________________________________
Key Responsibilities:
• Provide technical support for ABBYY FlexiCapture, troubleshooting and resolving issues related to OCR, data extraction, and integration.
• Work closely with business analysts and stakeholders to understand requirements and configure ABBYY solutions accordingly.
• Participate in design sessions to determine technical feasibility, scalability, and performance considerations.
• Develop, customize, and optimize ABBYY FlexiCapture templates, scripts, and workflows to enhance automation effectiveness.
• Ensure seamless integration of ABBYY with various enterprise applications through APIs and connectors.
• Collaborate with RPA developers using Blue Prism, UiPath, and Power Automate for end-to-end automation solutions.
• Maintain technical documentation, including solution designs, troubleshooting guides, and best practices.
• Conduct UAT sessions, deploy solutions, and support post-implementation hypercare.
________________________________________
Required Skills & Qualifications:
• 2-4 years of experience in software development and automation support.
• 1+ year of hands-on experience with ABBYY FlexiCapture (OCR, Data Capture, Template Creation, and Scripting).
• Strong knowledge of SQL and experience in database query optimization.
• Familiarity with .NET, Java, or Python for scripting and automation.
• Understanding of CRUD operations and relational database management.
• Knowledge of API integration and working with REST/SOAP services.
• Exposure to RPA tools (UiPath, Blue Prism) is a plus.
• Strong analytical and problem-solving skills with an eagerness to learn new technologies.
• Excellent communication skills and ability to collaborate with cross-functional teams.
• Bachelor's degree in Computer Science, Information Technology, or a related field.
Key Responsibilities:
- Install, configure, deploy, and administer Camunda 7 &8 for enterprise-level applications.
- Design and model BPMN workflows and DMN decisions using Camunda Modeler.
- Set up Single Sign-On (SSO), monitoring, and logging frameworks within the Camunda BPM environment.
- Develop, deploy, and monitor workflows in Camunda Cockpit, resolving any technical issues as they arise.
- Integrate Camunda workflows with Microservices and SpringBoot frameworks.
- Collaborate with database teams to design and optimize SQL queries and stored procedures for Camunda-based applications.
- Develop and consume RESTful and SOAP web services, ensuring seamless integration with external systems.
- Handle JSON and XML objects within workflows to ensure data integrity and accuracy.
- Utilize Docker and Kubernetes for containerization and orchestration of Camunda BPM services.
- Stay updated with the latest trends and best practices in Camunda BPM and related technologies.
Required Skills:
- 4-8 years of experience with Camunda BPM and Java J2EE development.
- Strong experience with BPMN workflow modeling and DMN decision modeling in Camunda Modeler.
- Proficiency in setting up and configuring SSO, monitoring, and logging frameworks in Camunda BPM.
- Experience with Microservices architecture and SpringBoot.
- Solid understanding of database concepts with practical knowledge of SQL and stored procedures.
- Expertise in developing and integrating RESTful and SOAP web services.
- Hands-on experience with JSON and XML object handling within workflows.
- Experience with Docker and Kubernetes for containerized deployments.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
- Relevant certifications in Camunda BPM, Java, or related technologies are a plus.
We are looking for a Spark developer who knows how to fully exploit the potential of our Spark cluster. You will clean, transform, and analyze vast amounts of raw data from various systems using Spark to provide ready-to-use data to our feature developers and business analysts. This involves both ad-hoc requests as well as data pipelines that are embedded in our production
Requirements:
The candidate should be well-versed in the Scala programming language Should have experience in Spark Architecture and Spark Internals Exp in AWS is preferable Should have experience in the full life cycle of at least one big data application
- Working with Banks in 24x7 based environment and identify Application problems and advising on the solution.
- Analyzing application/OS logs to spot common trends and underlying problems.
- Analyzing underlying database using SQL queries (MS SQL /Oracle or any other relational database).
- Deploying, testing, and reporting bugs/issues in new modules and support clients in testing.
Skills Required:
- Good analytical and problem-solving skills.
- knowledge of Linux Commands and database queries.
- Knowledge of Application servers i.e Apache Tomcat/ Jboss.
- Knowledge of shell script
- knowledge on Banking Digital Products i.e. IMPS, UPI


● Design and develop features using Core Java, Spring Boot, and Hibernate
● Ability to design database schema, develop views and stored procedures
● Participate in user story grooming, design discussions and proposal of solutions
● Maintain existing software systems by identifying and correcting software defects
● Practice standard development process leveraging agile methodologies such as
SCRUM and TDD
● Review and analyze business requirements and provide technical feasibility and
estimates
● Manage development / support functions etc
● Excellent in OOPS concepts, system design
● Strong knowledge of Core Java, Spring, Hibernate and Microservices
● Hands-on experience in DB design, SQL, UI Technologies like HTML/CSS,
JavaScript, jQuery, etc.
● Good knowledge of design patterns
● Excellent knowledge of JSP, Servlets, WebServices, JUnit
● Experience in Agile software development
● Familiarity with JIRA, GIT, Maven
● Experience in working directly with a client
● Good knowledge in requirement gathering, analysis, and designin
Sr. Technical IT Recruiter
Prefers someone local to Bangalore, or okay with any Accion offices at Hyderabad, Pune, Mumbai post pandemic,
Complete WFH is also possible for a right candidate
Present Virtual Operations since Covid
Full Time Permanent Opportunity; Okay with CTH model too
Job Descritiption:
- Min of 3 plus years of relevant END-END recruitment experience; Starting from sourcing a candidate till onboarding, maintaing complete rapport
- Worked in fast paced environments, eager to learn new work, passionate about recruitment
- Individual contributor, has good listening skills,
- Smart working
- Thinking out of the way in hiring talent
- Good to have social medial platforms hiring
- Fast Leaner, who has aspirations to grow next level very soon
- Prior experience in hiring Emerging Technologies is a PLUS
- Very good opportunity to hire and recruit candidates only for cutting edge emerging techologies mostly
About the Role
The Dremio India team owns the development of the cloud infrastructure and services that power Dremio's Data Lake Engine. With focus on query performance optimization, supporting modern table formats like Iceberg, Deltalake and Nessie, this team provides endless opportunities to to define the products for next generation of data analytics.
In this role, you will get opportunities to impact high performance system software and scalable SaaS services through application of continuous performance management. You will plan, design, automate, execute the runs followed by deep analysis and identification of key performance fixes in collaboration with developers. Open and flexible work culture combined with providing employees ownership of the work they do will help you develop as a leader. The inclusive culture of the company will provide you a platform to bring fresh ideas and innovate.
Responsibilities
- Deliver end to end performance testing independently using agile methodologies
- Prepare performance test plans, load simulators and test harnesses to thoroughly test the products against the approved specifications
- Translate deep insight of architecture, product & usage into an enhanced automated performance measurement & evaluation framework to support continuous performance management.
- Evaluate & apply the latest tools, techniques and research insights to drive improvements into a world-class data analytics engine
- Collaborate with other engineering and customer success functions to simulate customer data and usage patterns, workloads to execute performance runs, identify and fix customer issues and make sure that customers get highly performant, optimized and scalable Dremio experience
- Analyze performance bottlenecks, root cause issues, file defects, follow up with developers, documentation and other teams on the resolution.
- Publish performance benchmark report based on test runs in accordance with industry standards
- Regularly communicate leadership team with an assessment of the performance, scalability, reliability, and robustness of products before they are exposed to customers
- Analyze and debug performance issues in customer environments.
- Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment.
- Actively participate in code and design reviews to maintain exceptional quality and deepen your understanding of the system architecture and implementation
Basic Requirements
- B.Tech/M.Tech/Equivalent in Computer Science or a related technical field
- 8+ years of performance automation engineering experience on large scale distributed systems
- Proficiency in any of Java/C++/Python/Go and automation frameworks
- Hands on experience in integration performance automation using CI/CD tools like Jenkins
- Experience in planning and executing performance engineering tasks to completion and taking ownership of performance epics during a set of sprints.
- Experience in designing, implementing, executing and analyzing automated performance tests for complex, production system software.
- Experience in analyzing performance bottlenecks in system, performing root cause analysis, and following issue resolution workflow to tune the system to provide optimized performance
- Ability to derive meaningful insights from the collected performance data, articulate performance findings effectively with senior team members to evaluate design choices.
- Experience with database systems internals, query optimization, understanding and tuning query access plans, and query execution instrumentation.
- Hands on experience of working projects on AWS, Azure and Google Cloud Platform
- Understanding of distributed file systems like S3 or ADLS or HDFS and HIVE
- Ability to create reusable components to automate repeatable, manual activities
- Ability to write technical reports and summary and present to leadership team
- Passion for learning and delivering using latest technologies
- Excellent communication skills and affinity for collaboration and teamwork
- Passion and ability to work in a fast paced and agile development environment.
Preferred Qualification
- Hands on experience of multi-threaded and asynchronous programming models
- Hands on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, storage systems