
Risk Resources LLP hyd
https://riskresourcesindia.comJobs at Risk Resources LLP hyd
Key Responsibilities:
- Design, develop, and implement websites and components within Adobe Experience Manager (AEM).
- Develop and customize AEM components, templates, dialogs, workflows, servlets, and OSGi services.
- Integrate AEM with external systems and third-party services (CRM, DAM, Analytics, etc.).
- Implement responsive front-end solutions using HTML5, CSS3, JavaScript, ReactJS/Angular, and Sightly/HTL.
- Work closely with AEM Architects to ensure technical alignment and best practices.
- Optimize performance of AEM pages and ensure adherence to SEO and accessibility standards.
- Manage AEM environments, deployments, and configurations.
- Provide technical guidance and mentorship to junior developers.
- Collaborate with QA and DevOps teams for CI/CD automation and deployment pipelines.
- Troubleshoot and resolve AEM production issues.
Job Title: PySpark/Scala Developer
Functional Skills: Experience in Credit Risk/Regulatory risk domain
Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting
Good to Have Skills: Exposure to Machine Learning Techniques
Job Description:
5+ Years of experience with Developing/Fine tuning and implementing programs/applications
Using Python/PySpark/Scala on Big Data/Hadoop Platform.
Roles and Responsibilities:
a) Work with a Leading Bank’s Risk Management team on specific projects/requirements pertaining to risk Models in
consumer and wholesale banking
b) Enhance Machine Learning Models using PySpark or Scala
c) Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
d) Participate Feature Engineering, Training Models, Scoring and retraining
e) Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
Desired Competencies (Technical/Behavioral Competency)
Must-Have
· Strong understanding of Kafka concepts, including topics, partitions, consumers, producers, and security.
· Experience with testing Kafka Connect, Kafka Streams, and other Kafka ecosystem components.
· API Testing Experience
· X-RAY and Test Automation Experience
· Expertise with Postman/SOAP
· Agile/JIRA/Confluence
· Strong familiarity such as XML, JSON, CSV, Avro, etc.
· Strong hands-on SQL, Mongo.
· Continuous integration and automated testing.
· Working knowledge and experience of Git.
Good-to-Have
· Troubleshoot Kafka related issues, Strong in Kafka client configuration and troubleshooting
SN
Responsibility of / Expectations from the Role
1
Engage with the customer to understand the requirements, provide technical solutions, provide value added suggestions
2
Help build and manage the team of Kafka and Java developers in the near future.
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
• Technical expertise in the area of development of Master Data Management, data extraction, transformation, and load (ETL) applications, big data using existing and emerging technology platforms and cloud architecture
• Functions as lead developer• Support System Analysis, Technical/Data design, development, unit testing, and oversee end-to-end data solution.
• Technical SME in Master Data Management application, ETL, big data and cloud technologies
• Collaborate with IT teams to ensure technical designs and implementations account for requirements, standards, and best practices
• Performance tuning of end-to-end MDM, database, ETL, Big data processes or in the source/target database endpoints as needed.
• Mentor and advise junior members of team to provide guidance.
• Perform a technical lead and solution lead role for a team of onshore and offshore developers
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Mandatory Skills
· 3+ years of experience in AEM backend development.
· Strong hands-on experience with Java, OSGi, Sling, and JCR.
· Experience with Python for backend scripting, automation, or integration tasks.
· Knowledge of AEM architecture including dispatcher, replication agents, and workflows.
· Experience working with AEM APIs, servlets, event handlers, schedulers, and custom components.
· Understanding of REST APIs, JSON/XML data handling.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibility of / Expectations from the Role
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
Solid MS SQL Server skills including reporting experience.
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
Ensuring root cause resolution to identified problems.
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
Follow best practices and standards around data governance, security and privacy.
Comfortable working in a fast-paced team environment coordinating multiple projects.
Effective software development life cycle management skills and experience with GitHub
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
Data transformation and Data Analytics Documentation
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Desired Competencies (Technical/Behavioral Competency)
Must-Have
Snowflake, DBT , PL/SQL , Azure/AWS Overall Knowledge , Knowledge on DB Modelling , Knowledge on Data Warehouse concepts , Well versed with Agile Delivery, ETL Tools – Informatica/ADF
Good-to-Have
Azure 900/104/204 Certified, Informatica/SSIS/ADF
SN
Responsibility of / Expectations from the Role
1
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
2
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
3
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
4
Solid MS SQL Server skills including reporting experience.
5
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
6
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
7
Ensuring root cause resolution to identified problems.
8
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
9
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
10
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
11
Follow best practices and standards around data governance, security and privacy.
12
Comfortable working in a fast-paced team environment coordinating multiple projects.
13
Effective software development life cycle management skills and experience with GitHub
14
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
15
Data transformation and Data Analytics Documentation
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Location: Hyderabad, Indore and Ahmedabad.
Role:
We are looking for an experienced AEM Backend Developer to join our digital platforms team. The ideal candidate will have strong backend development skills in Java (primary) and/or Python, along with proven experience in designing and building scalable, maintainable Adobe Experience Manager (AEM) solutions. This role will focus on implementing backend logic, custom workflows, integrations, and supporting content management features.
Key Responsibilities:
· Design, develop, and maintain AEM-based backend solutions, components, and templates.
· Develop custom AEM Sling models, servlets, services, and OSGi components.
· Build and integrate RESTful services and APIs to support frontend and third-party systems.
· Work closely with frontend developers and AEM content authors to support dynamic content delivery.
· Develop automation scripts using Java/Python for data handling, deployment, and reporting needs.
· Implement AEM workflows, user permissions, and version control for content.
· Troubleshoot and resolve technical issues in AEM environments.
· Optimize AEM performance, scalability, and security configurations.
What You’ll Bring:
· Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
· Proven experience delivering high-quality web applications.
Mandatory Skills
· 3+ years of experience in AEM backend development.
· Strong hands-on experience with Java, OSGi, Sling, and JCR.
· Experience with Python for backend scripting, automation, or integration tasks.
· Knowledge of AEM architecture including dispatcher, replication agents, and workflows.
· Experience working with AEM APIs, servlets, event handlers, schedulers, and custom components.
· Understanding of REST APIs, JSON/XML data handling.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities:
· Develop and maintain frontend components using AEM Sites.
· Integrate AEM with React/Angular for enhanced user interfaces.
· Work with AEM templates, components, dialogs, and client libraries.
· Collaborate with backend AEM developers and designers to implement dynamic and responsive web features.
· Ensure code quality through unit testing and best practices.
· Participate in code reviews, debugging, and performance tuning.
· Support AEM content authors with technical implementation and troubleshooting.
What You’ll Bring:
· Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
· Proven experience delivering high-quality web applications.
Mandatory Skills:
· 3+ years of frontend development experience with React.js or Angular.
· 2+ years of experience working with Adobe Experience Manager (AEM).
· Strong knowledge of HTML5, CSS3, JavaScript, and TypeScript.
· Experience with AEM client libraries, Sightly (HTL), Sling models, and component development.
· Familiarity with RESTful APIs and JSON.
· Understanding of responsive design, cross-browser compatibility, and accessibility standards.
· Familiarity with Git, CI/CD pipelines, and Agile methodologies.
Similar companies
About the company
To hire better and faster, companies need rich candidate data, smart software and sound human judgement.
Cutshort is using AI to combine all these 3 to offer a 10x talent sourcing solution that is faster, better and cheaper.
We have 3 AI-powered offerings
- Hire using our AI platform: Affordable annual subscriptions
- Get only sourcing: 3.5% of annual CTC when you hire
- Get full recruiting: 6.99% of annual CTC when you hire
Customers such as Fractal, Sprinto, Shiprocket, Highlevel, ThoughtWorks, Deepintent have built strong engineering teams with Cutshort.
Jobs
1
About the company
CoffeeBeans Consulting is a technology partner dedicated to driving business transformation. With deep expertise in Cloud, Data, MLOPs, AI, Infrastructure services, Application modernization services, Blockchain, and Big Data, we help organizations tackle complex challenges and seize growth opportunities in today’s fast-paced digital landscape. We’re more than just a tech service provider; we're a catalyst for meaningful change
Jobs
8
About the company
Beyond Seek is a team of R.A.R.E individuals who're solving impactful problems using the best tools available today!
Jobs
2
About the company
Welcome to Fullness!!
We are an IT and web solutions company providing end-to-end technology and design solutions to clients. Our work is an amalgamation of functional tech and aesthetic design. We excel in building tech systems from scratch, allowing us to integrate a high level of flexibility and customisation in our clients’ web and mobile assets.
Jobs
1
About the company
Jobs
3
About the company
Clink is reimagining restaurant growth — no commissions, no food bloggers, just AI-powered loyalty and real customer influence.
Our platform helps restaurants turn diners into repeat customers and brand advocates using smart rewards and Instagram-powered virality. With every visit, customers earn personalized rewards and post about their experience on instagram driving organic traffic, not paid ads.
If you're excited by AI, social growth, and building the future of hospitality tech — Clink is the place to be.
Jobs
1
About the company
ZestFindz Private Limited is a Hyderabad-based startup founded in February 2025.
We simplify online retail by offering a curated marketplace for everyday essentials, fashion, home goods, skincare, and more backed by powerful seller tools. Our goal: make selling and shopping seamless with solid tech, transparent operations and customer-first design.
Jobs
0
About the company
Jobs
1
About the company
We are a B2B Technology Marketing Agency, helping our clients communicate more effectively, drive demand, and optimize their customer journey. Our team of tech-passionate experts lead with business objectives, speak the tech and channel language, and then put their creativity, tools, and technical skills to use to help brands achieve their goals. Some key clients we have worked with: are Microsoft, IBM, Alibaba Cloud, Eka Software, Cigniti Technologies, Redington, and Polestar Solutions.
There is no other industry that changes at the speed at which technology does. Innovations keep on coming up, competition keeps on increasing, and GTM models keep on evolving. Marketers are pulled in a million directions. They must deliver on that plus they have to stay abreast of the latest happenings. That's where we come in - to supplement the team and support top priorities, so clients achieve high-quality results, fast. We operate as an extension, caring just as much about success, challenges, and growth opportunities, as rolling up our sleeves time and again to help figure it out.
Jobs
5







