
Risk Resources LLP hyd
https://riskresourcesindia.comJobs at Risk Resources LLP hyd
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role - RSA Archer Technial Specialist
Location preferred - Bangalore + key metro
Exp Band - 10 +
JD
Experience in application development using the Archer platform
- Proficiency in Archer configuration, including custom fields, rules, and workflows
- Strong understanding of GRC concepts and the business context of Archer solutions
- Experience with web technologies including HTML, JavaScript, and CSS
- Familiarity with integration techniques and APIs
- Excellent problem-solving and analytical skills
- Able to work independently and collaboratively in a fast-paced environment
- Strong communication skills to interact with various stakeholders effectively
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities:
- Design, develop, and implement websites and components within Adobe Experience Manager (AEM).
- Develop and customize AEM components, templates, dialogs, workflows, servlets, and OSGi services.
- Integrate AEM with external systems and third-party services (CRM, DAM, Analytics, etc.).
- Implement responsive front-end solutions using HTML5, CSS3, JavaScript, ReactJS/Angular, and Sightly/HTL.
- Work closely with AEM Architects to ensure technical alignment and best practices.
- Optimize performance of AEM pages and ensure adherence to SEO and accessibility standards.
- Manage AEM environments, deployments, and configurations.
- Provide technical guidance and mentorship to junior developers.
- Collaborate with QA and DevOps teams for CI/CD automation and deployment pipelines.
- Troubleshoot and resolve AEM production issues.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Title: PySpark/Scala Developer
Functional Skills: Experience in Credit Risk/Regulatory risk domain
Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting
Good to Have Skills: Exposure to Machine Learning Techniques
Job Description:
5+ Years of experience with Developing/Fine tuning and implementing programs/applications
Using Python/PySpark/Scala on Big Data/Hadoop Platform.
Roles and Responsibilities:
a) Work with a Leading Bank’s Risk Management team on specific projects/requirements pertaining to risk Models in
consumer and wholesale banking
b) Enhance Machine Learning Models using PySpark or Scala
c) Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
d) Participate Feature Engineering, Training Models, Scoring and retraining
e) Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Desired Competencies (Technical/Behavioral Competency)
Must-Have
· Strong understanding of Kafka concepts, including topics, partitions, consumers, producers, and security.
· Experience with testing Kafka Connect, Kafka Streams, and other Kafka ecosystem components.
· API Testing Experience
· X-RAY and Test Automation Experience
· Expertise with Postman/SOAP
· Agile/JIRA/Confluence
· Strong familiarity such as XML, JSON, CSV, Avro, etc.
· Strong hands-on SQL, Mongo.
· Continuous integration and automated testing.
· Working knowledge and experience of Git.
Good-to-Have
· Troubleshoot Kafka related issues, Strong in Kafka client configuration and troubleshooting
SN
Responsibility of / Expectations from the Role
1
Engage with the customer to understand the requirements, provide technical solutions, provide value added suggestions
2
Help build and manage the team of Kafka and Java developers in the near future.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
• Technical expertise in the area of development of Master Data Management, data extraction, transformation, and load (ETL) applications, big data using existing and emerging technology platforms and cloud architecture
• Functions as lead developer• Support System Analysis, Technical/Data design, development, unit testing, and oversee end-to-end data solution.
• Technical SME in Master Data Management application, ETL, big data and cloud technologies
• Collaborate with IT teams to ensure technical designs and implementations account for requirements, standards, and best practices
• Performance tuning of end-to-end MDM, database, ETL, Big data processes or in the source/target database endpoints as needed.
• Mentor and advise junior members of team to provide guidance.
• Perform a technical lead and solution lead role for a team of onshore and offshore developers
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Mandatory Skills
· 3+ years of experience in AEM backend development.
· Strong hands-on experience with Java, OSGi, Sling, and JCR.
· Experience with Python for backend scripting, automation, or integration tasks.
· Knowledge of AEM architecture including dispatcher, replication agents, and workflows.
· Experience working with AEM APIs, servlets, event handlers, schedulers, and custom components.
· Understanding of REST APIs, JSON/XML data handling.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibility of / Expectations from the Role
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
Solid MS SQL Server skills including reporting experience.
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
Ensuring root cause resolution to identified problems.
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
Follow best practices and standards around data governance, security and privacy.
Comfortable working in a fast-paced team environment coordinating multiple projects.
Effective software development life cycle management skills and experience with GitHub
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
Data transformation and Data Analytics Documentation
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Desired Competencies (Technical/Behavioral Competency)
Must-Have
Snowflake, DBT , PL/SQL , Azure/AWS Overall Knowledge , Knowledge on DB Modelling , Knowledge on Data Warehouse concepts , Well versed with Agile Delivery, ETL Tools – Informatica/ADF
Good-to-Have
Azure 900/104/204 Certified, Informatica/SSIS/ADF
SN
Responsibility of / Expectations from the Role
1
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
2
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
3
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
4
Solid MS SQL Server skills including reporting experience.
5
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
6
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
7
Ensuring root cause resolution to identified problems.
8
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
9
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
10
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
11
Follow best practices and standards around data governance, security and privacy.
12
Comfortable working in a fast-paced team environment coordinating multiple projects.
13
Effective software development life cycle management skills and experience with GitHub
14
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
15
Data transformation and Data Analytics Documentation
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Location: Hyderabad, Indore and Ahmedabad.
Role:
We are looking for an experienced AEM Backend Developer to join our digital platforms team. The ideal candidate will have strong backend development skills in Java (primary) and/or Python, along with proven experience in designing and building scalable, maintainable Adobe Experience Manager (AEM) solutions. This role will focus on implementing backend logic, custom workflows, integrations, and supporting content management features.
Key Responsibilities:
· Design, develop, and maintain AEM-based backend solutions, components, and templates.
· Develop custom AEM Sling models, servlets, services, and OSGi components.
· Build and integrate RESTful services and APIs to support frontend and third-party systems.
· Work closely with frontend developers and AEM content authors to support dynamic content delivery.
· Develop automation scripts using Java/Python for data handling, deployment, and reporting needs.
· Implement AEM workflows, user permissions, and version control for content.
· Troubleshoot and resolve technical issues in AEM environments.
· Optimize AEM performance, scalability, and security configurations.
What You’ll Bring:
· Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
· Proven experience delivering high-quality web applications.
Mandatory Skills
· 3+ years of experience in AEM backend development.
· Strong hands-on experience with Java, OSGi, Sling, and JCR.
· Experience with Python for backend scripting, automation, or integration tasks.
· Knowledge of AEM architecture including dispatcher, replication agents, and workflows.
· Experience working with AEM APIs, servlets, event handlers, schedulers, and custom components.
· Understanding of REST APIs, JSON/XML data handling.
Similar companies
About the company
CAW Studios is Product Development Studio. WE BUILD TRUE PRODUCT TEAMS for our clients. Each team is a small, well-balanced group of geeks and a product manager that together produce relevant and high-quality products. We use data to make decisions, bringing big data and analysis to software development. We believe the product development process is broken as most studios operate as IT Services. We operate like a software factory that applies manufacturing principles of product development to the software.
Jobs
4
About the company
Deep Tech Startup Focusing on Autonomy and Intelligence for Unmanned Systems. Guidance and Navigation, AI-ML, Computer Vision, Information Fusion, LLMs, Generative AI, Remote Sensing
Jobs
4
About the company
LOVOJ is Bespoke Clothing Marketplace Platform. For the first time, end-users can shop custom-made tailored clothing online from any designer, tailor, boutiques of their own choice replacing traditional mass produced readymade clothing.
Jobs
8
About the company
Peak Hire Solutions is a leading Recruitment Firm that provides our clients with innovative IT / Non-IT Recruitment Solutions. We pride ourselves on our creativity, quality, and professionalism. Join our team and be a part of shaping the future of Recruitment.
Jobs
215
About the company
Jobs
3
About the company
Profit Labs aims to help ecommerce entrepreneurs in generating massive revenues by aiming to create useful apps on Shopify Platform.
Profit labs was founded 4 years back and is a bootstrapped company, Our Vision is to create tech solutions for brands to grow revenue and profit. We offer a diverse range of cutting-edge apps designed to optimize brand success. We understand the unique challenges faced by Shopify entrepreneurs, and we are dedicated to providing top-tier applications that address these pain points head-on.
Check out our apps on the Shopify store -
- Profit Bundle & Mystery Boxes : https://apps.shopify.com/bundles-5
- Inventory Sync - GoGo : https://apps.shopify.com/sync-master-gogo
Our plans for the next few years is to capitalize on our strengths and create new solutions from our deep understanding of how high volume stores operate.
Jobs
2
About the company
Jobs
3
About the company
Jobs
1





