
- Excellent knowledge with proven implementation experience of:
- Axiom Controller View V9 / V10 (2+ years’ experience).
- Building and configuring Axiom workflows to solve complex business problems and mapping to Axiom data dictionaries.
- XBRL and template disclosure submission including taxonomy, data point model and validations experience.
- Knowledge of EU regulatory reporting requirements (COREP/Statistical) with strong Capital knowledge.
- Packaging, deploying, branching Axiom solution including upgrades as part of the software development lifecycle.
- Oracle PL/SQL.

About Latent Bridge Pvt Ltd
About
Connect with the team
Similar jobs
Job Purpose
We are seeking a highly skilled Data Architect to join our Financial Business Automation Products group. The ideal candidate will have a deep understanding of data architecture principles, extensive experience with data modeling, and the ability to design and implement scalable data solutions in financial domains. As a Data Architect, you will develop and maintain the data strategy, governance, security, and availability for the suite of products developed by Acuity and used by Acuity clients. The individual must be familiar with Gen AI technologies and adapt at creating data strategy to allow the Gen AI based LLMs use the data infrastructure to provide relevant outcomes.
Key Responsibilities
• Platform Design and Architecture:
o Lead the design and development of the data platform architecture, ensuring scalability, performance, reliability, and security
o Define and implement Standards for data modeling, data integration, and data lifecycle management
o Well versed with modern data platform stack with end-end coverage to build large scale Data and AI solutions
o Create blueprints for data pipelines, data lakes, data warehouses, and analytical systems
o Provide technical leadership in choosing appropriate technologies for data processing, cloud compute, and storage solutions
• Technical Solutions and Roadmap: o Influence enterprise architecture design conversations and deliver sophisticated data solutions
o Work closely with leaders, data engineers, data scientists, and analysts to define and refine data platform requirements
o Lead cross-functional teams to develop and integrate new data products and solutions
o Understand business needs and translate them into data solutions and architecture roadmap that add value to the organization
• Cloud usage and Governance
o Design and implement cloud-based solutions for data processing and storage (e.g. Azure, Snowflake, Databricks, GCP etc)
o Optimize cloud resources for cost efficiency, performance, and availability
o Ensure the security and compliance of data platforms, addressing regulatory and privacy concerns
o Develop strategies to enforce data governance policies, ensuring data quality, consistency, and integrity across systems
o Design data security measures and control access to sensitive data through role-based access and encryption
• Innovation & Continuous Improvement:
o Stay up-to-date with emerging technologies and trends in data architecture, big data, cloud computing, and AI
o Recommend and lead initiatives to improve the performance, scalability, and efficiency of data processing and storage systems
o Act as the Data Architecture subject matter expert to drive the innovation for the company
• Documentation and Technical design:
o Produce detailed documentation for platform architecture, data models, and data workflows
o Well versed with technical design, diagrams, and documentation tools • AI Integration:
o Collaborate with AI/ML teams to ensure data architectures support Gen AI initiatives.
o Enable real-time and batch data processing for AI model training and deployment.
Technology and Tools:
• 10+ years experience in designing and implementing end-to-end data platforms, including data lakes, data warehouses, and data integration pipelines.
• Experience designing and developing low-latency and high-throughput enterprise grade data architecture ecosystem
• Knowledge of relational and non-relational databases, and big data technologies (e.g., Hadoop, Spark, Kafka).
• Expertise in cloud platforms Azure, Snowflake, Databricks, Github, Jenkins etc
• Strong knowledge of ETL processes and tools for real-time data processing
• Proficiency in building data solutions using tools like Apache Kafka, Apache Airflow, and dbt (Data Build Tool) and Python
• Strong understanding of SQL and data querying best practices
• Proficiency in managing and deploying solutions on cloud platforms such as Azure, Snowflake, Databricks
• Experience with data encryption, privacy, and security best practices, including GDPR compliance
• Excellent problem-solving and communication skills
• Strong scripting skills in Python, Shell, or similar languages for automation and process optimization
• Familiarity with CI/CD pipelines, version control (Git), and deployment automation tools (Jenkins, Terraform)
• Familiarity with BI tools such as Tableau, Power BI, or Looker, as well as experience working with data scientists and analysts to support analytical workloads


What we are looking for:-
1+ years of experience. • Good understanding of website integration and software creation. • Excellent verbal and written communication skills. • Able to follow clear instructions according to client demands. • Strong analytical and problem-solving skills. • Creative ability to produce effective solutions to client problems. • Good team player that is enthusiastic about delivering results. • Hands on HTML, CSS, JavaScript, PHP, C++, JAVA and other relevant web design coding languages. • Create and test applications for websites. • Collaborate with team members. • Troubleshoot website problems. • Proficient understanding of code versioning tools. • Familiarity with development aiding tools. • Knowledge of React JS & Angular JS is a plus point.
- Azure Data Factory, Azure Data Bricks, Talend, BODS, Jenkins
- Microsoft Office (mandatory)
- Strong knowledge on Databases, Azure Synapse, data management, SQL
- Knowledge on any cloud platforms (Azure, AWS etc.,)
Must Have Skills:

The candidate must have :
1. Excellent problem solving and logical skills.
2. Highly proficient with Node JS and JavaScript design patterns.
3. Developing RESTful APIs that read and write JSON.
4. Experience with databases like MongoDB, Redis or any NoSQL databases.
5. Proficient understanding of code versioning tools, such as Git.
6. Experience with AWS, Elastic Search would be an added advantage.
7. Ability to plan, execute projects to deliver in time and with quality.
8. Highly motivated Individual to learn and to mentor project members.
9. Excellent Communication and collaboration skills.


About the Company:
It is a Data as a Service company that helps businesses harness the power of data. Our technology fuels some of the most interesting big data projects of the word. We are a small bunch of people working towards shaping the imminent data-driven future by solving some of its fundamental and toughest challenges.
Role: We are looking for an experienced team lead to drive data acquisition projects end to end. In this role, you will be working in the web scraping team with data engineers, helping them solve complex web problems and mentor them along the way. You’ll be adept at delivering large-scale web crawling projects, breaking down barriers for your team and planning at a higher level, and getting into the detail to make things happen when needed.
Responsibilities
- Interface with clients and sales team to translate functional requirements into technical requirements
- Plan and estimate tasks with your team, in collaboration with the delivery managers
- Engineer complex data acquisition projects
- Guide and mentor your team of engineers
- Anticipate issues that might arise and proactively consider those into design
- Perform code reviews and suggest design changes
Prerequisites
- Between 5-8 years of relevant experience
- Fluent programming skills and well-versed with scripting languages like Python or Ruby
- Solid foundation in data structures and algorithms
- Excellent tech troubleshooting skills
- Good understanding of web data landscape
- Prior exposure to DOM, XPATH and hands on experience with selenium/automated testing is a plus
Skills and competencies
- Prior experience with team handling and people management is mandatory
- Work independently with little to no supervision
- Extremely high attention to detail
- Ability to juggle between multiple projects

• To manage the implementation of MES projects including managing the schedule,team coordination etc.
• Work with the teams to deliver system solutions for its customers and participate in the entire process.
• Facilitate customer workshops to discover and document customer requirements
• Lead customer design sessions to formulate and document customer solutions
• Be the SME for the development and deployment phases of the project
• Participate in the development of business test cases and solution testing.
• Provide onsite customer training.
Skills Expected:
• Excellent presentation and facilitation skills
• Able to work independently and efficiently to meet deadlines
• Able to multi-task multiple priorities in a fast-paced environment
• Self motivated, detail-oriented and excellent organizational and time management skills
• Team player with a ‘can-do’ positive attitude
• Fluent in English (both spoken and written) for communication, interpersonal, organizational,
and presentation skills
• Able to travel frequently
• Have a minimum of eight years of related





