Cutshort logo
Data governance jobs

11+ Data governance Jobs in India

Apply to 11+ Data governance Jobs on CutShort.io. Find your next job, effortlessly. Browse Data governance Jobs and apply today!

icon
Affine
Shibhani Shetty
Posted by Shibhani Shetty
Remote only
8 - 13 yrs
Best in industry
Data Warehouse (DWH)
Informatica
ETL
snowpipe
snowSql
+9 more

Snowflake Administrator Acquired skills:

  • Identify the various aspects of compute and storage management
  • Illustrate administrative tasks within the detailed architecture of Snowflake
  • Review Snowflake best practices & considerations for managing load operations and performance
  • Describe data governance in Snowflake, including column-level data security using secure views and
  • dynamic data masking features
  • Manage multiple accounts across the Organization
  • Describe the DDL operations that work with fundamental database objects
  • Discuss transaction and concurrency models and DML considerations
  • Employ recovery methods and agile development with Time Travel & Cloning
  • Implement advanced techniques for Snowflake performance-tuning methodologies
  • Design and develop secure access to objects in Snowflake with Role-Based Access Control (RBAC)
  • Recommend the Snowflake best practices for management, monitoring, and optimization
  • Use data replication for data sharing across accounts and for failover scenarios
  • Share data securely both within and without your organization


Detailed Snowflake Architecture and Overview

  • Snowflake Technical Overview
  • Overview of Three-Tiered Architecture


Compute Management

  • Scaling Virtual Warehouses Up & Out
  • Creating and Managing Virtual Warehouses


Mandatory Skills:

Compute & Storage Management, Load Operations Management, Snowflake Architecture, Data Governance, Column-Level Data Security, Dynamic Data Masking, DDL/DML Operations, Transaction & Concurrency Models, Recovery Methods, Agile Development, Time Travel, Role-Based Access Control (RBAC), Data Replication, Secure Data Sharing


Good to Have Skills:

Talend Admin Experience, Cost Optimization

Read more
a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.

a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.

Agency job
via HyrHub by Shwetha Naik
Bengaluru (Bangalore), Mangalore
8 - 12 yrs
₹18L - ₹22L / yr
Microsoft Dynamics
ETL
Microsoft Windows Azure
Data governance

=

·  Experience in MS Customer Insights Data

·  Experience in MS customer insights journey

·  Experience in Azure, data verse, and Power platform

·  At least one full project experience on CDP if not certified people with 3 years’ experience in CDP-related support etc work.


Responsibilities:

·  Design, develop, and implement solutions using Microsoft Customer Data Platform (CDP) to manage and analyze customer data.

·  Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.

·  Integrate CDP with various data sources and ensure seamless data flow and accuracy.

·  Develop and maintain data pipelines, ensuring data is collected, processed, and stored efficiently.

·  Create and manage customer profiles, segments, and audiences within the CDP.

·  Implement data governance and security best practices to protect customer data.

·  Monitor and optimize the performance of the CDP infrastructure.

·  Provide technical support and troubleshooting for CDP-related issues.

·  Stay updated with the latest trends and advancements in CDP technology and best practices.

 

Read more
NASDAQ listed, Service Provider IT Company

NASDAQ listed, Service Provider IT Company

Agency job
via CaptiveAide Advisory Pvt Ltd by Abhishek Dhuria
Bengaluru (Bangalore), Hyderabad
12 - 15 yrs
₹45L - ₹48L / yr
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Microsoft Windows Azure
Cloud Computing
Microservices
+11 more

Job Summary:

As a Cloud Architect at organization, you will play a pivotal role in designing, implementing, and maintaining our multi-cloud infrastructure. You will work closely with various teams to ensure our cloud solutions are scalable, secure, and efficient across different cloud providers. Your expertise in multi-cloud strategies, database management, and microservices architecture will be essential to our success.


Key Responsibilities:

  • Design and implement scalable, secure, and high-performance cloud architectures across multiple cloud platforms (AWS, Azure, Google Cloud Platform).
  • Lead and manage cloud migration projects, ensuring seamless transitions between on-premises and cloud environments.
  • Develop and maintain cloud-native solutions leveraging services from various cloud providers.
  • Architect and deploy microservices using REST, GraphQL to support our application development needs.
  • Collaborate with DevOps and development teams to ensure best practices in continuous integration and deployment (CI/CD).
  • Provide guidance on database architecture, including relational and NoSQL databases, ensuring optimal performance and security.
  • Implement robust security practices and policies to protect cloud environments and data.
  • Design and implement data management strategies, including data governance, data integration, and data security.
  • Stay-up-to-date with the latest industry trends and emerging technologies to drive continuous improvement and innovation.
  • Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability.
  • Optimize cost and performance across different cloud environments.


Qualifications/ Experience & Skills Required:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Experience: 10 - 15 Years
  • Proven experience as a Cloud Architect or in a similar role, with a strong focus on multi-cloud environments.
  • Expertise in cloud migration projects, both lift-and-shift and greenfield implementations.
  • Strong knowledge of cloud-native solutions and microservices architecture.
  • Proficiency in using GraphQL for designing and implementing APIs.
  • Solid understanding of database technologies, including SQL, NoSQL, and cloud-based database solutions.
  • Experience with DevOps practices and tools, including CI/CD pipelines.
  • Excellent problem-solving skills and ability to troubleshoot complex issues.
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment.
  • Deep understanding of cloud security practices and data protection regulations (e.g., GDPR, HIPAA).
  • Experience with data management, including data governance, data integration, and data security.


Preferred Skills:

  • Certifications in multiple cloud platforms (e.g., AWS Certified Solutions Architect, Google Certified Professional Cloud Architect, Microsoft Certified: Azure Solutions Architect).
  • Experience with containerization technologies (Docker, Kubernetes).
  • Familiarity with cloud cost management and optimization tools.
Read more
Kanerika Software

at Kanerika Software

1 recruiter
Rahul Balothia
Posted by Rahul Balothia
RIYADH (Saudi Arabia)
5 - 10 yrs
₹10L - ₹25L / yr
Data Warehouse (DWH)
Informatica
ETL
Informatica MDM
IDQ
+1 more

Job Description

We are looking for you!

You are a team player, get-it-done person, intellectually curious, customer focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. You must have experience in creating visually compelling designs that effectively communicate our message and engage our target audience. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.

You have excellent verbal and written communication skills with the ability to craft compelling UI creatives, visually compelling designs that effectively communicate our message and engage our target audience. You are self-motivated with a strong work ethic, positive attitude, and demeanor, enthusiastic when embracing new challenges, able to multitask and prioritize (good time management skills), willingness to learn new technology/methodologies, adaptable and flexible when new products are assigned. You prefer to work independently with less or no supervision. You are process oriented, have a methodical approach and demonstrate quality first approach and preferably who have worked in a result-oriented product marketing team(s). 

Requirements

What youll do

  • Solution Design: Collaborate with clients to understand their business requirements and design comprehensive Informatica MDM solutions tailored to their needs.
  • Implementation: Lead the implementation of Informatica MDM solutions, including configuration, customization, and integration with existing systems.
  • Data Modeling: Design and implement data models to support MDM initiatives, ensuring data integrity, accuracy, and consistency across systems.
  • Data Quality Management: Implement Informatica Data Quality (IDQ) processes and workflows to improve data quality and ensure compliance with organizational standards.
  • Integration: Integrate Informatica MDM solutions with other enterprise systems, such as CRM, ERP, and BI platforms, to facilitate seamless data exchange and interoperability.
  • Customization: Develop custom components, workflows, and extensions to extend the functionality of Informatica MDM and meet specific business requirements.
  • Data Governance: Establish data governance policies, standards, and best practices to govern the use, management, and quality of master data within the organization.
  • Documentation: Prepare comprehensive documentation, including solution designs, technical specifications, user guides, and training materials, to support the implementation and maintenance of Informatica MDM solutions.
  • Testing and Quality Assurance: Conduct thorough testing and quality assurance activities to ensure the reliability, scalability, and performance of Informatica MDM solutions.
  • Training and Support: Provide training and support to end-users, administrators, and stakeholders to ensure successful adoption and utilization of Informatica MDM solutions.

What you’ll bring

  • Bachelor's degree in Computer Science, Information Systems, or related field; Master's degree preferred.
  • Minimum of 4 years of experience in implementing and configuring Informatica MDM solutions.
  • Proficiency in Informatica MDM Hub, Informatica Data Director, Informatica Data Quality (IDQ), and related Informatica tools and technologies.
  • Strong understanding of data modelling principles, data integration concepts, and master data management best practices.
  • Hands-on experience with data profiling, cleansing, and transformation using Informatica IDQ.
  • Experience with data governance frameworks, data stewardship workflows, and data quality metrics.
  • Excellent communication skills and the ability to effectively collaborate with cross-functional teams and stakeholders.
  • Proven ability to work independently, manage multiple priorities, and deliver high-quality results within tight deadlines.
  • Informatica certification(s) in MDM and/or IDQ is a plus.

Benefits

1. Culture:

  • Open Door Policy: Encourages open communication and accessibility to management.
  • Open Office Floor Plan: Fosters a collaborative and interactive work environment.
  • Flexible Working Hours: Allows employees to have flexibility in their work schedules.
  • Employee Referral Bonus: Rewards employees for referring qualified candidates.
  • Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.

2. Inclusivity and Diversity:

  • Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
  • Mandatory POSH training: Promotes a safe and respectful work environment.

3. Health Insurance and Wellness Benefits:

  • GMC and Term Insurance: Offers medical coverage and financial protection.
  • Health Insurance: Provides coverage for medical expenses.
  • Disability Insurance: Offers financial support in case of disability.

4. Child Care & Parental Leave Benefits:

  • Company-sponsored family events: Creates opportunities for emplo4. Child Care & Parental Leave Benefits:yees and their families to bond.
  • Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
  • Family Medical Leave: Offers leave for employees to take care of family members' medical needs.

5. Perks and Time-Off Benefits:

  • Company-sponsored outings: Organizes recreational activities for employees.
  • Gratuity: Provides a monetary benefit as a token of appreciation.
  • Provident Fund: Helps employees save for retirement.
  • Generous PTO: Offers more than the industry standard for paid time off.
  • Paid sick days: Allows employees to take paid time off when they are unwell.
  • Paid holidays: Gives employees paid time off for designated holidays.
  • Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.

6. Professional Development Benefits:

  • L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
  • Mentorship Program: Offers guidance and support from experienced professionals.
  • Job Training: Provides training to enhance job-related skills.
  • Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
  • Promote from Within: Encourages internal growth and advancement opportunities.


About company

Who we are


Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Co-founded by Wharton Business School Aluminous, Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data. We leverage cutting-edge technologies and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.

Awards and Recognitions

Kanerika has won several awards over the years, including:

  • Best Place to Work 2022 by Great Place to Work
  • Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
  • Frost & Sullivan India 2021 Technology Innovation Award for its Compass composable solution architecture
  • Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.



Read more
Career Forge

at Career Forge

2 candid answers
Mohammad Faiz
Posted by Mohammad Faiz
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹12L - ₹15L / yr
skill iconPython
Apache Spark
PySpark
Data engineering
ETL
+10 more

🚀 Exciting Opportunity: Data Engineer Position in Gurugram 🌐


Hello 


We are actively seeking a talented and experienced Data Engineer to join our dynamic team at Reality Motivational Venture in Gurugram (Gurgaon). If you're passionate about data, thrive in a collaborative environment, and possess the skills we're looking for, we want to hear from you!


Position: Data Engineer  

Location: Gurugram (Gurgaon)  

Experience: 5+ years 


Key Skills:

- Python

- Spark, Pyspark

- Data Governance

- Cloud (AWS/Azure/GCP)


Main Responsibilities:

- Define and set up analytics environments for "Big Data" applications in collaboration with domain experts.

- Implement ETL processes for telemetry-based and stationary test data.

- Support in defining data governance, including data lifecycle management.

- Develop large-scale data processing engines and real-time search and analytics based on time series data.

- Ensure technical, methodological, and quality aspects.

- Support CI/CD processes.

- Foster know-how development and transfer, continuous improvement of leading technologies within Data Engineering.

- Collaborate with solution architects on the development of complex on-premise, hybrid, and cloud solution architectures.


Qualification Requirements:

- BSc, MSc, MEng, or PhD in Computer Science, Informatics/Telematics, Mathematics/Statistics, or a comparable engineering degree.

- Proficiency in Python and the PyData stack (Pandas/Numpy).

- Experience in high-level programming languages (C#/C++/Java).

- Familiarity with scalable processing environments like Dask (or Spark).

- Proficient in Linux and scripting languages (Bash Scripts).

- Experience in containerization and orchestration of containerized services (Kubernetes).

- Education in database technologies (SQL/OLAP and Non-SQL).

- Interest in Big Data storage technologies (Elastic, ClickHouse).

- Familiarity with Cloud technologies (Azure, AWS, GCP).

- Fluent English communication skills (speaking and writing).

- Ability to work constructively with a global team.

- Willingness to travel for business trips during development projects.


Preferable:

- Working knowledge of vehicle architectures, communication, and components.

- Experience in additional programming languages (C#/C++/Java, R, Scala, MATLAB).

- Experience in time-series processing.


How to Apply:

Interested candidates, please share your updated CV/resume with me.


Thank you for considering this exciting opportunity.

Read more
Leading Sales Platform

Leading Sales Platform

Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
5 - 10 yrs
₹30L - ₹45L / yr
Big Data
ETL
Spark
Data engineering
Data governance
+4 more
Work with product managers and development leads to create testing strategies · Develop and scale automated data validation framework · Build and monitor key metrics of data health across the entire Big Data pipelines · Early alerting and escalation process to quickly identify and remedy quality issues before something ever goes ‘live’ in front of the customer · Build/refine tools and processes for quick root cause diagnostics · Contribute to the creation of quality assurance standards, policies, and procedures to influence the DQ mind-set across the company
Required skills and experience: · Solid experience working in Big Data ETL environments with Spark and Java/Scala/Python · Strong experience with AWS cloud technologies (EC2, EMR, S3, Kinesis, etc) · Experience building monitoring/alerting frameworks with tools like Newrelic and escalations with slack/email/dashboard integrations, etc · Executive-level communication, prioritization, and team leadership skills
Read more
a global business process management company

a global business process management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
2 - 8 yrs
₹4L - ₹10L / yr
Data governance
Data security
skill iconData Analytics
Informatica
SQL
+4 more

Job Description

We are looking for a senior resource with Analyst skills and knowledge of IT projects, to support delivery of risk mitigation activities and automation in Aviva’s Global Finance Data Office. The successful candidate will bring structure to this new role in a developing team, with excellent communication, organisational and analytical skills. The Candidate will play the primary role of supporting data governance project/change activities. Candidates should be comfortable with ambiguity in a fast-paced and ever-changing environment. Preferred skills include knowledge of Data Governance, Informatica Axon, SQL, AWS. In our team, success is measured by results and we encourage flexible working where possible.

Key Responsibilities

  • Engage with stakeholders to drive delivery of the Finance Data Strategy
  • Support data governance project/change activities in Aviva’s Finance function.
  • Identify opportunities and implement Automations for enhanced performance of the Team

Required profile

  • Relevant work experience in at least one of the following: business/project analyst, project/change management and data analytics.
  • Proven track record of successful communication of analytical outcomes, including an ability to effectively communicate with both business and technical teams.
  • Ability to manage multiple, competing priorities and hold the team and stakeholders to account on progress.
  • Contribute, plan and execute end to end data governance framework.
  • Basic knowledge of IT systems/projects and the development lifecycle.
  • Experience gathering business requirements and reports.
  • Advanced experience of MS Excel data processing (VBA Macros).
  • Good communication

 

Additional Information

Degree in a quantitative or scientific field (e.g. Engineering, MBA Finance, Project Management) and/or experience in data governance/quality/privacy
Knowledge of Finance systems/processes
Experience in analysing large data sets using dedicated analytics tools

 

Designation – Assistant Manager TS

Location – Bangalore

Shift – 11 – 8 PM
Read more
They provide both wholesale and retail funding. PM1

They provide both wholesale and retail funding. PM1

Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
They provide both wholesale and retail funding. PM1

They provide both wholesale and retail funding. PM1

Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
AWS KINESYS
Data engineering
AWS Lambda
DynamoDB
data pipeline
+11 more
  • Key responsibility is to design & develop a data pipeline for real-time data integration, processing, executing of the model (if required), and exposing output via MQ / API / No-SQL DB for consumption
  • Provide technical expertise to design efficient data ingestion solutions to store & process unstructured data, such as Documents, audio, images, weblogs, etc
  • Developing API services to provide data as a service
  • Prototyping Solutions for complex data processing problems using AWS cloud-native solutions
  • Implementing automated Audit & Quality assurance Checks in Data Pipeline
  • Document & maintain data lineage from various sources to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Skills

  • Programming experience using Python & SQL
  • Extensive working experience in Data Engineering projects, using AWS Kinesys,  AWS S3, DynamoDB, EMR, Lambda, Athena, etc for event processing
  • Experience & expertise in implementing complex data pipeline
  • Strong Familiarity with AWS Toolset for Storage & Processing. Able to recommend the right tools/solutions available to address specific data processing problems
  • Hands-on experience in Unstructured (Audio, Image, Documents, Weblogs, etc) Data processing.
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Real-time Event Processing
  • Data Governance & Quality assurance
  • Containerized deployment
  • Linux
  • Unstructured Data Processing
  • AWS Toolsets for Storage & Processing
  • Data Security

 

Read more
European Bank headquartered at Copenhagen, Denmark.

European Bank headquartered at Copenhagen, Denmark.

Agency job
via Apical Mind by Rajeev T
NCR (Delhi | Gurgaon | Noida)
2 - 12 yrs
₹25L - ₹40L / yr
Data governance
DevOps
Data integration
Data engineering
skill iconPython
+14 more
Data Platforms (Data Integration) is responsible for envisioning, building and operating the Bank’s data integration platforms. The successful candidate will work out of Gurgaon as a part of a high performing team who is distributed across our two development centers – Copenhagen and Gurugram. The individual must be driven, passionate about technology and display a level of customer service that is second to none.

Roles & Responsibilities

  • Designing and delivering a best-in-class, highly scalable data governance platform
  • Improving processes and applying best practices
  • Contribute in all scrum ceremonies; assuming the role of ‘scum master’ on a rotational basis
  •  Development, management and operation of our infrastructure to ensure it is easy to deploy, scalable, secure and fault-tolerant
  • Flexible on working hours as per business needs
Read more
Client of People First Consultants

Client of People First Consultants

Agency job
Bengaluru (Bangalore), Chennai
10 - 15 yrs
₹18L - ₹23L / yr
Data governance
Informatica
Informatica Data Quality

Qualifications & Skills

  • Proven track record in delivering Data Governance Solutions to a large enterprise
  • Knowledge experience in data governance frameworks, formulating data governance policy, standards and processes
  • Experience in program management and managing cross functional stakeholders from senior leadership to project manager level
  • Experience in leading a team of data governance business analysts
  • Experience in data governance tools like Informatica Data Quality, Enterprise Data Catalog, Axon, Collibra
  • Experience in metadata management, master and reference data management, data quality and data governance
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort