11+ Informatica MDM Jobs in Pune | Informatica MDM Job openings in Pune
Apply to 11+ Informatica MDM Jobs in Pune on CutShort.io. Explore the latest Informatica MDM Job opportunities across top companies like Google, Amazon & Adobe.

Company Overview:
Our client is a global leader in AI Assisted Conversational messaging solutions, transforming the way professionals and institutions engage with customers. With cutting-edge technologies like Conversational AI and Chatbots, they have amassed 5000 customers across SMBs and Mid-market sectors, integrating seamlessly with top CRMs like Salesforce, Zoho, and Hubspot.
Position Overview:
As a rapidly growing company, we are seeking a dynamic professional to join our leadership team as a Director, Business Intelligence. In this role, you will oversee the organization's Data and information systems operations, focusing on data, tools, and analytics supporting stakeholders across Product, Sales, Marketing, Operations, and Finance.
Key Responsibilities:
- Develop and implement data architecture and management strategies.
- Establish master data management (MDM) processes for a single source of truth.
- Lead system integration aligning data systems with business goals.
- Collaborate with senior management, defining KPIs and leading data initiatives.
- Identify data trends, patterns, and insights using visualization tools.
- Evaluate, select, and manage vendors for effective service delivery.
- Implement data governance practices, including quality assurance and lifecycle management.
- Build and manage a high-performing systems and information team.
Qualifications:
- Bachelor's degree in computer science, information systems, or related field.
- 10+ years of experience in systems management, preferably in SaaS/D2C industry.
- Proficiency in data analysis methods, tools, and best practices.
- Experience with Salesforce and Power BI is essential.
- Strong problem-solving and decision-making skills.
- Knowledge of industry best practices and emerging trends.
- Demonstrated ability to lead cross-functional teams.
- Excellent communication and interpersonal skills.
Benefits:
- Competitive Salary
- Performance-Based Incentives
- Bonus Compensation
- Comprehensive Insurance Coverage
- Shift Allowance
- Emphasis on Work-Life Balance

Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.


Job Description:
Min 2 to 4 Years of experience in C# and ASP.NET, Web application development.
Knowledge about cloud programming or migration to cloud is preferred.
Mandatory skills:
- Proficient in Web Application development using ASP.NET, C# with .Net version 4.0/ 4.5.
- Experience with SQL Server or any other equivalent Database and know how to build efficient queries.
- Strong knowledge on jQuery, AJAX, JavaScript, HTML5, CSS3 and Bootstrap.
- Experience in debugging in multiple browsers.
- Strong understanding of object-oriented programming.
- Clear understanding of SVN or an equivalent VCS.
- Familiar with IIS and deploying code to Web Server.
- Should have excellent analytical and communication skills.
Responsibilities:
- Good hands on designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.
- Translate use cases into functional applications
- Design, build, and maintain efficient, reusable, and reliable C# code
- Ensure the best possible performance, quality, and responsiveness of applications
- Help maintain code quality
- Able to work well in a team setting
Academic Qualifications Required:
- B.E. / B.Tech. /MSC in Computer Science or IT./
M.C.A
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Responsibilities:
• Research, design and build highly reliable, available and scalable solutions that can
handle Millions of API calls across systems.
• Own large technical deliverables and execute in a structured manner, complete ownership
of functional Services that your team is responsible for.
• Take the accountability of the overall health of the products you build and predictability
of the deliverables of your team.
• Platformism components as libraries, utilities and service and promote reuse.
• Be able to conceptualize and develop prototypes quickly.
Requirements
• Ability to take ownership of projects under the guidance of your mentor.
• Must be an excellent problem solver.
• Must be familiar with one of the languages Java, Spring, Hibernate, microservices etc.
object-oriented high level proprietary or open-source language with strong programming
constructs.
• Solid understanding of DS and algorithms, OOP concepts, and MVC architecture.
- 4-5 years of experience in functional automation tools and frameworks (mandatory).
- Experience with the Playwright automation tool (desirable)
- Analyze logs generated from monitoring tools like Datadog, Splunk, Kibana for troubleshooting and debugging service failures.
- Perform database validation using appropriate queries to verify data integrity in SQL and NoSQL databases.
- Work effectively within an Agile development methodology and participate in ceremonies like sprint planning, daily stand-ups, backlog refinement etc.
- Identify and report bugs and defects using a bug tracking tool.
- Stay up-to-date with the latest testing tools and methodologies.
- Learn API automation and contribute to API automation deliveries.
We are looking for a dynamic and self-driven sales representative to join our team. As a SaaS Sales Representative, you will be responsible for driving sales of our recruitment software to new and existing customers. You will identify and qualify leads, create and deliver persuasive sales presentations, and close deals to meet and exceed sales targets.
Key responsibilities:
- - Identify and qualify potential customers through a variety of methods, including outbound calls, emails, and social media outreach
- - Understand customer needs and provide solutions through the use of our recruitment software
- - Create and deliver compelling sales presentations to potential customers, highlighting the benefits and features of our software
- - Negotiate and close deals with potential customers
- - Meet and exceed sales targets on a monthly and quarterly basis
- - Keep up-to-date with industry trends and competitive products
Qualifications:
- - 1- 4 years of experience in a SaaS sales role
- - Proven track record of meeting or exceeding sales targets
- - Excellent communication and presentation skills
- - Ability to build and maintain strong relationships with customers
- - Ability to work independently and as part of a team
- - Experience with CRM software (e.g., Salesforce) is a plus
- - Experience in the recruitment industry is a plus, but not required
If this sounds like a good fit for you, please apply with your resume and a cover letter explaining why you are interested in this role. We look forward to hearing from you!
Profile: DevOps Engineer
Experience: 5-8 Yrs
Notice Period: Immediate to 30 Days
Job Descrtiption:
Technical Experience (Must Have):
Cloud: Azure
DevOps Tool: Terraform, Ansible, Github, CI-CD pipeline, Docker, Kubernetes
Network: Cloud Networking
Scripting Language: Any/All - Shell Script, PowerShell, Python
OS: Linux (Ubuntu, RHEL etc)
Database: MongoDB
Professional Attributes: Excellent communication, written, presentation,
and problem-solving skills.
Experience: Minimum of 5-8 years of experience in Cloud Automation and
Application
Additional Information (Good to have):
Microsoft Azure Fundamentals AZ-900
Terraform Associate
Docker
Certified Kubernetes Administrator
Role:
Building and maintaining tools to automate application and
infrastructure deployment, and to monitor operations.
Design and implement cloud solutions which are secure, scalable,
resilient, monitored, auditable and cost optimized.
Implementing transformation from an as is state, to the future.
Coordinating with other members of the DevOps team, Development, Test,
and other teams to enhance and optimize existing processes.
Provide systems support, implement monitoring and logging alerting
solutions that enable the production systems to be monitored.
Writing Infrastructure as Code (IaC) using Industry standard tools and
services.
Writing application deployment automation using industry standard
deployment and configuration tools.
Design and implement continuous delivery pipelines that serve the
purpose of provisioning and operating client test as well as production
environments.
Implement and stay abreast of Cloud and DevOps industry best practices
and tooling.




- Good hands-on and experience in Ruby on Rails, NodeJS and React/Angular/VueJs
- Knowledge of an SQL Database like MYSQL, Postgres.
- Basic Knowledge of programming in basic data structures and algorithms.
- You should have good experience working with relational and non-relational databases. We use Postgres and Cassandra.
- Good knowledge of version management with git.
- Awareness of TDD.
- CI/CD knowledge would be a huge advantage
- Will to design and maintain large scale distributed systems.

Maveric is a T24 Company with large base of consultants
Securities:
- Should have Knowledge in T24
- Should be having good knowledge in Investment Products like, Equities, Bonds, Derivatives, Structured Products etc.
- Also should be well versed with processes like Corporate Actions, Mandate Management, Portfolios, Positions etc
Finance:
- Should have FI module knowledge of T24
- Should be having a very good knowledge on processes like IFRS, US GAAP, Year End Process, Accrual Bookings, Yearend process etc.
- Should have good knowledge in configuring Reports, Products etc.
TPH:
- Should have good knowledge on various payment systems like CHAPS, BACs etc
- Should have good knowledge on T24 configurations for TPH
- Should be well versed with the T24 TPH integration with other T24 modules
- Should be well versed with various SWIFT messages
- Should have development experience (for Developer profiles)
Tax & Regulatory Compliance
- Should have basic knowledge of T24 Modules
- Should have worked in European Regulatory space with Investment perspective
- Should have good knowledge of WHT, Corporate tax etc with respect to Investment portfolios