Cutshort logo
Informatica jobs

32+ Informatica Jobs in India

Apply to 32+ Informatica Jobs on CutShort.io. Find your next job, effortlessly. Browse Informatica Jobs and apply today!

icon
globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹25L / yr
ETL
skill iconPython
Informatica
Talend



Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,

Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.



 Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with

project scope, Analysis, requirements gathering, data modeling, ETL Design, development,

System testing, Implementation, and production support.

 Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts

and Dimensions

 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,

Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

 Developed mapping parameters and variables to support SQL override.

 Created applets to use them in different mappings.

 Created sessions, configured workflows to extract data from various sources, transformed data,

and loading into the data warehouse.

 Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

 Modified existing mappings for enhancements of new business requirements.

 Involved in Performance tuning at source, target, mappings, sessions, and system levels.

 Prepared migration document to move the mappings from development to testing and then to

production repositories

 Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex

SQL queries using PL/SQL.


 Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica

/Talend sessions as well as performance tuning of mappings and sessions.

 Experience in all phases of Data warehouse development from requirements gathering for the

data warehouse to develop the code, Unit Testing, and Documenting.

 Extensive experience in writing UNIX shell scripts and automation of the ETL processes using

UNIX shell scripting.

 Experience in using Automation Scheduling tools like Control-M.

 Hands-on experience across all stages of Software Development Life Cycle (SDLC) including

business requirement analysis, data mapping, build, unit testing, systems integration, and user

acceptance testing.

 Build, operate, monitor, and troubleshoot Hadoop infrastructure.

 Develop tools and libraries, and maintain processes for other engineers to access data and write

MapReduce programs.

Read more
globe teleservices
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
talend
Informatica
ETL

Good experience in Extraction, Transformation, and Loading (ETL) of data from various sources

into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,

Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL

tool on Oracle, and SQL Server Databases.

 Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with

project scope, Analysis, requirements gathering, data modeling, ETL Design, development,

System testing, Implementation, and production support.

 Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts

and Dimensions

Read more
codersbrain

at codersbrain

1 recruiter
Aishwarya Hire
Posted by Aishwarya Hire
Remote only
4 - 6 yrs
₹6L - ₹10L / yr
ETL
Informatica
Data Warehouse (DWH)
  • Support and maintain production Cognos web portal to manage the OLAP cube and folders.
  • Used various transformations like expression, union, source qualifier, aggregator, router, store procedure, lookup transformations.
  • Work on transformations such as source qualifier, joiner, lookup, rank, expression, aggregator and sequence generator etc.
  • Work with relational databases like DB2.
  • Develop mappings with XML as target and formatting the target data according to the requirement.
  • Work with logical, physical, conceptual, star, snow flake schema data models.
  • Extract data from flat files, DB2 and to load the data into the SalesForce database.
  • Work with various data sources--relational, flat file (fix width, delimitate) and XML.
  • Involve in all phases of SDLC from requirements gathering, architecture design, development, testing, and data migration.
  • Involve in all phases of SDLC - requirement gathering, design, development, testing and implementation and post-production support.
  • Involve in writing PERL scripts for file transfers, file renaming and few other database scripts to be execute from UNIX.
  • Create functional and technical mapping specifications and documentation for QA team.
  • Support implementation and execution of MAPREDUCE programs in a cluster environment.
  • Maintain data integrity during extraction, manipulation, processing, analysis and storage.
  • Migrate repository objects, services and scripts from development environment to production environment.
Read more
AArete Technosoft Pvt Ltd
Pune
7 - 12 yrs
₹25L - ₹30L / yr
Snowflake
Snow flake schema
ETL
Data Warehouse (DWH)
skill iconPython
+8 more
Help us modernize our data platforms, with a specific focus on Snowflake
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Read more
Pune
5 - 8 yrs
₹1L - ₹15L / yr
Informatica
Informatica PowerCenter
Spark
Hadoop
Big Data
+6 more

Technical/Core skills

  1. Minimum 3 yrs of exp in Informatica Big data Developer(BDM) in Hadoop environment.
  2. Have knowledge of informatica Power exchange (PWX).
  3. Minimum 3 yrs of exp in big data querying tool like Hive and Impala.
  4. Ability to designing/development of complex mappings using informatica Big data Developer.
  5. Create and manage Informatica power exchange and CDC real time implementation
  6. Strong Unix knowledge skills for writing shell scripts and troubleshoot of existing scripts.
  7. Good knowledge of big data platforms and its framework.
  8. Good to have an experience in cloudera data platform (CDP)
  9. Experience with building stream processing systems using Kafka and spark
  10. Excellent SQL knowledge

 

Soft skills :

  1. Ability to work independently 
  2. Strong analytical and problem solving skills
  3. Attitude of learning new technology
  4. Regular interaction with vendors, partners and stakeholders
Read more
GradMener Technology Pvt. Ltd.
Pune
2 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Oracle
Job Description : 
 
Roles and Responsibility : 
 
  • Designing and coding the data warehousing system to desired company specifications 
  • Conducting preliminary testing of the warehousing environment before data is extracted
  • Extracting company data and transferring it into the new warehousing environment
  • Testing the new storage system once all the data has been transferred
  • Troubleshooting any issues that may arise
  • Providing maintenance support
  • Consulting with data management teams to get a big-picture idea of the company’s data storage needs
  • Presenting the company with warehousing options based on their storage needs
Requirements :
  • Experience of 1-3 years in Informatica Power Center
  • Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
  • Knowledge of SQL Server database 
  • Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques  Understanding of ETL Control Framework
  • Experience in UNIX shell/Perl Scripting
  • Good communication skills, including the ability to write clearly
  • Able to function effectively as a member of a team 
  • Proactive with respect to personal and technical development
Read more
InnovAccer

at InnovAccer

3 recruiters
Jyoti Kaushik
Posted by Jyoti Kaushik
Noida, Bengaluru (Bangalore), Pune, Hyderabad
4 - 7 yrs
₹4L - ₹16L / yr
ETL
SQL
Data Warehouse (DWH)
Informatica
Datawarehousing
+2 more

We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.

In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that

  • Has healthcare experience and is passionate about helping heal people,
  • Loves working with data,
  • Has an obsessive focus on data quality,
  • Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
  • Has strong data interrogation and analysis skills,
  • Defaults to written communication and delivers clean documentation, and,
  • Enjoys working with customers and problem solving for them.

A day in the life at Innovaccer:

  • Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
  • Measure and communicate impact to our customers.
  • Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.

What You Need:

  • 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
  • 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
  • Intermediate to advanced level SQL programming skills.
  • Data Analytics and Visualization (using tools like PowerBI)
  • The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
  • Ability to work in a fast-paced and agile environment.
  • Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.

What we offer:

  • Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
  • Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
  • Health benefits: We cover health insurance for you and your loved ones.
  • Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
  • Pet-friendly office and open floor plan: No boring cubicles.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Lokesh Manikappa
Posted by Lokesh Manikappa
Bengaluru (Bangalore)
5 - 12 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data modeling
Spark
+5 more

Job Description

The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.

Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.

You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies

 

Skills /Expertise Required :

Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).

Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.

Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.

Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills

Read more
Pune, Mumbai, Bengaluru (Bangalore), Nashik, Gurugram, Vizag
4 - 6 yrs
₹8L - ₹12L / yr
Linux administration
System Administration
OBIEE administration
Informatica
Essbase
+2 more
Linux administrator having 4+ years of experience
Good knowledge of managing Linux admin along with application hosting on the server. Excellent
knowledge of Linux commands
Worked on OEL Linux and other Linux version. Manage VAPT requirements, server support, backup
strategy
Knowledge of hosting OBIEE, Informatica and other applications like Essbase will be added adavntage.
Good technical knowledge and ready to always learn new technologies
Configurations of SSL, Port related checking
Technical documentation and debug skills
Coordination with other technical teams
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Read more
Bengaluru (Bangalore), Pune, Delhi, Gurugram, Nashik, Vizag
3 - 5 yrs
₹8L - ₹12L / yr
Oracle
Business Intelligence (BI)
PowerBI
Oracle Warehouse Builder
Informatica
+3 more
Oracle BI developer wiith 6+ years experience working on Oracle warehouse design, development and
testing
Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
requirements
Good communication skills and written communication
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Chennai
5 - 7 yrs
₹20L - ₹30L / yr
Technical support
Tech Support
SQL
Informatica
PySpark
+1 more
Job Title: Support Engineer L3 Job Location: Chennai
We CondéNast are looking for a Support engineer Level 2 who would be responsible for
monitoring and maintaining the production systems to ensure the business continuity is
maintained. Your Responsibilities would also include prompt communication to business
and internal teams about process delays, stability, issue, resolutions.
Primary Responsibilities
● 5+ years experience in Production support
● The Support Data Engineer is responsible for monitoring of the data pipelines
that are in production.
● Level 3 support activities - Analysing issue, debug programs & Jobs, bug fix
● The position will contribute to the monitoring, rerun or reschedule, code fix
of pipelines for a variety of projects on a daily basis.
● Escalate failures to Data-Team/DevOps incase of Infrastructure Failures or unable
to revive the data-pipelines.
● Ensure accurate alerts are raised incase of pipeline failures and corresponding
stakeholders (Business/Data Teams) are notified about the same within the
agreed upon SLAs.
● Prepare and present success/failure metrics by accurately logging the
monitoring stats.
● Able to work in shifts to provide overlap with US Business teams
● Other duties as requested or assigned.
Desired Skills & Qualification
● Have Strong working knowledge of Pyspark, Informatica, SQL(PRESTO), Batch
Handling through schedulers(databricks, Astronomer will be an
advantage),AWS-S3, SQL, Airflow and Hive/Presto
● Have basic knowledge on Shell scripts and/or Bash commands.
● Able to execute queries in Databases and produce outputs.
● Able to understand and execute the steps provided by Data-Team to
revive data-pipelines.
● Strong verbal, written communication skills and strong interpersonal
skills.
● Graduate/Diploma in computer science or information technology.
About Condé Nast
CONDÉ NAST GLOBAL
Condé Nast is a global media house with over a century of distinguished publishing
history. With a portfolio of iconic brands like Vogue, GQ, Vanity Fair, The New Yorker and
Bon Appétit, we at Condé Nast aim to tell powerful, compelling stories of communities,
culture and the contemporary world. Our operations are headquartered in New York and
London, with colleagues and collaborators in 32 markets across the world, including
France, Germany, India, China, Japan, Spain, Italy, Russia, Mexico, and Latin America.
Condé Nast has been raising the industry standards and setting records for excellence in
the publishing space. Today, our brands reach over 1 billion people in print, online, video,
and social media.
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and
social platforms - in other words, a staggering amount of user data. Condé Nast made the
right move to invest heavily in understanding this data and formed a whole new Data
team entirely dedicated to data processing, engineering, analytics, and visualization. This
team helps drive engagement, fuel process innovation, further content enrichment, and
increase market revenue. The Data team aimed to create a company culture where data
was the common language and facilitate an environment where insights shared in
real-time could improve performance. The Global Data team operates out of Los Angeles,
New York, Chennai, and London. The team at Condé Nast Chennai works extensively with
data to amplify its brands' digital capabilities and boost online revenue. We are broadly
divided into four groups, Data Intelligence, Data Engineering, Data Science, and
Operations (including Product and Marketing Ops, Client Services) along with Data
Strategy and monetization. The teams built capabilities and products to create
data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create
diverse forms of self-expression. At Condé Nast, we encourage the imaginative and
celebrate the extraordinary. We are a media company for the future, with a remarkable
past. We are Condé Nast, and It Starts Here.
Read more
Infonex Technologies

at Infonex Technologies

1 recruiter
Vinay Ramesh
Posted by Vinay Ramesh
Bengaluru (Bangalore)
4 - 7 yrs
₹6L - ₹30L / yr
Informatica
ETL
SQL
Linux/Unix
Oracle
+1 more
  • Experience implementing large-scale ETL processes using Informatica PowerCenter.
  • Design high-level ETL process and data flow from the source system to target databases.
  • Strong experience with Oracle databases and strong SQL.
  • Develop & unit test Informatica ETL processes for optimal performance utilizing best practices.
  • Performance tune Informatica ETL mappings and report queries.
  • Develop database objects like Stored Procedures, Functions, Packages, and Triggers using SQL and PL/SQL.
  • Hands-on Experience in Unix.
  • Experience in Informatica Cloud (IICS).
  • Work with appropriate leads and review high-level ETL design, source to target data mapping document, and be the point of contact for any ETL-related questions.
  • Good understanding of project life cycle, especially tasks within the ETL phase.
  • Ability to work independently and multi-task to meet critical deadlines in a rapidly changing environment.
  • Excellent communication and presentation skills.
  • Effectively worked on the Onsite and Offshore work model.
Read more
Software Company
Bengaluru (Bangalore)
4 - 12 yrs
₹8L - ₹15L / yr
Snap Logic
Informatica
MuleSoft
snaplogic
SQL
+2 more
About the position:

This is a role that combines technical expertise with customer management skills and requires
close interaction with the customer in understanding requirements/use cases, scheduling,
and proposing solutions. Professional Services Engineer is responsible for system
implementation by developing, building pipelines (integrations) and providing product
demos to our customers. This person needs an ability to share and communicate ideas clearly,
both orally and in writing, to executive staff, business sponsors, and technical resources in
clear concise language that is the parlance of each group.

Requirements and Preferred Skills:

1. 5+ Years’ Experience with other integration technologies like SnapLogic, Informatica,
MuleSoft, etc. and in-depth understanding of Enterprise Integration Patterns
2. 5+ years of experience with SQL
3. Hands on experience with REST architectures
4. Knowledge of SOAP/XML/JMS/JSON, basic level understanding of REST principles, and
REST and SOAP APIs.
5. Deep understanding of HTTP protocols
6. Excellent customer facing skills
7. Must be a self-starter and extremely organized with your space and time.
8. Ability to juggle working independently and as part of a team.
9. Accurate and fast decision-making processes
10. Be able to quickly debug complex Snap issues and figure out the root cause of
problems.
11. Cycle between projects in weeks rather than years – continually learning about new
technology and products
Read more
UAE Client
Remote only
5 - 10 yrs
₹10L - ₹18L / yr
Informatica
Informatica PowerCenter
SQL
PL/SQL

Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience

SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.

Good to have- Advantage if you have knowledge of Windows Batch Script.

Read more
Bengaluru (Bangalore)
2 - 8 yrs
₹4L - ₹10L / yr
Data governance
Data security
skill iconData Analytics
Informatica
SQL
+4 more

Job Description

We are looking for a senior resource with Analyst skills and knowledge of IT projects, to support delivery of risk mitigation activities and automation in Aviva’s Global Finance Data Office. The successful candidate will bring structure to this new role in a developing team, with excellent communication, organisational and analytical skills. The Candidate will play the primary role of supporting data governance project/change activities. Candidates should be comfortable with ambiguity in a fast-paced and ever-changing environment. Preferred skills include knowledge of Data Governance, Informatica Axon, SQL, AWS. In our team, success is measured by results and we encourage flexible working where possible.

Key Responsibilities

  • Engage with stakeholders to drive delivery of the Finance Data Strategy
  • Support data governance project/change activities in Aviva’s Finance function.
  • Identify opportunities and implement Automations for enhanced performance of the Team

Required profile

  • Relevant work experience in at least one of the following: business/project analyst, project/change management and data analytics.
  • Proven track record of successful communication of analytical outcomes, including an ability to effectively communicate with both business and technical teams.
  • Ability to manage multiple, competing priorities and hold the team and stakeholders to account on progress.
  • Contribute, plan and execute end to end data governance framework.
  • Basic knowledge of IT systems/projects and the development lifecycle.
  • Experience gathering business requirements and reports.
  • Advanced experience of MS Excel data processing (VBA Macros).
  • Good communication

 

Additional Information

Degree in a quantitative or scientific field (e.g. Engineering, MBA Finance, Project Management) and/or experience in data governance/quality/privacy
Knowledge of Finance systems/processes
Experience in analysing large data sets using dedicated analytics tools

 

Designation – Assistant Manager TS

Location – Bangalore

Shift – 11 – 8 PM
Read more
Abu Dhabi, Dubai
8 - 15 yrs
₹35L - ₹50L / yr
Informatica
Big Data
Spark
Hadoop
SQL
Skills- Informatica with Big Data Management
 
1. Minimum 6 to 8 years of experience in Informatica BDM development
 
2. Experience working on Spark/SQL
 
3. Develops informtica mapping/SQL 
 
4. Should have experience in Hadoop, spark, etc

Work Days-
 
Sunday to Thursday- Day shift
 
(Friday and Saturday would be weekly off.)
Read more
UAE Client
Remote, Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹15L - ₹22L / yr
Informatica
Big Data
SQL
Hadoop
Apache Spark
+1 more

Skills- Informatica with Big Data Management

 

1.Minimum 6 to 8 years of experience in informatica BDM development
2.Experience working on Spark/SQL
3.Develops informtica mapping/Sql 

4. Should have experience in Hadoop, spark etc
Read more
UAE Client
Agency job
via Fragma Data Systems by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad
2 - 5 yrs
₹15L - ₹18L / yr
Informatica PowerCenter
SQL
Informatica

Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience

SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.

Good to have- Advantage if you have knowledge of Windows Batch Script.

Read more
ACT FIBERNET

at ACT FIBERNET

1 video
2 recruiters
Sumit Sindhwani
Posted by Sumit Sindhwani
Bengaluru (Bangalore)
9 - 14 yrs
₹20L - ₹36L / yr
Data engineering
Data Engineer
Hadoop
Informatica
Qlikview
+1 more

Key  Responsibilities :

  • Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
  • Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
  • Creation of a project plan including timelines and critical milestones to success in support of the project
  • Identification of the vital skill sets/staff required to complete the project
  • Identification of crucial sources of the data needed to achieve the objective.

 

Skill Requirement :

  • Experience with data pipeline processes and tools
  • Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
  • Experience with an existing ETL tool e.g Informatica and Ab initio etc
  • Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
  • Deep knowledge of Qlik ecosystems like  Qlikview, Qliksense, and Nprinting
  • Python, or a similar programming language
  • Exposure to data science and machine learning
  • Comfort working in a fast-paced environment

Soft attributes :

  • Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
  • Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
  • Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Priyanka U
Posted by Priyanka U
Remote only
4 - 10 yrs
₹12L - ₹23L / yr
Informatica
ETL
Big Data
Spark
SQL
Skill:- informatica with big data management
 
1.Minimum 6 to 8 years of experience in informatica BDM development
2. Experience working on Spark/SQL
3. Develops informtica mapping/Sql 
4. Should have experience in Hadoop, spark etc

Work days- Sun-Thu
Day shift
 
 
 
Read more
Futurense Technologies

at Futurense Technologies

1 recruiter
Rajendra Dasigari
Posted by Rajendra Dasigari
Bengaluru (Bangalore)
2 - 7 yrs
₹6L - ₹12L / yr
ETL
Data Warehouse (DWH)
Apache Hive
Informatica
Data engineering
+5 more
1. Create and maintain optimal data pipeline architecture
2. Assemble large, complex data sets that meet business requirements
3. Identify, design, and implement internal process improvements
4. Optimize data delivery and re-design infrastructure for greater scalability
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
6. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
7. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs
8. Create data tools for analytics and data scientist team members
 
Skills Required:
 
1. Working knowledge of ETL on any cloud (Azure / AWS / GCP)
2. Proficient in Python (Programming / Scripting)
3. Good understanding of any of the data warehousing concepts (Snowflake / AWS Redshift / Azure Synapse Analytics / Google Big Query / Hive)
4. In-depth understanding of principles of database structure
5.  Good understanding of any of the ETL technologies (Informatica PowerCenter / AWS Glue / Data Factory / SSIS / Spark / Matillion / Talend / Azure)
6. Proficient in SQL (query solving)
7. Knowledge in Change case Management / Version Control – (VSS / DevOps / TFS / GitHub, Bit bucket, CICD Jenkin)
Read more
Bengaluru (Bangalore), Hyderabad, Chennai
4 - 7 yrs
₹12L - ₹15L / yr
Informatica
Data integration
Salesforce
JIRA
Oracle
+1 more
Knowledge & Experience:
  • 4-7 years of Industry experience in IT or consulting organizations
  • 3+ years of experience defining and delivering Informatica Cloud Data Integration & Application Integration enterprise applications in lead developer role
  • Must have working knowledge on integrating with Salesforce, Oracle DB, JIRA Cloud  
  • Must have working scripting knowledge (windows or Nodejs) 


Soft Skills 

  • Superb interpersonal skills, both written and verbal, in order to effectively develop materials that are appropriate for variety of audience in business & technical teams
  • Strong presentation skills, successfully present and defend point of view to Business & IT audiences
  • Excellent analysis skills and ability to rapidly learn and take advantage of new concepts, business models, and technologies
Read more
Curl

at Curl

Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹25L / yr
Data Visualization
PowerBI
ETL
Business Intelligence (BI)
skill iconData Analytics
+6 more
Main Responsibilities:

 Work closely with different Front Office and Support Function stakeholders including but not restricted to Business
Management, Accounts, Regulatory Reporting, Operations, Risk, Compliance, HR on all data collection and reporting use cases.
 Collaborate with Business and Technology teams to understand enterprise data, create an innovative narrative to explain, engage and enlighten regular staff members as well as executive leadership with data-driven storytelling
 Solve data consumption and visualization through data as a service distribution model
 Articulate findings clearly and concisely for different target use cases, including through presentations, design solutions, visualizations
 Perform Adhoc / automated report generation tasks using Power BI, Oracle BI, Informatica
 Perform data access/transfer and ETL automation tasks using Python, SQL, OLAP / OLTP, RESTful APIs, and IT tools (CFT, MQ-Series, Control-M, etc.)
 Provide support and maintain the availability of BI applications irrespective of the hosting location
 Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability, provide incident-related communications promptly
 Work with strict deadlines on high priority regulatory reports
 Serve as a liaison between business and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood, and considered as part of operational
prioritization and planning
 To work for APAC Chief Data Office and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).

General Skills:
 Excellent knowledge of RDBMS and hands-on experience with complex SQL is a must, some experience in NoSQL and Big Data Technologies like Hive and Spark would be a plus
 Experience with industrialized reporting on BI tools like PowerBI, Informatica
 Knowledge of data related industry best practices in the highly regulated CIB industry, experience with regulatory report generation for financial institutions
 Knowledge of industry-leading data access, data security, Master Data, and Reference Data Management, and establishing data lineage
 5+ years experience on Data Visualization / Business Intelligence / ETL developer roles
 Ability to multi-task and manage various projects simultaneously
 Attention to detail
 Ability to present to Senior Management, ExCo; excellent written and verbal communication skills
Read more
Pune
2 - 5 yrs
₹10L - ₹14L / yr
ETL
Datawarehousing
Data Warehouse (DWH)
SQL
Informatica

 Review all job requirements and specifications required for deploying the solution into the production environment. 

 Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same. 

 Work with Lead to resolve all issues within the required timeframe and inform for any delays. 

 Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments. 

 Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis. 

 Review and maintain all technical and business documents.  Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide 

 Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same. 

 Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues. 

 Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau. 

 Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones. 

 

Experience on Tools and Technologies preferred: 

ETL Tools: Talend or Informatica ,Abinitio,Datastage 

BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience 

Database: Experience in Oracle or SS 

Methodology: Experience in SDLC and/or Agile Methodology

Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹15L / yr
Big Data
ETL
PySpark
SSIS
Microsoft Windows Azure
+4 more

Must Have Skills:

- Solid Knowledge on DWH, ETL and Big Data Concepts

- Excellent SQL Skills (With knowledge of SQL Analytics Functions)

- Working Experience on any ETL tool i.e. SSIS / Informatica

- Working Experience on any Azure or AWS Big Data Tools.

- Experience on Implementing Data Jobs (Batch / Real time Streaming)

- Excellent written and verbal communication skills in English, Self-motivated with strong sense of ownership and Ready to learn new tools and technologies

Preferred Skills:

- Experience on Py-Spark / Spark SQL

- AWS Data Tools (AWS Glue, AWS Athena)

- Azure Data Tools (Azure Databricks, Azure Data Factory)

Other Skills:

- Knowledge about Azure Blob, Azure File Storage, AWS S3, Elastic Search / Redis Search

- Knowledge on domain/function (across pricing, promotions and assortment).

- Implementation Experience on Schema and Data Validator framework (Python / Java / SQL),

- Knowledge on DQS and MDM.

Key Responsibilities:

- Independently work on ETL / DWH / Big data Projects

- Gather and process raw data at scale.

- Design and develop data applications using selected tools and frameworks as required and requested.

- Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.

- Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.

- Work closely with the engineering team to integrate your work into our production systems.

- Process unstructured data into a form suitable for analysis.

- Analyse processed data.

- Support business decisions with ad hoc analysis as needed.

- Monitoring data performance and modifying infrastructure as needed.

Responsibility: Smart Resource, having excellent communication skills

 

 
Read more
Bengaluru (Bangalore), Chennai
10 - 15 yrs
₹18L - ₹23L / yr
Data governance
Informatica
Informatica Data Quality

Qualifications & Skills

  • Proven track record in delivering Data Governance Solutions to a large enterprise
  • Knowledge experience in data governance frameworks, formulating data governance policy, standards and processes
  • Experience in program management and managing cross functional stakeholders from senior leadership to project manager level
  • Experience in leading a team of data governance business analysts
  • Experience in data governance tools like Informatica Data Quality, Enterprise Data Catalog, Axon, Collibra
  • Experience in metadata management, master and reference data management, data quality and data governance
Read more
A MNC
Agency job
via HerSecondInnings by Rampriya K
Remote, Bengaluru (Bangalore), Pune, Mumbai
4 - 11 yrs
₹4L - ₹14L / yr
PL/SQL
Stored Procedures
Informatica
Shell Scripting
Linux/Unix
+1 more
Job Description - PL/SQL Developer (Women on break/Immediate Joiner)
1.Understand client business requirements and interpret into technical solutions
2. Build and maintain database stored procedures
3. Build and maintain ETL workflows
4. Perform quality assurance and testing at the unit level
5. Write and maintain user and technical documentation
6. Integrate Merkle database solutions with web services and cloud-based platforms. Must Have: SQL server stored procedures
Good/Nice to have: UNIX shell scripting, Talend/Tidal/Databricks/Informatica , JAVA/Python Experience : 2 to 10 years of experienced candidates
Read more
MNC

at MNC

Agency job
via Fragma Data Systems by Harpreet kour
Bengaluru (Bangalore)
12 - 18 yrs
₹15L - ₹25L / yr
Enterprise Data Warehouse (EDW)
Business Intelligence (BI)
Informatica
MicroStrategy
Azure
+1 more
  • EDW/BI experience of 15+ years with at least 2-3 end to end EDW implementation experience as Solution or Technical program manager
  • Must have at least ONE Azure Data Platform implementation experience as Solution or Technical Project manager (Azure, Databricks, ADF, PySpark)
  • Must have technology experience in any of the ETL tools like Informatica, Datastage and etc.
  • Excellent communication and presentation skills
  • Should be well versed with project estimation, project planning, execution, tracking & monitoring
  • Should be well versed with delivery metrics in Waterfall and/or Agile delivery models, scrum management
  • Preferred to have technology experience of any of the BI tools like MicroStrategy, Tableau, Power BI and etc.
Read more
TechChefs Software

at TechChefs Software

2 recruiters
Shilpa Yadav
Posted by Shilpa Yadav
Remote, Anywhere from india
5 - 10 yrs
₹1L - ₹15L / yr
ETL
Informatica
skill iconPython
SQL

Responsibilities

  • Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices
  • Day to day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst).
  • Informatica capacity planning and on-going monitoring (e.g. CPU, Memory, etc.) to proactively increase capacity as needed.
  • Manage backup and security of Data Integration Infrastructure.
  • Design, develop, and maintain all data warehouse, data marts, and ETL functions for the organization as a part of an infrastructure team.
  • Consult with users, management, vendors, and technicians to assess computing needs and system requirements.
  • Develop and interpret organizational goals, policies, and procedures.
  • Evaluate the organization's technology use and needs and recommend improvements, such as software upgrades.
  • Prepare and review operational reports or project progress reports.
  • Assist in the daily operations of the Architecture Team , analyzing workflow, establishing priorities, developing standards, and setting deadlines.
  • Work with vendors to manage support SLA’s and influence vendor product roadmap
  • Provide leadership and guidance in technical meetings, define standards and assist/provide status updates
  • Work with cross functional operations teams such as systems, storage and network to design technology stacks.

 

Preferred Qualifications

  • Minimum 6+ years’ experience as Informatica Engineer and Developer role
  • Minimum of 5+ years’ experience in an ETL environment as a developer.
  • Minimum of 5+ years of experience in SQL coding and understanding of databases
  • Proficiency in Python
  • Proficiency in command line troubleshooting
  • Proficiency in writing code in Perl/Shell scripting languages
  • Understanding of Java and concepts of Object-oriented programming
  • Good understanding of systems, networking, and storage
  • Strong knowledge of scalability and high availability
Read more
Chennai, Bengaluru (Bangalore)
5 - 12 yrs
₹8L - ₹16L / yr
Informatica MDM
Informatica PowerCenter
Informatica
MDM
skill iconJava
+3 more
  • Looking only for immediate to 15 days candidate.
  • Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
  • Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
  • Having experience in a broader Informatica toolset is strongly preferred
  • Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
  • Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
  • Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
  • Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
  • Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
  • Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
  • Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
  • Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
  • Very strong scripting background (C/bourne shell/Perl/Java)
  • Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
  • Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.
Read more
Preludesys India Pvt Ltd
Remote, Chennai
5 - 10 yrs
₹8L - ₹19L / yr
MuleSoft
Informatica
Integration

Job Title: Senior Software Engineer - Mulesoft / Informatica

 

Job Description:

  • Senior integration developer who is having experience in any two of the integrations tools, Dell Boomi, MuleSoft, Informatica.
  • Understand, gather and document requirements
  • Analyze and design the solution, if needed help PM/Architect to decide the architecture
  • Develop the solution and guide the team to develop the solution
  • Follow coding guidelines
  • Test the developed solution
  • Follow the organizational / project processes
  • Provide necessary support with right attitude to the project manager and the project team

 

Job Responsibilities:

  • Gather requirements, analyze, document the design approach, develop and produce quality output
  • Follow the project processes
  • Mentor junior resources
  • Impart training where necessary
  • Be open to learn other integration tools

 

Primary Skills Required: Dell Boomi, Mulesoft, Informatica

Secondary Skills Required: Java, Salesforce

 

Technical Skills Required:

Experience in any one of the Integration tool, Dell Boomi, Mulesoft and Informatica.
Required at least one Certification on any Integration Tool

 

Experience: 5 years to 10 years

Read more
Preludesys India Pvt Ltd
Remote, Chennai
3 - 5 yrs
₹5L - ₹12L / yr
MuleSoft
Informatica
BOOMI
Integration

Job Description:

  • Integration developer who is having experience in any one of the integrations tools Dell Boomi, MuleSoft, Informatica.
  • Understand requirements and participate in designing the solution
  • Develop the solution as per the design by following coding guidelines.
  • Test the developed solution.
  • Work on defect fixes
  • Follow project processes
  • Provide accurate status report on a daily basis

 

Job Responsibilities:

  • Participate in Design and develop as per the design and requirements
  • Follow coding guidelines and produce quality output
  • Follow the project processes and completed the development within the set target
  • Analyze the defects to identify root cause and fix the defects.
  • Be Open to learn other integration tools.

 

Primary Skills Required: Dell Boomi, Mulesoft, Informatica

Secondary Skills Required: Java, Salesforce

 

Technical Skills Required:

Experience in any one of the Integration tool, Dell Boomi, Mulesoft and Informatica.
Required at least one Certification on any Integration Tool


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort