21+ Informatica Jobs in Bangalore (Bengaluru) | Informatica Job openings in Bangalore (Bengaluru)
Apply to 21+ Informatica Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Informatica Job opportunities across top companies like Google, Amazon & Adobe.
Job Title: Database Engineer
Location: Bangalore, Karnataka
Company: Wissen Technology
Experience: 5-7 Years
Joining Period: Immediate candidates, Currently Serving, 15-20 days ONLY
About Us:
Wissen Technology is a global leader in technology consulting and development. We specialize in delivering top-notch solutions for Fortune 500 companies across various industries. Join us to work on cutting-edge projects in a collaborative and innovative environment.
Key Responsibilities:
- Develop and maintain ETL processes using the Informatica ETL tool, adhering to best practices.
- Write efficient and optimized SQL queries and manage database operations.
- Create and manage UNIX shell scripts to support automation and database activities.
- Analyze business requirements, propose robust solutions, and implement them effectively.
- Work collaboratively with global teams to deliver high-quality solutions within an Agile framework.
- Leverage JIRA or other ALM tools to maintain a productive development environment.
- Stay updated with new technologies and concepts to address business challenges innovatively.
Required Skills:
- Proficiency in Informatica ETL tools and ETL processes.
- Strong SQL database expertise.
- Advanced hands-on experience with UNIX shell scripting.
- Experience working in Agile methodologies.
- Familiarity with JIRA or similar ALM tools.
- Excellent problem-solving, verbal, and written communication skills.
- Proven ability to collaborate with global teams effectively.
Desired Skills:
- Knowledge of financial markets, payment solutions, and wealth management.
- Experience with Spark scripting (preferred but not mandatory).
Qualifications:
- BTech / MTech/ MCA or any related field
- Candidate needs to be an immediate joiner / serving notice period / 15-20 days
Why Join Wissen Technology?
- Opportunity to be part of a growing team focused on data-driven innovation and quality.
- Exposure to global clients and complex data management projects.
- Competitive benefits, including health coverage, paid time off, and a collaborative work environment.
At Wissen Technology, we value our team members and their contributions. We offer competitive compensation, opportunities for growth, and an environment where your ideas can make a difference!
We look forward to welcoming a detail-oriented and driven Data Analyst Associate to our team!
Good experience in the Extraction, Transformation, and Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL tool on Oracle, and SQL Server Databases.
Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, ETL Design, development,
System testing, Implementation, and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts
and Dimensions
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created applets to use them in different mappings.
Created sessions, configured workflows to extract data from various sources, transformed data,
and loading into the data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Modified existing mappings for enhancements of new business requirements.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to
production repositories
Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex
SQL queries using PL/SQL.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
/Talend sessions as well as performance tuning of mappings and sessions.
Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing, and Documenting.
Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
Experience in using Automation Scheduling tools like Control-M.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration, and user
acceptance testing.
Build, operate, monitor, and troubleshoot Hadoop infrastructure.
Develop tools and libraries, and maintain processes for other engineers to access data and write
MapReduce programs.
Good experience in Extraction, Transformation, and Loading (ETL) of data from various sources
into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Connect as ETL
tool on Oracle, and SQL Server Databases.
Knowledge of Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, ETL Design, development,
System testing, Implementation, and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts
and Dimensions
We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.
In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that
- Has healthcare experience and is passionate about helping heal people,
- Loves working with data,
- Has an obsessive focus on data quality,
- Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
- Has strong data interrogation and analysis skills,
- Defaults to written communication and delivers clean documentation, and,
- Enjoys working with customers and problem solving for them.
A day in the life at Innovaccer:
- Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
- Measure and communicate impact to our customers.
- Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.
What You Need:
- 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
- 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
- Intermediate to advanced level SQL programming skills.
- Data Analytics and Visualization (using tools like PowerBI)
- The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
- Ability to work in a fast-paced and agile environment.
- Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.
What we offer:
- Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
- Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
- Health benefits: We cover health insurance for you and your loved ones.
- Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
- Pet-friendly office and open floor plan: No boring cubicles.
Job Description
The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.
Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.
You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies
Skills /Expertise Required :
Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).
Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.
Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.
Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills
With a global provider of Business Process Management.
Good knowledge of managing Linux admin along with application hosting on the server. Excellent
knowledge of Linux commands
Worked on OEL Linux and other Linux version. Manage VAPT requirements, server support, backup
strategy
Knowledge of hosting OBIEE, Informatica and other applications like Essbase will be added adavntage.
Good technical knowledge and ready to always learn new technologies
Configurations of SSL, Port related checking
Technical documentation and debug skills
Coordination with other technical teams
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
With a global provider of Business Process Management.
testing
Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
requirements
Good communication skills and written communication
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
- Experience implementing large-scale ETL processes using Informatica PowerCenter.
- Design high-level ETL process and data flow from the source system to target databases.
- Strong experience with Oracle databases and strong SQL.
- Develop & unit test Informatica ETL processes for optimal performance utilizing best practices.
- Performance tune Informatica ETL mappings and report queries.
- Develop database objects like Stored Procedures, Functions, Packages, and Triggers using SQL and PL/SQL.
- Hands-on Experience in Unix.
- Experience in Informatica Cloud (IICS).
- Work with appropriate leads and review high-level ETL design, source to target data mapping document, and be the point of contact for any ETL-related questions.
- Good understanding of project life cycle, especially tasks within the ETL phase.
- Ability to work independently and multi-task to meet critical deadlines in a rapidly changing environment.
- Excellent communication and presentation skills.
- Effectively worked on the Onsite and Offshore work model.
This is a role that combines technical expertise with customer management skills and requires
close interaction with the customer in understanding requirements/use cases, scheduling,
and proposing solutions. Professional Services Engineer is responsible for system
implementation by developing, building pipelines (integrations) and providing product
demos to our customers. This person needs an ability to share and communicate ideas clearly,
both orally and in writing, to executive staff, business sponsors, and technical resources in
clear concise language that is the parlance of each group.
Requirements and Preferred Skills:
1. 5+ Years’ Experience with other integration technologies like SnapLogic, Informatica,
MuleSoft, etc. and in-depth understanding of Enterprise Integration Patterns
2. 5+ years of experience with SQL
3. Hands on experience with REST architectures
4. Knowledge of SOAP/XML/JMS/JSON, basic level understanding of REST principles, and
REST and SOAP APIs.
5. Deep understanding of HTTP protocols
6. Excellent customer facing skills
7. Must be a self-starter and extremely organized with your space and time.
8. Ability to juggle working independently and as part of a team.
9. Accurate and fast decision-making processes
10. Be able to quickly debug complex Snap issues and figure out the root cause of
problems.
11. Cycle between projects in weeks rather than years – continually learning about new
technology and products
a global business process management company
Job Description
We are looking for a senior resource with Analyst skills and knowledge of IT projects, to support delivery of risk mitigation activities and automation in Aviva’s Global Finance Data Office. The successful candidate will bring structure to this new role in a developing team, with excellent communication, organisational and analytical skills. The Candidate will play the primary role of supporting data governance project/change activities. Candidates should be comfortable with ambiguity in a fast-paced and ever-changing environment. Preferred skills include knowledge of Data Governance, Informatica Axon, SQL, AWS. In our team, success is measured by results and we encourage flexible working where possible.
Key Responsibilities
- Engage with stakeholders to drive delivery of the Finance Data Strategy
- Support data governance project/change activities in Aviva’s Finance function.
- Identify opportunities and implement Automations for enhanced performance of the Team
Required profile
- Relevant work experience in at least one of the following: business/project analyst, project/change management and data analytics.
- Proven track record of successful communication of analytical outcomes, including an ability to effectively communicate with both business and technical teams.
- Ability to manage multiple, competing priorities and hold the team and stakeholders to account on progress.
- Contribute, plan and execute end to end data governance framework.
- Basic knowledge of IT systems/projects and the development lifecycle.
- Experience gathering business requirements and reports.
- Advanced experience of MS Excel data processing (VBA Macros).
- Good communication
Additional Information
Degree in a quantitative or scientific field (e.g. Engineering, MBA Finance, Project Management) and/or experience in data governance/quality/privacy
Knowledge of Finance systems/processes
Experience in analysing large data sets using dedicated analytics tools
Designation – Assistant Manager TS
Location – Bangalore
Shift – 11 – 8 PMSkills- Informatica with Big Data Management
1.Minimum 6 to 8 years of experience in informatica BDM development
2.Experience working on Spark/SQL
3.Develops informtica mapping/Sql
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.
Key Responsibilities :
- Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
- Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
- Creation of a project plan including timelines and critical milestones to success in support of the project
- Identification of the vital skill sets/staff required to complete the project
- Identification of crucial sources of the data needed to achieve the objective.
Skill Requirement :
- Experience with data pipeline processes and tools
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
- Experience with an existing ETL tool e.g Informatica and Ab initio etc
- Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
- Deep knowledge of Qlik ecosystems like Qlikview, Qliksense, and Nprinting
- Python, or a similar programming language
- Exposure to data science and machine learning
- Comfort working in a fast-paced environment
Soft attributes :
- Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
- Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
- Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.
2. Assemble large, complex data sets that meet business requirements
3. Identify, design, and implement internal process improvements
4. Optimize data delivery and re-design infrastructure for greater scalability
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
6. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
7. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs
8. Create data tools for analytics and data scientist team members
Skills Required:
1. Working knowledge of ETL on any cloud (Azure / AWS / GCP)
2. Proficient in Python (Programming / Scripting)
3. Good understanding of any of the data warehousing concepts (Snowflake / AWS Redshift / Azure Synapse Analytics / Google Big Query / Hive)
4. In-depth understanding of principles of database structure
5. Good understanding of any of the ETL technologies (Informatica PowerCenter / AWS Glue / Data Factory / SSIS / Spark / Matillion / Talend / Azure)
6. Proficient in SQL (query solving)
7. Knowledge in Change case Management / Version Control – (VSS / DevOps / TFS / GitHub, Bit bucket, CICD Jenkin)
Several years of experience in designing web applications
- 4-7 years of Industry experience in IT or consulting organizations
- 3+ years of experience defining and delivering Informatica Cloud Data Integration & Application Integration enterprise applications in lead developer role
- Must have working knowledge on integrating with Salesforce, Oracle DB, JIRA Cloud
- Must have working scripting knowledge (windows or Nodejs)
Soft Skills
- Superb interpersonal skills, both written and verbal, in order to effectively develop materials that are appropriate for variety of audience in business & technical teams
- Strong presentation skills, successfully present and defend point of view to Business & IT audiences
- Excellent analysis skills and ability to rapidly learn and take advantage of new concepts, business models, and technologies
Work closely with different Front Office and Support Function stakeholders including but not restricted to Business
Management, Accounts, Regulatory Reporting, Operations, Risk, Compliance, HR on all data collection and reporting use cases.
Collaborate with Business and Technology teams to understand enterprise data, create an innovative narrative to explain, engage and enlighten regular staff members as well as executive leadership with data-driven storytelling
Solve data consumption and visualization through data as a service distribution model
Articulate findings clearly and concisely for different target use cases, including through presentations, design solutions, visualizations
Perform Adhoc / automated report generation tasks using Power BI, Oracle BI, Informatica
Perform data access/transfer and ETL automation tasks using Python, SQL, OLAP / OLTP, RESTful APIs, and IT tools (CFT, MQ-Series, Control-M, etc.)
Provide support and maintain the availability of BI applications irrespective of the hosting location
Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability, provide incident-related communications promptly
Work with strict deadlines on high priority regulatory reports
Serve as a liaison between business and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood, and considered as part of operational
prioritization and planning
To work for APAC Chief Data Office and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).
General Skills:
Excellent knowledge of RDBMS and hands-on experience with complex SQL is a must, some experience in NoSQL and Big Data Technologies like Hive and Spark would be a plus
Experience with industrialized reporting on BI tools like PowerBI, Informatica
Knowledge of data related industry best practices in the highly regulated CIB industry, experience with regulatory report generation for financial institutions
Knowledge of industry-leading data access, data security, Master Data, and Reference Data Management, and establishing data lineage
5+ years experience on Data Visualization / Business Intelligence / ETL developer roles
Ability to multi-task and manage various projects simultaneously
Attention to detail
Ability to present to Senior Management, ExCo; excellent written and verbal communication skills
Must Have Skills:
- Solid Knowledge on DWH, ETL and Big Data Concepts
- Excellent SQL Skills (With knowledge of SQL Analytics Functions)
- Working Experience on any ETL tool i.e. SSIS / Informatica
- Working Experience on any Azure or AWS Big Data Tools.
- Experience on Implementing Data Jobs (Batch / Real time Streaming)
- Excellent written and verbal communication skills in English, Self-motivated with strong sense of ownership and Ready to learn new tools and technologies
Preferred Skills:
- Experience on Py-Spark / Spark SQL
- AWS Data Tools (AWS Glue, AWS Athena)
- Azure Data Tools (Azure Databricks, Azure Data Factory)
Other Skills:
- Knowledge about Azure Blob, Azure File Storage, AWS S3, Elastic Search / Redis Search
- Knowledge on domain/function (across pricing, promotions and assortment).
- Implementation Experience on Schema and Data Validator framework (Python / Java / SQL),
- Knowledge on DQS and MDM.
Key Responsibilities:
- Independently work on ETL / DWH / Big data Projects
- Gather and process raw data at scale.
- Design and develop data applications using selected tools and frameworks as required and requested.
- Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
- Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.
- Work closely with the engineering team to integrate your work into our production systems.
- Process unstructured data into a form suitable for analysis.
- Analyse processed data.
- Support business decisions with ad hoc analysis as needed.
- Monitoring data performance and modifying infrastructure as needed.
Responsibility: Smart Resource, having excellent communication skills
Client of People First Consultants
Qualifications & Skills
- Proven track record in delivering Data Governance Solutions to a large enterprise
- Knowledge experience in data governance frameworks, formulating data governance policy, standards and processes
- Experience in program management and managing cross functional stakeholders from senior leadership to project manager level
- Experience in leading a team of data governance business analysts
- Experience in data governance tools like Informatica Data Quality, Enterprise Data Catalog, Axon, Collibra
- Experience in metadata management, master and reference data management, data quality and data governance
1.Understand client business requirements and interpret into technical solutions
2. Build and maintain database stored procedures
3. Build and maintain ETL workflows
4. Perform quality assurance and testing at the unit level
5. Write and maintain user and technical documentation
6. Integrate Merkle database solutions with web services and cloud-based platforms. Must Have: SQL server stored procedures
Good/Nice to have: UNIX shell scripting, Talend/Tidal/Databricks/Informatica , JAVA/Python Experience : 2 to 10 years of experienced candidates
- EDW/BI experience of 15+ years with at least 2-3 end to end EDW implementation experience as Solution or Technical program manager
- Must have at least ONE Azure Data Platform implementation experience as Solution or Technical Project manager (Azure, Databricks, ADF, PySpark)
- Must have technology experience in any of the ETL tools like Informatica, Datastage and etc.
- Excellent communication and presentation skills
- Should be well versed with project estimation, project planning, execution, tracking & monitoring
- Should be well versed with delivery metrics in Waterfall and/or Agile delivery models, scrum management
- Preferred to have technology experience of any of the BI tools like MicroStrategy, Tableau, Power BI and etc.
- Looking only for immediate to 15 days candidate.
- Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
- Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
- Having experience in a broader Informatica toolset is strongly preferred
- Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
- Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
- Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
- Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
- Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
- Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
- Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
- Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
- Very strong scripting background (C/bourne shell/Perl/Java)
- Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
- Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.