8+ Informatica Jobs in Pune | Informatica Job openings in Pune
Apply to 8+ Informatica Jobs in Pune on CutShort.io. Explore the latest Informatica Job opportunities across top companies like Google, Amazon & Adobe.
• Work with various stakeholders, understand requirements, and build solutions/data pipelines
that address the needs at scale
• Bring key workloads to the clients’ Snowflake environment using scalable, reusable data
ingestion and processing frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
Skills - 50% of below:
• A passion for all things data; understanding how to work with it at scale, and more importantly,
knowing how to get the most out of it
• Good understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy
cloning, tasks, Snowpipe etc
• Expertise in data modeling, with a good understanding of modeling approaches like Star
schema and/or Data Vault
• Experience in automating deployments
• Experience writing code in Python, Scala or Java or PHP
• Experience in ETL/ELT either via a code-first approach or using low-code tools like AWS Glue,
Appflow, Informatica, Talend, Matillion, Fivetran etc
• Experience in one or more of the AWS especially in relation to integration with Snowflake
• Familiarity with data visualization tools like Tableau or PowerBI or Domo or any similar tool
• Experience with Data Virtualization tools like Trino, Starburst, Denodo, Data Virtuality, Dremio
etc.
• Certified SnowPro Advanced: Data Engineer is a must.
Technical/Core skills
- Minimum 3 yrs of exp in Informatica Big data Developer(BDM) in Hadoop environment.
- Have knowledge of informatica Power exchange (PWX).
- Minimum 3 yrs of exp in big data querying tool like Hive and Impala.
- Ability to designing/development of complex mappings using informatica Big data Developer.
- Create and manage Informatica power exchange and CDC real time implementation
- Strong Unix knowledge skills for writing shell scripts and troubleshoot of existing scripts.
- Good knowledge of big data platforms and its framework.
- Good to have an experience in cloudera data platform (CDP)
- Experience with building stream processing systems using Kafka and spark
- Excellent SQL knowledge
Soft skills :
- Ability to work independently
- Strong analytical and problem solving skills
- Attitude of learning new technology
- Regular interaction with vendors, partners and stakeholders
- Designing and coding the data warehousing system to desired company specifications
- Conducting preliminary testing of the warehousing environment before data is extracted
- Extracting company data and transferring it into the new warehousing environment
- Testing the new storage system once all the data has been transferred
- Troubleshooting any issues that may arise
- Providing maintenance support
- Consulting with data management teams to get a big-picture idea of the company’s data storage needs
- Presenting the company with warehousing options based on their storage needs
- Experience of 1-3 years in Informatica Power Center
- Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
- Knowledge of SQL Server database
- Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques Understanding of ETL Control Framework
- Experience in UNIX shell/Perl Scripting
- Good communication skills, including the ability to write clearly
- Able to function effectively as a member of a team
- Proactive with respect to personal and technical development
We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.
In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that
- Has healthcare experience and is passionate about helping heal people,
- Loves working with data,
- Has an obsessive focus on data quality,
- Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
- Has strong data interrogation and analysis skills,
- Defaults to written communication and delivers clean documentation, and,
- Enjoys working with customers and problem solving for them.
A day in the life at Innovaccer:
- Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
- Measure and communicate impact to our customers.
- Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.
What You Need:
- 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
- 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
- Intermediate to advanced level SQL programming skills.
- Data Analytics and Visualization (using tools like PowerBI)
- The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
- Ability to work in a fast-paced and agile environment.
- Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.
What we offer:
- Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
- Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
- Health benefits: We cover health insurance for you and your loved ones.
- Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
- Pet-friendly office and open floor plan: No boring cubicles.
With a global provider of Business Process Management.
Good knowledge of managing Linux admin along with application hosting on the server. Excellent
knowledge of Linux commands
Worked on OEL Linux and other Linux version. Manage VAPT requirements, server support, backup
strategy
Knowledge of hosting OBIEE, Informatica and other applications like Essbase will be added adavntage.
Good technical knowledge and ready to always learn new technologies
Configurations of SSL, Port related checking
Technical documentation and debug skills
Coordination with other technical teams
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
With a global provider of Business Process Management.
testing
Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
requirements
Good communication skills and written communication
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Review all job requirements and specifications required for deploying the solution into the production environment.
Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same.
Work with Lead to resolve all issues within the required timeframe and inform for any delays.
Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments.
Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis.
Review and maintain all technical and business documents. Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide
Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same.
Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues.
Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau.
Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones.
Experience on Tools and Technologies preferred:
ETL Tools: Talend or Informatica ,Abinitio,Datastage
BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience
Database: Experience in Oracle or SS
Methodology: Experience in SDLC and/or Agile Methodology
1.Understand client business requirements and interpret into technical solutions
2. Build and maintain database stored procedures
3. Build and maintain ETL workflows
4. Perform quality assurance and testing at the unit level
5. Write and maintain user and technical documentation
6. Integrate Merkle database solutions with web services and cloud-based platforms. Must Have: SQL server stored procedures
Good/Nice to have: UNIX shell scripting, Talend/Tidal/Databricks/Informatica , JAVA/Python Experience : 2 to 10 years of experienced candidates