11+ Informatica PowerCenter Jobs in Mumbai | Informatica PowerCenter Job openings in Mumbai
Apply to 11+ Informatica PowerCenter Jobs in Mumbai on CutShort.io. Explore the latest Informatica PowerCenter Job opportunities across top companies like Google, Amazon & Adobe.
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Position: Assistant Manager – BD (Mall Media Barter Deals)
Location: Mumbai
CTC: 4.5 - 6 LPA + Incentives
Key Responsibilities
- Drive mall media barter sales across Nexus and 25+ malls Pan India.
- Leverage a strong network of brands across India to identify and close barter opportunities.
- Manage media inventory, create proposals, and present to brands/agencies.
- Pitch, negotiate, and finalize barter deals across Lifestyle, Electronics, Office Supplies, and Gifting categories.
- Ensure seamless execution of deals with timely delivery, performance tracking, and MIS/reporting.
- Achieve sales and barter revenue targets while generating agency fees.
- Build and maintain strong relationships with mall stakeholders, vendors, and clients.
Requirements
- MBA preferred, with 3–6 years’ experience in media sales, barter, mall advertising, or OOH.
- Strong brand and agency network across India.
- Proven record in closing barter/partnership deals and delivering revenue.
- Excellent negotiation, communication, and presentation skills.
- Independent, target-driven, with strong analytical and reporting ability.
Full Stack Developer responsible for managing back-end services and the interchange of data between the server and the users.
Your primary focus will be the development of all server side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the frontend elements built by your co-workers into the application. Therefore, a basic understanding of frontend technologies is necessary as well.
We are looking to recruit a candidate for a role that will require:
- Create and consume restful APIs
- Design, develop, and maintain internal and external applications
- Build efficient, testable, and reusable modules
- Write high quality, structured application/interface code and documentation
- Identify solutions through research and collaboration that resolves the root of problems as they arise
- Define functional and technical requirements for application software to develop skills and knowledge
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Contribute in all phases of the development lifecycle
Requirement
∙ Proficient in Node.JS and ReactJS development stack
∙ 2+ years’ experience designing, querying, and updating databases in MySQL/nosql
∙ Basic understanding of web technologies including HTML, CSS, JavaScript, AJAX etc.
∙ Passion for best design and coding practices and a desire to develop new bold ideas
∙ Good to have knowledge of AWS, Redis, ElasticSearch
Education: Min. Graduate in related discipline
Work experience: 2 years relevant experience
Compensation: Based on Industry StandardsInterview Mode : Face to Face
Ability to take reference from existing client
Ability to generate performance from team & build good relationship with client
Proven track record of achievement of target
Ability to set target & understand product costing with projection
Location: Mumbai
Experience Required: 2+ Years
Job Type: Full-Time
Notice Period: Immediate Joiners
About the Role
We are seeking a skilled and experienced Full Stack (MERN) Developer to join our product engineering team. The ideal candidate must have experience working in a product-based software development company and should have hands-on involvement in e-commerce platform projects.
Mandatory Criteria
- Candidate must be currently working or have prior experience in a product-based software development company
- Candidate must have worked on at least one e-commerce platform project (end-to-end involvement)
Key Responsibilities
- Develop and maintain scalable web applications using MongoDB, Express.js, React.js, and Node.js
- Translate UI/UX designs into high-quality front-end code
- Build and integrate RESTful APIs
- Ensure performance, quality, and responsiveness of applications
- Participate in design discussions, code reviews, and contribute to technical documentation
- Work in a collaborative, agile development environment
Required Skills
- Strong proficiency in JavaScript, React.js, Node.js, and Express.js
- Hands-on experience with MongoDB or other NoSQL databases
- Solid understanding of RESTful APIs and architectural patterns
- Knowledge of version control systems like Git
- Understanding of CI/CD and cloud-based deployment practices
- Good grasp of performance optimization, caching, and security practices
Job Description: Data Engineer
Position Overview:
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
· Ensure data quality and consistency by implementing validation and governance practices.
· Work on data security best practices in compliance with organizational policies and regulations.
· Automate repetitive data engineering tasks using Python scripts and frameworks.
· Leverage CI/CD pipelines for deployment of data workflows on AWS.
Required Skills and Qualifications
· Professional Experience: 5+ years of experience in data engineering or a related field.
· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.
· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
· AWS Glue for ETL/ELT.
· S3 for storage.
· Redshift or Athena for data warehousing and querying.
· Lambda for serverless compute.
· Kinesis or SNS/SQS for data streaming.
· IAM Roles for security.
· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
· Version Control: Proficient with Git-based workflows.
· Problem Solving: Excellent analytical and debugging skills.
Optional Skills
· Knowledge of data modeling and data warehouse design principles.
· Experience with data visualization tools (e.g., Tableau, Power BI).
· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
· Exposure to other programming languages like Scala or Java.
Education
· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Why Join Us?
· Opportunity to work on cutting-edge AWS technologies.
· Collaborative and innovative work environment.
Responsibilities
Work on React Js, MongoDB, Express React, AngularJS, and Node.js.
Care deeply about clean and functional code.
Passionate about learning and staying up to date with the latest technologies.
Strong proficiency in JavaScript, object model, DOM manipulation and event handlers, data structures, algorithms, JSX, and Babel.
Complete understanding of ReactJS and its main fundamentals like JSX, Virtual DOM, component lifecycle, etc.
Preceding experience with ReactJS workflows like Flux, Redux, Create React App, data structure libraries.
Understanding of REST APIs/GraphQL, HTML/CSS, ES6 (variables and scoping, array methods), code versioning tools like GIT, SVN, etc., popular frontend development tools, CI/CD tools, DevOps, performance testing frameworks like Mocha, Node + NPM
● You’ve been building scalable backend solutions for web applications.
● You have experience with any of these backend programming languages -- Python,
NodeJS or Java.
● You write an understandable, very high quality, testable code with an eye towards
maintainability.
● You are a strong communicator. Explaining complex technical concepts to designers,
support, and other engineers is no problem for you.
● You possess strong computer science fundamentals: data structures, algorithms,
programming languages, distributed systems, and information retrieval.
● You have completed a bachelor's degree in Computer Science, Engineering or related
field, or equivalent training, fellowship, or work experience.
Very good knowledge of material design components
Angular CLI and CDK.
Creating new Components
String Interpolation and property binding.
In depth knowledge on angular Core package
Two way binding concept
Use cases on Angular Life cycle methods
Inbuilt directives and custom directive
Creation of Custom events and Event binding between components.
Communication between two custom components
Local references and View child and Content Child options
Services and Dependency Injection and its implementation in various use cases
Angular Routers and routing
Observables and their use and implementation
Pipes to transform output Creating Http request ( API handling)Dynamic Components
Good Knowledge of HTML5, CSS3, JQuery, Typescript
Proficient understanding of code versioning tools, such as Git, SVN
Adhering to best design practice
Good Written and Verbal communications.
Job brief:
We are looking for a professional Instructional Designer to act as a bridge between the customer and the development team to translate information between them and drive intelligence into the content development process by taking a solutioning approach. The ID would not only be a part of the course design process but also be involved in storyboarding. The objective is to facilitate end learners in acquiring knowledge, skills and competencies in an effective and appealing manner.
Responsibilities Include:
- Content analysis and identify target audience’s training needs
- Work closely with SMEs/Customers to understand concepts
- Apply Instructional design theories, practice and methods
- Analyse and research strategies where necessary
- Develop creative scripts for videos and engaging storyboards for e-learning modules (Courses, Videos, Simulations, Gamified courses, Scenarios, etc), ILTs
- Design interactive learning objects, develop assessments and exercises
- Provide visual guidance and suitable references to developers where necessary
- Ensure that the outputs are developed in accordance with the Storyboards
- Project documentation and vendor coordination
- Brief travel to customer locations as necessary
Requirements:
- Proven working experience in instructional design and storyboarding
- Excellent knowledge of learning theories and instructional design models
- Lesson and curriculum planning skills
- Advanced User of Microsoft Office – Word, PowerPoint, Excel – for storyboarding
- Understanding on the working of HTML5, Articulate, Flash, Adobe Suite and any LMS
- Very good communication skills
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow







