
- Minimum of 12 years of Experience with Informatica ETL, Database technologies.
- Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture.
- Develop solution in highly demanding environment and provide hands on guidance to other team members.
- Head complex ETL requirements and design.
- Implement an Informatica based ETL solution fulfilling stringent performance requirements.
- Collaborate with product development teams and senior designers to develop architectural requirements for the requirements.
- Assess requirements for completeness and accuracy.
- Determine if requirements are actionable for ETL team.
- Conduct impact assessment and determine size of effort based on requirements.
- Develop full SDLC project plans to implement ETL solution and identify resource requirements.
- Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements.
- Assist and verify design of solution and production of all design phase deliverables.
- Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture

Similar jobs
Qualification: Chartered Accountant (CA)
Experience: Minimum 2–3 years of post-qualification experience
Industry: Manufacturing (Preferred)
About the Role:
We are looking for a qualified and experienced Chartered Accountant (CA) to join our finance team as a Finance Manager. The ideal candidate should have 2–3 years of post-qualification experience, strong knowledge of finance, tax laws, and compliance, and a proactive approach to team management and financial leadership.
Key Responsibilities:
Prepare and review monthly, quarterly, and yearly financial reports
Ensure compliance with accounting standards and applicable tax laws
Review all statutory compliances, including TDS, GST, MSME, and other regulatory requirements
Manage and coordinate internal and external audits
Assist in budgeting, forecasting, and financial planning
Monitor company expenses and cost control measures
Perform account reconciliations and maintain accuracy in financial records
Review plant-wise and product-wise costing, identify cost-saving opportunities and improve efficiency
Provide insights and financial analysis to support management decisions
Lead the finance team, assign tasks, provide guidance, and ensure timely deliverables
Coordinate with auditors, consultants, plant teams, and legal advisors
Stay updated with the latest changes in financial, tax, and compliance regulations
Key Skills Required:
Strong knowledge of financial reporting, tax laws, and compliance
Experience in cost analysis and understanding of manufacturing operations (preferred)
Proficiency in accounting software like Tally, SAP, or other ERP systems
Excellent team management and leadership skills
Attention to detail, accuracy, and ability to meet deadlines
Strong communication, coordination, and analytical skills
Requirements:
Must be a qualified Chartered Accountant (CA)
Minimum 2–3 years of post-qualification experience
Must be based in Mumbai
Should be able to work independently and lead a team effectively
Prior experience in the manufacturing industry is preferred
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
- Writing, editing and proofreading website content.
- Write Articles, Blogs, Social media, forums and other SEO best practices.
- Develop content to support SEO strategies to drive traffic to the site and improve website Ranking.
- Understanding of Web publishing requirements.
- Experience in creating content for the web and growing a social audience.
- Editorial mindset with an ability to predict audience preferences.
- Expertise in social media platforms.
- Collaborate with multiple teams to develop compelling content plans.
- Ability to deliver high-quality documentation paying attention to details.
- Basic Knowledge of technical terminologies.
- Excellent communication skills in English both written and spoken.
(Marketing Agency Experience required)
1. Develop high-quality written and visual content, including blog posts, articles, infographics, videos, and social media posts.
Mentor and provide guidance to junior content creators to maintain consistent content quality.
2. Lead content strategy development, collaborating with the marketing team to align with marketing objectives and target audience needs.
Utilize your experience to refine and optimize our content strategy continually.
3. Develop and implement advanced content promotion strategies, including influencer outreach and partnerships.
Monitor industry trends and emerging channels for innovative content promotion opportunities.
4. Take ownership of SEO efforts, conducting in-depth keyword research, and overseeing on-page and off-page optimization efforts.
Monitor and analyze SEO performance, making data-driven improvements.
5. Provide in-depth analysis and insights into content performance, identifying areas for improvement and growth.
Create customized reports for key stakeholders to showcase the impact of content marketing efforts.
5. Manage the content calendar, ensuring alignment with marketing campaigns, industry events, and product launches.
Streamline content production processes for increased efficiency.
Qualifications:
Bachelor's degree in Marketing, Communications, English, Journalism, or a related field.
Minimum of three years of experience in content marketing, with a portfolio of successful campaigns.

Requirement:
- Must have 5+ years Drupal programming experience.
- Good experience in Drupal 7 and 8 implementations.
- Good understanding of MY SQL and relational databases.
- Should have a good understanding of OOPs concept and latest features.
- Should have an understanding of Design Patterns like Singleton, factory, etc.
- Should have at least worked on MVC frameworks like CodeIgniter,CakePHP, Laravel.
- Good understanding of web technologies
- Good understanding of front-end development HTML5, CSS3, JavaScript, Jquery skills.
- Good Git version control knowledge.
- Contribution to open source community.
- Experience in Drupal 7 of 5+ years.
- Good with RDBMS & writing custom SQL
- Drupal API experience
- Must have PHP, JavaScript, AJAX, HTML and CSS experience
- Should be able to code as per Drupal standard coding
- Should be well versed with implementation and configuration of most commonly used modules
- Experience with CSS Frameworks such as Twitter Bootstrap. Communication Responsibilities:
- Deliver engaging, informative and well-organized presentations.
- Strong command of English language (both verbal and written).
- Resolves and/or escalates issues in a timely fashion.
Other Skills
- Disseminate technology best practices.
- Work with senior developers in adoption of new technologies within our Technology practice
- Good team player with ability to lead multiple technical teams.
- Excellent relationship building skills.
- Ability to work under pressure with a solid sense for setting priorities.
Knowledge of Mysql
Strong Debugging and root cause analysis
L2/L3 Support


Job Description For PHP/Laravel Developer
We are looking for a PHP Developer responsible for managing back-end services and the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of front-end technologies is necessary as well.
Desired Technical Skills :
- Excellent skills in PHP, MySQL and Javascript
- Knowledge of any MVC framework Laravel, Codeigniter etc.
- Strong object-oriented programming skills
- Good understanding of both front-end and back-end web development.
- Knowledge of eCommerce is a plus point.
Professional Skills:
- A team player with good communications
- Ability to solve problems quickly and efficiently
- Great aptitude and attitude towards learning
- Take full responsibility for task/project execution
- Ability to prioritize own work and respect deadlines
Benefits:
Work From Office Only
Flexible day shift
Alternate Saturday Working
Paid Leave
Salary-No Bar for deserving
Location-Mohali
Experience-3 years Minimum
Immediate joiner/15 days Preferred.
*No Work From Home

Job Description
We are looking for a great Go developer who possesses a strong understanding of how best to leverage and exploit the language’s unique paradigms, idioms, and syntax. Your primary focus will be on developing Go packages and programs that are scalable and maintainable. You will ensure that these Go packages and programs are well documented and has a reasonable test coverage. You will coordinate with the rest of the team working on different layers of the infrastructure. A commitment to collaborative problem solving, sophisticated design, and quality product is essential.
Responsibilities
-
Writing scalable, robust,, testable, efficient, and easily maintainable code
-
Translating software requirements into stable, working, high performance software
-
Playing a key role in architectural and design decisions, building toward an efficient micro services distributed architecture



Headquarters : Gurgaon, Haryana
Year founded : 2015
Work Location : Gurugram
Job Description –
Job Role - Software Developer Frontend
Key Capabilities
Responsibilities
- Developing new user-facing features using React.js
- Building reusable components and front-end libraries for future use
- Translating designs and wireframes into high quality code
- Optimizing components for maximum performance across a vast array of web-capable devices and browsers
- Knowledge of isomorphic React is a plus
If you think you are one of them then please reply with your interest and we will take your candidature ahead for the next level of evaluation. Below information would be appreciated more:
Current CTC:
Expected CTC:
Notice Period:
Update Resume: please attach
Looking forward to taking you to the next level of technology.


