Good knowledge of Informatica ETL, Oracle Analytics Server
Analytical ability to design warehouse as per user requirements mainly in Finance and HR domain
Good skills to analyze existing ETL, dashboard to understand the logic and do enhancements as per
Good communication skills and written communication
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Excellent verbal and written communication skills
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
- Proficiency in warehousing concepts and fundamentals and architecture techniques including MOLAP, ROLAP, ODS, DM, and EDW.
- Extracting company data and transferring it into the new warehousing environment.
- Testing the new storage system once all the data has been transferred.
- Document all technical and system specifications documents for all ETL processes and perform unit tests on all processes and prepare required programs and scripts
- Understanding of standard business functions like Sales, Finance, Inventory, Production reporting etc.
- Understanding of Enterprise applications like Salesforce, Microsoft Navision.
- Hands-on experience on Azure Data Bricks, Data Pipelines, Data Factory, Azure Data Lake, Azure SQL Database,
- Must have part of a 2-3 projects for their entire project lifecycle.
- Should be involved in direct customer interaction.
- Performed activities of data validation, support, issue resolution, bug fixing.
- Must have worked on standard business functions like Sales & Inventory.
- Ability to integrate analytics tools like Power BI
- Knowledge of Power BI DAX functions, Power query to prepare effective business metrics and KPIs for indicating performance of a business.
- Develop and implement reports using effective “Data Modelling” Techniques
This is a role that combines technical expertise with customer management skills and requires
close interaction with the customer in understanding requirements/use cases, scheduling,
and proposing solutions. Professional Services Engineer is responsible for system
implementation by developing, building pipelines (integrations) and providing product
demos to our customers. This person needs an ability to share and communicate ideas clearly,
both orally and in writing, to executive staff, business sponsors, and technical resources in
clear concise language that is the parlance of each group.
Requirements and Preferred Skills:
1. 5+ Years’ Experience with other integration technologies like SnapLogic, Informatica,
MuleSoft, etc. and in-depth understanding of Enterprise Integration Patterns
2. 5+ years of experience with SQL
3. Hands on experience with REST architectures
4. Knowledge of SOAP/XML/JMS/JSON, basic level understanding of REST principles, and
REST and SOAP APIs.
5. Deep understanding of HTTP protocols
6. Excellent customer facing skills
7. Must be a self-starter and extremely organized with your space and time.
8. Ability to juggle working independently and as part of a team.
9. Accurate and fast decision-making processes
10. Be able to quickly debug complex Snap issues and figure out the root cause of
11. Cycle between projects in weeks rather than years – continually learning about new
technology and products
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
Briefly describe Mandatory Skills: (Also attach the job description)
Sisense Developer with 5+ years of experience shall have the following skills:
- Design and build BI dashboards in short time frames via rapid prototyping and agile development
- Apply advanced visualization techniques to reveal business insights
- Perform rigorous data analysis to proactively identify any inconsistencies or data quality issues. Provide recommendations for improvements
- Develop a strong understanding of different data sources, and strategically implement data flows for robustness and scalability
- Demonstrate ability to translate business questions into structured analysis and data analytics into business insights
- Work to improve the day-to-day activities of the analytics team with technology and automation
- Clear demonstrable experience working to develop and improve reporting systems
- Advanced SQL scripting skills
- Highly skilled in Sisense, with proven experience creating reports and dashboards from scratch.
- Confidence showing your work and delivering reports/dashboards to stakeholders
The developer is expected to be involved in the following:
- Gather stakeholder requirements and produce multiple sophisticated reports and dashboards in Sisense
- Add to and altering existing reports as the needs of the business change
- Offer suggestions as to how the reporting system can be improved or functionality added
- Deliver these improvements and developing out additional features to the system
Optional Skills (Good to have):Sisense certification
* Formulates and recommends standards for achieving maximum performance
and efficiency of the DW ecosystem.
* Participates in the Pre-sales activities for solutions of various customer
* Develop business cases and ROI for the customer/clients.
* Interview stakeholders and develop BI roadmap for success given project
* Evangelize self-service BI and visual discovery while helping to automate any
manual process at the client site.
* Work closely with the Engineering Manager to ensure prioritization of
* Champion data quality, integrity, and reliability throughout the organization by
designing and promoting best practices.
* Help DW/DE team members with issues needing technical expertise or
complex systems and/or programming knowledge.
* Provide on-the-job training for new or less experienced team members.
* Develop a technical excellence team
- experience designing business intelligence solutions
- experience with ETL Process, Data warehouse architecture
- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,
Synapse, Azure Databricks, and Power BI
- Good analytical and problem-solving skills
- Fluent in relational database concepts and flat file processing concepts
- Must be knowledgeable in software development lifecycles/methodologies
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.
We are looking for job for talend developer for reputed company for permanent role in coimbatore.
skills;ETL tools,any DB,supporting tool
- 4+ years of extensive EXP in TIBCO SPOTFIRE Dashboard Development is MUST
- Design and create data visualizations in TIBCO Spotfire
- Proven EXP in delivering Spotfire solutions to advance business goals and needs.
- Detailed knowledge of TIBCO Spotfire - report developer configuration
- EXP on creating all charts that exist in Spotfire (scatter, line, bar, combo, pie, etc.) and how to manipulate every property associated with a visualization (trellis, color, shape, size, etc.)
- EXP in writing efficient SQL queries, views in relational databases such as Oracle, SQL Server, Postgres and BigQuery (Optional).
- Ability to incorporate multiple data sources into one Spotfire DXP and have that information linked via data table relations.
- EXP with Spotfire Administrative tasks, load balancing, installation/configuration of servers and clients, upgrades and patches would be a plus.
- Strong background in analytical visualizations and building executive dashboards.
- In-depth knowledge & understanding of BI and Datawarehouse concepts