Business Intelligence Consultant – Qlik
Role
· Working through customer specifications and develop solutions in line with defined requirements
· Strategizing and ideating the solution design (create prototypes and/or wireframes) before beginning to create the application or solution.
· Creating load scripts and QVDs to support dashboards.
· Creating data models in Qlik Sense to support dashboards.
· Leading data discovery, assessment, analysis, modeling and mapping efforts for Qlik dashboards.
· Develop visual reports, dashboards and KPI scorecards using Qlik
· Connecting to data sources ( MS SQL SERVER, ORACLE, SAP), importing data and transforming data for Business Intelligence.
· Translating data into informative visuals and reports.
· Developing, publishing and scheduling reports as per the business requirements.
· Implementing application security layer models in Qlik
Skills Required
· Knowledge of data visualization and data analytics principles and skills –including good user experience/UI Design
· Hands-on developer on Qlik Sense development
· Knowledge of writing SQL queries
· Exceptional analytical skills, problem-solving skills and excellent communication skills
Qualifications
1. Degree in Computer Science Engineering disciplines or MCA
2. 2-4 years of hands-on Qlik experience
3. Qlik Sense certification would be preferred

Similar jobs
Job Description
Location- Noida (WFO)
Experience- 2+ years
Relevant experience- 2 years is required
Notice period- Immediate to 30 days (max)
Budget- upto 5 LPA
Skills required- Communication skills, Oracle Apex, PL/SQL, SQL, Javascript
Interview process- 2 technical rounds (vitural). Last round can be F2F if needed.
Working days and timings- 5 days working- 9:30am to 6pm.
Lightning job by Cutshort ⚡
As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)
About Exponentia.ai:
Exponentia.ai started its journey in the year 2014 when Rohit Mathur and Ramendra Shukla, our co-founders felt that organizations needed to be technology-ready and leverage the power of data. Within a short span of 7 years, we have made a name for ourselves in the field of data and analytics and have established ourselves as a leading AI Tech firm in India. We have already served more than 60 enterprises across India, UK, Singapore, UAE and US by establishing and implementing analytics and AI solutions.
We believe that rapid technological advancements should be the ladder to organizational growth and development and not an impediment. Our aim is to empower businesses by making new technology accessible, affordable and scalable.
Check here to know more about Exponentia.ai : https://exponentia.ai/
Data Engineer - Lightning job by Cutshort ⚡
Requirements:
1.Candidate with hands on experience on relational SQL/No SQL
databases preferably in the BFSI domain preferably on large volumes of data.
2. Advanced working knowledge on SQL/PL SQL
3. Hands on experience on building and optimizing data
pipelines architecture and data sets.
4. Hands on experience on data ware house and multi-dimensional data modelling techniques
5. Hands on experience on any of the Data Integration (ETL)
tools like SSIS, Informatica, Talend
6. Hands on experience on processes supporting data
transformation, data structures. Metadata
7. Working experience on data quality, error handling, data
reconciliation, anomalies identification and error logging
8. Working with stakeholders to understand data
requirement and document their work.
9. Strong analytical , problem solving and organizational
skills
10. Good to have hands on experience on Hadoop, Spark,
Kafka etc
11. Good to have hands on experience on Snowflakes, Data
Bricks, Data Lake, Data lake-house etc
12. Good to have experience on AWS, Azure, Google cloud
• Development/Testing/Processes
• Developing effective QlikView/ Sense data models
• Developing front end applications using Qlik technology
• Utilizing scripting language to meet complex business requirements
• Utilizing Qlik Publisher / N printing capabilities
• Extract, transform and load (ETL) data from multiple data sources into the Qlik application
• Design, build, test and debug Qlik solutions based upon specified
requirements
• Follow implementation standards
• Utilize source control tools
• Follow deployment process
• Experience creating extract/transform/load routines from data sources
including SAP BW, SAP R/3, MS SQL Server, DB2, Oracle as well as other data sources
• Solid experience developing complex Qlik data models
Specific Responsibilities:
• Participating in business requirements and design review sessions
• Providing input on proposing, evaluating, and selecting appropriate
design alternatives which meet requirements and are consistent with our
current standards and processes
• Extracting, transforming and loading data into Qlik applications
• Developing, testing, debugging Qlik applications
• Migrating code across development and testing landscapes
• Creating publisher jobs
• Developing documentation
• Transferring knowledge and landing application to BI Support team
• Good communication skills and ability to interact with the customer
• Willingness to travel is mandatory
• Experience on Qlik sense, Geo Analytics an added advantage
Concepts of RDBMS, Normalization techniques
Entity Relationship diagram/ ER-Model
Transaction, commit, rollback, ACID properties
Transaction log
Difference in behavior of the column if it is nullable
SQL Statements
Join Operations
DDL, DML, Data Modelling
Optimal Query writing - with Aggregate fn, Group By, having clause, Order by etc. Should be
hands on for scenario-based query Writing
Query optimizing technique, Indexing in depth
Understanding query plan
Batching
Locking schemes
Isolation levels
Concept of stored procedure, Cursor, trigger, View
Beginner level - PL/SQL - Procedure Function writing skill.
Spring JPA and Spring Data basics
Hibernate mappings
UNIX
Basic Concepts on Unix
Commonly used Unix Commands with their options
Combining Unix commands using Pipe Filter etc.
Vi Editor & its different modes
Basic level Scripting and basic knowledge on how to execute jar files from host
Files and directory permissions
Application based scenarios.
Key Responsibilities :
- Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
- Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
- Creation of a project plan including timelines and critical milestones to success in support of the project
- Identification of the vital skill sets/staff required to complete the project
- Identification of crucial sources of the data needed to achieve the objective.
Skill Requirement :
- Experience with data pipeline processes and tools
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
- Experience with an existing ETL tool e.g Informatica and Ab initio etc
- Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
- Deep knowledge of Qlik ecosystems like Qlikview, Qliksense, and Nprinting
- Python, or a similar programming language
- Exposure to data science and machine learning
- Comfort working in a fast-paced environment
Soft attributes :
- Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
- Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
- Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.
Job Title: Oracle PL/SQL Developer
Qualification: (B.E./B.Tech/ Masters in Computer or IT)
Years of Experience: 3 – 7 Years
No. of Open Positions – 3
Job Location: Jaipur
- Proven hands-on Database Development experience
- Develop, design, test and implement complex database programs
- Strong experience with oracle functions, procedures, triggers, packages & performance tuning,
- Ensure that database programs are in compliance with V3 standards.
- Hands-on development using Oracle PL/SQL.
- Performance tune SQL's, application programs and instances.
- Evaluation of new and upcoming technologies.
- Providing technical assistance, problem resolution and troubleshooting support.
As a Power BI and QlikView Developer, we expect the candidate to be a key contributor in the implementation of data analytics dashboards – from data preparation to dashboard development, unit testing and deployment. The primary work focus for the candidate is as under:
- Understanding the database design.
- Develop efficient SQL queries from simple to complex and test the data output as a part of data preparation activity
- Development & Unit Testing – Dashboards & Data Visualizations using Power BI
- Troubleshooting/debugging and rectifying issues
- Review, feedback and mentoring the team
- Adherence of standards and best practices as defined by the company at the individual level and as a team
QUALIFICATIONS AND EXPERIENCE
- Degree in BE/ BTech with at least 3 to 5 years of overall experience
- Experience in working on multiple databases like MS SQL (Mandatory), PostgreSQL, Mongo DB, MySQL (would be a plus) etc. and data analytics projects (a minimum of 2 to 3).
- Experience working on SQL Server: Writing queries of 1 to 2 years is mandatory.
- Expertise in Power BI and knowledge of QlikView (1-2 years).
- Understanding other tools like Tableau, Domo etc. would be a great plus
- Experience in working on different types of visualizations, in addition to the generic ones – Scatter plots, Heat Maps, Geo maps, Gantt, Bubbles, Tree Maps etc.
- Experience working on Trends, forecasting etc. would be a plus
- Experience in working with projects teams and ensuring the successful delivery of the solution
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
Job Role : Associate Manager (Database Development)
Key Responsibilities:
- Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
- Designing and developing numerous complex queries, views, functions, and stored procedures
- to work seamlessly with the Application/Development team’s data needs.
- Responsible for providing solutions to all data related needs to support existing and new
- applications.
- Creating scalable structures to cater to large user bases and manage high workloads
- Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
- Developing custom stored procedures and packages to support new enhancement needs.
- Working with multiple teams to design, develop and deliver early warning systems.
- Reviewing query performance and optimizing code
- Writing queries used for front-end applications
- Designing and coding database tables to store the application data
- Data modelling to visualize database structure
- Working with application developers to create optimized queries
- Maintaining database performance by troubleshooting problems.
- Accomplishing platform upgrades and improvements by supervising system programming.
- Securing database by developing policies, procedures, and controls.
- Designing and managing deep statistical systems.
Desired Skills and Experience :
- 7+ years of experience in database development
- Minimum 4+ years of experience in PostgreSQL is a must
- Experience and in-depth knowledge in PL/SQL
- Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
- Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
- Experience in Big Data technologies is an added advantage
- Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
- Ability to take ownership of tasks and flexibility to work individually or in team
- Ability to communicate with teams and clients across time zones and global regions
- Good communication and self-motivated
- Should have the ability to work under pressure
- Knowledge of NoSQL and Cloud Architecture will be an advantage

