11+ QMC Jobs in India
Apply to 11+ QMC Jobs on CutShort.io. Find your next job, effortlessly. Browse QMC Jobs and apply today!
Business Intelligence Consultant – Qlik
Role
· Working through customer specifications and develop solutions in line with defined requirements
· Strategizing and ideating the solution design (create prototypes and/or wireframes) before beginning to create the application or solution.
· Creating load scripts and QVDs to support dashboards.
· Creating data models in Qlik Sense to support dashboards.
· Leading data discovery, assessment, analysis, modeling and mapping efforts for Qlik dashboards.
· Develop visual reports, dashboards and KPI scorecards using Qlik
· Connecting to data sources ( MS SQL SERVER, ORACLE, SAP), importing data and transforming data for Business Intelligence.
· Translating data into informative visuals and reports.
· Developing, publishing and scheduling reports as per the business requirements.
· Implementing application security layer models in Qlik
Skills Required
· Knowledge of data visualization and data analytics principles and skills –including good user experience/UI Design
· Hands-on developer on Qlik Sense development
· Knowledge of writing SQL queries
· Exceptional analytical skills, problem-solving skills and excellent communication skills
Qualifications
1. Degree in Computer Science Engineering disciplines or MCA
2. 2-4 years of hands-on Qlik experience
3. Qlik Sense certification would be preferred
Key Responsibilities:
- Collaborate with business stakeholders and data analysts to understand reporting requirements and translate them into effective Power BI solutions.
- Design and develop interactive and visually compelling dashboards, reports, and visualizations using Microsoft Power BI.
- Ensure data accuracy and consistency in the reports by working closely with data engineers and data architects.
- Optimize and streamline existing Power BI reports and dashboards for better performance and user experience.
- Develop and maintain data models and data connections to various data sources, ensuring seamless data integration.
- Implement security measures and data access controls to protect sensitive information in Power BI reports.
- Troubleshoot and resolve issues related to Power BI reports, data refresh, and connectivity problems.
- Stay updated with the latest Power BI features and capabilities, and evaluate their potential use in improving existing solutions.
- Conduct training sessions and workshops for end-users to promote self-service BI capabilities and enable them to create their own reports.
- Collaborate with the wider data and analytics team to identify opportunities for using Power BI to enhance business processes and decision-making.
Requirements:
- Bachelor's degree in Computer Science, Information Systems, or a related field.
- Proven experience as a Power BI Developer or similar role, with a strong portfolio showcasing previous Power BI projects.
- Proficient in Microsoft Power BI, DAX (Data Analysis Expressions), and M (Power Query) to manipulate and analyze data effectively.
- Solid understanding of data visualization best practices and design principles to create engaging and intuitive dashboards.
- Strong SQL skills and experience with data modeling and database design concepts.
- Knowledge of data warehousing concepts and ETL (Extract, Transform, Load) processes.
- Ability to work with various data sources, including relational databases, APIs, and cloud-based platforms.
- Excellent problem-solving skills and a proactive approach to identifying and addressing issues in Power BI reports.
- Familiarity with data security and governance practices in the context of Power BI development.
- Strong communication and interpersonal skills to collaborate effectively with cross-functional teams and business stakeholders.
- Experience with other BI tools (e.g., Tableau, QlikView) is a plus.
The role of a Power BI Developer is critical in enabling data-driven decision-making and empowering business users to gain valuable insights from data. The successful candidate will have a passion for data visualization and analytics, along with the ability to adapt to new technologies and drive continuous improvement in BI solutions. If you are enthusiastic about leveraging the power of data through Power BI, we encourage you to apply and join our dynamic team.
Key Responsibilities :
- Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
- Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
- Creation of a project plan including timelines and critical milestones to success in support of the project
- Identification of the vital skill sets/staff required to complete the project
- Identification of crucial sources of the data needed to achieve the objective.
Skill Requirement :
- Experience with data pipeline processes and tools
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
- Experience with an existing ETL tool e.g Informatica and Ab initio etc
- Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
- Deep knowledge of Qlik ecosystems like Qlikview, Qliksense, and Nprinting
- Python, or a similar programming language
- Exposure to data science and machine learning
- Comfort working in a fast-paced environment
Soft attributes :
- Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
- Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
- Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.
Position Profile
Qlik Sense Developer
Location: Mumbai/Gurgaon
Job Description
Role Summary:
Lead and drive the development in BI domain using Qlik Sense eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Qlik Sence eco-system.
Key functions & responsibilities:
- QlikView/Qlik Sense Data Architect (possibly certified) with extensive knowledge of QlikView and Qlik Sense including best practices for data modelling, application design and development.
- Familiarity with the use of GeoAnalytics, NPrinting, extensions, widgets, mashups, ODAG and various other advanced features used in Qlik Sense development.
- Good knowledge working with Set Analysis.
- Experience working with Qlik Sense sites and the Qlik Management Console, creating rules, and managing the streams, as well as user and application security.
- Knowledge of Active Directory, proxies, load balancers, etc.
- Experience in troubleshooting connectivity, configuration, performance, etc.
- Strong communication and presentation skills.
Candidate’s Profile
Academics:
- Bachelor’s degree preferable in Computer science.
- Master’s degree would have an added advantage.
Experience:
2-6 years of experience in Qlik Sense designing and development.
Job Title: Oracle PL/SQL Developer
Qualification: (B.E./B.Tech/ Masters in Computer or IT)
Years of Experience: 3 – 7 Years
No. of Open Positions – 3
Job Location: Jaipur
- Proven hands-on Database Development experience
- Develop, design, test and implement complex database programs
- Strong experience with oracle functions, procedures, triggers, packages & performance tuning,
- Ensure that database programs are in compliance with V3 standards.
- Hands-on development using Oracle PL/SQL.
- Performance tune SQL's, application programs and instances.
- Evaluation of new and upcoming technologies.
- Providing technical assistance, problem resolution and troubleshooting support.
Senior Software Engineer
MUST HAVE:
POWER BI with PLSQL
experience: 5+ YEARS
cost: 18 LPA
WHF- HYBRID
• Working Knowledge of XML, JSON, Shell and other DBMS scripts
• Hands on Experience on Oracle 11G,12c. Working knowledge of Oracle 18 and 19c
• Analysis, design, coding, testing, debugging and documentation. Complete knowledge of
Software Development Life Cycle (SDLC).
• Writing Complex Queries, stored procedures, functions and packages
• Knowledge of REST Services, UTL functions, DBMS functions and data integration is required
• Good knowledge on table level partitions, row locks and experience in OLTP.
• Should be aware about ETL tools, Data Migration, Data Mapping functionalities
• Understand the business requirement, transform/design the same into business solutions.
Perform data modelling and implement the business rules using Oracle database objects.
• Define source to target data mapping and data transformation logic as per the business
need.
• Should have worked on Materialised views creation and maintenance. Experience in
Performance tuning, impact analysis required
• Monitoring and optimizing the performance of the database. Planning for backup and
recovery of database information. Maintaining archived data. Backing up and restoring
databases.
• Hands on Experience on SQL Developer
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
at Velocity Services
We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.
We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.
Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!
Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc.
Key Responsibilities
-
Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality
-
Work with the Office of the CTO as an active member of our architecture guild
-
Writing pipelines to consume the data from multiple sources
-
Writing a data transformation layer using DBT to transform millions of data into data warehouses.
-
Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities
-
Identify downstream implications of data loads/migration (e.g., data quality, regulatory)
What To Bring
-
3+ years of software development experience, a startup experience is a plus.
-
Past experience of working with Airflow and DBT is preferred
-
2+ years of experience working in any backend programming language.
-
Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL
-
Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)
-
Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects
-
Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure
-
Basic understanding of Kubernetes & docker is a must.
-
Experience in data processing (ETL, ELT) and/or cloud-based platforms
-
Working proficiency and communication skills in verbal and written English.
at Home Credit
ETL Developer – Talend
Job Duties:
- ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,
best practices and are maintainable, modular and reusable.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- ETL Developer will analyze and review complex object and data models and the metadata
repository in order to structure the processes and data for better management and efficient
access.
- Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
- Training and mentoring Junior Analysts and building their proficiency in the ETL process.
- Preparing mapping document to extract, transform, and load data ensuring compatibility with
all tables and requirement specifications.
- Experience in ETL system design and development with Talend / Pentaho PDI is essential.
- Create quality rules in Talend.
- Tune Talend / Pentaho jobs for performance optimization.
- Write relational(sql) and multidimensional(mdx) database queries.
- Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &
Load balancing setup, and all its administrative functions.
- Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,
dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,
and analytical models.
- Exposure in Map Reduce components of Talend / Pentaho PDI.
- Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and
maintenance.
- Working knowledge of relational database theory and dimensional database models.
- Creating and deploying Talend / Pentaho custom components is an add-on advantage.
- Nice to have java knowledge.
Skills and Qualification:
- BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
- Having an experience of 3+ years.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- Ability to work independently.
- Ability to handle a team.
- Good written and oral communication skills.