- Hands-on development/maintenance experience in Tableau: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions using Tableau
- Reviewing and improving existing Tableau dashboards and data models/ systems and collaborating with teams to integrate new systems
- Provide support and expertise to the business community to assist with better utilization of Tableau
- Understand business requirements, conduct analysis and recommend solution options for intelligent dashboards in Tableau
- Experience with Data Extraction, Transformation and Load (ETL) – knowledge of how to extract, transform and load data
- Execute SQL data queries across multiple data sources in support of business intelligence reporting needs. Format query results / reports in various ways
- Participates in QA testing, liaising with other project team members and being responsive to client's needs, all with an eye on details in a fast-paced environment
- Performing and documenting data analysis, data validation, and data mapping/design
Key Performance Indicators (Indicate how performance will be measured: indicators, activities…) |
KPIs will be outlined in detail in the goal sheet |
Ideal Background (State the minimum and desirable education and experience level) |
Education |
Minimum: Graduation, preferably in Science |
Experience requirement: |
· Minimum: 2-3 years’ relevant work experience in the field of reporting and data analytics using Tableau. · Tableau certifications would be preferred · Work experience in the regulated medical device / Pharmaceutical industry would be an added advantage, but not mandatory |
Languages: |
Minimum: English (written and spoken) |
Specific Professional Competencies: Indicate any other soft/technical/professional knowledge and skills requirements |
|
Similar jobs
SQL Lead
at Datametica Solutions Private Limited
We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description :
Experience: 6+ Years
Work Location: Pune / Hyderabad
Technical Skills :
- Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
- Knowledge of database performance tuning techniques
- Rich experience in a database development
- Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
- Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
Required Candidate Profile :
- Excellent communication, interpersonal, analytical skills and strong ability to drive teams
- Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
- Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
- Stakeholder management and client engagement skills
- Strong communication skills (written and verbal)
About Us!
A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle Data warehouse Assessment & Migration Planning Product
Raven Automated Workload Conversion Product
Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
Who Are We?
Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 7 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.
Vahak has raised a capital of $5+ Million in a Pre-Series A round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.
Manager Data Science:
We at Vahak, are looking for an enthusiastic and passionate Manager of Data Science, to join our young & diverse team.You will play a key role in the data science group, working with different teams, identifying the use cases that could be solved by application of data science techniques.
Our goal as a group is to drive powerful, big data analytics products with scalable results.We love people who are humble and collaborative with hunger for excellence.
Responsibilities:
- Mine and Analyze end to end business data and generate actionable insights. Work will involve analyzing Customer transaction data, Marketing Campaign performance analysis, identifying process bottlenecks, business performance analysis etc.
- Identify data driven opportunities to drive optimization and improvement of product development, marketing techniques and business strategies.
- Collaborate with Product and growth teams to test and learn at unprecedented pace and help the team achieve substantial upside in key metrics
- Actively participate in the OKR process and help team democratize the key KPIs and metrics that drive various objectives
- Comfortable with digital marketing campaign concepts, use of marketing campaign platforms such as Google Adwords and Facebook Ads
- Responsible for design of algorithms that require different advanced analytics techniques and heuristics to work together
- Create dashboard and visualization from scratch and present data in logical manner to all the stakeholders
- Collaborates with internal teams to create actionable items based off analysis; works with the datasets to conduct complex quantitative analysis and helps drive the innovation for our customers
Requirements:
- Bachelor’s or Masters degree in Engineering, Science, Maths, Economics or other quantitative fields. MBA is a plus but not required
- 5+ years of proven experience working in Data Science field preferably in ecommerce/web based or consumer technology companies
- Thorough understanding of implementation and analysis of product and marketing metrics at scale
- Strong problem solving skills with an emphasis on product development.
- Fluency in statistical computer languages like SQL, Python, R as well as a deep understanding of statistical analysis, experiments designs and common pitfalls of data analysis
- Should have worked in a relational database like Oracle or Mysql, experience in Big Data systems like Bigquery or Redshift a definite plus
- Experience using business intelligence tools e.g. Tableau, Power BI would be an added advantage (not mandatory)
Senior Manager- Data Intelligence|
at Data Intelligence Platform For Decision-Makers
Supporting today’s data driven business world, our client acts as a full-stack data intelligence platform which leverages granular and deep data from various sources, thus helping the decision-makers at the executive level. Their solutions include supply chain optimization, building footprints, track construction hotspots, real estate, and lots more.
Their work embeds geospatial analytics, location intelligence and predictive modelling in the foundations of economic modelling and evaluation theory to build data intelligence layers for their clients which include governments, multilateral institutions, and private organizations.
Headquartered in New Delhi, our client works as a team of economists, data scientists, geo-spatial analysts, etc. Their decision-support system includes Big-Data, predictive modeling, forecasting, socio economic dataset and many more.
As a Senior Manager– Data Intelligence, you will be responsible for contributing to all stages of projects– conceptualizing, preparing work plans, overseeing analytical work, driving teams to meet targets and ensuring quality standards.
What you will do:
- Thoroughly understanding the data processing pipeline and troubleshooting/problem-solving the technical team
- Acting as SPOC for client communications across the portfolio of projects undertaken by the organization
- Contributing to different aspects of organizational growth– team building, process building, strategy and business development
Desired Candidate Profile
What you need to have:- Post-graduate degree in relevant subjects such as Economics/ Engineering/ Quantitative Social Sciences/ Management etc
- At least 5 years of relevant work experience
- Vast experience in managing multiple client projects
- Strong data/quantitative analysis skills are a must
- Prior experience working in data analytics teams
- Credible experience of data platforms & languages in the past
2-4 years of experience in developing ETL activities for Azure – Big data, relational databases, and data warehouse solutions.
Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Azure Analysis Service, Azure Databricks, Azure Data Catalog, ML Studio, AI/ML, Snowflake, etc.
Well versed in DevOps and CI/CD deployments
Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, SSIS, etc.
Minimum of 2 years of RDBMS experience
Experience with private and public cloud architectures, pros/cons, and migration considerations.
Nice-to-Have Skills/Qualifications:
- DevOps on an Azure platform
- Experience developing and deploying ETL solutions on Azure
- IoT, event-driven, microservices, Containers/Kubernetes in the cloud
- Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog etc.
- Multi-cloud experience a plus - Azure, AWS, Google
Professional Skill Requirements
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Desire to work in an information systems environment
Excellent communication (written and oral) and interpersonal skills
Excellent leadership and management skills
Excellent organizational, multi-tasking, and time-management skills
Why work with us?
Responsibilities:
- Applies scripting/programming skills to assemble various types of source data (unstructured, semi-structured, and structured) into well-prepared datasets with multiple levels of granularities (e.g., demographics, customers, products, transactions).
- Lead the development of standard and customized reporting, dashboards, and analysis of information
- Lead the development of tools, methodologies, and statistical
- Provide hands-on development and support in creating and launching various tools and reporting
- Develops analytical solutions and makes recommendations based on an understanding of the business strategy and stakeholder
- Works with various data owners to discover and select available data from internal sources to fulfill analytical needs
- Summarizes statistical findings and draws conclusions, presents actionable business recommendations. Presents findings & recommendations in a simple, clear way to drive action.
- Uses the appropriate algorithms to discover
- Works independently on a range of complex tasks, which may include unique
Qualifications, Skills & Competencies:
- Post Secondary Degree – Computer Science, Information Technology or other relevant degrees with curriculum related to data structures and analysis
- Minimum 5 years of experience as an analyst
- Minimum 5 years of knowledge of business intelligence tools and programming languages
- Advance skills in data analysis and profiling, data mapping, data modeling, data lakes, and analytics
- Data Analytics: AWS Quicksight and Redshift
- Data Migration: solid in SQL and ETL
- Scripting and Integration: REST APIs, GraphQL, Nodejs, AWS Lambda/API Gateway
- Experience working with data mining and performing quantitative analysis
- Experience with Machine Learning algorithms and associated data sets
- Business acumen results-oriented
- Proactive/takes initiative/self-starter
- Excellent written and oral communication skills
- Ability to create, coordinate and facilitate presentations
- Time management, highly organized
- Collaboration and Team Engagement
- Analytical and Problem Solving
- Data-driven/Metrics Driven
Minimum of 4 years’ experience of working on DW/ETL projects and expert hands-on working knowledge of ETL tools.
Experience with Data Management & data warehouse development
Star schemas, Data Vaults, RDBMS, and ODS
Change Data capture
Slowly changing dimensions
Data governance
Data quality
Partitioning and tuning
Data Stewardship
Survivorship
Fuzzy Matching
Concurrency
Vertical and horizontal scaling
ELT, ETL
Spark, Hadoop, MPP, RDBMS
Experience with Dev/OPS architecture, implementation and operation
Hand's on working knowledge of Unix/Linux
Building Complex SQL Queries. Expert SQL and data analysis skills, ability to debug and fix data issue.
Complex ETL program design coding
Experience in Shell Scripting, Batch Scripting.
Good communication (oral & written) and inter-personal skills
Expert SQL and data analysis skill, ability to debug and fix data issue Work closely with business teams to understand their business needs and participate in requirements gathering, while creating artifacts and seek business approval.
Helping business define new requirements, Participating in End user meetings to derive and define the business requirement, propose cost effective solutions for data analytics and familiarize the team with the customer needs, specifications, design targets & techniques to support task performance and delivery.
Propose good design & solutions and adherence to the best Design & Standard practices.
Review & Propose industry best tools & technology for ever changing business rules and data set. Conduct Proof of Concepts (POC) with new tools & technologies to derive convincing benchmarks.
Prepare the plan, design and document the architecture, High-Level Topology Design, Functional Design, and review the same with customer IT managers and provide detailed knowledge to the development team to familiarize them with customer requirements, specifications, design standards and techniques.
Review code developed by other programmers, mentor, guide and monitor their work ensuring adherence to programming and documentation policies.
Work with functional business analysts to ensure that application programs are functioning as defined.
Capture user-feedback/comments on the delivered systems and document it for the client and project manager’s review. Review all deliverables before final delivery to client for quality adherence.
Technologies (Select based on requirement)
Databases - Oracle, Teradata, Postgres, SQL Server, Big Data, Snowflake, or Redshift
Tools – Talend, Informatica, SSIS, Matillion, Glue, or Azure Data Factory
Utilities for bulk loading and extracting
Languages – SQL, PL-SQL, T-SQL, Python, Java, or Scala
J/ODBC, JSON
Data Virtualization Data services development
Service Delivery - REST, Web Services
Data Virtualization Delivery – Denodo
ELT, ETL
Cloud certification Azure
Complex SQL Queries
Data Ingestion, Data Modeling (Domain), Consumption(RDMS)
Work at client location as Tableau Squad BA / Tech lead gathering visualization requirements and building Business Insights and Dashboards
Roles And Responsibilities
- Work with Business stakeholders to gather Dashboarding/reporting requirements
- Document user stories
- High level and detail design for Tableau application
- Suggest visualization options and collaborate on prototyping of Dashboards
- Actively collaborate with customers and colleagues to ensure delivery excellence
- Display a growth mindset by proactively seeking feedback
- Technical lead for developers guiding them with complex set analysis expressions and Dashboard development
Skills /Competencies:
- Experience in reporting requirement analysis
- Strong hands-on experience in Tableau design and development
- Experience in set analysis, storytelling, data load scripting and security setup of Tableau reports
- Experience in working with large scale RBDMS (Oracle/Teradata preferred)
- Working knowledge of QMC
- Good SQL knowledge
- Ability to present analysis and technical information to a non-technical audience
- Ability to work independently and collaboratively, as part of a team
- Excellent communication skills
- Ability to create documented standards and procedures for others to follow
- Insurance P&C and Specialty domain experience a plus
- Experience in a cloud-based architecture preferred, such as Databricks, Azure Data Lake, Azure Data Factory, etc.
- Strong understanding of ETL fundamentals and solutions. Should be proficient in writing advanced / complex SQL, expertise in performance tuning and optimization of SQL queries required.
- Strong experience in Python/PySpark and Spark SQL
- Experience in troubleshooting data issues, analyzing end to end data pipelines, and working with various teams in resolving issues and solving complex problems.
- Strong experience developing Spark applications using PySpark and SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights and actionable intelligence for internal and external use
Data Engineer
at Rorko India Private Limited