Job Title: Power BI Developer(Onsite)
Location: Park Centra, Sec 30, Gurgaon
CTC: 8 LPA
Time: 1:00 PM - 10:00 PM
Must Have Skills:
- Power BI Desktop Software
- Dax Queries
- Data modeling
- Row-level security
- Visualizations
- Data Transformations and filtering
- SSAS and SQL
Job description:
We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.
Roles and Responsibilities:
- Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
- Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
- Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
- Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
- Manage upstream and downstream impact of all changes on automated reporting/dashboards
- Independently apply problem-solving ability to identify meaningful insights to business
- Identify automation opportunities and work with a wide range of stakeholders to implement the same.
- The ability and self-confidence to work independently and increase the scope of the service line
Requirements:
- 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
- Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
- Experience in leading and mentoring a small team.
About Saviance Technologies
Similar jobs
We are a fast-growing digital, cloud, and mobility services provider with a principal market being North
America. We are looking for talented database/SQL experts for the management and analytics of large
data in various enterprise projects.
Responsibilities
Translate business needs to technical specifications
Manage and maintain various database servers (backup, replicas, shards, jobs)
Develop and execute database queries and conduct analyses
Occasionally write scripts for ETL jobs.
Create tools to store data (e.g. OLAP cubes)
Develop and update technical documentation
Requirements
Proven experience as a database programmer and administrator
Background in data warehouse design (e.g. dimensional modeling) and data mining
Good understanding of SQL and NoSQL databases, online analytical processing (OLAP) and ETL
(Extract, transform, load) framework
Advance Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server
Integration Services (SSIS)
Familiarity with BI technologies (strong Tableu hands-on experience) is a plus
Analytical mind with a problem-solving aptitude
What is the role?
You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.
Key Responsibilities
- Design and develop resilient data pipelines.
- Write efficient queries to fetch data from the report database.
- Work closely with application backend engineers on data requirements for their stories.
- Designing and developing report APIs for the front end to consume.
- Focus on building highly available, fault-tolerant report systems.
- Constantly improve the architecture of the application by clearing the technical backlog.
- Adopt a culture of learning and development to constantly keep pace with and adopt new technolgies.
What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
- Education - BE/MCA or equivalent
- Overall 8+ years of experience
- Expert level understanding of database concepts and BI.
- Well verse in databases such as MySQL, MongoDB and hands on experience in creating data models.
- Must have designed and implemented low latency data warehouse systems.
- Must have strong understanding of Kafka and related systems.
- Experience in clickhouse database preferred.
- Must have good knowledge of APIs and should be able to build interfaces for frontend engineers.
- Should be innovative and communicative in approach
- Will be responsible for functional/technical track of a project
Whom will you work with?
You will work with a top-notch tech team, working closely with the CTO and product team.
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.
We are
Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.
Way forward
We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.
- Understand the business drivers and analytical use-cases.
- Translate use cases to data models, descriptive, analytical, predictive, and engineering outcomes.
- Explore new technologies and learn new techniques to solve business problems creatively
- Think big! and drive the strategy for better data quality for the customers.
- Become the voice of business within engineering and of engineering within the business with customers.
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What we're looking for :
- Hands-on experience in data modeling, data visualization, and pipeline design and development
- Hands-on exposure to Machine learning concepts like supervised learning, unsupervised learning, RNN, DNN.
- Prior experience working with business stakeholders, in an enterprise space is a plus
- Great communication skills. You should be able to directly communicate with senior business leaders, embed yourself with business teams, and present solutions to business stakeholders
- Experience in working independently and driving projects end to end, strong analytical skills.
Location: Bengaluru
Department: - Engineering
Bidgely is looking for extraordinary and dynamic Senior Data Analyst to be part of its core team in Bangalore. You must have delivered exceptionally high quality robust products dealing with large data. Be part of a highly energetic and innovative team that believes nothing is impossible with some creativity and hard work.
● Design and implement a high volume data analytics pipeline in Looker for Bidgely flagship product.
● Implement data pipeline in Bidgely Data Lake
● Collaborate with product management and engineering teams to elicit & understand their requirements & challenges and develop potential solutions
● Stay current with the latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers.
● 3-5 years of strong experience in data analytics and in developing data pipelines.
● Very good expertise in Looker
● Strong in data modeling, developing SQL queries and optimizing queries.
● Good knowledge of data warehouse (Amazon Redshift, BigQuery, Snowflake, Hive).
● Good understanding of Big data applications (Hadoop, Spark, Hive, Airflow, S3, Cloudera)
● Attention to details. Strong communication and collaboration skills.
● BS/MS in Computer Science or equivalent from premier institutes.
● Working on an awesome AI product for the eCommerce domain.
● Build the next-generation information extraction, computer vision product powered
by state-of-the-art AI and Deep Learning techniques.
● Work with an international top-notch engineering team with full commitment to
Machine Learning development.
Desired Candidate Profile
● Passionate about search & AI technologies. Open to collaborating with colleagues &
external contributors.
● Good understanding of the mainstream deep learning models from multiple domains:
computer vision, NLP, reinforcement learning, model optimization, etc.
● Hands-on experience on deep learning frameworks, e.g. Tensorflow, Pytorch, MXNet,
BERT. Able to implement the latest DL model using existing API, open-source libraries
in a short time.
● Hands-on experience with the Cloud-Native techniques. Good understanding of web
services and modern software technologies.
● Maintained/contributed machine learning projects, familiar with the agile software
development process, CICD workflow, ticket management, code-review, version
control, etc.
● Skilled in the following programming languages: Python 3.
● Good English skills especially for writing and reading documentation
Must Have Skills:
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
- We are looking for a Data Engineer with 3-5 years experience in Python, SQL, AWS (EC2, S3, Elastic Beanstalk, API Gateway), and Java.
- The applicant must be able to perform Data Mapping (data type conversion, schema harmonization) using Python, SQL, and Java.
- The applicant must be familiar with and have programmed ETL interfaces (OAUTH, REST API, ODBC) using the same languages.
- The company is looking for someone who shows an eagerness to learn and who asks concise questions when communicating with teammates.
Job Responsibilities
- Drive growth strategies for our online eCommerce shop across all channels (own store, Amazon, eBay, etc.).
- Optimize customer journey with the help of product owner and project manager
- Optimize conversion funnel across all channels and sources. Create detailed maps of customer journey and analyze the funnels to increase conversions.
- Be responsible for overall online marketing REPORTING, channel attribution, analyzing marketing effectiveness (CPL, CAC, CLV/CAC, ROI, CLTV) and assist in creating presentations for key stakeholders, leadership and invertors.
- Develop and implement acquisition and customer retentions strategies (think customer first) in collaboration with Creative, CRM, Social Media, SEO, etc.
- Partner closely with all teams to create a customer-first approach while communicating the brand message clearly.
-
Minimum 3 years experience in High Growth E-Commerce company or agency
-
A proven marketer who understands and has worked in a high functioning, high revenue eCommerce brand.
-
Deep understanding of A/B testing, experimentation and eCommerce analysis
-
An expert in analyzing data and creating reports
Responsibilities
- Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices
- Day to day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst).
- Informatica capacity planning and on-going monitoring (e.g. CPU, Memory, etc.) to proactively increase capacity as needed.
- Manage backup and security of Data Integration Infrastructure.
- Design, develop, and maintain all data warehouse, data marts, and ETL functions for the organization as a part of an infrastructure team.
- Consult with users, management, vendors, and technicians to assess computing needs and system requirements.
- Develop and interpret organizational goals, policies, and procedures.
- Evaluate the organization's technology use and needs and recommend improvements, such as software upgrades.
- Prepare and review operational reports or project progress reports.
- Assist in the daily operations of the Architecture Team , analyzing workflow, establishing priorities, developing standards, and setting deadlines.
- Work with vendors to manage support SLA’s and influence vendor product roadmap
- Provide leadership and guidance in technical meetings, define standards and assist/provide status updates
- Work with cross functional operations teams such as systems, storage and network to design technology stacks.
Preferred Qualifications
- Minimum 6+ years’ experience as Informatica Engineer and Developer role
- Minimum of 5+ years’ experience in an ETL environment as a developer.
- Minimum of 5+ years of experience in SQL coding and understanding of databases
- Proficiency in Python
- Proficiency in command line troubleshooting
- Proficiency in writing code in Perl/Shell scripting languages
- Understanding of Java and concepts of Object-oriented programming
- Good understanding of systems, networking, and storage
- Strong knowledge of scalability and high availability