i

Business Intelligence Engineer (Looker Certified LookML Developer)

i
Posted by Abdul Shaik
Apply to this job
i
Remote, Pune, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida), Hyderabad, Rajkot
i
2 - 12 yrs
i
₹4L - ₹20L / yr
Skills
Looker
Business Intelligence (BI)
SQL
Data engineering
Google Cloud Platform (GCP)
Job description

Must be a Looker certified LookML Developer (It is very important)

As a Looker Specialist at Searce, you will:

  • Work with the business team to identify the best technical solution for a given problem or scenario in the Looker Platform
  • Design and develop LookML models, explores, dashboards, workflows in the Looker platform
  • Utilize established processes and best practices in designing solutions or blaze new trails and drive organization-wide data quality methodologies
  • Help create and maintain coding standards (style guides, etc.) and perform code reviews and technical analysts
  • Write SQL, build derived tables, and work with data engineering to ensure queries are performant and maintainable
  • Ensure project:model:view structure is optimized and re-architect as needed for growing business use cases
  • Build and maintain monitoring dashboards for the Looker platform
  • Optimize performance through re-architecting as needed
  • Partner with data engineering on larger projects to optimize the data warehouse layer for Looker performance

Sounds like you? What we are looking for:

  • Looker certification is must have(It is very important)
  • Minimum 1 - 2 years of experience using Looker and developing Look ML 
  • Experience in dashboard building and with LookerML build from scratch experience
  • Excellent SQL skills and advanced proficiency in LookML
  • Previous experience building scalable BI platforms, including the use of code review processes, automated testing, staging and production environments, and release schedules is preferred
  • Experience with cloud database platforms preferred
  • Working knowledge of scripting languages to allow scheduling of BI dashboards
  • Knowledge of software development processes and best practices
What You Can Expect From Us

You’ll join an entrepreneurial, inclusive culture. One where we succeed together – across the desk and around the globe. Where like-minded people work naturally together to achieve great things.

Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay.

Join us to develop your strengths and enjoy a fulfilling career full of varied experiences.Keep those ambitions in sights and imagine where Searce can take you..

Apply today! 
About Searce Inc

Searce is a cloud, automation & analytics led process improvement company helping futurify businesses. Searce is a premier partner for Google Cloud for all products and services. Searce is the largest Cloud Systems Integrator for enterprises with the largest # of enterprise Google Cloud clients in India.

 

Searce specializes in helping businesses move to cloud, build on the next generation cloud, adopt SaaS - Helping reimagine the ‘why’ & redefining ‘what’s next’ for workflows, automation, machine learning & related futuristic use cases. Searce has been recognized by Google as one of the Top partners for the year 2015, 2016.

 

Searce's organizational culture encourages making mistakes and questioning the status quo and that allows us to specialize in simplifying complex business processes and use a technology agnostic approach to create, improve and deliver.

 

Founded
2004
Type
Products & Services
Size
100-1000 employees
Stage
Profitable
Why apply to jobs via CutShort
i
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
i
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
i
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
6212
Companies hiring
Similar jobs
i
Founded 2019  •  Products & Services  •  100-1000 employees  •  Profitable
Data Science
Machine Learning (ML)
Artificial Intelligence (AI)
Python
Tableau
PowerBI
R Language
SQL
i
Remote only
i
5 - 8 yrs
i
₹6L - ₹10L / yr
  • 5 to 8 years of overall experience 3 to 4 years in Data Science/ Machine Learning/Artificial Intelligence
  • Good knowledge of Machine Learning, statistics, optimization or related field
  • Experience of building Artificial Intelligence, Machine Learning solutions.
  • 2+ years of experience with python/R & SQL is required for design and development and maintenance of ongoing metrics, reports, analysis, dashboard etc. to drive business decisions.
  • Knowledge of any visualization tools like Tableau and Power BI,
  • Knowledge, experience and ability to convert functional requirements into data problems and data requirements into functional queries.
  • Good communication skills to present business cases, modelling outcomes and project deliverables to stakeholders
  • Key Skills: R, Python, SQL
  • Knowledge or experience on Tableau and Power BI is added advantage.
Read more
Job posted by
i
Sowmya K
Apply for job
i
at Curl Analytics
Agency job
Python
Data engineering
ETL
Pipeline management
Spark
Apache Hive
PySpark
Docker
Kubernetes
MongoDB
Apache Kafka
SQL server
Oracle
Machine Learning (ML)
BigQuery
i
Bengaluru (Bangalore)
i
5 - 10 yrs
i
₹15L - ₹30L / yr
What you will do
  • Bring in industry best practices around creating and maintaining robust data pipelines for complex data projects with/without AI component
    • programmatically ingesting data from several static and real-time sources (incl. web scraping)
    • rendering results through dynamic interfaces incl. web / mobile / dashboard with the ability to log usage and granular user feedbacks
    • performance tuning and optimal implementation of complex Python scripts (using SPARK), SQL (using stored procedures, HIVE), and NoSQL queries in a production environment
  • Industrialize ML / DL solutions and deploy and manage production services; proactively handle data issues arising on live apps
  • Perform ETL on large and complex datasets for AI applications - work closely with data scientists on performance optimization of large-scale ML/DL model training
  • Build data tools to facilitate fast data cleaning and statistical analysis
  • Ensure data architecture is secure and compliant
  • Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability
  • Work closely with APAC CDO and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).

You should be

  •  Expert in structured and unstructured data in traditional and Big data environments – Oracle / SQLserver, MongoDB, Hive / Pig, BigQuery, and Spark
  • Have excellent knowledge of Python programming both in traditional and distributed models (PySpark)
  • Expert in shell scripting and writing schedulers
  • Hands-on experience with Cloud - deploying complex data solutions in hybrid cloud / on-premise environment both for data extraction/storage and computation
  • Hands-on experience in deploying production apps using large volumes of data with state-of-the-art technologies like Dockers, Kubernetes, and Kafka
  • Strong knowledge of data security best practices
  • 5+ years experience in a data engineering role
  • Science / Engineering graduate from a Tier-1 university in the country
  • And most importantly, you must be a passionate coder who really cares about building apps that can help people do things better, smarter, and faster even when they sleep
Read more
Job posted by
i
Naveen Taalanki
Apply for job
i
at Hiring for MNC company
Agency job
Google Cloud Platform (GCP)
Big Data
Data engineering
i
Hyderabad, Bengaluru (Bangalore), Chennai, Pune
i
5 - 7 yrs
i
₹5L - ₹25L / yr

Job roles and responsibilities:

  • Minimum 3 to 4 years hands-on designing, building and operationalizing large-scale enterprise data solutions and applications using GCP data and analytics services like, Cloud DataProc, Cloud Dataflow, Cloud BigQuery, Cloud PubSub, Cloud Functions.
  • Hands-on experience in analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services.
  • Experience in designing and building data pipelines within a hybrid big data architecture using Java, Python, Scala & GCP Native tools.
  • Hands-on Orchestrating and scheduling Data pipelines using Composer, Airflow.
  • Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud

Technical Skills Required:

  • Strong Experience in GCP data and Analytics Services
  • Working knowledge on Big data ecosystem-Hadoop, Spark, Hbase, Hive, Scala etc
  • Experience in building and optimizing data pipelines in Spark
  • Strong skills in Orchestration of workflows with Composer/Apache Airflow
  • Good knowledge on object-oriented scripting languages: Python (must have) and Java or C++.
  • Good to have knowledge in building CI/CD pipelines with GCP Cloud Build and native GCP services
Read more
Job posted by
i
Swagatika Sahoo
Apply for job
i
at Company is into Ecommerce & Retail
Agency job
Informatica
SQL
PL/SQL
Python
Amazon Redshift
i
Chennai
i
4 - 12 yrs
i
₹12L - ₹18L / yr
Job Title: Data Engineer

Work Location: Chennai, India

Experience Level:4 to 12 Years

Package:Upto 20 LPA

Notice Period:Immediate to 20 days

Its a full time Opportunity with our client.

Job Description:

Data Engineering Role

· Oracle/Redshift/SQL/PLSQL (This is core to the role)

· Python (essential)

· Informatica Power center/Cloud

· Cloud development experience

· Tableau (would be a bonus but not a requirement).

· Basic development tools experience (listed or comparable industry standard S/W)
 
  JIRA

  Source Control (GIT)

  CI/CD (Jenkins/UC etc.)
Read more
Job posted by
i
Venkat B
Apply for job
i
at Artificial Intelligence (AI) focused product engineering
Agency job
Hadoop
Big Data
Spark
Scala
Java
SQL
i
Hyderabad
i
7 - 10 yrs
i
₹10L - ₹15L / yr
Job Title: Hadoop Developer

Work Location: Hyderabad

Experience:7+yrs

Package:Upto 15 LPA

Notice Period:Immediate to 15 days

Its a Full Time Opportunity with Our Client

Mandatory Skills:Big Data,Hadoop & Spark/Scala/Hive/Pig/SQOOP/OOZIE

Job Description:

--Overall 7 years of experience and at least 5 year of experience in Big Data space.
--Hadoop Developers with Spark concepts, Scala programming and HIVE. Should be able to explain Tuples, Data frames etc.
--Strong Hadoop –Spark/Scala/Hive/Pig/SQOOP/OOZIE-MUST
--Good exposure to Kafka.(preferred)
--Good exposure on Java (preferred)
--Complex High Volume High Velocity projects end to end delivery experience
--Good experience with at least one of the scripting language like Scala, Python.
--Good exposure to Big-Data architectures.
--Experience with some framework building experience on Hadoop
--Very good understanding of Big Data eco system
--Experience with sizing and estimating large scale big data projects
--Good with DB knowledge with SQL tuning experience.
Read more
Job posted by
i
Venkat B
Apply for job
i
Founded 1999  •  Product  •  100-500 employees  •  Profitable
Business Intelligence (BI)
SQL
PowerBI
Tableau
Python
Dashboard
Apache Hive
Big Data
Sisense
Azure DataBricks
Visual Story Telling
i
Pune
i
4 - 7 yrs
i
Best in industry

Want to shape the future of Energy through Data Analysis? We all know that without good representation of the data there is no Data Analysis. We go little further and call it Visual Story Telling, where you have an objective in mind on what data visualization problem it is going to solve with steps in the dashboard, sometimes with navigation among few dashboards together. Especially with Big Data, it becomes very important to get the meaning out of so much of data with lower latency to render the dashboards. The job of the Data Analyst is also to create an efficient Data Cube that makes it easy to build dashboards on top of it. We work with our internal and external customers to understand the requirements, develop data cubes & dashboards, enable the customers with technical knowledge on developing their custom dashboards. These critical pieces of works complement the Data Analyst, with a continuous feedback loop based on how a dashboard is performing and what fine tuning is needed in the data cube and the dashboards.

 

The Energy Exemplar (EE) data team is looking for an experienced Data Analysts to join our Pune office. As a dedicated Data Analyst on our Research team, you will apply data analysis expertise, work very closely with the data pipeline team to have the data structured suitably for the data cube. The dashboard developers will then incrementally pull the data from data cubes and create robust dashboards, which in turn provides  tremendous value to hundreds of EE customers in taking very critical business decisions in billion dollar energy market.

 

At EE, you’ll have access to vast amounts of energy-related data from our sources. Our data pipelines are curated and supported by engineering teams. We also offer many company-sponsored classes and conferences that focus on data analysis and visual story telling. There’s great growth opportunity for data analysis at EE.

 

Responsibilities

  • Develop, test and maintain architectures, such as databases and large-scale dashboards systems backed by high-performance data cubes.
  • Recommend and implement ways to improve data visualization, and data quality.
  • Identify dashboards with highest impact and make them accessible to our internal and external customers.
  • Work together with data engineers and data scientists to wrangle the data and provide quality dashboards and insights to make business critical decisions.
  • Take end-to-end responsibility for the development, quality, testing, and production readiness of the dashboards and data cubes you build.
  • Define and evangelize Data Analysis best standards and practices to ensure engineering excellence at every stage of development cycle.
  • Act as a resident expert for data analysis, data visualization, data warehousing.

 

Qualifications

  • 3+ years of professional experience in developing data cubes, dashboards for large-scale, complex datasets from varieties of data sources.
  • Building quality dashboards in terms of visual aesthetics and quality of data, with always and objective for each dashboards and understanding how the single dashboard or suite of dashboards together help solve a business problem.
  • Data Analysis expertise with strong experience working with Big data technologies such as Hadoop, Hive, Spark, Scala, Python etc.
  • Experience working with Cloud based data technologies such as Azure Data lake, Azure Data factory, Azure Data Bricks highly desirable.
  • Knowledge and experience working with database systems such as Apache Hive, SQL Server, Cosmos, etc.
  • Advanced knowledge on SQL, performance tuning, writing moderate to advanced SQL queries strongly preferred.
  • Outstanding communication and collaboration skills. You can learn from and teach others.
  • Strong drive for results. You have a proven record of shepherding experiments to create successful shipping products/services.
  • Experience with prediction in adversarial (energy) environments highly desirable.
  • A Bachelor or Masters degree in Computer Science or Engineering with coursework in Data Analysis, Experimentation Design, and Visual Story Telling highly desirable.
Read more
Job posted by
i
Subhasis Khatua
Apply for job
Tableau
SQL
Problem solving
i
Bengaluru (Bangalore)
i
5 - 8 yrs
i
₹8L - ₹12L / yr
  • Hands-on development/maintenance experience in Tableau: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions using Tableau
  • Reviewing and improving existing Tableau dashboards and data models/ systems and collaborating with teams to integrate new systems
  • Provide support and expertise to the business community to assist with better utilization of Tableau
  • Understand business requirements, conduct analysis and recommend solution options for intelligent dashboards in Tableau
  • Experience with Data Extraction, Transformation and Load (ETL) – knowledge of how to extract, transform and load data
  • Execute SQL data queries across multiple data sources in support of business intelligence reporting needs. Format query results / reports in various ways
  • Participates in QA testing, liaising with other project team members and being responsive to client's needs, all with an eye on details in a fast-paced environment
  • Performing and documenting data analysis, data validation, and data mapping/design

 

  • Extensive experience in developing, maintaining and managing Tableau driven dashboards & analytics and working knowledge of Tableau administration /architecture.
  • A solid understanding of SQL, rational databases, and normalization
  • Proficiency in use of query and reporting analysis tools
  • Competency in Excel (macros, pivot tables, etc.)
  • Degree in Mathematics, Computer Science, Information Systems, or related field.
Read more
Job posted by
i
Jerrin Thomas
Apply for job
i
Founded 2017  •  Product  •  20-100 employees  •  Raised funding
Data Analytics
Data Science
Data cleansing
Data Visualization
Data engineering
i
Remote, Bengaluru (Bangalore), anywhere
i
0 - 1 yrs
i
₹1.8L - ₹2.4L / yr

We are looking for multiple Data Analyst Interns to be part of our Delivery Team. The candidate will work on data preparation, configuration and execution of Clootrack's deep learning platforms - Sun Tiger & Beagle. The candidate needs to take an active role in interacting with the core engineering team, and business teams to bench mark platform outputs to improve value to customers.

To apply, please fill out the details here- https://docs.google.com/forms/d/16yehtN7oJ4i4X8ytrArK5nMl5nuRG7P9k9RgpMT9aA0/edit

If the performance is satisfactory, the candidates will be offered a permanent role with the team within 6 months or earlier. 

 

Responsibilities

  • Configure and run our proprietary AI & Deep Learning platforms
  • Understand business requirements and map to deliverables
  • Collaborate with core engineering team and business teams
  • Data preparation
  • Data labelling
  • Insight extraction & benchmarking
  • Handle client communication to evaluate requirements, plan and act as client touch point for projects

Skills And Qualifications

  • Yrs of experience: 0+ yrs
  • Very good problem solving capabilities and aptitude
  • Very good team player, with drive to work with complete ownership
  • Fire to drive results
  • Passion to understand data and derive insights
  • Very good communication and presentation skills
  • Good to have: Knowledge on SQL, Google Data Prep, Python Scripting
  • Please do not apply if you are undergoing any under graduation, graduation or post graduation courses and are yet to complete 

Duration of  Internship:

The applicant should be available for a minimum of 6 months for internship. Placement offers will be provided based on performance.

To apply, please fill out the details here- https://docs.google.com/forms/d/16yehtN7oJ4i4X8ytrArK5nMl5nuRG7P9k9RgpMT9aA0/edit

Read more
Job posted by
i
Shameel Abdulla
Apply for job
i
Founded 2012  •  Services  •  100-1000 employees  •  Profitable
Analytics
Business Intelligence (BI)
Oracle SQL Developer
PowerBI
Business Analysis
Tableau
Data Analytics
SQL
PL/SQL
i
NCR (Delhi | Gurgaon | Noida)
i
- yrs
i
₹6L - ₹11L / yr
Role Summary - The position holder will be responsible for supporting various aspects of organization's Analytical & BI activities. As a member of team, candidate will collaborate with a multi-disciplinary team of experts and SMT group on a wide range of problems which will give him opportunities to solve critical business problems by using Analytical & Statistical techniques. Essential/Key Responsibilities - By analyzing data/reports to identify early warning signals (unusual trends, patterns, Process gaps etc.) and proactively providing feedback in order to take corrective actions by finding continuous improvement in process (improvement in performance, reducing cost, technological improvement etc.) - Will also be responsible for creating/defining BI (Business Intelligence) and AI (Analytical Intelligence) standards for Home Credit - Being a part of BICC team, expecting high level of Business Intelligence support (Regular Reports, weekly presentations etc.) to top management - Will ensure automation & centralization of BI activities for better utilization of resources - Will be responsible for supporting data driven ADHOC's & critical requirements Qualifications/Requirements: - MBA/ M. Tech / B-Tech or Bachelors in a quantitative discipline such as Computer Science, Engineering, Mathematics, Statistics, Operations Research, Economics from premier /Tier 1 Colleges with a minimum of 3 years of experience in Analytics/ Business Intelligence - Highly numerate/ Statistical knowledge - able to work with numbers and can understand the data trend - Ability to work with both business and technical communities - Good to know, financial analysis / modeling to support the various teams on specific analysis projects. Skills/ Desired Characteristics - Able to think analytically, use a systematics and logical approach to analyze data, problems and situations. - Good Database skills with exposure to Oracle (11g) systems and tools - Highly skilled in Excel, SQL, R/Python or Power BI /Tableau or VBA - Ability to manage multiple deliverables with minimum guidance and pro-actively set up communication processes with stakeholders - Willing to working in IC (Individual Contributor) role - Excellent communication skills in English - written, verbal - Good knowledge in Project Management and Program management. Who should join us - If you are willing to face new challenges and want to apply your data knowledge for growth / future of company, then Home Credit can give you this opportunity. Home Credit can provide you platform to show your skills & suggest valuable ideas to company. - Will get opportunity to work on company level platform & will be part of BI platform of company. - Opportunity to work in a team of enthusiastic professionals.
Read more
Job posted by
i
Garima Singh
Apply for job
i
at It is India’s biggest vernacular e-sports gaming platform.
Machine Learning (ML)
Data Structures
Data engineering
Big Data
Neural networks
i
NCR (Delhi | Gurgaon | Noida)
i
- yrs
i
₹12L - ₹34L / yr
• Experience with Big Data, Neural network (deep learning), and reinforcement learning • Ability to design machine learning systems • Research and implement appropriate ML algorithms and tools • Develop machine learning applications according to requirements • Select appropriate datasets and data representation methods • Run machine learning tests and experiments • Perform statistical analysis and fine-tuning using test results • Extend existing ML libraries and frameworks • Keep abreast of developments in the field • Understanding of data structures, data modeling and software architecture • Deep knowledge of math, probability, statistics and algorithms • Ability to write robust code in Python, Java and R Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
Read more
Job posted by
i
Silky Malik
Apply for job
Did not find a job you were looking for?
i
Search
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on CutShort.
iiiii
Want to apply for this role at Searce Inc?
i
Apply for this job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.