Data Engineer (GCP/BigQuery)

at Marktine

DP
Posted by Vishal Sharma
icon
Remote, Bengaluru (Bangalore)
icon
3 - 6 yrs
icon
₹10L - ₹25L / yr
icon
Full time
Skills
Cloud
Google Cloud Platform (GCP)
BigQuery
Python
SQL
Tableau
PowerBI

Specific Responsibilities

  • Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
  • Design and develop the ETL framework using BigQuery
  • Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
  • Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
  • Should be able to automate the data load from Big Query using APIs or scripting language.
  • Good experience in Advanced SQL concepts.
  • Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
  • Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
  • Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
  • Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
  • Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
  • Understand requirements to translate business problems & technical problems into analytics problems.
  • Effective storyboarding and presentation of the solution to the client and leadership.
  • Client engagement & management
  • Ability to interface effectively with multiple levels of management and functional disciplines.
  • Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.

 

Work Experience
  • 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
  • Relevant experience in Consumer Tech/CPG/Retail industries
  • Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
  • Strong problem solving and web analytical skills. Acute attention to detail.
  • Experience in analyzing large, complex, multi-dimensional data sets.
  • Experience in one or more roles in an online eCommerce or online support environment.
 
Skills
  • Expertise in Google Big Query & Google Cloud Platform
  • Experience in Advanced SQL, Scripting language (Python/R)
  • Hands-on experience in BI tools (Tableau, Power BI)
  • Working Experience & understanding of Adobe Analytics or Google Analytics
  • Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
  • Excellent analytical thinking, analysis, and problem-solving skills.
  • Knowledge of other GCP services is a plus
 
Read more

About Marktine

Founded
2014
Type
Products & Services
Size
20-100 employees
Stage
Bootstrapped
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Senior Customer Scientist

at Crayon Data

Founded 2012  •  Product  •  100-500 employees  •  Raised funding
SQL
Python
Analytical Skills
Data modeling
Data Visualization
Statistical Modeling
icon
Chennai
icon
5 - 8 yrs
icon
₹15L - ₹25L / yr

Role : Senior Customer Scientist 

Experience : 6-8 Years 

Location : Chennai (Hybrid) 
 
 

Who are we? 
 
 

A young, fast-growing AI and big data company, with an ambitious vision to simplify the world’s choices. Our clients are top-tier enterprises in the banking, e-commerce and travel spaces. They use our core AI-based choice engine maya.ai, to deliver personal digital experiences centered around taste. The maya.ai platform now touches over 125M customers globally. You’ll find Crayon Boxes in Chennai and Singapore. But you’ll find Crayons in every corner of the world. Especially where our client projects are – UAE, India, SE Asia and pretty soon the US. 
 
 

Life in the Crayon Box is a little chaotic, largely dynamic and keeps us on our toes! Crayons are a diverse and passionate bunch. Challenges excite us. Our mission drives us. And good food, caffeine (for the most part) and youthful energy fuel us. Over the last year alone, Crayon has seen a growth rate of 3x, and we believe this is just the start. 
 

 
We’re looking for young and young-at-heart professionals with a relentless drive to help Crayon double its growth. Leaders, doers, innovators, dreamers, implementers and eccentric visionaries, we have a place for you all. 
 

 
 

Can you say “Yes, I have!” to the below? 
 
 

  1. Experience with exploratory analysis, statistical analysis, and model development 
     
  1. Knowledge of advanced analytics techniques, including Predictive Modelling (Logistic regression), segmentation, forecasting, data mining, and optimizations 
     
  1. Knowledge of software packages such as SAS, R, Rapidminer for analytical modelling and data management. 
     
  1. Strong experience in SQL/ Python/R working efficiently at scale with large data sets 
     
  1. Experience in using Business Intelligence tools such as PowerBI, Tableau, Metabase for business applications 
     

 
 

 

Can you say “Yes, I will!” to the below? 
 
 

  1. Drive clarity and solve ambiguous, challenging business problems using data-driven approaches. Propose and own data analysis (including modelling, coding, analytics) to drive business insight and facilitate decisions. 
     
  1. Develop creative solutions and build prototypes to business problems using algorithms based on machine learning, statistics, and optimisation, and work with engineering to deploy those algorithms and create impact in production. 
     
  1. Perform time-series analyses, hypothesis testing, and causal analyses to statistically assess the relative impact and extract trends 
     
  1. Coordinate individual teams to fulfil client requirements and manage deliverable 
     
  1. Communicate and present complex concepts to business audiences 
     
  1. Travel to client locations when necessary  

 

 

Crayon is an equal opportunity employer. Employment is based on a person's merit and qualifications and professional competences. Crayon does not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related.  
 
 

More about Crayon: https://www.crayondata.com/  
 

More about maya.ai: https://maya.ai/  

 

 

Read more
Job posted by
Varnisha Sethupathi

Data Quality Engineer

at Envoy Global

Founded 1998  •  Product  •  100-500 employees  •  Profitable
Data Analytics
Data Visualization
PowerBI
Tableau
Qlikview
Spotfire
Informatica Data Quality
icon
Hyderabad
icon
3 - 5 yrs
icon
₹4L - ₹8L / yr
The Data Quality Engineer will be responsible for designing, developing, documenting and performing data quality checks across all data assets developed at Envoy. That includes ETL jobs, reports, dashboards and data pipelines. The primary goal for this role is to ensure high quality of data delivered to internal stakeholders and customers. Validation of data in data repositories (DW, Data Marts) against data from source systems and validation of metrics and data in reports/dashboards against data in the repositories is a key responsibility. Essentially, making data assets consistently accurate for users.

As the successful candidate, you will be required to:

 

  • Design, develop and maintain data quality assurance framework
  • Work in conjunction with BI and Data Engineers to ensure high quality Data Deliverable
  • Design and develop testing frameworks to test ETL jobs, BI reports and Dashboards and other data pipelines
  • Write SQL scripts to validate data in the data repositories against the data in the source systems
  • Write SQL scripts to validate data surfacing in BI assets against the data sources
  • Ensure data quality by checking against our ODS and the front-end application
  • Track, monitor and document testing results

 

To be eligible for this role, you should possess the following:

  • Demonstrated ability to write complex SQL/TSQL queries to retrieve/modify data
  • Ability to work in an Agile environment
  • Ability to learn new tools and technologies and adapt to an evolving tech-scape

 

 

Envoy Global is an equal opportunity employer and will recruit, hire, train and promote into all job levels the most qualified applicants without regard to race, colour, religion, sex, national origin, age, disability, ancestry, sexual orientation, gender identification, veteran status, pregnancy, or any other protected classification.

Read more
Job posted by
Swetha Akkala

Data Analyst

at Vahak

Founded 2016  •  Product  •  20-100 employees  •  Raised funding
Data Analytics
Data Analyst
MS-Excel
SQL
Tableau
Python
PowerBI
icon
Bengaluru (Bangalore)
icon
4 - 10 yrs
icon
₹10L - ₹15L / yr

Who Are We?

Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 5 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.

 

Vahak has raised a capital of $5 Million in a “Pre Series A” round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.


Responsibilities for Data Analyst:

  • Undertake preprocessing of structured and unstructured data
  • Propose solutions and strategies to business challenges
  • Present information using data visualization techniques
  • Identify valuable data sources and automate collection processes
  • Analyze large amounts of information to discover trends and patterns
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
  • Proactively analyze data to answer key questions from stakeholders or out of self-initiated curiosity with an eye for what drives business performance.

Qualifications for Data Analyst

  • Experience using business intelligence tools (e.g. Tableau, Power BI – not mandatory)
  • Strong SQL or Excel skills with the ability to learn other analytic tools
  • Conceptual understanding of various modelling techniques, pros and cons of each technique
  • Strong problem solving skills with an emphasis on product development.
  • Programming advanced computing, Developing algorithms and predictive modeling experience
  • Experience using statistical computer languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
  • Advantage - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
  • Demonstrated experience applying data analysis methods to real-world data problems
Read more
Job posted by
Vahak Talent

Tableau Developer

at CIEL HR Services

Founded 2015  •  Services  •  employees  •  Profitable
Tableau
SQL
PowerBI
icon
Bengaluru (Bangalore)
icon
5 - 7 yrs
icon
₹10L - ₹20L / yr
Skill: Tableau Developer
Experience: 5-7 Years

Responsibilities

  • Deliver full-cycle Tableau development projects, from business needs assessment and data discovery, through solution design, to delivery to client.

  • Enable our clients and ourselves to answer questions and develop data-driven insights through Tableau.

  • Provide technical leadership and support across all aspects of Tableau development and use, from data specification development, through DataMart development, to supporting end-user dashboards and reports.

  • Administrate Tableau Server by creating sites, add/remove users, and provide the appropriate level access for users.

  • Strategize and ideate the solution design. Develop UI mock-ups, storyboards, flow diagrams, conceptual diagrams, wireframes, visual mockups, and interactive prototypes.

  • Develop best practices guidelines for Tableau data processing and visualization. Use these best practices to quickly deliver functionality across the client base and internal users.

Qualifications

  • Degree in a highly-relevant analytical or technical field, such as statistics, data science, or business analytics.

· 5+ years as a Tableau developer and administrator.

· Extensive experience with large data sets, statistical analyses, and visualization as well as hands-on experience on tools (SQL, Tableau, Power BI).

· Ability to quickly learn and take responsibility to deliver.

Read more
Job posted by
Swati M

Sr Product Analyst

at High-Growth Fintech Startup

Agency job
via Unnati
Business Intelligence (BI)
Tableau
PowerBI
SQL
Data Analytics
Google Analytics
Analytical Skills
Product Strategy
Process automation
Web Analytics
Data modeling
key metrics
icon
Mumbai
icon
3 - 5 yrs
icon
₹7L - ₹11L / yr
Want to join the trailblazing Fintech company which is leveraging software and technology to change the face of short-term financing in India!

Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
 
Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
 
As a Sr Product Analyst, you will partner with business & product teams to define goals, identify specific insights/ anomalies and monitor key metrics on a day-to-day basis.
 
What you will do:
  • Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
  • Establishing scalable, efficient and automated processes to deploy data analytics on large data sets across platforms

 

 

What you need to have:

  • B.Tech /B.E.; Any Graduation
  • Strong background in statistical concepts & calculations to perform analysis/ modeling
  • Proficient in SQL and other BI tools like Tableau, Power BI etc.
  • Good knowledge of Google Analytics and any other web analytics platforms (preferred)
  • Strong analytical and problem solving skills to analyze large quantum of datasets
  • Ability to work independently and bring innovative solutions to the team
  • Experience of working with a start-up or a product organization (preferred)
Read more
Job posted by
Sarika Tamhane
ETL
Data Warehouse (DWH)
ETL Developer
Relational Database (RDBMS)
Spark
Hadoop
SQL server
SSIS
ADF
Python
Java
talend
Azure Data Factory
icon
Bengaluru (Bangalore)
icon
5 - 8 yrs
icon
₹8L - ₹13L / yr

 Minimum of 4 years’ experience of working on DW/ETL projects and expert hands-on working knowledge of ETL tools.

Experience with Data Management & data warehouse development

Star schemas, Data Vaults, RDBMS, and ODS

Change Data capture

Slowly changing dimensions

Data governance

Data quality

Partitioning and tuning

Data Stewardship

Survivorship

Fuzzy Matching

Concurrency

Vertical and horizontal scaling

ELT, ETL

Spark, Hadoop, MPP, RDBMS

Experience with Dev/OPS architecture, implementation and operation

Hand's on working knowledge of Unix/Linux

Building Complex SQL Queries. Expert SQL and data analysis skills, ability to debug and fix data issue.

Complex ETL program design coding

Experience in Shell Scripting, Batch Scripting.

Good communication (oral & written) and inter-personal skills

Expert SQL and data analysis skill, ability to debug and fix data issue Work closely with business teams to understand their business needs and participate in requirements gathering, while creating artifacts and seek business approval.

Helping business define new requirements, Participating in End user meetings to derive and define the business requirement, propose cost effective solutions for data analytics and familiarize the team with the customer needs, specifications, design targets & techniques to support task performance and delivery.

Propose good design & solutions and adherence to the best Design & Standard practices.

Review & Propose industry best tools & technology for ever changing business rules and data set. Conduct Proof of Concepts (POC) with new tools & technologies to derive convincing benchmarks.

Prepare the plan, design and document the architecture, High-Level Topology Design, Functional Design, and review the same with customer IT managers and provide detailed knowledge to the development team to familiarize them with customer requirements, specifications, design standards and techniques.

Review code developed by other programmers, mentor, guide and monitor their work ensuring adherence to programming and documentation policies.

Work with functional business analysts to ensure that application programs are functioning as defined. 

Capture user-feedback/comments on the delivered systems and document it for the client and project manager’s review. Review all deliverables before final delivery to client for quality adherence.

Technologies (Select based on requirement)

Databases - Oracle, Teradata, Postgres, SQL Server, Big Data, Snowflake, or Redshift

Tools – Talend, Informatica, SSIS, Matillion, Glue, or Azure Data Factory

Utilities for bulk loading and extracting

Languages – SQL, PL-SQL, T-SQL, Python, Java, or Scala

J/ODBC, JSON

Data Virtualization Data services development

Service Delivery - REST, Web Services

Data Virtualization Delivery – Denodo

 

ELT, ETL

Cloud certification Azure

Complex SQL Queries

 

Data Ingestion, Data Modeling (Domain), Consumption(RDMS)
Read more
Job posted by
Jerrin Thomas

Data Scientist

at Accolite Software

Founded 2007  •  Products & Services  •  100-1000 employees  •  Profitable
Data Science
R Programming
Python
Deep Learning
Neural networks
OpenCV
Machine Learning (ML)
Image Processing
icon
Remote, Bengaluru (Bangalore)
icon
3 - 10 yrs
icon
₹5L - ₹24L / yr
  • Adept at Machine learning techniques and algorithms.

Feature selection, dimensionality reduction, building and

  • optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Doing ad-hoc analysis and presenting results
  • Proficiency in using query languages such as N1QL, SQL

Experience with data visualization tools, such as D3.js, GGplot,

  • Plotly, PyPlot, etc.

Creating automated anomaly detection systems and constant tracking

  • of its performance
  • Strong in Python is a must.
  • Strong in Data Analysis and mining is a must
  • Deep Learning, Neural Network, CNN, Image Processing (Must)

Building analytic systems - data collection, cleansing and

  • integration

Experience with NoSQL databases, such as Couchbase, MongoDB,

Cassandra, HBase

Read more
Job posted by
Nikita Sadarangani

Data Engineer

at Prescience Decision Solutions

Founded 2017  •  Products & Services  •  20-100 employees  •  Profitable
Big Data
ETL
Spark
Apache Kafka
Apache Spark
Python
SQL
Java
Databricks
icon
Bengaluru (Bangalore)
icon
3 - 7 yrs
icon
₹10L - ₹20L / yr

The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes

Required Experience, Skills and Qualifications:

  • Hands on experience on Big Data tools/technologies like Spark,  Databricks, Map Reduce, Hive, HDFS.
  • Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
  • Proficiency in any of the programming language: Python/ Scala/  Java with 4+ years’ experience
  • Experience in Cloud infrastructures like MS Azure, Data lake etc
  • Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
Read more
Job posted by
Shivakumar K

Data Scientist

at Simplilearn Solutions

Founded 2009  •  Product  •  500-1000 employees  •  Profitable
Data Science
R Programming
Python
Scala
Tableau
SQL server
icon
Bengaluru (Bangalore)
icon
2 - 5 yrs
icon
₹6L - ₹10L / yr
Simplilearn.com is the world’s largest professional certifications company and an Onalytica Top 20 influential brand. With a library of 400+ courses, we've helped 500,000+ professionals advance their careers, delivering $5 billion in pay raises. Simplilearn has over 6500 employees worldwide and our customers include Fortune 1000 companies, top universities, leading agencies and hundreds of thousands of working professionals. We are growing over 200% year on year and having fun doing it. Description We are looking for candidates with strong technical skills and proven track record in building predictive solutions for enterprises. This is a very challenging role and provides an opportunity to work on developing insights based Ed-Tech software products used by large set of customers across globe. It provides an exciting opportunity to work across various advanced analytics & data science problem statement using cutting-edge modern technologies collaborating with product, marketing & sales teams. Responsibilities • Work on enterprise level advanced reporting requirements & data analysis. • Solve various data science problems customer engagement, dynamic pricing, lead scoring, NPS improvement, optimization, chatbots etc. • Work on data engineering problems utilizing our tech stack - S3 Datalake, Spark, Redshift, Presto, Druid, Airflow etc. • Collect relevant data from source systems/Use crawling and parsing infrastructure to put together data sets. • Craft, conduct and analyse A/B experiments to evaluate machine learning models/algorithms. • Communicate findings and take algorithms/models to production with ownership. Desired Skills • BE/BTech/MSc/MS in Computer Science or related technical field. • 2-5 years of experience in advanced analytics discipline with solid data engineering & visualization skills. • Strong SQL skills and BI skills using Tableau & ability to perform various complex analytics in data. • Ability to propose hypothesis and design experiments in the context of specific problems using statistics & ML algorithms. • Good overlap with Modern Data processing framework such as AWS-lambda, Spark using Scala or Python. • Dedication and diligence in understanding the application domain, collecting/cleaning data and conducting various A/B experiments. • Bachelor Degree in Statistics or, prior experience with Ed-Tech is a plus
Read more
Job posted by
Aniket Manhar Nanjee

Data Engineer - Google Cloud Platform

at Datalicious Pty Ltd

Founded 2007  •  Products & Services  •  20-100 employees  •  Raised funding
Python
Amazon Web Services (AWS)
Google Cloud Storage
Big Data
Data Analytics
Datawarehousing
Software Development
Data Science
icon
Bengaluru (Bangalore)
icon
2 - 7 yrs
icon
₹7L - ₹20L / yr
DESCRIPTION :- We- re looking for an experienced Data Engineer to be part of our team who has a strong cloud technology experience to help our big data team to take our products to the next level.- This is a hands-on role, you will be required to code and develop the product in addition to your leadership role. You need to have a strong software development background and love to work with cutting edge big data platforms.- You are expected to bring with you extensive hands-on experience with Amazon Web Services (Kinesis streams, EMR, Redshift), Spark and other Big Data processing frameworks and technologies as well as advanced knowledge of RDBS and Data Warehousing solutions.REQUIREMENTS :- Strong background working on large scale Data Warehousing and Data processing solutions.- Strong Python and Spark programming experience.- Strong experience in building big data pipelines.- Very strong SQL skills are an absolute must.- Good knowledge of OO, functional and procedural programming paradigms.- Strong understanding of various design patterns.- Strong understanding of data structures and algorithms.- Strong experience with Linux operating systems.- At least 2+ years of experience working as a software developer or a data-driven environment.- Experience working in an agile environment.Lots of passion, motivation and drive to succeed!Highly desirable :- Understanding of agile principles specifically scrum.- Exposure to Google cloud platform services such as BigQuery, compute engine etc.- Docker, Puppet, Ansible, etc..- Understanding of digital marketing and digital advertising space would be advantageous.BENEFITS :Datalicious is a global data technology company that helps marketers improve customer journeys through the implementation of smart data-driven marketing strategies. Our team of marketing data specialists offer a wide range of skills suitable for any challenge and cover everything from web analytics to data engineering, data science and software development.Experience : Join us at any level and we promise you'll feel up-levelled in no time, thanks to the fast-paced, transparent and aggressive growth of DataliciousExposure : Work with ONLY the best clients in the Australian and SEA markets, every problem you solve would directly impact millions of real people at a large scale across industriesWork Culture : Voted as the Top 10 Tech Companies in Australia. Never a boring day at work, and we walk the talk. The CEO organises nerf-gun bouts in the middle of a hectic day.Money: We'd love to have a long term relationship because long term benefits are exponential. We encourage people to get technical certifications via online courses or digital schools.So if you are looking for the chance to work for an innovative, fast growing business that will give you exposure across a diverse range of the world's best clients, products and industry leading technologies, then Datalicious is the company for you!
Read more
Job posted by
Ramjee Ganti
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Marktine?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort