Specific Responsibilities
- Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
- Design and develop the ETL framework using BigQuery
- Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
- Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
- Should be able to automate the data load from Big Query using APIs or scripting language.
- Good experience in Advanced SQL concepts.
- Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
- Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
- Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
- Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
- Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
- Understand requirements to translate business problems & technical problems into analytics problems.
- Effective storyboarding and presentation of the solution to the client and leadership.
- Client engagement & management
- Ability to interface effectively with multiple levels of management and functional disciplines.
- Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.
- 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
- Relevant experience in Consumer Tech/CPG/Retail industries
- Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
- Strong problem solving and web analytical skills. Acute attention to detail.
- Experience in analyzing large, complex, multi-dimensional data sets.
- Experience in one or more roles in an online eCommerce or online support environment.
- Expertise in Google Big Query & Google Cloud Platform
- Experience in Advanced SQL, Scripting language (Python/R)
- Hands-on experience in BI tools (Tableau, Power BI)
- Working Experience & understanding of Adobe Analytics or Google Analytics
- Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
- Excellent analytical thinking, analysis, and problem-solving skills.
- Knowledge of other GCP services is a plus
About Marktine
Similar jobs
Role : Senior Customer Scientist
Experience : 6-8 Years
Location : Chennai (Hybrid)
Who are we?
A young, fast-growing AI and big data company, with an ambitious vision to simplify the world’s choices. Our clients are top-tier enterprises in the banking, e-commerce and travel spaces. They use our core AI-based choice engine maya.ai, to deliver personal digital experiences centered around taste. The maya.ai platform now touches over 125M customers globally. You’ll find Crayon Boxes in Chennai and Singapore. But you’ll find Crayons in every corner of the world. Especially where our client projects are – UAE, India, SE Asia and pretty soon the US.
Life in the Crayon Box is a little chaotic, largely dynamic and keeps us on our toes! Crayons are a diverse and passionate bunch. Challenges excite us. Our mission drives us. And good food, caffeine (for the most part) and youthful energy fuel us. Over the last year alone, Crayon has seen a growth rate of 3x, and we believe this is just the start.
We’re looking for young and young-at-heart professionals with a relentless drive to help Crayon double its growth. Leaders, doers, innovators, dreamers, implementers and eccentric visionaries, we have a place for you all.
Can you say “Yes, I have!” to the below?
- Experience with exploratory analysis, statistical analysis, and model development
- Knowledge of advanced analytics techniques, including Predictive Modelling (Logistic regression), segmentation, forecasting, data mining, and optimizations
- Knowledge of software packages such as SAS, R, Rapidminer for analytical modelling and data management.
- Strong experience in SQL/ Python/R working efficiently at scale with large data sets
- Experience in using Business Intelligence tools such as PowerBI, Tableau, Metabase for business applications
Can you say “Yes, I will!” to the below?
- Drive clarity and solve ambiguous, challenging business problems using data-driven approaches. Propose and own data analysis (including modelling, coding, analytics) to drive business insight and facilitate decisions.
- Develop creative solutions and build prototypes to business problems using algorithms based on machine learning, statistics, and optimisation, and work with engineering to deploy those algorithms and create impact in production.
- Perform time-series analyses, hypothesis testing, and causal analyses to statistically assess the relative impact and extract trends
- Coordinate individual teams to fulfil client requirements and manage deliverable
- Communicate and present complex concepts to business audiences
- Travel to client locations when necessary
Crayon is an equal opportunity employer. Employment is based on a person's merit and qualifications and professional competences. Crayon does not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related.
More about Crayon: https://www.crayondata.com/
More about maya.ai: https://maya.ai/
As the successful candidate, you will be required to:
- Design, develop and maintain data quality assurance framework
- Work in conjunction with BI and Data Engineers to ensure high quality Data Deliverable
- Design and develop testing frameworks to test ETL jobs, BI reports and Dashboards and other data pipelines
- Write SQL scripts to validate data in the data repositories against the data in the source systems
- Write SQL scripts to validate data surfacing in BI assets against the data sources
- Ensure data quality by checking against our ODS and the front-end application
- Track, monitor and document testing results
To be eligible for this role, you should possess the following:
- Demonstrated ability to write complex SQL/TSQL queries to retrieve/modify data
- Ability to work in an Agile environment
- Ability to learn new tools and technologies and adapt to an evolving tech-scape
Envoy Global is an equal opportunity employer and will recruit, hire, train and promote into all job levels the most qualified applicants without regard to race, colour, religion, sex, national origin, age, disability, ancestry, sexual orientation, gender identification, veteran status, pregnancy, or any other protected classification.
Who Are We?
Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 5 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.
Vahak has raised a capital of $5 Million in a “Pre Series A” round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.
Responsibilities for Data Analyst:
- Undertake preprocessing of structured and unstructured data
- Propose solutions and strategies to business challenges
- Present information using data visualization techniques
- Identify valuable data sources and automate collection processes
- Analyze large amounts of information to discover trends and patterns
- Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
- Proactively analyze data to answer key questions from stakeholders or out of self-initiated curiosity with an eye for what drives business performance.
Qualifications for Data Analyst
- Experience using business intelligence tools (e.g. Tableau, Power BI – not mandatory)
- Strong SQL or Excel skills with the ability to learn other analytic tools
- Conceptual understanding of various modelling techniques, pros and cons of each technique
- Strong problem solving skills with an emphasis on product development.
- Programming advanced computing, Developing algorithms and predictive modeling experience
- Experience using statistical computer languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
- Advantage - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
- Demonstrated experience applying data analysis methods to real-world data problems
Responsibilities
-
Deliver full-cycle Tableau development projects, from business needs assessment and data discovery, through solution design, to delivery to client.
-
Enable our clients and ourselves to answer questions and develop data-driven insights through Tableau.
-
Provide technical leadership and support across all aspects of Tableau development and use, from data specification development, through DataMart development, to supporting end-user dashboards and reports.
-
Administrate Tableau Server by creating sites, add/remove users, and provide the appropriate level access for users.
-
Strategize and ideate the solution design. Develop UI mock-ups, storyboards, flow diagrams, conceptual diagrams, wireframes, visual mockups, and interactive prototypes.
-
Develop best practices guidelines for Tableau data processing and visualization. Use these best practices to quickly deliver functionality across the client base and internal users.
Qualifications
-
Degree in a highly-relevant analytical or technical field, such as statistics, data science, or business analytics.
· 5+ years as a Tableau developer and administrator.
· Extensive experience with large data sets, statistical analyses, and visualization as well as hands-on experience on tools (SQL, Tableau, Power BI).
· Ability to quickly learn and take responsibility to deliver.
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
- Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
- Establishing scalable, efficient and automated processes to deploy data analytics on large data sets across platforms
What you need to have:
- B.Tech /B.E.; Any Graduation
- Strong background in statistical concepts & calculations to perform analysis/ modeling
- Proficient in SQL and other BI tools like Tableau, Power BI etc.
- Good knowledge of Google Analytics and any other web analytics platforms (preferred)
- Strong analytical and problem solving skills to analyze large quantum of datasets
- Ability to work independently and bring innovative solutions to the team
- Experience of working with a start-up or a product organization (preferred)
Minimum of 4 years’ experience of working on DW/ETL projects and expert hands-on working knowledge of ETL tools.
Experience with Data Management & data warehouse development
Star schemas, Data Vaults, RDBMS, and ODS
Change Data capture
Slowly changing dimensions
Data governance
Data quality
Partitioning and tuning
Data Stewardship
Survivorship
Fuzzy Matching
Concurrency
Vertical and horizontal scaling
ELT, ETL
Spark, Hadoop, MPP, RDBMS
Experience with Dev/OPS architecture, implementation and operation
Hand's on working knowledge of Unix/Linux
Building Complex SQL Queries. Expert SQL and data analysis skills, ability to debug and fix data issue.
Complex ETL program design coding
Experience in Shell Scripting, Batch Scripting.
Good communication (oral & written) and inter-personal skills
Expert SQL and data analysis skill, ability to debug and fix data issue Work closely with business teams to understand their business needs and participate in requirements gathering, while creating artifacts and seek business approval.
Helping business define new requirements, Participating in End user meetings to derive and define the business requirement, propose cost effective solutions for data analytics and familiarize the team with the customer needs, specifications, design targets & techniques to support task performance and delivery.
Propose good design & solutions and adherence to the best Design & Standard practices.
Review & Propose industry best tools & technology for ever changing business rules and data set. Conduct Proof of Concepts (POC) with new tools & technologies to derive convincing benchmarks.
Prepare the plan, design and document the architecture, High-Level Topology Design, Functional Design, and review the same with customer IT managers and provide detailed knowledge to the development team to familiarize them with customer requirements, specifications, design standards and techniques.
Review code developed by other programmers, mentor, guide and monitor their work ensuring adherence to programming and documentation policies.
Work with functional business analysts to ensure that application programs are functioning as defined.
Capture user-feedback/comments on the delivered systems and document it for the client and project manager’s review. Review all deliverables before final delivery to client for quality adherence.
Technologies (Select based on requirement)
Databases - Oracle, Teradata, Postgres, SQL Server, Big Data, Snowflake, or Redshift
Tools – Talend, Informatica, SSIS, Matillion, Glue, or Azure Data Factory
Utilities for bulk loading and extracting
Languages – SQL, PL-SQL, T-SQL, Python, Java, or Scala
J/ODBC, JSON
Data Virtualization Data services development
Service Delivery - REST, Web Services
Data Virtualization Delivery – Denodo
ELT, ETL
Cloud certification Azure
Complex SQL Queries
Data Ingestion, Data Modeling (Domain), Consumption(RDMS)
Data Scientist
- Adept at Machine learning techniques and algorithms.
Feature selection, dimensionality reduction, building and
- optimizing classifiers using machine learning techniques
- Data mining using state-of-the-art methods
- Doing ad-hoc analysis and presenting results
- Proficiency in using query languages such as N1QL, SQL
Experience with data visualization tools, such as D3.js, GGplot,
- Plotly, PyPlot, etc.
Creating automated anomaly detection systems and constant tracking
- of its performance
- Strong in Python is a must.
- Strong in Data Analysis and mining is a must
- Deep Learning, Neural Network, CNN, Image Processing (Must)
Building analytic systems - data collection, cleansing and
- integration
Experience with NoSQL databases, such as Couchbase, MongoDB,
Cassandra, HBase
Data Engineer
at Prescience Decision Solutions
The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes
Required Experience, Skills and Qualifications:
- Hands on experience on Big Data tools/technologies like Spark, Databricks, Map Reduce, Hive, HDFS.
- Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
- Proficiency in any of the programming language: Python/ Scala/ Java with 4+ years’ experience
- Experience in Cloud infrastructures like MS Azure, Data lake etc
- Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)