Data Analytics - Datorama
The Client is the world’s largest media investment company. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels.
Responsibilities of the role:
· Manage extraction of data sets from multiple marketing/database platforms and perform hygiene and quality control steps, either via Datorama or in partnership with Neo Technology team. Data sources will include: web analytic tools, media analytics, customer databases, social listening tools, search tools, syndicated data, research & survey tools, etc…
· Implement and manage data system architecture
· Audit and manage data taxonomy/classifications from multiple systems and partners
· Manage the extraction, loading, and transformation of multiple data sets
· Cleanse all data and metrics; perform override updates where necessary
· Execute all business rules according to requirements · Identify and implement opportunities for efficiency throughout the process
· Manage and execute thorough QA process and ensure quality and accuracy of all data and reporting deliverables
· Manipulate and analyze “big” data sets synthesized from a variety of sources, including media platforms and marketing automation tools
· Generate and manage all data visualizations and ensure data is presented accurately and is visually pleasing
· Assist analytics team in running numerous insights reports as needed
· Help maintain a performance platform and provide insights and ongoing recommendations around.
Requirements :
· 5+ years’ experience in an analytics position working with large amounts of data
· Hands-on experience working with data visualization tools such as Datorama, Tableau, or PowerBI
· Additional desirable skills include tag management experience, application coding experience, statistics background
· Digital media experience background preferred, including knowledge of Doubleclick and web analytics tools
· Excellent communication skills
· Experience with HTML, CSS, Javascript a plus
About My Client is the world’s largest media investment company.
Similar jobs
- Overall, 4-5 years of experience
- At least 3 years of HTML/CSS development experience is required
- Must have solid working experience on HTML5, CSS3, XML
- Should be an expert on XPath/Regex expressions for complex website navigations
- Working knowledge on core JavaScript
- Experience of mentoring the junior team members will be desirable
- Must have good communication skills
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Job Description :
- Candidate should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- Candidate should have experience for proper understanding of the project deliverables.
- Candidate should be responsible for the respective tasks assigned in the project.
- Candidate will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- Candidate should be involved in client interaction.
- Candidate should possess excellent communication skills.
Required Skills : BI Gateway, MS SQL Server, Tableau, Power BI,.Net , OLAP, UI/UX , Dashboard Building
Experience : 5+Years
Job Location : Remote/Saudi Arabia
Work Timings : 2.30 pm- 11.30 pm
We are looking for a passionate and experienced Data Analyst to join our team! As a Data Analyst at Oneistox, you will play an extremely important role as your insights and findings will be crucial for our growth and success.
Job Responsibilities
- Execution of data validation, profiling, auditing and data cleansing activities
- Collaboration with internal and external stakeholders
- Development, production and management of data quality reports
- Development of key metrics, rules and notifications to identify critical gaps
- Identify opportunities for business process improvements
- Develop and maintain KPIs Dashboards
- Support all Marketing and Sales data requests
- Standardization and automation of data collection and processing
Job Requirements
- BS in Computer Science, Mathematics or a similar field
- 2-3 years of experience as a Data Analyst or a similar role
- Experience with analytics platforms like Google Analytics, Hubspot and Amplitude is a must
- Familiarity with Javascript is preferred
- Advanced excel a must; Pivot Tables, Macros preferred
- Experience with using a range of data analysis tools
- Advanced analytics capability is a preferred skill
- Understanding of multiple regression analyses
- Experience with performing analysis in a database environment is preferred
Python + Data scientist : |
• Build data-driven models to understand the characteristics of engineering systems |
• Train, tune, validate, and monitor predictive models |
• Sound knowledge on Statistics |
• Experience in developing data processing tasks using PySpark such as reading, merging, enrichment, loading of data from external systems to target data destinations |
• Working knowledge on Big Data or/and Hadoop environments |
• Experience creating CI/CD Pipelines using Jenkins or like tools |
• Practiced in eXtreme Programming (XP) disciplines |
- Minimum 3-4 years of experience with ETL tools, SQL, SSAS & SSIS
- Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes
- Knowledge of programming languages eg. JASON, Python, R
- Hands on experience of SQL database design
- Experience working with REST API
- Influencing and supporting project delivery through involvement in project/sprint planning and QA
- Working experience with Azure
- Stakeholder management
- Good communication skills
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
- BE Computer Science, MCA or equivalent
- Javascript AI library experience - there are many!
- A team player who can collaborate with engineers, designers, and other cross-functional
teams
- Ability to initiate and drive projects to completion with minimal guidance
- Fluent in and passionate about JavaScript
- Troubleshooting/debugging experience
- Strong communication skills
Experience:
- Min 1 year experience
- Not more than 7 year experience.
- Startup experience is a must.
Location ● Remotely, anywhere in India
Timings:- 40 hours a week but with 4 hours a day overlapping with client timezone. Typically clients are in California PST Timezone.
Position:- Full time/Direct
Other Benefits
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12
PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other
incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here
because you love the company. We have only a 15 days notice period.
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.