
Role - Senior Analytics Executive
Experience - 1-2 years
Location - Open (Remote working option available)
About Company :-
Our team is made up of best in class digital, offline, and integrated media experts who work together to enhance media's contribution to Google's business. Our team operates in a seamlessly integrated way across strategy, planning, investment, creative, business sciences and analytics, data and technology. The profile of people who work are world class, committed to establishing a new high water mark in the media industry.
About the role/ Some of the things we'd like you to do:
- Support the Analytics team and other stakeholders with analytics agendas impacting campaigns, measurement frameworks, and campaign optimization.
- Conduct thorough data analysis using various tools and software within MFG to provide insights and recommendations that align with client needs.
- Collaborate with internal stakeholders across disciplines and countries to understand client objectives, providing support and expertise in data and analytics while identifying opportunities for advanced analytics solutions.
- Formulate strategic recommendations with the team based on gathered insights, and support the delivery of reporting for clients.
- Continuously improve project performance and processes by collaborating with the team to develop new tools and solutions.
About yourself/ Requirements:
- Bachelor's degree in a related quantitative field (e.g. Statistics, Business Analytics, Economics, Computer Science, etc.)
- 1-2 years of relevant work experience in data analysis; digital media experience desired
- Strong knowledge of various data analysis tools and software (e.g., Excel, SQL, R, Python, Tableau).
- Is proficient in statistical principles and how they apply to tasks/work items.
- Excellent problem-solving skills and the ability to analyze complex data sets.
- Strong communication and interpersonal skills, with the ability to present data-driven insights to both technical and non-technical audiences.
- Ability to work independently and as part of a team, with strong collaboration skills.
- Demonstrated ability to manage multiple projects and prioritize tasks effectively.
- Passion tor continuous learning and staying current with industry trends and best practices in analytics.

About Global Media Agency
Similar jobs
Responsibilities
-
Deliver full-cycle Tableau development projects, from business needs assessment and data discovery, through solution design, to delivery to client.
-
Enable our clients and ourselves to answer questions and develop data-driven insights through Tableau.
-
Provide technical leadership and support across all aspects of Tableau development and use, from data specification development, through DataMart development, to supporting end-user dashboards and reports.
-
Administrate Tableau Server by creating sites, add/remove users, and provide the appropriate level access for users.
-
Strategize and ideate the solution design. Develop UI mock-ups, storyboards, flow diagrams, conceptual diagrams, wireframes, visual mockups, and interactive prototypes.
-
Develop best practices guidelines for Tableau data processing and visualization. Use these best practices to quickly deliver functionality across the client base and internal users.
Qualifications
-
Degree in a highly-relevant analytical or technical field, such as statistics, data science, or business analytics.
· 5+ years as a Tableau developer and administrator.
· Extensive experience with large data sets, statistical analyses, and visualization as well as hands-on experience on tools (SQL, Tableau, Power BI).
· Ability to quickly learn and take responsibility to deliver.
Title: Data Engineer – Snowflake
Location: Mysore (Hybrid model)
Exp-2-8 yrs
Type: Full Time
Walk-in date: 25th Jan 2023 @Mysore
Job Role: We are looking for an experienced Snowflake developer to join our team as a Data Engineer who will work as part of a team to help design and develop data-driven solutions that deliver insights to the business. The ideal candidate is a data pipeline builder and data wrangler who enjoys building data-driven systems that drive analytical solutions and building them from the ground up. You will be responsible for building and optimizing our data as well as building automated processes for production jobs. You will support our software developers, database architects, data analysts and data scientists on data initiatives
Key Roles & Responsibilities:
- Use advanced complex Snowflake/Python and SQL to extract data from source systems for ingestion into a data pipeline.
- Design, develop and deploy scalable and efficient data pipelines.
- Analyze and assemble large, complex datasets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements. For example: automating manual processes, optimizing data delivery, re-designing data platform infrastructure for greater scalability.
- Build required infrastructure for optimal extraction, loading, and transformation (ELT) of data from various data sources using AWS and Snowflake leveraging Python or SQL technologies.
- Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
- Create and configure appropriate cloud resources to meet the needs of the end users.
- As needed, document topology, processes, and solution architecture.
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies
Qualifications & Experience
Qualification & Experience Requirements:
- Bachelor's degree in computer science, computer engineering, or a related field.
- 2-8 years of experience working with Snowflake
- 2+ years of experience with the AWS services.
- Candidate should able to write the stored procedure and function in Snowflake.
- At least 2 years’ experience in snowflake developer.
- Strong SQL Knowledge.
- Data injection in snowflake using Snowflake procedure.
- ETL Experience is Must (Could be any tool)
- Candidate should be aware of snowflake architecture.
- Worked on the Migration project
- DW Concept (Optional)
- Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
- Experience with data pipeline and workflow management tools: Airflow, etc.
- Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources
- Experience working with Linux and UNIX environments.
- Experience with profiling data, with and without data definition documentation
- Familiar with Git
- Familiar with issue tracking systems like JIRA (Project Management Tool) or Trello.
- Experience working in an agile environment.
Desired Skills:
- Experience in Snowflake. Must be willing to be Snowflake certified in the first 3 months of employment.
- Experience with a stream-processing system: Snowpipe
- Working knowledge of AWS or Azure
- Experience in migrating from on-prem to cloud systems
Position: ETL Developer
Location: Mumbai
Exp.Level: 4+ Yrs
Required Skills:
* Strong scripting knowledge such as: Python and Shell
* Strong relational database skills especially with DB2/Sybase
* Create high quality and optimized stored procedures and queries
* Strong with scripting language such as Python and Unix / K-Shell
* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.
* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.
* Experienced in Agile development process
* Java Knowledge is a big plus but not essential
* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus
* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.
* Good team player; Integrity & ownership
Senior Data Scientist
Your goal: To improve the education process and improve the student experience through data.
The organization: Data Science for Learning Services Data Science and Machine Learning are core to Chegg. As a Student Hub, we want to ensure that students discover the full breadth of learning solutions we have to offer to get full value on their learning time with us. To create the most relevant and engaging interactions, we are solving a multitude of machine learning problems so that we can better model student behavior, link various types of content, optimize workflows, and provide a personalized experience.
The Role: Senior Data Scientist
As a Senior Data Scientist, you will focus on conducting research and development in NLP and ML. You will be responsible for writing production-quality code for data product solutions at Chegg. You will lead in identification and implementation of key projects to process data and knowledge discovery.
Responsibilities:
• Translate product requirements into AIML/NLP solutions
• Be able to think out of the box and be able to design novel solutions for the problem at hand
• Write production-quality code
• Be able to design data and annotation collection strategies
• Identify key evaluation metrics and release requirements for data products
• Integrate new data and design workflows
• Innovate, share, and educate team members and community
Requirements:
• Working experience in machine learning, NLP, recommendation systems, experimentation, or related fields, with a specialization in NLP • Working experience on large language models that cater to multiple tasks such as text generation, Q&A, summarization, translation etc is highly preferred
• Knowledge on MLOPs and deployment pipelines is a must
• Expertise on supervised, unsupervised and reinforcement ML algorithms.
• Strong programming skills in Python
• Top data wrangling skills using SQL or NOSQL queries
• Experience using containers to deploy real-time prediction services
• Passion for using technology to help students
• Excellent communication skills
• Good team player and a self-starter
• Outstanding analytical and problem-solving skills
• Experience working with ML pipeline products such as AWS Sagemaker, Google ML, or Databricks a plus.
Why do we exist?
Students are working harder than ever before to stabilize their future. Our recent research study called State of the Student shows that nearly 3 out of 4 students are working to support themselves through college and 1 in 3 students feel pressure to spend more than they can afford. We founded our business on provided affordable textbook rental options to address these issues. Since then, we’ve expanded our offerings to supplement many facets of higher educational learning through Chegg Study, Chegg Math, Chegg Writing, Chegg Internships, Thinkful Online Learning, and more, to support students beyond their college experience. These offerings lower financial concerns for students by modernizing their learning experience. We exist so students everywhere have a smarter, faster, more affordable way to student.
Video Shorts
Life at Chegg: https://jobs.chegg.com/Video-Shorts-Chegg-Services
Certified Great Place to Work!: http://reviews.greatplacetowork.com/chegg
Chegg India: http://www.cheggindia.com/
Chegg Israel: http://insider.geektime.co.il/organizations/chegg
Thinkful (a Chegg Online Learning Service): https://www.thinkful.com/about/#careers
Chegg out our culture and benefits!
http://www.chegg.com/jobs/benefits
https://www.youtube.com/watch?v=YYHnkwiD7Oo
Chegg is an equal-opportunity employer
JOB SUMMARY: The Senior Associate supports the Data Analytics Manager by proposing relevant analytics procedures/tools, executing the analytics and also developing visualization outputs for audits, continuous monitoring/auditing and IA initiatives. The individual’s responsibilities include -
Understanding audit and/or project objectives and assisting the manager in preparing the plan and timelines.
Working with the Process/BU/IA teams for gathering requirements for continuous monitoring/auditing projects.
Working with Internal audit project teams to understand the analytics requirements for audit engagements.
Independently build pilot/prototype, determine appropriate visual tool and design the views to meet project objectives.
Proficient in data management and data mining.
Highly skilled on visualization tools like Qlik View, Qlik Sense, Power BI, Tableau, Alteryx etc.
Working with Data Analytics Manager to develop analytics program aligned to the overall audit plan.
Showcasing analytics capability to Process management teams to increase adoption of continuous monitoring.
Establishing and maintaining relationships with all key stakeholders of internal audit.
Coaching other data analysts on analytics procedures, coding and tools.
Taking a significant and active role in developing and driving Internal Audit Data Analytics quality and knowledge sharing to enhance the value provided to Internal Audit stakeholders.
Ensuring timely and accurate time tracking.
Continuously focusing on self-development by attending trainings, seminars and acquiring relevant certifications.
Job Title : Analyst / Sr. Analyst – Data Science Developer - Python
Exp : 2 to 5 yrs
Loc : B’lore / Hyd / Chennai
NP: Candidate should join us in 2 months (Max) / Immediate Joiners Pref.
About the role:
We are looking for an Analyst / Senior Analyst who works in the analytics domain with a strong python background.
Desired Skills, Competencies & Experience:
• • 2-4 years of experience in working in the analytics domain with a strong python background. • • Visualization skills in python with plotly, matplotlib, seaborn etc. Ability to create customized plots using such tools. • • Ability to write effective, scalable and modular code. Should be able to understand, test and debug existing python project modules quickly and contribute to that. • • Should be familiarized with Git workflows.
Good to Have: • • Familiarity with cloud platforms like AWS, AzureML, Databricks, GCP etc. • • Understanding of shell scripting, python package development. • • Experienced with Python data science packages like Pandas, numpy, sklearn etc. • • ML model building and evaluation experience using sklearn.
|
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
In this role, candidates will be responsible for developing Tableau Reports. Should be able to write effective and scalable code. Improve functionality of existing Reports/systems.
· Design stable, scalable code.
· Identify potential improvements to the current design/processes.
· Participate in multiple project discussions as a senior member of the team.
· Serve as a coach/mentor for junior developers.
Minimum Qualifications
· 3 - 8 Years of experience
· Excellent written and verbal communication skills
Must have skills
· Meaningful work experience
· Extensively worked on BI Reporting tool: Tableau for development of reports to fulfill the end user requirements.
· Experienced in interacting with business users to analyze the business process and requirements and redefining requirements into visualizations and reports.
· Must have knowledge with the selection of appropriate data visualization strategies (e.g., chart types) for specific use cases. Ability to showcase complete dashboard implementations that demonstrate visual standard methodologies (e.g., color themes, visualization layout, interactivity, drill-down capabilities, filtering, etc.).
· You should be an Independent player and have experience working with senior leaders.
· Able to explore options and suggest new solutions and visualization techniques to the customer.
· Experience crafting joins and joins with custom SQL blending data from different data sources using Tableau Desktop.
· Using sophisticated calculations using Tableau Desktop (Aggregate, Date, Logical, String, Table, LOD Expressions.
· Working with relational data sources (like Oracle / SQL Server / DB2) and flat files.
· Optimizing user queries and dashboard performance.
· Knowledge in SQL, PL/SQL.
· Knowledge is crafting DB views and materialized views.
· Excellent verbal and written communication skills and interpersonal skills are required.
· Excellent documentation and presentation skills; should be able to build business process mapping document; functional solution documents and own the acceptance/signoff process from E2E
· Ability to make right graph choices, use of data blending feature, Connect to several DB technologies.
· Must stay up to date on new and coming visualization technologies.
Pref location: Chennai (priority)/ Bengaluru
The candidate,
1. Must have a very good hands-on technical experience of 3+ years with JAVA or Python
2. Working experience and good understanding of AWS Cloud; Advanced experience with IAM policy and role management
3. Infrastructure Operations: 5+ years supporting systems infrastructure operations, upgrades, deployments using Terraform, and monitoring
4. Hadoop: Experience with Hadoop (Hive, Spark, Sqoop) and / or AWS EMR
5. Knowledge on PostgreSQL/MySQL/Dynamo DB backend operations
6. DevOps: Experience with DevOps automation - Orchestration/Configuration Management and CI/CD tools (Jenkins)
7. Version Control: Working experience with one or more version control platforms like GitHub or GitLab
8. Knowledge on AWS Quick sight reporting
9. Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, AWS CloudTrail, Datadog and Elastic Search
10. Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB) and high availability architecture
11. Security: Experience implementing role-based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools for remediation of security vulnerabilities.
12. Demonstrated successful experience learning new technologies quickly
WHAT WILL BE THE ROLES AND RESPONSIBILITIES?
1. Create procedures/run books for operational and security aspects of AWS platform
2. Improve AWS infrastructure by developing and enhancing automation methods
3. Provide advanced business and engineering support services to end users
4. Lead other admins and platform engineers through design and implementation decisions to achieve balance between strategic design and tactical needs
5. Research and deploy new tools and frameworks to build a sustainable big data platform
6. Assist with creating programs for training and onboarding for new end users
7. Lead Agile/Kanban workflows and team process work
8. Troubleshoot issues to resolve problems
9. Provide status updates to Operations product owner and stakeholders
10. Track all details in the issue tracking system (JIRA)
11. Provide issue review and triage problems for new service/support requests
12. Use DevOps automation tools, including Jenkins build jobs
13. Fulfil any ad-hoc data or report request queries from different functional groups

