
Senior Executive - Analytics (Business Intelligence)
at World's largest Media Investment Company
Role - Senior Executive Analytics (BI)
Overview of job -
Our Client is the world’s largest media investment company and are a part of WPP. In fact, we are responsible for one in every three ads you see globally. We are currently looking for a Senior Executive Analytics to join us. In this role, you will be responsible for a massive opportunity to build and be a part of largest performance marketing setup APAC is committed to fostering a culture of diversity and inclusion. Our people are our strength, so we respect and nurture their individual talent and potential.
Reporting of the role - This role reports to the Director - Analytics.
3 best things about the job:
1. Responsible for data & analytics projects and developing data strategies by diving into data and extrapolating insights and providing guidance to clients
2. Build and be a part of a dynamic team
3. Being part of a global organisations with rapid growth opportunities
Responsibilities of the role:
• Design and build Visualization, Dashboard and reports for both Internal and external clients using Tableau, Power BI, Datorama or R Shiny/Python, SQL, Alteryx.
• Work with large data sets via hands-on data processing to produce structured data sets for analysis.
• Good to Have - Build Marketing-Mix and Multi-Touch attribution models using a range of tools, including free and paid.
What you will need:
• Degree in Mathematics, Statistics, Economics, Engineering, Data Science, Computer Science, or quantitative field.
• Proficiency in one or more coding languages – preferred languages: Python, R
• Proficiency in one or more Visualization Tools – Tableau, Datorama, Power BI
• Proficiency in using SQL.
• Minimum 3 years of experience in Marketing/Data Analytics or related field with hands-on experience in building Marketing-Mix and Attribution models is a plus.

Similar jobs
Responsibilities :
- Involved in detailing and implementing user stories.
- Understand the technical specifications and design the solutions.
- Validate and implement the integration components of the third-party applications.
- Build scalable and fault-tolerant software solutions adhering to the organization's secured coding standards.
- Strive for 100% unit test code coverage.
- Do code quality checks and code reviews regularly to ensure safe and efficient code.
- Verify and deploy software solutions for development needs.
- Work closely with the team to deliver the sprint objectives.
- Continuously look to improve the organization's standards.
Requirement
- A Bachelor’s / Master’s Degree in Engineering or Information Technology.
- 4-7 years of software development experience with 2+ years of experience with Python programming language.
- A thorough understanding of computer architecture, operating systems, and data structures.
- An in-depth understanding of the Internet, Cloud Computing & Services, and REST APIs.
- Must have experience with any one of the Python frameworks like Flask / FastAPI / Django REST.
- Must know GIT and Python virtual environment.
- Should have experience with Python requests module.
- Must know how to use third-party libraries in Python.
- Knowledge of Python module/library creation will be added advantageous.
- Familiarity with SIEM tools like the Qradar app / Splunk app and Splunk add-on will be an advantage.
- Experience working with Linux/Unix and shell scripts.
- Experience working with Linux/Unix and shell scripts
- A meticulous and organized approach to work.
- A logical, analytical, and creative approach to problem-solving.
- A thorough, detail-oriented work style.
Senior Data Analyst
Experience: 8+ Years
Work Mode: Remote Full Time
Responsibilities:
• Analyze large datasets to uncover trends, patterns, and insights to support business goals.
• Design, develop, and manage interactive dashboards and reports using Power BI.
• Utilize DAX and SQL for advanced data querying and data modeling.
• Create and manage complex SQL queries for data extraction, transformation, and loading processes.
• Collaborate with cross-functional teams to understand data requirements and translate them into actionable solutions.
• Maintain data accuracy and integrity across projects, ensuring reliable data-driven insights.
• Present findings to stakeholders, translating complex data insights into simple, actionable business recommendations.
Skills:
Power BI, DAX (Data Analysis Expressions), SQL, Data Modeling, Python
Preferred Skills:
• Machine Learning: Exposure to machine learning models and their integration within analytical solutions.
• Microsoft Fabric: Familiarity with Microsoft Fabric for enhanced data integration and management.
Their services are available across the globe, with over 65% of their client base being from US, UK, and Canada. The company's primary focus is on Ayurveda and taking the ancient knowledge to anyone who wishes to bring back balance to their health and apply the tools in their everyday life.
What you will do:
- Developing, coding and assisting the Email Marketing team with email campaigns
- Configuring multifaceted content elements to build personalized emails at scale
- Ensuring quality, consistency enforced at all steps and across all assets, devices, inboxes
- Staying up to date with trends in email development and continuing to study and improve skills
- Thinking and creating for conversion optimization (training will be given in-house)
- Building landing pages using HTML/CSS/JavaScript (front end)
What you need to have:
- Demonstrable HTML design skills and an interest or love for email marketing
- Willingness to broaden your expertise, and go the extra mile
- Deep understanding of not just creating a beautiful email but thinking from a conversion point of view as per what aspects of the email will encourage a reader to click the email, and be amazed by its design, resulting in a desire to open more emails from our brand
- Experience working with email code
- Advanced with HTML, CSS and responsive design
- Comfortable with HTML for email
- Have demonstrable coding, proofing, testing and troubleshooting experience
- Proficiency in Adobe Creative Suite and/or Sketch
- Comfortable translating design file into functional HTML
- Eye for design, passionate about building UX standard methodologies into every project is a plus
- Ability to professionally interact with key stakeholders throughout the development process
- Knack for English language and comfortable working in an English speaking organization
- Deploy company Application on customer public cloud and on-premise data centers
- Building Kubernetes based workflows for wide variety of use cases
- Document and Automate the deployment process for internal and external deployments
- Interacting with customers over call to deployment and debugging
- Deployment and Product Support
Desired Skills and Experience
- 4-6 years of experience in infrastructure development, or development and operations.
- Minimum 2+ years of experience in docker and kubernetes.
- Experience working with Docker and Kubernetes. Aware of Kubernetes Internals, Networking etc. Experience with Linux infrastructures tools.
- Good interpersonal skills and communication with all levels of management.
- Extensive experience in setting up Kubernetes on AWS, Azure etc.
Good to Have
- Familiarity with Big Data Tools like Hadoop, Spark.
- Experience with Java Application Debugging.
- Experience in monitoring tools like Prometheus, Grafana etc
SpringML is looking to hire a top-notch Data Engineer who is passionate about working with data and using the latest distributed framework to process large datasets. As a Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets. In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Title-Data Engineers
Location-Hyderabad
Work Timings-10.00 AM-06.00 PM
RESPONSIBILITIES:
Ability to work as a member of a team assigned to design and implement data integration solutions.
Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
Propose design solutions and recommend best practices for large scale data analysis
SKILLS:
B.tech degree in computer science, mathematics or other relevant fields.
Good Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.

- Expert knowledge of IBM DB2 V11.5 installations, configurations & administration in Linux systems.
- Expert level knowledge in Database restores including redirected restore & backup concepts.
- Excellent understanding of database performance monitoring techniques, fine-tuning, and able to perform performance checks & query optimization
- Good knowledge of utilities like import, load & export under high volume conditions.
- Ability to tune SQLs using db2advisor & db2explain.
- Ability to troubleshoot database issues using db2diag, db2pd, db2dart, db2top tec.
- Administration of database objects.
- Capability to review & assess features or upgrades to existing components.
- Experience in validating security aspects on a confidential database.
- Hands-on experience in SSL communication setup, strong access control, and database hardening.
- Experience in performing productive DB recovery and validating crash recovery.
- Experience in handling incidents & opening DB2 support tickets.
- Experience in deploying a special build DB2 version from DB2 support.
- Worked in environments such as 12x5 supports of production database services.
- Excellent problem-solving skills, analytical skills.
- Validate security aspects on a confidential database
- SSL communication setup, strong access control; database hardening
- Validate Crash recovery
- perform productive DB recovery
- On incidents open db2 support tickets
- Deploy a special build DB2 version from db2 support
Good to have:
- Experience in handling application servers (WebSphere, WebLogic, Jboss, etc.) in highly available Production environments.
- Experience in maintenance, patching, and installing updates on WebSphere Servers in the Production environment.
- Able to handle installation/deployment of the product (JAR/EAR/WAR) independently.
- Knowledge of ITIL concepts (Service operation & transition)
Soft skills:
- Ability to work with the global team (co-located staffing).
- Carries learning attitude, should be an individual contributor and must have excellent communication skills.
- Support: 12/5 support is required (on a rotational basis).
Looking Data Enginner for our Own organization-
Notice Period- 15-30 days
CTC- upto 15 lpa
Preferred Technical Expertise
- Expertise in Python programming.
- Proficient in Pandas/Numpy Libraries.
- Experience with Django framework and API Development.
- Proficient in writing complex queries using SQL
- Hands on experience with Apache Airflow.
- Experience with source code versioning tools such as GIT, Bitbucket etc.
Good to have Skills:
- Create and maintain Optimal Data Pipeline Architecture
- Experienced in handling large structured data.
- Demonstrated ability in solutions covering data ingestion, data cleansing, ETL, Data mart creation and exposing data for consumers.
- Experience with any cloud platform (GCP is a plus)
- Experience with JQuery, HTML, Javascript, CSS is a plus.

- 5+ years experience
- Previously managed a team
- Built & managed backend systems from scratch previously.
- Lead the core backend engineering team, including mentoring, coaching, architecture, devops, code review and software infrastructure
- Should be more proficient with Django.
- Work with the executive team to shape and execute the product roadmap
- Direct our team environment to achieve sprint deliverables and acceptance criteria
- Design and monitor metrics to assess product and infrastructure performance
- Hire, build and shape the team







