Professional experience in Python – Mandatory experience
Basic knowledge of any BI Tool (Microsoft Power BI, Tableau etc.) and experience in R
will be an added advantage
Proficient in Excel
Good verbal and written communication skills
Key Responsibilities:
Analyze data trends and provide intelligent business insights, monitor operational and
business metrics
Complete ownership of business excellence dashboard and preparation of reports for
senior management stating trends, patterns, and predictions using relevant data
Review, validate and analyse data points and implement new data analysis
methodologies
Perform data profiling to identify and understand anomalies
Perform analysis to assess quality and meaning of data
Develop policies and procedures for the collection and analysis of data
Analyse existing process with the help of data and propose process change and/or lead
process re-engineering initiatives
Use BI Tools (Microsoft Power BI/Tableau) and develop and manage BI solutions
About Broadcast Media Production and Distribution Company
Similar jobs
Senior Executive - Analytics
Overview of job :-
Our Client is the world’s largest media investment company which is a part of WPP. They are a global digital transformation agency with 1200 employees across 21 nations. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels.
We are currently looking for a Sr Executive – Analytics to join us. In this role, you will be responsible for a massive opportunity to build and be a part of largest performance marketing setup APAC is committed to fostering a culture of diversity and inclusion. Our people are our strength so we respect and nurture their individual talent and potential.
Reporting of the role - This role reports to the Director - Analytics,
3 best things about the job:
1. Responsible for data & analytics projects and developing data strategies by diving into data and extrapolating insights and providing guidance to clients
2. Build and be a part of a dynamic team
3. Being part of a global organisations with rapid growth opportunities
Responsibilities of the role:
Build Marketing-Mix and Multi-Touch attribution models using a range of tools, including free and paid.
Work with large data sets via hands-on data processing to produce structured data sets for analysis.
Design and build Visualization, Dashboard and reports for both Internal and external clients using Tableau, Power BI, Datorama or R Shiny/Python.
What you will need:
Degree in Mathematics, Statistics, Economics, Engineering, Data Science, Computer Science or quantitative field.
2-3 years’ experience in Marketing/Data Analytics or related field with hands-on experience in building Marketing-Mix and Attribution models. Proficiency in one or more coding languages – preferred languages: Python, R
Proficiency in one or more Visualization Tools – Tableau, Datorama, Power BI
Proficiency in using SQL.
Proficiency with one or more statistical tools is a plus – Example: SPSS, SAS, MATLAB, Mathcad.
Working experience using big data technologies (Hive/Hadoop) is a plus
Proficiency in Linux.
Must have SQL knowledge and experience working with relational databases,
query authoring (SQL) as well as familiarity with databases including Mysql,
Mongo, Cassandra, and Athena.
Must have experience with Python/Scala.
Must have experience with Big Data technologies like Apache Spark.
Must have experience with Apache Airflow.
Experience with data pipeline and ETL tools like AWS Glue.
Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Company Description
What We Do
Miratech helps visionaries to change the world. We are a global IT services and consulting company that brings together global enterprise innovation and start-up innovation. Today we support digital transformation for the largest enterprises on the planet.
By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. Our culture of Relentless Performance enables over 99% of Miratech’s engagements to succeed by meeting or exceeding scope, schedule and/or budget objectives since our inception in 1989.
Job Description
We are looking for a Senior Perl Developer to join our team, who will help us work on solutions and implement technologies that will improve user experience.
Responsibilities:
- Developing and maintaining large-scale Perl applications.
- Perform application modifications and enhancements based on business needs.
- Develop clean, high-quality and reusable codes based on programming standards.
- Coordinate with Project Manager to clearly understand business requirements and expectations.
- Stay abreast with latest trends in application development techniques and technologies.
- Suggest optimal application development solutions to meet or exceed business objectives.
- Develop best practices to ensure coding efficiency and quality.
- Analyze and resolve coding issues in a timely and accurate manner.
- Prepare and maintain coding documentations for reference purposes.
- Prioritize, plan and handle multiple tasks effectively.
- Ensure to complete the assigned development tasks within deadlines.
- Report project status to Manager on regular basis.
Qualifications
- 7+ years of experience in Perl development.
- Experience of developing and maintaining shell scripts.
- Strong working knowledge in Perl and Unix based systems.
- Experience of developing database-driven web services/applications against SQL databases such as Oracle or MySQL.
- Detail focused, experience of reviewing technical documentation, diagrams and plans in order to help meet and/or define requirements.
- Able to communicate technical information verbally in a clear manner to both technical and non-technical stakeholders.
Description
If you are passionate about game development, logic building to create immersive and engaging content here is a great opportunity for you to be a part of a leading EdTech organization with 3000+ professionals.
Extramarks Education India is made up of a diverse talent pool of instructional designers, educational psychologists, software engineers, 2D/3D Animators, UX designers, artists and creative professionals that create engaging and immersive learning solutions in digital media.
Designation – Game Developer
Key Responsibilities/Skills :
Mandatory Skills:
- Create/update web modules (games/engines/templates/Responsive Modules) using Html5/css3/JavaScript/J-Query.
- Translate design specification into html5 interactive live modules.
- Design, build, and maintain efficient, reusable, and reliable code.
- Should be able to build logic based on Academic concepts using analytical and problem-solving skills.
- Good Understanding of Bootstrap, JSON/XML, CreateJS/Phaser/PixiJS/G-Develop, OOPS and Design Patterns.
- Knowledge of different Browsers for Desktop and Mobile platforms is must.
- Ability to work in agile environment.
Desired Skills: (Any few of the following)
- Exposure to Cocos2d Creator and immersive technologies like Augmented Reality.
- Use of 2D/3D development tools/libraries.
- Graphics code optimization skills.
- Understanding in implementing automated testing platforms and unit tests.
- Proficient knowledge of code versioning tools such as Git and SVN.
- Understanding of the backend technology stack.
Educational content development is a plus.
Qualifications and Experience
- Tech/MCA (Computer Science).
- 2 years’ experience in game programming and scripting is desired.
- Exceptional candidates with less experience will be also considered.
Work Timings:4:00PM to 11:30PM
Fulltime WFH
6+ Yrs in Data science
Strong Experience ML Regression, Classification, Anomaly detection, NLP, Deep learning, Predictive analytics, Predictive maintenance ,Python, Added advantage Data visualization
- Design, build web crawlers to scrape data and URLs.
- Integrate the data crawled and scraped into our databases
- Create more/better ways to crawl relevant information
- Strong knowledge of web technologies (HTML, CSS, Javascript, XPath, Regex)
- Understanding of data privacy policies (esp. GDPR) and personally identifiable information
- Develop automated and reusable routines for extracting information from various data sources
- Prepare requirement summary and re-confirm with Operation team
- Translate business requirements into specific solutions
- Ability to relay technical information to non-technical users
- Demonstrate Effective problem solving and analytical skill
- Ability to pay attention to detail, pro-active, critical thinking and accuracy is essential
- Ability to work to deadlines and give realistic estimates
Skills & Expertise
- 2+ years of web scraping experience
- Experience with two or more of the following web scraping frameworks and tools: Selenium, Scrapy, Import.io, Webhose.io, ScrapingHub, ParseHub, Phantombuster, Octoparse, Puppeter, etc.
- Basic knowledge of data engineering (database ingestion, ETL, etc.)
- Solution orientation and "can do" attitude - with a desire to tackle complex problems.
Come Join Qualifyde’s Interviewing world and be part of the next generation of technical interviews.
QUALIFYDE Tech nerds are a network of software professionals, including software development managers, software engineers, and freelancers covering the full technology stack.
Technical Interviews with QUALIFYDE occur 24/7 with candidates located throughout the world
Conduct interviews based on your own availability — whether that be five times a week, 10 times a week, mornings, nights, or weekends.
Utilize the QUALIFYDE Interviewing infrastructure and process to conduct fair, equitable technical interviews in a wide variety of roles.
Who can apply; -
- Software developers/engineers with hands-on coding experience and experience conducting technical interviews
- Strong Communication Skills in English (Oral & Written)
- Stable, Consistent internet connection
- Equipped with a working webcam
Job Description for :
Role: Data/Integration Architect
Experience – 8-10 Years
Notice Period: Under 30 days
Key Responsibilities: Designing, Developing frameworks for batch and real time jobs on Talend. Leading migration of these jobs from Mulesoft to Talend, maintaining best practices for the team, conducting code reviews and demos.
Core Skillsets:
Talend Data Fabric - Application, API Integration, Data Integration. Knowledge on Talend Management Cloud, deployment and scheduling of jobs using TMC or Autosys.
Programming Languages - Python/Java
Databases: SQL Server, Other Databases, Hadoop
Should have worked on Agile
Sound communication skills
Should be open to learning new technologies based on business needs on the job
Additional Skills:
Awareness of other data/integration platforms like Mulesoft, Camel
Awareness Hadoop, Snowflake, S3
-
Owns the end to end implementation of the assigned data processing components/product features i.e. design, development, dep
loyment, and testing of the data processing components and associated flows conforming to best coding practices -
Creation and optimization of data engineering pipelines for analytics projects.
-
Support data and cloud transformation initiatives
-
Contribute to our cloud strategy based on prior experience
-
Independently work with all stakeholders across the organization to deliver enhanced functionalities
-
Create and maintain automated ETL processes with a special focus on data flow, error recovery, and exception handling and reporting
-
Gather and understand data requirements, work in the team to achieve high-quality data ingestion and build systems that can process the data, transform the data
-
Be able to comprehend the application of database index and transactions
-
Involve in the design and development of a Big Data predictive analytics SaaS-based customer data platform using object-oriented analysis
, design and programming skills, and design patterns -
Implement ETL workflows for data matching, data cleansing, data integration, and management
-
Maintain existing data pipelines, and develop new data pipeline using big data technologies
-
Responsible for leading the effort of continuously improving reliability, scalability, and stability of microservices and platform