WHAT WILL YOU DOGOALS Partner with client in their desire to create ‘best-in-class’ data & analytics work to support the decisionsthey make and the work they do. Partner with and support global /regional teams on understanding opportunities to use data & analyticsin their deliverables for clients Create a data and insight-led culture across the team.KEY TASKS Requirement gathering and evaluation of clients’ business situations in order to implement appropriateanalytic solutions. Designs, generates and manages reporting frameworks that provide insight as to the performance ofclients’ marketing activities across multiple channels. Be the single point of contact on anything data & analytics related to the account. Creating and reviewing QA plans for quality assurance of deliverables and responses provided by thestakeholders. Work with the development team in an Agile development environment, as required. Prioritize tasks and proactively manage workload to ensure all deadlines are met (or exceeded), withaccuracy. Keep abreast of developments in and answer questions on data visualization and presentation, media/research/ reporting tools, and systems; educate team on same. Active contribution to project planning and scheduling. Create and maintain project specific documents such as process / quality / learning documents. Led project delivery, proficient in client management and communicated organizational goals to one’s team. Mentors team on delivery, career path and is able to engage with client to understand their needs anddelight them.MUST HAVE SKILLS 6 to 8 years of experience in data management, ETL/BI or marketing analytics. Data Extraction and Data Manipulation: - Working with the huge databases to meet the data needs forthe project and play an SME role ensuring understanding of end to end data processes, system,architecture Hands on experience in SQL is mandatory. Reporting and Data Analysis – Candidate should be able to code, test and execute the data extractionand loading for different sources. Design reports from scratch and use BI tools like Tableau, Datorama, Alteryx or Qlikview for efficientreport generation. Support business on ad-hoc queries and ensuring quality with timelines. Experience in leading a team of 4-7 members, with focus towards coaching the team on domain andtechnology, undertaking their performance management and providing guidance for their career. Maintaining positive client and vendor relationships. Excellent project and resource management skills. Strong written and verbal communication skills Demonstrate strong problem solving and troubleshooting skills. Enthusiasm for the industry, knowledge of industry issues and eagerness to learn. Able to work successfully with teams, handling multiple projects and meeting tight deadlines. Strong analytical skills with superior attention to detail. Strong presentation skills.NICE TO HAVE SKILLS Good understanding of media/advertising domain. Knowledge of scripting language like Python, SQL or R is preferred. Familiarity with data platforms like Double Click Campaign Manager, DV360, SA360, MOAT, IAS, FacebookBusiness Manager, Twitter, Innovid, Sizmek, Kenshoo, Nielsen, Kantar, MediaMath, Prisma, AppNexus. Strong project management and administrative skills.WHAT YOU CAN EXPECT FROM USAt , we work with clients to develop data driven marketing strategies, powered by a connected system oftechnology, tools, consultants and digital activation. Drawing from the vast resources of our parent company Omnicomand spanning the globe in a variety of industry verticals, we are the data and analytics experts for dozens of Fortune500 companies.Omnicom is an equal rights employer and in the India team we are proud of our diversity. We actively look forand welcome new team members from different backgrounds.
Key skills:Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake Job Description Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
· Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy· Prior experience of statistical modelling techniques, AI/ML models etc. will be value add· Working knowledge of reporting packages (Business Objects, QLIK Power BI etc.), ETL frameworks will be an advantage.· Knowledge of statistics and experience using statistical packages for analysing datasets (MS Excel, SPSS, SAS etc.)· Experience on Python, R and other scripting languages is desirable, but not must
We’re looking for an experienced Solution Architect to lead our data analytics and visualization team in Gurugram to help generate valuable insights for our customers as well as the product team. This role will be a very hands-on role and you will be responsible for all aspects of the architecture, security and scalability of the analytics and visualization application. Key Responsibilities: Responsible for the application architecture, performance, security, scalability and availability. Write/Review Code every day in addition to pairing with team members on functional and nonfunctional requirements and spread design philosophy, goals and improvements to code quality across the team Translate objectives into iterative MVPs, evaluate and then refactor into highly scalable, highly available, reliable, secure and fault-tolerant systems. Building and managing automated build/test/deployment environments Identify and confirm technical design risks, and develop mitigating approaches, judge the tradeoffs with technology and feasibility and makes choices that fit the constraints of the project. Demonstrates comprehensive decision-making authority over technical issues, project policies, standards, and strategies Deep dive into the data to help our customers understand how their users are using the app and how they can improve the adoption. Understanding how users engage with the product, by analyzing users' digital footprints step-by-step to see what leads them to engage, return, or churn. The role would enable product owners, designers, and developers to use the right data to guide their decisions. Provide data driven insights and deliver recommendations that address opportunities for product improvements Help improve accuracy and efficiency of data capture strategy A/B Test: Analyze, consolidate, and present test results. Uncover friction points to continuously iterate and improve products or features Desired Skills and Experience: 7+ years overall experience, with minimum 2+ years in data analysis/ visualization, with hands-on participation. Experience in at least two of NodeJS/ Python/ Java and a willingness to learn others. Data modelling experience in both Relational and NoSQL databases. Experience working with SQL and No-SQL databases, data visualization technologies and cloud infrastructure platforms Shown experience with large-scale data analysis, data-mining, business intelligence systems, and other advanced business reporting tools, including SQL or similar tools. Knowledge of data conversion strategy, capturing data, creating source to target definitions for ETL/ data flow process, data quality, and database management Develop and implement databases, data collection systems, data analytics and other strategies Identify, analyze, and interpret trends or patterns in complex data sets. Work with large scale data sets: extract data from sources across multiple platforms and synthesize into dashboards or insightful information Proven deep technical expertise and hands-on involvement in designing, architecting and executing large scale cloud solutions for data analytics and other complementary services like compute, workflow and serverless. Strong experience in Agile methodology and ability to deliver in a global team environment with members working remotely in various time zones Communication Skills must be excellent across roles @Simpplr.
should have experience in Data engineering with knowledge of SQL , ETL , SPARK, Data pipelineShould have worked in product based company
Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design. Role and Responsibility · Plan, create, coordinate, and deploy data warehouses. · Design end user interface. · Create best practices for data loading and extraction. · Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment. · Develop reporting applications and data warehouse consistency. · Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers. · Supervise design throughout implementation process. · Design and build cubes while performing custom scripts. · Develop and implement ETL routines according to the DWH design and architecture. · Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse. · Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required. · Manage multiple projects at once. DESIRABLE SKILL SET · Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures · Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database · High proficiency in dimensional modeling techniques and their applications · Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel · Several years working experience with Tableau, MicroStrategy, Information Builders, and other reporting and analytical tools · Working knowledge of SAS and R code used in data processing and modeling tasks · Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data
Job Responsibilities : - Developing new data pipelines and ETL jobs for processing millions of records and it should be scalable with growth. Pipelines should be optimised to handle both real time data, batch update data and historical data. Establish scalable, efficient, automated processes for complex, large scale data analysis. Write high quality code to gather and manage large data sets (both real time and batch data) from multiple sources, perform ETL and store it in a data warehouse.Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety of tools and data analysis techniques. Participate in data pipelines health monitoring and performance optimisations as well as quality documentation. Interact with end users/clients and translate business language into technical requirements. Acts independently to expose and resolve problems.Job Requirements :- 2+ years experience working in software development & data pipeline development for enterprise analytics. 2+ years of working with Python with exposure to various warehousing tools In-depth working with any of commercial tools like AWS Glue, Ta-lend, Informatica, Data-stage, etc. Experience with various relational databases like MySQL, MSSql, Oracle etc. is a must. Experience with analytics and reporting tools (Tableau, Power BI, SSRS, SSAS).Experience in various DevOps practices helping the client to deploy and scale the systems as per requirement. Strong verbal and written communication skills with other developers and business client. Knowledge of Logistics and/or Transportation Domain is a plus. Hands-on with traditional databases and ERP systems like Sybase and People-soft.
We are looking at a Big Data Engineer with at least 3-5 years of experience as a Big Data Developer/EngineerExperience with Big Data technologies and tools like Hadoop, Hive, MapR, Kafka, Spark, etc.,Experience in Architecting data ingestion, storage, consumption model.Experience with NoSQL Databases like MongoDB, HBase, Cassandra, etc.,Knowledge of various ETL tools & techniques