20+ ETL Jobs in Mumbai | ETL Job openings in Mumbai
Apply to 20+ ETL Jobs in Mumbai on CutShort.io. Explore the latest ETL Job opportunities across top companies like Google, Amazon & Adobe.
- Engage with client business team managers and leaders independently to understand their requirements, help them structure their needs into data needs, prepare functional and technical specifications for execution and ensure delivery from the data team. This can be combination of ETL Processes, Reporting Tools, Analytics tools like SAS, R and alike.
- Lead and manage the Business Analytics team, ensuring effective execution of projects and initiatives.
- Develop and implement analytics strategies to support business objectives and drive data-driven decision-making.
- Analyze complex data sets to provide actionable insights that improve business performance.
- Collaborate with other departments to identify opportunities for process improvements and implement data-driven solutions.
- Oversee the development, maintenance, and enhancement of dashboards, reports, and analytical tools.
- Stay updated with the latest industry trends and technologies in analytics and data
- science.
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Java/Scala + Data Engineer
Experience: 5-10 years
Location: Mumbai
Notice: Immediate to 30 days
Required Skills:
· 5+ years of software development experience.
· Excellent skills in Java and/or Scala programming, with expertise in backend architectures, messaging technologies, and related frameworks.
· Developing Data Pipelines (Batch/Streaming). Developing Complex data transformations, ETL Orchestration, Data Migration, Develop and Maintain Datawarehouse / Data Lakes.
· Extensive experience in complex SQL queries, database development, and data engineering, including the development of procedures, packages, functions, and handling exceptions.
· Knowledgeable in issue tracking tools (e.g., JIRA), code collaboration tools (e.g., Git/GitLab), and team collaboration tools (e.g., Confluence/Wiki).
· Proficient in Linux/Unix, including shell scripting.
· Ability to translate business and architectural features into quality, consistent software design.
· Solid understanding of programming practices, emphasizing reusable, flexible, and reliable code.
DATA ENGINEER
Overview
They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.
Job Description:
We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.
Responsibilities:
Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.
Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.
Optimize and tune the performance of data systems to ensure efficient data processing and analysis.
Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.
Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.
Implement and maintain data governance and security measures to protect sensitive data.
Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.
Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.
Qualifications:
Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.
Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.
Strong programming skills in languages such as Python, Java, or Scala.
Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).
Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).
Solid understanding of data modeling, data warehousing, and ETL principles.
Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).
Strong problem-solving and analytical skills, with the ability to handle complex data challenges.
Excellent communication and collaboration skills to work effectively in a team environment.
Preferred Qualifications:
Advanced knowledge of distributed computing and parallel processing.
Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).
Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).
Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
Certification in relevant technologies or data engineering disciplines.
Role: Project Manager
Experience: 8-10 Years
Location: Mumbai
Company Profile:
Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are rapidly expanding across machine learning, Data Engineering and Analytics functions. Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.
One of the top partners of Data bricks, Azure, Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik and "Excellence in Business Process Automation Award" (IMEA) by Automation Anywhere.
Get to know more about us at http://www.exponentia.ai and https://in.linkedin.com/company/exponentiaai
Role Overview:
· Project manager shall be responsible to oversee and take responsibility for the successful delivery of a range of projects in Business Intelligence, Data warehousing, and Analytics/AI-ML.
· Project manager is expected to manage the project and lead the teams of BI engineers, data engineers, data scientists and application developers.
Job Responsibilities:
· Efforts estimation, creating a project plan, planning milestones, activities and tracking the progress.
· Identify risks and issues. Come up with a mitigation plan.
· Status reporting to both internal and external stakeholders.
· Communicate with all stakeholders.
· Manage end-to-end project lifecycle - requirements gathering, design, development, testing and go-live.
· Manage end-to-end BI or data warehouse projects.
· Must have experience in running Agile-based project development.
Technical skills
· Experience in Business Intelligence Data warehousing or Analytics projects.
· Understand data lake and data warehouse solutions including ETL pipelines.
· Good to have - Knowledge of Azure blob storage, azure data factory and Synapse analytics.
· Good to have - Knowledge of Qlik Sense or Power BI
· Good to have - Certified in PMP/Prince 2 / Agile Project management.
· Excellent written and verbal communication skills.
Education:
MBA, B.E. or B. Tech. or MCA degree
Responsibilities :
- Involve in planning, design, development and maintenance of large-scale data repositories, pipelines, analytical solutions and knowledge management strategy
- Build and maintain optimal data pipeline architecture to ensure scalability, connect operational systems data for analytics and business intelligence (BI) systems
- Build data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
- Reporting and obtaining insights from large data chunks on import/export and communicating relevant pointers for helping in decision-making
- Preparation, analysis, and presentation of reports to the management for further developmental activities
- Anticipate, identify and solve issues concerning data management to improve data quality
Requirements :
- Ability to build and maintain ETL pipelines
- Technical Business Analysis experience and hands-on experience developing functional spec
- Good understanding of Data Engineering principles including data modeling methodologies
- Sound understanding of PostgreSQL
- Strong analytical and interpersonal skills as well as reporting capabilities
at TSG Global Services Private Limited
Exp-Min 10 Years
Location Mumbai
Sal-Nego
Powerbi, Tableau, QlikView,
Solution Architect/Technology Lead – Data Analytics
Role
Looking for Business Intelligence lead (BI Lead) having hands on experience BI tools (Tableau, SAP Business Objects, Financial and Accounting modules, Power BI), SAP integration, and database knowledge including one or more of Azure Synapse/Datafactory, SQL Server, Oracle, cloud-based DB Snowflake. Good knowledge of AI-ML, Python is also expected.
- You will be expected to work closely with our business users. The development will be performed using an Agile methodology which is based on scrum (time boxing, daily scrum meetings, retrospectives, etc.) and XP (continuous integration, refactoring, unit testing, etc) best practices. Candidates must therefore be able to work collaboratively, demonstrate good ownership, leadership and be able to work well in teams.
- Responsibilities :
- Design, development and support of multiple/hybrid Data sources, data visualization Framework using Power BI, Tableau, SAP Business Objects etc. and using ETL tools, Scripting, Python Scripting etc.
- Implementing DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code-utilizing tools like Git. Primary Skills
Requirements
- 10+ years working as a hands-on developer in Information Technology across Database, ETL and BI (SAP Business Objects, integration with SAP Financial and Accounting modules, Tableau, Power BI) & prior team management experience
- Tableau/PowerBI integration with SAP and knowledge of SAP modules related to finance is a must
- 3+ years of hands-on development experience in Data Warehousing and Data Processing
- 3+ years of Database development experience with a solid understanding of core database concepts and relational database design, SQL, Performance tuning
- 3+ years of hands-on development experience with Tableau
- 3+ years of Power BI experience including parameterized reports and publishing it on PowerBI Service
- Excellent understanding and practical experience delivering under an Agile methodology
- Ability to work with business users to provide technical support
- Ability to get involved in all the stages of project lifecycle, including analysis, design, development, testing, Good To have Skills
- Experience with other Visualization tools and reporting tools like SAP Business Objects.
Your key responsibilities
- Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
- Responsible for development, support, maintenance, and implementation of a complex project module
- Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
- Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
- Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
- complete reporting solutions.
- Preparation of HLD about architecture of the application and high level design.
- Preparation of LLD about job design, job description and in detail information of the jobs.
- Preparation of Unit Test cases and execution of the same.
- Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle
Skills and attributes for success
- Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
- Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
- Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
- Should have enough experience to work on Power Shell Scripting
- Able to guide the team through the development, testing and implementation stages and review the completed work effectively
- Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
- Primary owner of delivery, timelines. Review code was written by other engineers.
- Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
- Must have understanding of business intelligence development in the IT industry
- Outstanding written and verbal communication skills
- Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
- Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
- Should be able to orchestrate and automate pipeline
- Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark
To qualify for the role, you must have
- Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
- More than 6 years of experience in ETL development projects
- Proven experience in delivering effective technical ETL strategies
- Microsoft Azure project experience
- Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)
Ideally, you’ll also have
- Implementing Environment solutions for projects in a dynamic corporate environment
- Communicating and collaborating with project and technical teams on Environment requirements, delivery and support
- Delivering and Maintaining Environment Management Plans, Bookings, Access Details and Schedules for Projects
- Working with Environment Team on Technical Environment Delivery Solutions
- Troubleshooting, managing and tracking Environment Incidents & Service Requests in conjunction with technical teams and external partners via the service management tool
- Leadership support in the North Sydney office
- Mentoring, guiding and leading other team
- Creation of new test environments
- Provisioning infrastructure and platform
- Test data provisioning (privatization, traceability, ETL, segregation)
- Endpoint integration
- Monitoring the test environment
- Updating/deleting outdated test-environments and their details
- Investigation of test environment issues and at times, co- ordination till its resolution
- Collaborate with the business teams to understand the data environment in the organization; develop and lead the Data Scientists team to test and scale new algorithms through pilots and subsequent scaling up of the solutions
- Influence, build and maintain the large-scale data infrastructure required for the AI projects, and integrate with external IT infrastructure/service
- Act as the single point source for all data related queries; strong understanding of internal and external data sources; provide inputs in deciding data-schemas
- Design, develop and maintain the framework for the analytics solutions pipeline
- Provide inputs to the organization’s initiatives on data quality and help implement frameworks and tools for the various related initiatives
- Work in cross-functional teams of software/machine learning engineers, data scientists, product managers, and others to build the AI ecosystem
- Collaborate with the external organizations including vendors, where required, in respect of all data-related queries as well as implementation initiatives
Broadcast Media Production and Distribution Company
Professional experience in Python – Mandatory experience
Basic knowledge of any BI Tool (Microsoft Power BI, Tableau etc.) and experience in R
will be an added advantage
Proficient in Excel
Good verbal and written communication skills
Key Responsibilities:
Analyze data trends and provide intelligent business insights, monitor operational and
business metrics
Complete ownership of business excellence dashboard and preparation of reports for
senior management stating trends, patterns, and predictions using relevant data
Review, validate and analyse data points and implement new data analysis
methodologies
Perform data profiling to identify and understand anomalies
Perform analysis to assess quality and meaning of data
Develop policies and procedures for the collection and analysis of data
Analyse existing process with the help of data and propose process change and/or lead
process re-engineering initiatives
Use BI Tools (Microsoft Power BI/Tableau) and develop and manage BI solutions
Responsibilities
- Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
- Conduct meetings with Clients to gather the requirements and understand the scope.
- Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
- Define business problems, identify solutions, provide analysis and insights from the client's data.
- 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
- Ensure solutions are delivered error free and submitted on time.
- Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
- Provide end-user training and technical assistance to maximize utilization of tools.
- Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.
Requirements
- Bachelor's degree in Engineering.
- 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
- 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
- Strong project and time management skills with ability to multitask and prioritize workload.
- Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
- Proficiency in Hadoop / Hive.
- Experience Pentaho ETL, Pentaho Visualization API, Tableau.
- Hands on experience of working with Big data sets (Data sets with millions of records).
- Strong technical and Management experience.
Desired Skills and Experience
- NET,ASP.NET
is an agile and innovative, global analytics company driven
Job Description – Developer (ETL + Database)
Develop, document & Support ETL mappings, Database structures & BI reports.
Perform unit testing of developments done by him/her.
Participate in UAT process and ensure quick resolution of any UAT issue.
Manage different environments and be responsible for proper deployment of code in all client
environments.
Prepare release documents.
Prepare and Maintain project documents as advised by Team Leads.
Skill-sets:
3+ years of Hands on experience on ETL Pentaho Spoon Talend & MS SQL Server, Oracle & SYBASE Database tools.
Ability to write complex SQL and database procedures.
Good knowledge and understanding regarding Data warehouse Concepts, ETL Concepts, ETL
Loading Strategies, Data archiving, Data Reconciliation, ETL error handling etc.
Problem Solving.
Good communication skills – written and verbal.
Self-motivated, team player, action and result oriented.
Ability to successfully work under tight project schedule
delivered.
• You will utilize your configuration management and software release experience; as well as
change management concepts to drive the success of the projects.
• You will partner with senior leaders to understand and communicate the business needs to
translate them into IT requirements. Consult with Customer’s Business Analysts on their Data
warehouse requirements
• You will assist the technical team in identification and resolution of Data Quality issues.
• You will manage small to medium-sized projects relating to the delivery of applications or
application changes.
• You will use Managed Services or 3rd party resources to meet application support requirements.
• You will interface daily with multi-functional team members within the EDW team and across the
enterprise to resolve issues.
• Recommend and advocate different approaches and designs to the requirements
• Write technical design docs
• Execute Data modelling
• Solution inputs for the presentation layer
• You will craft and generate summary, statistical, and presentation reports; as well as provide reporting and metrics for strategic initiatives.
• Performs miscellaneous job-related duties as assigned
Preferred Qualifications
• Strong interpersonal, teamwork, organizational and workload planning skills
• Strong analytical, evaluative, and problem-solving abilities as well as exceptional customer service orientation
• Ability to drive clarity of purpose and goals during release and planning activities
• Excellent organizational skills including ability to prioritize tasks efficiently with high level of attention to detail
• Excited by the opportunity to continually improve processes within a large company
• Healthcare background/ Automobile background.
• Familiarity with major big data solutions and products available in the market.
• Proven ability to drive continuous
They provide both wholesale and retail funding. PM1
- Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
- Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
- Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
- Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
- Implementing automated Audit & Quality assurance checks in Data Pipeline
- Document & maintain data lineage to enable data governance
- Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
Requirements
- Programming experience using Python / Java, to create functions / UDX
- Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
- Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
- Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
- Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
- Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
- Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
- Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
- Good analytical skills with the ability to synthesize data to design and deliver meaningful information
- Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
- Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
- Ability to understand business functionality, processes, and flows
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently
Functional knowledge
- Data Governance & Quality Assurance
- Distributed computing
- Linux
- Data structures and algorithm
- Unstructured Data Processing
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server
Should be able to use the transformations components to transform the data
Should possess knowledge on incremental load, full load etc.
Should Design, build and deploy effective packages
Should be able to schedule these packages through task schedulers
Implement stored procedures and effectively query a database
Translate requirements from the business and analyst into technical code
Identify and test for bugs and bottlenecks in the ETL solution
Ensure the best possible performance and quality in the packages
Provide support and fix issues in the packages
Writes advanced SQL including some query tuning
Experience in the identification of data quality
Some database design experience is helpful
Experience designing and building complete ETL/SSIS processes moving and transforming data for
ODS, Staging, and Data Warehousing
at Magic9 Media and Consumer Knowledge Pvt. Ltd.
Job Description
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
Qualifications:
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Company is into Product Development.
What's the role?
Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.
- Setup coding practice, guidelines & quality of the software delivered.
- Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
- Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
- Prepares and installs solutions by determining and designing system specifications, standards, and programming.
- Improves operations by conducting systems analysis; recommending changes in policies and procedures.
- Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
- Protects operations by keeping information confidential.
- Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.
Essential Skills / Experience:
- 10+ years of engineering experience
- Experience in designing and developing high volume web-services using API protocols and data formats
- Proficient in API modelling languages and annotation
- Proficient in Java programming
- Experience with Scala programming
- Experience with ETL systems
- Experience with Agile methodologies
- Experience with Cloud service & storage
- Proficient in Unix/Linux operating systems
- Excellent oral and written communication skills Preferred:
- Functional programming languages (Scala, etc)
- Scripting languages (bash, Perl, Python, etc)
- Amazon Web Services (Redshift, ECS etc)