Data Analyst

at T500

Agency job
icon
Bengaluru (Bangalore)
icon
3 - 9 yrs
icon
₹10L - ₹30L / yr
icon
Full time
Skills
Informatica MDM
Data modeling
IDQ

Primary Duties and Responsibilities 

  • Experience with Informatica Multidomain MDM 10.4 tool suite preferred
  • Partnering with data architects and engineers to ensure an optimal data model design and implementation for each MDM domain in accordance with industry and MDM best practices
  • Works with data governance and business steward(s) to design, develop, and configure business rules for data validation, standardize, match, and merge
  • Implementation of Data Quality policies, procedures and standards along with Data Governance Team for maintenance of customer, location, product, and other data domains; Experience with Informatica IDQ tool suite preferred.
  • Performs data analysis and source-to-target mapping for ingest and egress of data.
  • Maintain compliance with change control, SDLC, and development standards.
  • Champion the creation and contribution to technical documentation and diagrams.
  • Establishes a technical vision and strategy with the team and works with the team to turn it into reality.
  • Emphasis on coaching and training to cultivate skill development of team members within the department.
  • Responsible for keeping up with industry best practices and trends.
  • Monitor, troubleshoot, maintain, and continuously improve the MDM ecosystem.

Secondary Duties and Responsibilities

  • May participate in off-hours on-call rotation.
  • Attends and is prepared to participate in team, department and company meetings.
  • Performs other job related duties and special projects as assigned.

Supervisory Responsibilities

This is a non-management role

Education and Experience

  • Bachelor's degree in MIS, Computer Sciences, Business Administration, or related field; or High School Degree/General Education Diploma and 4 years of relevant experience in lieu of Bachelor's degree.
  • 5+ years of experience in implementing MDM solutions using Informatica MDM.
  • 2+ years of experience in data stewardship, data governance, and data management concepts.
  • Professional working knowledge of Customer 360 solution
  • Professional working knowledge in multi domain MDM data modeling.
  • Strong understanding of company master data sets and their application in complex business processes and support data profiling, extraction, and cleansing activities using Informatica Data Quality (IDQ).
  • Strong knowledge in the installation and configuration of the Informatica MDM Hub.
  • Familiarity with real-time, near real-time and batch data integration.
  • Strong experience and understanding of Informatica toolsets including Informatica MDM Hub, Informatica Data Quality (IDQ), Informatica Customer 360, Informatica EDC, Hierarchy Manager (HM), Business Entity Service Model, Address Doctor, Customizations & Composite Services
  • Experience with event-driven architectures (e.g. Kafka, Google Pub/Sub, Azure Event Hub, etc.).
  • Professional working knowledge of CI/CD technologies such as Concourse, TeamCity, Octopus, Jenkins, and CircleCI.
  • Team player that exhibits high energy, strategic thinking, collaboration, direct communication and results orientation.

Physical Requirements

  • Visual requirements include: ability to see detail at near range with or without correction. Must be physically able to perform sedentary work: occasionally lifting or carrying objects of no more than 10 pounds, and occasionally standing or walking, reaching, handling, grasping, feeling, talking, hearing and repetitive motions.

Working Conditions

  • The duties of this position are performed through a combination of an open office setting and remote work options. Full remote work options available for employees that reside outside of the Des Moines Metro Area. There is frequent pressure to meet deadlines and handle multiple projects in a day.

Equipment Used to Perform Job

  • Windows, or Mac computer and various software solutions.

Financial Responsibility

  • Responsible for company assets including maintenance of software solutions.

Contacts

  • Has frequent contact with office personnel in other departments related to the position, as well as occasional contact with users and customers. Engages stakeholders from other area in the business.

Confidentiality

  • Has access to confidential information including trade secrets, intellectual property, various financials, and customer data.
Read more
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

leading digital transformation services provider that delivers design-led, complex digital transformation initiatives to Fortune 500 clients. Our differentiated services span digital product engineering, cloud and DevOps, data and AI, customer experience, cyber security, and design services.
Agency job
via HyrHub by Shwetha Naik
Mumbai
4 - 6 yrs
₹12L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Databases
+2 more
Required Skills:

Database: 4-6 years of experience in Database development (SQL/Data Modelling/Stored Procedures/Index/Performance Tuning e.g. Query Plan analysis). Teradata experience preferred.
ETL Programming : 2+ years of experience (Tools like Informatica or Ab Initio not necessary)
DevOps Tooling including Unit/Functional Testing in a Developer capacity
UNIX experience: Basic understanding of UNIX and shell commands is required, scripting (e.g. Python/Shell/Perl) experience is a plus
Job scheduler tools: Autosys experience preferred but any scheduling tool experience will suffice


Optional Skills:

Experience with Java/Scala/Python and Spark Framework preferred
Exposure to BFSI and finance industry preferred
Hadoop/BigData/Cloud is not necessary
 
Notice Period - Immediate ,15-25 days ,Max 30 days
Read more
at Klubworks
4 recruiters
DP
Posted by Anupam Arya
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹18L / yr
Data Analytics
MS-Excel
MySQL
Python
Business Analysis
+9 more
We are looking to hire a Senior Data Analyst to join our data team. You will take responsibility for managing our master data set, developing reports, and troubleshooting data issues. To do well in this role you need a very fine eye for detail, experience as a data analyst, and a deep understanding of the popular data analysis tools and databases.

Responsibilities
  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Develop and implement databases, data collection systems, data analytics, and other strategies that optimize statistical efficiency and quality
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Filter and clean data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
  • Work with the teams to prioritize business and information needs
  • Locate and define new process improvement opportunities

Requirements- 
  • Minimum 3 year of working experience as a Data Analyst or Business Data Analyst
  • Technical expertise with data models, database design development, data mining, and segmentation techniques
  • Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL, etc), programming (XML, JavaScript, or ETL frameworks)
  • Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS, etc)
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Excellent written and verbal communication skills for coordinating across teams.
  • A drive to learn and master new technologies and techniques.
Read more
at Crayon Data
2 recruiters
DP
Posted by Varnisha Sethupathi
Chennai
5 - 8 yrs
₹15L - ₹25L / yr
SQL
Python
Analytical Skills
Data modeling
Data Visualization
+1 more

Role : Senior Customer Scientist 

Experience : 6-8 Years 

Location : Chennai (Hybrid) 
 
 

Who are we? 
 
 

A young, fast-growing AI and big data company, with an ambitious vision to simplify the world’s choices. Our clients are top-tier enterprises in the banking, e-commerce and travel spaces. They use our core AI-based choice engine  maya.ai , to deliver personal digital experiences centered around taste. The  maya.ai  platform now touches over 125M customers globally. You’ll find Crayon Boxes in Chennai and Singapore. But you’ll find Crayons in every corner of the world. Especially where our client projects are – UAE, India, SE Asia and pretty soon the US. 
 
 

Life in the Crayon Box is a little chaotic, largely dynamic and keeps us on our toes! Crayons are a diverse and passionate bunch. Challenges excite us. Our mission drives us. And good food, caffeine (for the most part) and youthful energy fuel us. Over the last year alone, Crayon has seen a growth rate of 3x, and we believe this is just the start. 
 

 
We’re looking for young and young-at-heart professionals with a relentless drive to help Crayon double its growth. Leaders, doers, innovators, dreamers, implementers and eccentric visionaries, we have a place for you all. 
 

 
 

Can you say “Yes, I have!” to the below? 
 
 

  1. Experience with exploratory analysis, statistical analysis, and model development 
     
  1. Knowledge of advanced analytics techniques, including Predictive Modelling (Logistic regression), segmentation, forecasting, data mining, and optimizations 
     
  1. Knowledge of software packages such as SAS, R, Rapidminer for analytical modelling and data management. 
     
  1. Strong experience in SQL/ Python/R working efficiently at scale with large data sets 
     
  1. Experience in using Business Intelligence tools such as PowerBI, Tableau, Metabase for business applications 
     

 
 

 

Can you say “Yes, I will!” to the below? 
 
 

  1. Drive clarity and solve ambiguous, challenging business problems using data-driven approaches. Propose and own data analysis (including modelling, coding, analytics) to drive business insight and facilitate decisions. 
     
  1. Develop creative solutions and build prototypes to business problems using algorithms based on machine learning, statistics, and optimisation, and work with engineering to deploy those algorithms and create impact in production. 
     
  1. Perform time-series analyses, hypothesis testing, and causal analyses to statistically assess the relative impact and extract trends 
     
  1. Coordinate individual teams to fulfil client requirements and manage deliverable 
     
  1. Communicate and present complex concepts to business audiences 
     
  1. Travel to client locations when necessary  

 

 

Crayon is an equal opportunity employer. Employment is based on a person's merit and qualifications and professional competences. Crayon does not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related.  
 
 

More about Crayon:  https://www.crayondata.com/   
 

More about  maya.ai :  https://maya.ai/   

 

 

Read more
at Tide
1 video
4 recruiters
DP
Posted by Anushka Jain
Remote only
3 - 7 yrs
₹30L - ₹32L / yr
Snow flake schema
Data Warehouse (DWH)
Informatica
ETL
Data modeling
+3 more

About You

As a Senior Data Engineer part of the data team, you will be responsible for running the data systems and services that monitor and report on the end to end Data infrastructure. We are heavily dependent on Snowflake, Airflow, Fivetran, Looker for our business intelligence and embrace AWS as a key partner across our engineering teams. You will report directly in the Head of Data Engineering and work closely with our ML Engineering and Data science Team.

Some of the things you’ll be doing: 

  • Integration of additional data sources into our Snowflake Data Warehouse using Fivetran or custom code
  • Building infrastructure that helps our analysts to move faster, such as adding tests to our CI/CD systems
  • Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis
  • Maintaining an accurate log of the technical documentation for the warehouse
  • Troubleshooting and resolving technical issues as they arise
  • Ensuring all servers and applications are patched and upgraded in a timely manner
  • Looking for ways of improving both what and how services are delivered by the department
  • Building data loading services for the purpose of importing data from numerous, disparate data sources, inclusive of APIs, logs, relational, and non-relational databases
  • Working with the BI Developer to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches
  • Discovering, transforming, testing, deploying and documenting data sources
  • Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review

What you’ll get in return:

  • Competitive Salary
  • Family & Self Health Insurance
  • Life & Accidental Insurance
  • 25 days annual leaves
  • We invest in your development with professional L&D budget (fixed amount of 40,000 per year)
  • Flexible working options
  • Share Options

You’ll be a great fit if:

  • You have 4+ years of experience in Data Engineering
  • You have extensive development in Building ELT pipelines (Snowflake added advantage )
  • You have experience in building data solutions, both batch processes and streaming applications
  • You have extensive experience in designing , architecting and implementing best Data Engineering practices 
  • You have good experience in Data Modelling 
  • You have extensive experience in writing SQL statements and performance tuning them
  • You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
  • You have experience architecting analytical databases
  • You have experience working in a data engineering or data warehousing team
  • You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
  • You have strong technical documentation skills and the ability to be clear and precise with business users
  • You have business-level of English and good communication skills
  • You have knowledge of various systems across the AWS platform and the role they play e.g. Lambda, DynamoDB, CloudFormation, Glue 
  • You have experience with Git and Docker
  • You have experience with with Snowflake, dbt, Apache Airflow, Python, Fivetran, AWS, git and Looker

 Who are Tide?

We’re the UK’s leading provider of smart current accounts for sole traders and small companies. We’re also on a mission to save business owners time and money on their banking and finance admin so they can get back to doing what they love - for too long, these customers have been under-served by the big banks.

Our offices are in London, UK, Sofia, Bulgaria and Hyderabad, India, where our teams are dedicated to our small business members, revolutionising business banking for SMEs. We are also the leading provider of UK SME business accounts and one of the fastest-growing fintechs in the UK.

We’re scaling at speed with a focus on hiring talented individuals with a growth mindset and ownership mentality, who are able to juggle multiple and sometimes changing priorities. Our values show our commitment to working as one team, working collaboratively to take action and deliver results. Member first, we are passionate about our members and put them first. We are data-driven, we make decisions, creating insight using data.

We’re also one of LinkedIn’s top 10 hottest UK companies to work for.

Here’s what we think about diversity and inclusion…

We build our services for all types of small business owners. We aim to be as diverse as our members so we hire people from a variety of backgrounds. We’re proud that our diversity not only reflects our multicultural society but that this breadth of experience makes us awesome at solving problems. Everyone here has a voice and you’ll be able to make a difference. If you share our values and want to help small businesses, you’ll make an amazing Tidean.

A note on the future of work at Tide:

Tide’s offices are beginning to open for Tideans to return on a voluntary basis. Timelines for reopening will be unique for each region and will be based on region-specific guidelines. The health and well-being of Tideans and candidates is our primary concern, therefore, for the foreseeable future, we have transitioned all interviews and onboarding to be conducted via Zoom.

Once offices are fully open, Tideans will be able to choose to work from the office or remotely, with the requirement that they visit the office or participate in face-to-face team activities several times per month.

Read more
at Srijan Technologies
6 recruiters
DP
Posted by PriyaSaini
Remote only
3 - 8 yrs
₹5L - ₹12L / yr
Data Analytics
Data modeling
Python
PySpark
ETL
+3 more

Role Description:

  • You will be part of the data delivery team and will have the opportunity to develop a deep understanding of the domain/function.
  • You will design and drive the work plan for the optimization/automation and standardization of the processes incorporating best practices to achieve efficiency gains.
  • You will run data engineering pipelines, link raw client data with data model, conduct data assessment, perform data quality checks, and transform data using ETL tools.
  • You will perform data transformations, modeling, and validation activities, as well as configure applications to the client context. You will also develop scripts to validate, transform, and load raw data using programming languages such as Python and / or PySpark.
  • In this role, you will determine database structural requirements by analyzing client operations, applications, and programming.
  • You will develop cross-site relationships to enhance idea generation, and manage stakeholders.
  • Lastly, you will collaborate with the team to support ongoing business processes by delivering high-quality end products on-time and perform quality checks wherever required.

Job Requirement:

  • Bachelor’s degree in Engineering or Computer Science; Master’s degree is a plus
  • 3+ years of professional work experience with a reputed analytics firm
  • Expertise in handling large amount of data through Python or PySpark
  • Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
  • Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued
  • Comfort with data modelling principles (e.g. database structure, entity relationships, UID etc.) and software development principles (e.g. modularization, testing, refactoring, etc.)
  • A thoughtful and comfortable communicator (verbal and written) with the ability to facilitate discussions and conduct training
  • Strong problem-solving, requirement gathering, and leading.
  • Track record of completing projects successfully on time, within budget and as per scope

Read more
A Pre-series A funded FinTech Company
Agency job
via GoHyre by Avik Majumder
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
at CES IT
6 recruiters
DP
Posted by Yash Rathod
Hyderabad
7 - 12 yrs
₹5L - ₹15L / yr
Machine Learning (ML)
Deep Learning
Python
Data modeling
o Critical thinking mind who likes to solve complex problems, loves programming, and cherishes to work in a fast-paced environment.
o Strong Python development skills, with 7+ yrs. experience with SQL.
o A bachelor or master’s degree in Computer Science or related areas
o 5+ years of experience in data integration and pipeline development
o Experience in Implementing Databricks Delta lake and data lake
o Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Snowflake Spark
o Experience in working with multiple file formats (Parque, Avro, Delta Lake) & API
o experience with AWS Cloud on data integration with S3.
o Hands on Development experience with Python and/or Scala.
o Experience with SQL and NoSQL databases.
o Experience in using data modeling techniques and tools (focused on Dimensional design)
o Experience with micro-service architecture using Docker and Kubernetes
o Have experience working with one or more of the public cloud providers i.e. AWS, Azure or GCP
o Experience in effectively presenting and summarizing complex data to diverse audiences through visualizations and other means
o Excellent verbal and written communications skills and strong leadership capabilities

Skills:
ML
MOdelling
Python
SQL
Azure Data Lake, dataFactory, Databricks, Delta Lake
Read more
at Servian
2 recruiters
DP
Posted by sakshi nigam
Bengaluru (Bangalore)
2 - 8 yrs
₹10L - ₹25L / yr
Data engineering
ETL
Data Warehouse (DWH)
Powershell
DA
+7 more
Who we are
 
We are a consultant led organisation. We invest heavily in our consultants to ensure they have the technical skills and commercial acumen to be successful in their work.
 
Our consultants have a passion for data and solving complex problems. They are curious, ambitious and experts in their fields. We have developed a first rate team so you will be supported and learn from the best

About the role

  • Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.

  • As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.

Mandatory experience

    • 1-6 years of relevant experience
    • Strong SQL skills and data literacy
    • Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
    • Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
    • Experience in an enterprise data environment
    • Strong communication skills

Desirable experience

    • Ability to work on data architecture, data models, data migration, integration and pipelines
    • Ability to work on data platform modernisation from on-premise to cloud-native
    • Proficiency in data security best practices
    • Stakeholder management experience
    • Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
    • Desire to gain breadth and depth of technologies to support customer's vision and project objectives

What to expect if you join Servian?

    • Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
    • Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
    • Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
    • Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
    • Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
Read more
at The other Fruit
1 video
3 recruiters
DP
Posted by Dipendra SIngh
Pune
1 - 5 yrs
₹3L - ₹15L / yr
Machine Learning (ML)
Artificial Intelligence (AI)
Python
Data Structures
Algorithms
+17 more
 
SD (ML and AI) job description:

Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
 
Read more
15 years US based Product Company
Agency job
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at T500?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort