Cutshort logo
MNC logo
Data Analyst - Health Care domain
at MNC
Data Analyst - Health Care domain
MNC's logo

Data Analyst - Health Care domain

at MNC

Agency job
3 - 8 yrs
₹15L - ₹18L / yr
Bengaluru (Bangalore)
Skills
skill iconData Analytics
SQL server
SQL
Data Analyst

1. Ability to work independently and to set priorities while managing several projects simultaneously; strong attention to detail is essential.
2.Collaborates with Business Systems Analysts and/or directly with key business users to ensure business requirements and report specifications are documented accurately and completely.
3.Develop data field mapping documentation.
4. Document data sources and processing flow.
5. Ability to design, refine and enhance existing reports from source systems or data warehouse.
6.Ability to analyze and optimize data including data deduplication required for reports.
7. Analysis and rationalization of reports.
8. Support QA and UAT teams in defining test scenarios and clarifying requirements.
9. Effectively communicate results of the data analysis to internal and external customers to support decision making.
10.Follows established SDLC, change control, release management and incident management processes.
11.Perform source data analysis and assessment.
12. Perform data profiling to capture business and technical rules.
13. Track and help to remediate issues and defects due to data quality exceptions.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About MNC

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Bengaluru (Bangalore), Gurugram
2 - 6 yrs
₹12L - ₹14L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Role - Senior Executive Analytics (BI)


Overview of job -


Our Client is the world’s largest media investment company and are a part of WPP. In fact, we are responsible for one in every three ads you see globally. We are currently looking for a Senior Executive Analytics to join us. In this role, you will be responsible for a massive opportunity to build and be a part of largest performance marketing setup APAC is committed to fostering a culture of diversity and inclusion. Our people are our strength, so we respect and nurture their individual talent and potential.


Reporting of the role - This role reports to the Director - Analytics.


3 best things about the job:


1. Responsible for data & analytics projects and developing data strategies by diving into data and extrapolating insights and providing guidance to clients


2. Build and be a part of a dynamic team


3. Being part of a global organisations with rapid growth opportunities


Responsibilities of the role:


• Design and build Visualization, Dashboard and reports for both Internal and external clients using Tableau, Power BI, Datorama or R Shiny/Python, SQL, Alteryx.

• Work with large data sets via hands-on data processing to produce structured data sets for analysis.

• Good to Have - Build Marketing-Mix and Multi-Touch attribution models using a range of tools, including free and paid.


What you will need:


• Degree in Mathematics, Statistics, Economics, Engineering, Data Science, Computer Science, or quantitative field.

• Proficiency in one or more coding languages – preferred languages: Python, R

• Proficiency in one or more Visualization Tools – Tableau, Datorama, Power BI

• Proficiency in using SQL.

• Minimum 3 years of experience in Marketing/Data Analytics or related field with hands-on experience in building Marketing-Mix and Attribution models is a plus.

Read more
Remote, Chennai
3 - 6 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
skill iconDeep Learning
Artificial Intelligence (AI)
skill iconPython
+1 more

Skills: Machine Learning,Deep Learning,Artificial Intelligence,python.

Location:Chennai


Domain knowledge:
Data cleaning, modelling, analytics, statistics, machine learning, AI

Requirements:

·         To be part of Digital Manufacturing and Industrie 4.0 projects across Saint Gobain group of companies

·         Design and develop AI//ML models to be deployed across SG factories

·         Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required

·         Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks

·         Prior experience in developing AI and ML models is required

·         Experience with data from the Manufacturing Industry would be a plus

Roles and Responsibilities:

·         Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics

·         Multitasking, good communication necessary

·         Entrepreneurial attitude.

 
Read more
Gurugram, Bengaluru (Bangalore), Chennai
2 - 9 yrs
₹9L - ₹27L / yr
DevOps
Microsoft Windows Azure
gitlab
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+15 more
Greetings!!

We are looking out for a technically driven  "ML OPS Engineer" for one of our premium client

COMPANY DESCRIPTION:
This Company is a global management consulting firm. We are the trusted advisor to the world's leading businesses, governments, and institutions. We work with leading organizations across the private, public and social sectors. Our scale, scope, and knowledge allow us to address


Key Skills
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)
Read more
DFCS Technologies
Agency job
via dfcs Technologies by SheikDawood Ali
Remote, Chennai, Anywhere India
1 - 5 yrs
₹9L - ₹14L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
    • data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
    • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Leena AI
at Leena AI
13 recruiters
Preethi Gothandam
Posted by Preethi Gothandam
Remote only
2 - 8 yrs
₹25L - ₹40L / yr
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
skill iconData Science
skill iconData Analytics

Responsibilities: 

  • Improve robustness of Leena AI current NLP stack 
  • Increase zero shot learning capability of Leena AI current NLP stack 
  • Opportunity to add/build new NLP architectures based on requirements 
  • Manage End to End lifecycle of the data in the system till it achieves more than 90% accuracy 
  • Manage a NLP team 

Page BreakRequirements: 

  • Strong understanding of linear algebra, optimisation, probability, statistics 
  • Experience in the data science methodology from exploratory data analysis, feature engineering, model selection, deployment of the model at scale and model evaluation 
  • Experience in deploying NLP architectures in production 
  • Understanding of latest NLP architectures like transformers is good to have 
  • Experience in adversarial attacks/robustness of DNN is good to have 
  • Experience with Python Web Framework (Django), Analytics and Machine Learning frameworks like Tensorflow/Keras/Pytorch. 
Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
Namaste Credit
at Namaste Credit
9 recruiters
Abhishek J Shiri
Posted by Abhishek J Shiri
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹10L / yr
analyst
analysis
DA
SQL
Joint application design
+3 more

About the Role

The following Job Specification is intended to reflect the nature, range and context of the work. It identifies the main requirements of the job but is not an exhaustive list of duties.

 

The Data Analyst role is a vital role as to be a sole point of contact responsible for closely working with various teams internally and closely working with the CTO.  This role typically requires the resource to analyse the vast number of datasets and provide a meaningful impact and the result of such data with combination of growth of the company to the management. The role handles large responsibilities of closely communicating with other departments, thus must be someone who really loves his role.

 

Job Description

  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Develop and implement, data analytics and other strategies that optimize statistical efficiency and quality
  • Strong experience at working on databases and good at analytical reasoning
  • Acquire data from Finance, Technology, Marketing and Sales data sources and maintain data systems
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Should be able to highlight Incorrect data and record the pattern of Drawbacks relating to the cleansing of data modules
  • Must be involved in reconciliation of data sets and components across verticals depending on the need
  • Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
  • Work with the management to prioritize business and information needs
  • Locate and define new process improvement opportunities

 

 

Required Skills:

 

  • Proven working experience as a data analyst or business data analyst
  • Sound experience in MySQL, Postgres, Oracle databases and fluent in SQL scripts
  • https://resources.workable.com/data-scientist-analysis-interview-questions">Technical expertiseregarding data models, database design development, data mining and segmentation techniques
  • Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks)
  • Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc)
  • Strong https://resources.workable.com/analytical-skills-interview-questions">analytical skillswith the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Adept at queries, report writing and presenting findings
  • BS in Mathematics, Economics, Computer Science, Information Management or Statistics

 

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
Pune
3 - 8 yrs
₹5L - ₹20L / yr
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
+1 more

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Read more
Dataweave Pvt Ltd
at Dataweave Pvt Ltd
32 recruiters
Megha M
Posted by Megha M
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Delivery Management
skill iconData Analytics
Agile/Scrum
Project delivery

Job Description: Project Manager

Internal Role: Delivery Manager     

Type: Full time

Location: Bangalore  

Internal Role: Delivery Manager

About Dataweave:      

About Us         

DataWeave provides “Competitive Intelligence as a Service” to Retailers and Consumer brands helping them optimize their offerings through effective pricing, assortment and promotional recommendations.

It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the Web. At serious scale!

Read more on http://dataweave.com/about/become-dataweaver">Become a DataWeaver

 

Job Description:

  • Develop a detailed project plan (Ensuring resource availability and allocation) to track progress
  • Developing project scopes and objectives, involving all relevant stakeholders and ensuring technical feasibility
  • Ensure that all projects are delivered on-time, within scope and within budget
  • Coordinate internal resources and third parties/vendors for the flawless execution of projects
  • Use appropriate verification techniques to manage risks, changes in project scope, schedule and costs
  • Measure project performance using appropriate systems, tools and techniques
  • Report and escalate to management as needed
  • Establish and maintain relationships with Clients, internal stakeholders, third parties / vendors
  • Create and maintain comprehensive project documentation
  • Definition of service level agreements (SLA) in relation to services, ensuring the SLA’s are achieved; data sanity and hygiene & client expectations are exceeded
  • Effectively monitor, control and support delivery, ensuring standard operating procedures and methodologies are followed
  • Create KPIs for monitoring and review and publish health stats on a recurring basis

 

 

Key Skills / Knowledge required:

  • Proven ability to switch context, manage multiple short projects
  • Relevant experience of 3-5 years and overall experience of 8+ yrs
  • Exposure to Analytics delivery process, ability to troubleshoot using insights within the data
  • Basic Database query skills, with knowledge of Web-crawling concepts is preferred
  • Excellent communication skills
  • Team management and ability to deliver while working with cross functional teams
  • Exposure to Agile Development (Scrum / Kanban) methodology is a plus

 

Read more
Mumbai, Pune
0 - 3 yrs
₹0L / yr
Business Development
skill iconData Analytics
Client Servicing
Sales
Presales
+1 more
Helping team to source good start ups that are looking to raise funds. Preliminary basic understanding and evaluation of the business start ups. Co-coordinating and scheduling communication slots with founders of startups Following up for information sought from start ups. Maintaining records of the startups being evaluated selected and rejected. Work in sync with the analyst’s team and provide them with back office support. Any other tasks that may come up and needed to be done by the team from time to time.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos