Cutshort logo
Data steward Jobs in Pune

11+ Data steward Jobs in Pune | Data steward Job openings in Pune

Apply to 11+ Data steward Jobs in Pune on CutShort.io. Explore the latest Data steward Job opportunities across top companies like Google, Amazon & Adobe.

icon
Infogain
Agency job
via Technogen India PvtLtd by RAHUL BATTA
NCR (Delhi | Gurgaon | Noida), Bengaluru (Bangalore), Mumbai, Pune
7 - 8 yrs
₹15L - ₹16L / yr
Data steward
MDM
Tamr
Reltio
Data engineering
+7 more
  1. Data Steward :

Data Steward will collaborate and work closely within the group software engineering and business division. Data Steward has overall accountability for the group's / Divisions overall data and reporting posture by responsibly managing data assets, data lineage, and data access, supporting sound data analysis. This role requires focus on data strategy, execution, and support for projects, programs, application enhancements, and production data fixes. Makes well-thought-out decisions on complex or ambiguous data issues and establishes the data stewardship and information management strategy and direction for the group. Effectively communicates to individuals at various levels of the technical and business communities. This individual will become part of the corporate Data Quality and Data management/entity resolution team supporting various systems across the board.

 

Primary Responsibilities:

 

  • Responsible for data quality and data accuracy across all group/division delivery initiatives.
  • Responsible for data analysis, data profiling, data modeling, and data mapping capabilities.
  • Responsible for reviewing and governing data queries and DML.
  • Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
  • Accountable for the performance, quality, and alignment to requirements for all data query design and development.
  • Responsible for defining standards and best practices for data analysis, modeling, and queries.
  • Responsible for understanding end-to-end data flows and identifying data dependencies in support of delivery, release, and change management.
  • Responsible for the development and maintenance of an enterprise data dictionary that is aligned to data assets and the business glossary for the group responsible for the definition and maintenance of the group's data landscape including overlays with the technology landscape, end-to-end data flow/transformations, and data lineage.
  • Responsible for rationalizing the group's reporting posture through the definition and maintenance of a reporting strategy and roadmap.
  • Partners with the data governance team to ensure data solutions adhere to the organization’s data principles and guidelines.
  • Owns group's data assets including reports, data warehouse, etc.
  • Understand customer business use cases and be able to translate them to technical specifications and vision on how to implement a solution.
  • Accountable for defining the performance tuning needs for all group data assets and managing the implementation of those requirements within the context of group initiatives as well as steady-state production.
  • Partners with others in test data management and masking strategies and the creation of a reusable test data repository.
  • Responsible for solving data-related issues and communicating resolutions with other solution domains.
  • Actively and consistently support all efforts to simplify and enhance the Clinical Trial Predication use cases.
  • Apply knowledge in analytic and statistical algorithms to help customers explore methods to improve their business.
  • Contribute toward analytical research projects through all stages including concept formulation, determination of appropriate statistical methodology, data manipulation, research evaluation, and final research report.
  • Visualize and report data findings creatively in a variety of visual formats that appropriately provide insight to the stakeholders.
  • Achieve defined project goals within customer deadlines; proactively communicate status and escalate issues as needed.

 

Additional Responsibilities:

 

  • Strong understanding of the Software Development Life Cycle (SDLC) with Agile Methodologies
  • Knowledge and understanding of industry-standard/best practices requirements gathering methodologies.
  • Knowledge and understanding of Information Technology systems and software development.
  • Experience with data modeling and test data management tools.
  • Experience in the data integration project • Good problem solving & decision-making skills.
  • Good communication skills within the team, site, and with the customer

 

Knowledge, Skills and Abilities

 

  • Technical expertise in data architecture principles and design aspects of various DBMS and reporting concepts.
  • Solid understanding of key DBMS platforms like SQL Server, Azure SQL
  • Results-oriented, diligent, and works with a sense of urgency. Assertive, responsible for his/her own work (self-directed), have a strong affinity for defining work in deliverables, and be willing to commit to deadlines.
  • Experience in MDM tools like MS DQ, SAS DM Studio, Tamr, Profisee, Reltio etc.
  • Experience in Report and Dashboard development
  • Statistical and Machine Learning models
  • Python (sklearn, numpy, pandas, genism)
  • Nice to Have:
  • 1yr of ETL experience
  • Natural Language Processing
  • Neural networks and Deep learning
  • xperience in keras,tensorflow,spacy, nltk, LightGBM python library

 

Interaction :  Frequently interacts with subordinate supervisors.

Education : Bachelor’s degree, preferably in Computer Science, B.E or other quantitative field related to the area of assignment. Professional certification related to the area of assignment may be required

Experience :  7 years of Pharmaceutical /Biotech/life sciences experience, 5 years of Clinical Trials experience and knowledge, Excellent Documentation, Communication, and Presentation Skills including PowerPoint

 

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
2 - 5 yrs
Best in industry
Data Warehouse (DWH)
Informatica
ETL
SQL
skill iconJava
+1 more

Who You Are:


- In-depth and strong knowledge of SQL.

- Basic knowledge of Java.

- Basic scripting knowledge.

- Strong analytical skills.

- Excellent debugging skills and problem-solving.


What You’ll Do:


- Comfortable working in EST+IST Timezone

- Troubleshoot complex issues discovered in-house as well as in customer environments.

- Replicate customer environments/issues on Platform and Data and work to identify the root cause or provide interim workaround as needed.

- Ability to debug SQL queries associated with Data pipelines.

- Monitoring and debugging ETL jobs on a daily basis.

- Provide Technical Action plans to take a customer/product issue from start to resolution.

- Capture and document any Data incidents identified on Platform and maintain the history of such issues along with resolution.

- Identify product bugs and improvements based on customer environments and work to close them

- Ensure implementation/continuous improvement of formal processes to support product development activities.

- Good in external and internal communication across stakeholders.

Read more
Concinnity Media Technologies

at Concinnity Media Technologies

2 candid answers
Anirban Biswas
Posted by Anirban Biswas
Pune
6 - 10 yrs
₹18L - ₹27L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+9 more
  • Develop, train, and optimize machine learning models using Python, ML algorithms, deep learning frameworks (e.g., TensorFlow, PyTorch), and other relevant technologies.
  • Implement MLOps best practices, including model deployment, monitoring, and versioning.
  • Utilize Vertex AI, MLFlow, KubeFlow, TFX, and other relevant MLOps tools and frameworks to streamline the machine learning lifecycle.
  • Collaborate with cross-functional teams to design and implement CI/CD pipelines for continuous integration and deployment using tools such as GitHub Actions, TeamCity, and similar platforms.
  • Conduct research and stay up-to-date with the latest advancements in machine learning, deep learning, and MLOps technologies.
  • Provide guidance and support to data scientists and software engineers on best practices for machine learning development and deployment.
  • Assist in developing tooling strategies by evaluating various options, vendors, and product roadmaps to enhance the efficiency and effectiveness of our AI and data science initiatives.


Read more
Virtusa

at Virtusa

2 recruiters
Agency job
via Response Informatics by Anupama Lavanya Uppala
Chennai, Bengaluru (Bangalore), Mumbai, Hyderabad, Pune
3 - 10 yrs
₹10L - ₹25L / yr
PySpark
skill iconPython
  • Minimum 1 years of relevant experience, in PySpark (mandatory)
  • Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus 
  • Ability to play lead role and independently manage 3-5 member of Pyspark development team 
  • EMR ,Python and PYspark mandate.
  • Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
HCL Technologies

at HCL Technologies

3 recruiters
Agency job
via Saiva System by Sunny Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Mumbai, Kolkata
5 - 10 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
Exp- 5 + years
Skill- Spark and Scala along with Azure
Location - Pan India

Looking for someone Bigdata along with Azure
Read more
Edubridge Learning

at Edubridge Learning

6 recruiters
Hemal Thakker
Posted by Hemal Thakker
Mumbai, Pune, Hyderabad, Gurugram
2 - 6 yrs
₹4L - ₹7L / yr
skill iconData Analytics
skill iconPython
skill iconR Programming
SAS
skill iconMachine Learning (ML)
+1 more

JOB DESCRIPTION

  • 2 to 6 years of experience in imparting technical training/ mentoring  
  • Must have very strong concepts of Data Analytics
  • Must have hands-on and training experience on Python, Advanced Python, R programming, SAS and machine learning
  • Must have good knowledge of SQL and Advanced SQL
  • Should have basic knowledge of Statistics
  • Should be good in Operating systems GNU/Linux, Network fundamentals,        
  • Must have knowledge on MS office (Excel/ Word/ PowerPoint)                           
  • Self-Motivated and passionate about technology                                     
  • Excellent analytical and logical skills and team player                               
  • Must have exceptional Communication Skills/ Presentation Skills                         
  • Good Aptitude skills is preferred                        
  • Exceptional communication skills

Responsibilities:                                                                                         

  • Ability to quickly learn any new technology and impart the same to other employees
  • Ability to resolve all technical queries of students
  • Conduct training sessions and drive the placement driven quality in the training
  • Must be able to work independently without the supervision of a senior person
  • Participate in reviews/ meetings                                                                                                          

Qualification:                                                                               

  • UG: Any Graduate in IT/Computer Science, B.Tech/B.E. – IT/ Computers
  • PG: MCA/MS/MSC – Computer Science
  • Any Graduate/ Post graduate, provided they are certified in similar courses

ABOUT EDUBRIDGE

EduBridge is an Equal Opportunity employer and we believe in building a meritorious culture where everyone is recognized for their skills and contribution.

Launched in 2009 EduBridge Learning is a workforce development and skilling organization with 50+ training academies in 18 States pan India. The organization has been providing skilled manpower to corporates for over 10 years and is a leader in its space. We have trained over a lakh semi urban & economically underprivileged youth on relevant life skills and industry-specific skills and provided placements in over 500 companies. Our latest product E-ON is committed to complementing our training delivery with an Online training platform, enabling the students to learn anywhere and anytime.

To know more about EduBridge please visit: http://www.edubridgeindia.com/">http://www.edubridgeindia.com/

You can also visit us on https://www.facebook.com/Edubridgelearning/">Facebook , https://www.linkedin.com/company/edubridgelearning/">LinkedIn for our latest initiatives and products

Read more
first principle labs

at first principle labs

1 recruiter
Ankit Goenka
Posted by Ankit Goenka
Pune
3 - 7 yrs
₹12L - ₹18L / yr
skill iconData Science
skill iconPython
skill iconR Programming
Big Data
Hadoop
The selected would be a part of the inhouse Data Labs team. He/she would be responsible to creation insights-driven decision structure.

This will include:

Scorecards
Strategies
MIS

The verticals included are:

Risk
Marketing
Product
Read more
Techknomatic Services Pvt. Ltd.
Techknomatic Services
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more
Saama Technologies

at Saama Technologies

6 recruiters
Sandeep Chaudhary
Posted by Sandeep Chaudhary
Pune
4 - 8 yrs
₹1L - ₹16L / yr
skill iconData Science
skill iconPython
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Big Data
+2 more
Description Must have Direct Hands- on, 4 years of experience, building complex Data Science solutions Must have fundamental knowledge of Inferential Statistics Should have worked on Predictive Modelling, using Python / R Experience should include the following, File I/ O, Data Harmonization, Data Exploration Machine Learning Techniques (Supervised, Unsupervised) Multi- Dimensional Array Processing Deep Learning NLP, Image Processing Prior experience in Healthcare Domain, is a plus Experience using Big Data, is a plus Should have Excellent Analytical, Problem Solving ability. Should be able to grasp new concepts quickly Should be well familiar with Agile Project Management Methodology Should have excellent written and verbal communication skills Should be a team player with open mind
Read more
Atyeti Inc

at Atyeti Inc

3 recruiters
Yash G
Posted by Yash G
Pune
5 - 8 yrs
₹8L - ₹16L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython
skill iconR Programming
+3 more
• Exposure to Deep Learning, Neural Networks, or related fields and a strong interest and desire to pursue them. • Experience in Natural Language Processing, Computer Vision, Machine Learning or Machine Intelligence (Artificial Intelligence). • Programming experience in Python. • Knowledge of machine learning frameworks like Tensorflow. • Experience with software version control systems like Github. • Understands the concept of Big Data like Hadoop, MongoDB, Apache Spark
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort