Cutshort logo
Change Management Jobs in Hyderabad

11+ Change Management Jobs in Hyderabad | Change Management Job openings in Hyderabad

Apply to 11+ Change Management Jobs in Hyderabad on CutShort.io. Explore the latest Change Management Job opportunities across top companies like Google, Amazon & Adobe.

icon
OSBIndia Private Limited
Bengaluru (Bangalore), Hyderabad
5 - 12 yrs
₹10L - ₹18L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
Stored Procedures
+3 more

1.      Core Responsibilities

·        Leading solutions for data engineering

·        Maintain the integrity of both the design and the data that is held within the architecture

·        Champion and educate people in the development and use of data engineering best practises

·        Support the Head of Data Engineering and lead by example

·        Contribute to the development of database management services and associated processes relating to the delivery of data solutions

·        Provide requirements analysis, documentation, development, delivery and maintenance of data platforms.

·        Develop database requirements in a structured and logical manner ensuring delivery is aligned with business prioritisation and best practise

·        Design and deliver performance enhancements, application migration processes and version upgrades across a pipeline of BI environments.

·        Provide support for the scoping and delivery of BI capability to internal users.

·        Identify risks and issues and escalate to Line / Project manager. 

·        Work with clients, existing asset owners & their service providers and non BI development staff to clarify and deliver work stream objectives in timescales that deliver to the overall project expectations.

·        Develop and maintain documentation in support of all BI processes.

·        Proactively identify cost-justifiable improvements to data manipulation processes.

·        Research and promote relevant BI tools and processes that contribute to increased efficiency and capability in support of corporate objectives.

·        Promote a culture that embraces change, continuous improvement and a ‘can do’ attitude.

·        Demonstrate enthusiasm and self-motivation at all times.

·        Establish effective working relationships with other internal teams to drive improved efficiency and effective processes.

·        Be a champion for high quality data and use of strategic data repositories, associated relational model, and Data Warehouse for optimising the delivery of accurate, consistent and reliable business intelligence

·        Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.

·        Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.

·        Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.

 

2.      Experience Requirements

·        5 years Data Engineering / ETL development experience is essential

·        5 years data design experience in an MI / BI / Analytics environment (Kimball, lake house, data lake) is essential

·        5 years experience of working in a structured Change Management project lifecycle is essential

·        Experience of working in a financial services environment is desirable

·        Experience of dealing with senior management within a large organisation is desirable

·        5 years experience of developing in conjunction with large complex projects and programmes is desirable

·        Experience mentoring other members of the team on best practise and internal standards is essential

·        Experience with cloud data platforms desirable (Microsoft Azure) is desirable

 

3.      Knowledge Requirements

·        A strong knowledge of business intelligence solutions and an ability to translate this into data solutions for the broader business is essential

·        Strong demonstrable knowledge of data warehouse methodologies

·        Robust understanding of high level business processes is essential

·        Understanding of data migration, including reconciliation, data cleanse and cutover is desirable

Read more
Technogen India PvtLtd

at Technogen India PvtLtd

4 recruiters
Mounika G
Posted by Mounika G
Hyderabad
11 - 16 yrs
₹24L - ₹27L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconAmazon Web Services (AWS)
SQL
+1 more

Daily and monthly responsibilities

  • Review and coordinate with business application teams on data delivery requirements.
  • Develop estimation and proposed delivery schedules in coordination with development team.
  • Develop sourcing and data delivery designs.
  • Review data model, metadata and delivery criteria for solution.
  • Review and coordinate with team on test criteria and performance of testing.
  • Contribute to the design, development and completion of project deliverables.
  • Complete in-depth data analysis and contribution to strategic efforts
  • Complete understanding of how we manage data with focus on improvement of how data is sourced and managed across multiple business areas.

 

Basic Qualifications

  • Bachelor’s degree.
  • 5+ years of data analysis working with business data initiatives.
  • Knowledge of Structured Query Language (SQL) and use in data access and analysis.
  • Proficient in data management including data analytical capability.
  • Excellent verbal and written communications also high attention to detail.
  • Experience with Python.
  • Presentation skills in demonstrating system design and data analysis solutions.


Read more
Wallero technologies
Hyderabad
7 - 15 yrs
₹20L - ₹28L / yr
SQL
Data modeling
ADF
PowerBI
  1. Strong communication skills are essential, as the selected candidate will be responsible for leading a team of two in the future.
  2. Proficiency in SQL.
  3. Expertise in Data Modelling.
  4. Experience with Azure Data Factory (ADF).
  5. Competence in Power BI.
  6. SQL – Should be strong in Data Modeling , Tables Design and SQL Queries.
  7. ADF – Must have hands-on experience in ADF pipelines and its set-up from End-to-End in Azure including subscriptions, IR and Resource Group creations.
  8. Power BI – Hands-on knowledge in Power BI reports including documentation and follow existing standards.


Read more
Blend360

at Blend360

1 recruiter
VasimAkram Shaik
Posted by VasimAkram Shaik
Hyderabad
5 - 13 yrs
Best in industry
Tableau
SQL
Business Intelligence (BI)
Spotfire
Qlikview
+3 more

Key Responsibilities:


•Design, development, support and maintain automated business intelligence products in Tableau.


•Rapidly design, develop and implement reporting applications that insert KPI metrics and actionable insights into the operational, tactical and strategic activities of key business functions.


•Develop strong communication skills with a proven success communicating with users, other tech teams.


•Identify business requirements, design processes that leverage/adapt the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs.


•Design, code and review business intelligence projects developed in tools Tableau & Power BI.


•Work as a member and lead teams to implement BI solutions for our customers.


•Develop dashboards and data sources that meet and exceed customer requirements.


•Partner with business information architects to understand the business use cases that support and fulfill business and data strategy.


•Partner with Product Owners and cross functional teams in a collaborative and agile environment


•Provide best practices for data visualization and Tableau implementations.


•Work along with solution architect in RFI / RFP response solution design, customer presentations, demonstrations, POCs etc. for growth.



Desired Candidate Profile:


•6-10 years of programming experience and a demonstrated proficiency in Experience with Tableau Certifications in Tableau is highly preferred.


•Ability to architect and scope complex projects.


•Strong understanding of SQL and basic understanding of programming languages; experience with SAQL, SOQL, Python, or R a plus.


•Applied experience in Agile development processes (SCRUM)


•Ability to independently learn new technologies.


•Ability to show initiative and work independently with minimal direction.


•Presentation skills – demonstrated ability to simplify complex situations and ideas and distill them into compelling and effective written and oral presentations.


•Learn quickly – ability to understand and rapidly comprehend new areas, functional and technical, and apply detailed and critical thinking to customer solutions.



Education:


•Bachelor/master’s degree in Computer Science, Computer Engineering, quantitative studies, such as Statistics, Math, Operation Research, Economics and Advanced Analytics

Read more
A fast growing Big Data company
Noida, Bengaluru (Bangalore), Chennai, Hyderabad
6 - 8 yrs
₹10L - ₹15L / yr
AWS Glue
SQL
skill iconPython
PySpark
Data engineering
+6 more

AWS Glue Developer 

Work Experience: 6 to 8 Years

Work Location:  Noida, Bangalore, Chennai & Hyderabad

Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops, 

Job Reference ID:BT/F21/IND


Job Description:

Design, build and configure applications to meet business process and application requirements.


Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.


Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.


➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

➢ Create data pipeline architecture by designing and implementing data ingestion solutions.

➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

➢ Author ETL processes using Python, Pyspark.

➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.

➢ ETL process monitoring using CloudWatch events.

➢ You will be working in collaboration with other teams. Good communication must.

➢ Must have experience in using AWS services API, AWS CLI and SDK


Professional Attributes:

➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.


Qualification:

➢ Degree in Computer Science, Computer Engineering or equivalent.


Salary: Commensurate with experience and demonstrated competence

Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
Softobiz Technologies Private limited

at Softobiz Technologies Private limited

2 candid answers
1 recruiter
Swati Sharma
Posted by Swati Sharma
Hyderabad
5 - 13 yrs
₹10L - ₹25L / yr
azure data factory
SQL server
SSIS
SQL Server Integration Services (SSIS)
Data Warehouse (DWH)
+7 more

Responsibilities


  • Design and implement Azure BI infrastructure, ensure overall quality of delivered solution 
  • Develop analytical & reporting tools, promote and drive adoption of developed BI solutions 
  • Actively participate in BI community 
  • Establish and enforce technical standards and documentation 
  • Participate in daily scrums  
  • Record progress daily in assigned Devops items 


Ideal Candidates should have


  • 5 + years of experience in a similar senior business intelligence development position 
  • To be successful in the role you will require a high level of expertise across all facets of the Microsoft BI stack and prior experience in designing and developing well-performing data warehouse solutions 
  • Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps. 
  • Experience with development methodologies including Agile, DevOps, and CICD patterns 
  • Strong oral and written communication skills in English 
  • Ability and willingness to learn quickly and continuously 
  • Bachelor's Degree in computer science 


Read more
Indium Software

at Indium Software

16 recruiters
Swaathipriya P
Posted by Swaathipriya P
Bengaluru (Bangalore), Hyderabad
2 - 5 yrs
₹1L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more
2+ years of Analytics with predominant experience in SQL, SAS, Statistics, R , Python, Visualization
Experienced in writing complex SQL select queries (window functions & CTE’s) with advanced SQL experience
Should be an individual contributor for initial few months based on project movement team will be aligned
Strong in querying logic and data interpretation
Solid communication and articulation skills
Able to handle stakeholders independently with less interventions of reporting manager
Develop strategies to solve problems in logical yet creative ways
Create custom reports and presentations accompanied by strong data visualization and storytelling
Read more
Accion Labs

at Accion Labs

14 recruiters
Anjali Mohandas
Posted by Anjali Mohandas
Remote, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai
4 - 8 yrs
₹15L - ₹28L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more

4-6 years of total experience in data warehousing and business intelligence

3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)

2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)

Strong experience building visually appealing UI/UX in Power BI

Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)

Experience building Power BI using large data in direct query mode

Expert SQL background (query building, stored procedure, optimizing performance)

Read more
Remote, Bengaluru (Bangalore), Hyderabad
0 - 1 yrs
₹2.5L - ₹4L / yr
SQL
Data engineering
Big Data
skill iconPython
● Hands-on Work experience as a Python Developer
● Hands-on Work experience in SQL/PLSQL
● Expertise in at least one popular Python framework (like Django,
Flask or Pyramid)
● Knowledge of object-relational mapping (ORM)
● Familiarity with front-end technologies (like JavaScript and HTML5)
● Willingness to learn & upgrade to Big data and cloud technologies
like Pyspark Azure etc.
● Team spirit
● Good problem-solving skills
● Write effective, scalable code
Read more
Indium Software

at Indium Software

16 recruiters
Mohamed Aslam
Posted by Mohamed Aslam
Hyderabad
3 - 7 yrs
₹7L - ₹13L / yr
skill iconPython
Spark
SQL
PySpark
HiveQL
+2 more

Indium Software is a niche technology solutions company with deep expertise in Digital , QA and Gaming. Indium helps customers in their Digital Transformation journey through a gamut of solutions that enhance business value.

With over 1000+ associates globally, Indium operates through offices in the US, UK and India

Visit http://www.indiumsoftware.com">www.indiumsoftware.com to know more.

Job Title: Analytics Data Engineer

What will you do:
The Data Engineer must be an expert in SQL development further providing support to the Data and Analytics in database design, data flow and analysis activities. The position of the Data Engineer also plays a key role in the development and deployment of innovative big data platforms for advanced analytics and data processing. The Data Engineer defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business.

We ask:

Extensive Experience with SQL and strong ability to process and analyse complex data

The candidate should also have an ability to design, build, and maintain the business’s ETL pipeline and data warehouse The candidate will also demonstrate expertise in data modelling and query performance tuning on SQL Server
Proficiency with analytics experience, especially funnel analysis, and have worked on analytical tools like Mixpanel, Amplitude, Thoughtspot, Google Analytics, and similar tools.

Should work on tools and frameworks required for building efficient and scalable data pipelines
Excellent at communicating and articulating ideas and an ability to influence others as well as drive towards a better solution continuously.
Experience working in python, Hive queries, spark, pysaprk, sparkSQL, presto

  • Relate Metrics to product
  • Programmatic Thinking
  • Edge cases
  • Good Communication
  • Product functionality understanding

Perks & Benefits:
A dynamic, creative & intelligent team they will make you love being at work.
Autonomous and hands-on role to make an impact you will be joining at an exciting time of growth!

Flexible work hours and Attractive pay package and perks
An inclusive work environment that lets you work in the way that works best for you!

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort