Cutshort logo
Microsoft SSIS Jobs in Hyderabad

11+ Microsoft SSIS Jobs in Hyderabad | Microsoft SSIS Job openings in Hyderabad

Apply to 11+ Microsoft SSIS Jobs in Hyderabad on CutShort.io. Explore the latest Microsoft SSIS Job opportunities across top companies like Google, Amazon & Adobe.

icon
Hyderabad
6 - 9 yrs
₹10L - ₹15L / yr
SQL
Databases
SQL Server Reporting Services (SSRS)
SQL Server Integration Services (SSIS)
SQL Server Analysis Services (SSAS)
+11 more

Designation: Senior - DBA

Experience: 6-9 years

CTC: INR 17-20 LPA

Night Allowance: INR 800/Night

Location: Hyderabad,Hybrid

Notice Period: NA

Shift Timing : 6:30 pm to 3:30 am

Openings: 3

Roles and Responsibilities:

As a Senior Database Administrator is responsible for the physical design development

administration and optimization of properly engineered database systems to meet agreed

business and technical requirements.

The candidate will work as part of but not limited to the Onsite/Offsite DBA

group-Administration and management of databases in Dev Stage and Production

environments

Performance tuning of database schema stored procedures etc.

Providing technical input on the setup configuration of database servers and SAN disk

subsystem on all database servers.

Troubleshooting and handling all database related issues and tracking them through to

resolution.

Pro-active monitoring of databases both from a performance and capacity management

perspective.

Performing database maintenance activities such as backup/recovery rebuilding and

reorganizing indexes.

Ensuring that all database releases are properly assessed and measured from a

functionality and performance perspective.

Ensuring that all databases are up to date with the latest service packs patches &

security fixes.

Take ownership and ensure high quality timely delivery of projects on hand.

Collaborate with application/database developers quality assurance and

operations/support staff

Will help manage large high transaction rate SQL Server production

Eligibility:

Bachelors/Master Degree (BE/BTech/MCA/MTect/MS)

6 - 8 years of solid experience in SQL Server 2016/2019 Database administration and

maintenance on Azure and AWS cloud.

Experience handling and managing large SQL Server databases in a real time production

environment with sizes greater than 200+ GB

Experience in troubleshooting and resolving database integrity issues performance

issues blocking/deadlocking issues connectivity issues data replication issues etc.

Experience on Configuration Trouble shoot on SQL Server HA

Ability to detect and troubleshoot database related CPUmemoryI/Odisk space and other

resource contention issues.

Experience with database maintenance activities such as backup/recovery & capacity

monitoring/management and Azure Backup Services.

Experience with HA/Failover technologies such as Clustering SAN Replication Log

shipping & mirroring.

Experience collaborating with development teams on physical database design activities

and performance tuning.

Experience in managing and making software deployments/changes in real time

production environments.

Ability to work on multiple projects at one time with minimal supervision and ensure high

quality timely delivery.

Knowledge on tools like SQL Lite speed SQL Diagnostic Manager App Dynamics.

Strong understanding of Data Warehousing concepts and SQL server Architecture

Certified DBA Proficient in TSQL Proficient in the various Storage technologies such as

ASM SAN NAS RAID Multi patching

Strong analytical and problem solving skills Proactive independent and proven ability to

work under tight target and pressure

Experience working in a highly regulated environment such as a financial services

institutions

Expertise in SSIS SSRS

Skills:

SSIS

SSRS


Read more
Blend360

at Blend360

1 recruiter
VasimAkram Shaik
Posted by VasimAkram Shaik
Hyderabad
5 - 13 yrs
Best in industry
Tableau
SQL
Business Intelligence (BI)
Spotfire
Qlikview
+3 more

Key Responsibilities:


•Design, development, support and maintain automated business intelligence products in Tableau.


•Rapidly design, develop and implement reporting applications that insert KPI metrics and actionable insights into the operational, tactical and strategic activities of key business functions.


•Develop strong communication skills with a proven success communicating with users, other tech teams.


•Identify business requirements, design processes that leverage/adapt the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs.


•Design, code and review business intelligence projects developed in tools Tableau & Power BI.


•Work as a member and lead teams to implement BI solutions for our customers.


•Develop dashboards and data sources that meet and exceed customer requirements.


•Partner with business information architects to understand the business use cases that support and fulfill business and data strategy.


•Partner with Product Owners and cross functional teams in a collaborative and agile environment


•Provide best practices for data visualization and Tableau implementations.


•Work along with solution architect in RFI / RFP response solution design, customer presentations, demonstrations, POCs etc. for growth.



Desired Candidate Profile:


•6-10 years of programming experience and a demonstrated proficiency in Experience with Tableau Certifications in Tableau is highly preferred.


•Ability to architect and scope complex projects.


•Strong understanding of SQL and basic understanding of programming languages; experience with SAQL, SOQL, Python, or R a plus.


•Applied experience in Agile development processes (SCRUM)


•Ability to independently learn new technologies.


•Ability to show initiative and work independently with minimal direction.


•Presentation skills – demonstrated ability to simplify complex situations and ideas and distill them into compelling and effective written and oral presentations.


•Learn quickly – ability to understand and rapidly comprehend new areas, functional and technical, and apply detailed and critical thinking to customer solutions.



Education:


•Bachelor/master’s degree in Computer Science, Computer Engineering, quantitative studies, such as Statistics, Math, Operation Research, Economics and Advanced Analytics

Read more
MNC Company - Product Based
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 9 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
Python
Google Cloud Platform (GCP)
+2 more

Job Responsibilities

  • Design, build & test ETL processes using Python & SQL for the corporate data warehouse
  • Inform, influence, support, and execute our product decisions
  • Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
  • Evaluate and prototype new technologies in the area of data processing
  • Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
  • High energy level, strong team player and good work ethic
  • Data analysis, understanding of business requirements and translation into logical pipelines & processes
  • Identification, analysis & resolution of production & development bugs
  • Support the release process including completing & reviewing documentation
  • Configure data mappings & transformations to orchestrate data integration & validation
  • Provide subject matter expertise
  • Document solutions, tools & processes
  • Create & support test plans with hands-on testing
  • Peer reviews of work developed by other data engineers within the team
  • Establish good working relationships & communication channels with relevant departments

 

Skills and Qualifications we look for

  • University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
  • 4 - 6 years experience with data engineering.
  • Strong coding ability and software development experience in Python.
  • Strong hands-on experience with SQL and Data Processing.
  • Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
  • Good working experience in any one of the ETL tools (Airflow would be preferable).
  • Should possess strong analytical and problem solving skills.
  • Good to have skills - Apache pyspark, CircleCI, Terraform
  • Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
  • Understanding & experience of agile / scrum delivery methodology

 

Read more
Product and Service based company
Hyderabad, Ahmedabad
8 - 12 yrs
₹15L - ₹30L / yr
SQL server
Relational Database (RDBMS)
NOSQL Databases
Oracle
Database Design
+3 more

Job Description

Job Responsibilities

  • Design and implement robust database solutions including

    • Security, backup and recovery

    • Performance, scalability, monitoring and tuning,

    • Data management and capacity planning,

    • Planning, and implementing failover between database instances.

  • Create data architecture strategies for each subject area of the enterprise data model.

  • Communicate plans, status and issues to higher management levels.

  • Collaborate with the business, architects and other IT organizations to plan a data strategy, sharing important information related to database concerns and constrains

  • Produce all project data architecture deliverables..

  • Create and maintain a corporate repository of all data architecture artifacts.

 

Skills Required:

  • Understanding of data analysis, business principles, and operations

  • Software architecture and design Network design and implementation

  • Data visualization, data migration and data modelling

  • Relational database management systems

  • DBMS software, including SQL Server  

  • Database and cloud computing design, architectures and data lakes

  • Information management and data processing on multiple platforms 

  • Agile methodologies and enterprise resource planning implementation

  • Demonstrate database technical functionality, such as performance tuning, backup and recovery, monitoring.

  • Excellent skills with advanced features such as database encryption, replication, partitioning, etc.

  • Strong problem solving, organizational and communication skill.

Read more
Hyderabad
5 - 10 yrs
₹19L - ₹25L / yr
ETL
Informatica
Data Warehouse (DWH)
Windows Azure
Microsoft Windows Azure
+4 more

A Business Transformation Organization that partners with businesses to co–create customer-centric hyper-personalized solutions to achieve exponential growth. Invente offers platforms and services that enable businesses to provide human-free customer experience, Business Process Automation.


Location: Hyderabad (WFO)

Budget: Open

Position: Azure Data Engineer

Experience: 5+ years of commercial experience


Responsibilities

●     Design and implement Azure data solutions using ADLS Gen 2.0, Azure Data Factory, Synapse, Databricks, SQL, and Power BI

●     Build and maintain data pipelines and ETL processes to ensure efficient data ingestion and processing

●     Develop and manage data warehouses and data lakes

●     Ensure data quality, integrity, and security

●     Implement from existing use cases required by the AI and analytics teams.

●     Collaborate with other teams to integrate data solutions with other systems and applications

●     Stay up-to-date with emerging data technologies and recommend new solutions to improve our data infrastructure


Read more
IT MNC
Agency job
via Apical Mind by Madhusudan Patade
Bengaluru (Bangalore), Hyderabad, Noida, Chennai, NCR (Delhi | Gurgaon | Noida)
3 - 12 yrs
₹15L - ₹40L / yr
Presto
Hadoop
presto
SQL

Experience – 3 – 12 yrs

Budget - Open

Location - PAN India (Noida/Bangaluru/Hyderabad/Chennai)


Presto Developer (4)

 

Understanding of distributed SQL query engine running on Hadoop 

Design and develop core components for Presto 

Contribute to the ongoing Presto development by implementing new features, bug fixes, and other improvements 

Develop new and extend existing Presto connectors to various data sources 

Lead complex and technically challenging projects from concept to completion 

Write tests and contribute to ongoing automation infrastructure development 

Run and analyze software performance metrics 

Collaborate with teams globally across multiple time zones and operate in an Agile development environment 

Hands-on experience and interest with Hadoop 

Read more
ZF india
Sagar Sthawarmath
Posted by Sagar Sthawarmath
Hyderabad, Chennai
4 - 9 yrs
₹3L - ₹15L / yr
SAS
Data Analytics
Data Visualization
Data integration
Data Warehouse (DWH)

In this role, you will: 

As part of a team focused on the preserving the customer experience across the organization, this Analytic Consultant will be responsible for: 

    • Understand business objectives and provide credible challenge to analysis requirements. 
    • Verify sound analysis practices and data decisions were leveraged throughout planning and data sourcing phases. 
    • Conduct in-depth research within complex data environments to identify data integrity issues and propose solutions to improve analysis accuracy. 
    • Applying critical evaluation to challenge assumptions, formulate a defendable hypothesis, and ensuring high quality analysis results. 
    • Ensure adherence to data management/ data governance regulations and policies. 
    • Performing and testing highly complex data analytics for customer remediation. 
    • Designing analysis projects flow and documentation that is structured for consistency, easy to understand, and to be offered to multiple levels of reviewers, partners, and regulatory agents demonstrating research and analysis completed. 
    • Investigate and ensure data integrity from multiple sources. 
    • Ensure data recommended and used is the best “source of truth”. 
    • Applies knowledge of business, customers, and products to synthesize data to 'form a story' and align information to contrast/compare to industry perspective. Data involved typically very large, structured or unstructured, and from multiple sources. 
    • Must have a strong attention to detail and be able to meet high quality standards consistently. 
    • Other duties as assigned by manager. 
    • Willing to assist on high priority work outside of regular business hours or weekend as needed. 


Essential Qualifications: 
 

    • Around 5+ years in similar analytics roles 
    • Bachelors, M.A./M.Sc. College Degree or Higher in applied mathematics, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis. 
    • Preferred programming knowledge SQL/SAS  
    • Knowledge of PVSI, Non-Lending, Student Loans, Small Business and Personal Lines and Loans is a plus. 
    • Strong experience with data integration, database structures and data warehouses. 
    • Persuasive written and verbal communication skills. 


Desired Qualifications: 

    • Certifications in Data Science, or BI Reporting tools. 
    • Ability to prioritize work, meet deadlines, achieve goals and work under pressure in a dynamic and complex environment – Soft Skills. 
    • Detail oriented, results driven, and has the ability to navigate in a quickly changing and high demand environment while balancing multiple priorities. 
    • Ability to research and report on a variety of issues using problem solving skills. 
    • Ability to act with integrity and a high level of professionalism with all levels of team members and management. 
    • Ability to make timely and independent judgment decisions while working in a fast-paced and results-driven environment. 
    • Ability to learn the business aspects quickly, multitask and prioritize between projects. 
    • Exhibits appropriate sense of urgency in managing responsibilities. 
    • Ability to accurately process high volumes of work within established deadlines. 
    • Available to flex schedule periodically based on business need. 
    • Demonstrate strong negotiation, communication & presentation skills. 
    • Demonstrates a high degree of reliability, integrity and trustworthiness. 
    • Takes ownership of assignments and helps drive assignments of the team. 
    • Dedicated, enthusiastic, driven and performance-oriented; possesses a strong work ethic and good team player. 
    • Be proactive and get engaged in organizational initiatives.
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad
0 - 1 yrs
₹3L - ₹3.5L / yr
SQL
Data engineering
Data Engineer
Python
Big Data
+1 more
Strong Programmer with expertise in Python and SQL
 
● Hands-on Work experience in SQL/PLSQL
● Expertise in at least one popular Python framework (like Django,
Flask or Pyramid)
● Knowledge of object-relational mapping (ORM)
● Familiarity with front-end technologies (like JavaScript and HTML5)
● Willingness to learn & upgrade to Big data and cloud technologies
like Pyspark Azure etc.
● Team spirit
● Good problem-solving skills
● Write effective, scalable code
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad
3 - 9 yrs
₹8L - ₹20L / yr
PySpark
Data engineering
Data Engineer
Windows Azure
ADF
+2 more
Must-Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skill
 
 
Technology Skills (Good to Have):
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
  • Azure Synapse or Azure SQL data warehouse
  • Spark on Azure is available in HD insights and data bricks
 
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
MNC

at MNC

Agency job
via Fragma Data Systems by geeti gaurav mohanty
Bengaluru (Bangalore), Hyderabad, Pune
3 - 8 yrs
₹8L - ₹16L / yr
ADF
SSIS
Job Responsibilities/KRAs:
Responsibilities
 Understand business requirement and actively provide inputs from Data perspective.
 Experience of SSIS development.
 Experience in Migrating SSIS packages to Azure SSIS Integrated Runtime
 Experience in Data Warehouse / Data mart development and migration
 Good knowledge and Experience on Azure Data Factory
 Expert level knowledge of SQL DB & Datawarehouse
 Should know at least one programming language (python or PowerShell)
 Should be able to analyse and understand complex data flows in SSIS.
 Knowledge on Control-M
 Knowledge of Azure data lake is required.
 Excellent interpersonal/communication skills (both oral/written) with the ability to communicate
at various levels with clarity & precision.
 Build simple to complex pipelines & dataflows.
 Work with other Azure stack modules like Azure Data Lakes, SQL DW, etc.
Requirements
 Bachelor’s degree in Computer Science, Computer Engineering, or relevant field.
 A minimum of 5 years’ experience in a similar role.
 Strong knowledge of database structure systems and data mining.
 Excellent organizational and analytical abilities.
 Outstanding problem solver.
 Good written and verbal communication skills.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort