39+ SSIS Jobs in India
Apply to 39+ SSIS Jobs on CutShort.io. Find your next job, effortlessly. Browse SSIS Jobs and apply today!
at Delivery Solutions
About UPS:
Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPS’s India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics.
Job Summary:
- Applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software that provide business capabilities, solutions, and/or product suites. Provides systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery of technical solutions is on time and within budget.
- Researches and supports the integration of emerging technologies.
- Provides knowledge and support for applications’ development, integration, and maintenance.
- Develops program logic for new applications or analyzes and modifies logic in existing applications.
- Analyzes requirements, tests, and integrates application components.
- Ensures that system improvements are successfully implemented. May focus on web/internet applications specifically, using a variety of languages and platforms.
REQUIREMENTS
- Experience with Azure Data bricks, SQL, ETL – SSIS Packages – Very Critical.
- Azure Data Factory, Function Apps, DevOps – A must
- Experience with Azure and other cloud technologies
- Database – Oracle, SQL Server and COSMOS experience needed.
- Azure Services (key vault, app config, Blob storage, Redis cache, service bus, event grid, ADLS, App insight etc.)
- Knowledge of STRIIMs
Preffered skills: Microservices experience, preferred. Experience with Angular, .NET core– Not critical
Additional Information **This role will be in-office 3 days a week in Chennai, India **
Job Summary:
We are looking for an experienced ETL Tester with 5 to 7 years of experience and expertise
in the banking domain. The candidate will be responsible for testing ETL processes,
ensuring data quality, and validating data flows in large-scale projects.
Key Responsibilities:
Design and execute ETL test cases, ensuring data integrity and accuracy.
Perform data validation using complex SQL queries.
Collaborate with business analysts to define testing requirements.
Track defects and work with developers to resolve issues.
Conduct performance testing for ETL processes.
Banking Domain Knowledge: Strong understanding of banking processes such as
payments, loans, credit, accounts, and regulatory reporting.
Required Skills:
5-7 years of ETL testing experience.
Strong SQL skills and experience with ETL tools (Informatica, SSIS, etc.).
Knowledge of banking domain processes.
Experience with test management tools (JIRA, HP ALM).
Familiarity with Agile methodologies.
Location – Hyderabad
is a software product company that provides
5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus
Skills Required :
Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols
at Gipfel & Schnell Consultings Pvt Ltd
Must have Expert level (5-7 years of development and maintenance experience) in MS SQL Server
· Excellent skills in writing Stored Procedures, functions, triggers, queries, scripts, etc.
· Excellent in troubleshooting, performance tuning, debugging, and query optimization
· Excellent in indexing, finding and resolving potential for database deadlocks
· Strong in database migrations and upgrades.
· Has experience with analytics and reporting.
· 3+ years of experience with SSIS, SSRS.
· Has experience with Access DB.
· Have a MS SQL Server certification OR are willing to obtain in a short time frame.
1. Core Responsibilities
· Build, maintain and manage a team capable of delivering the data operations needs of the banks data teams and other stakeholders, ensuring the team is right sized, motivated and focused on key goals and SLAs.
· Maintain, develop and enhance the tools and environments used by the data teams to ensure availability of Development, Test and production environments
· Manage all data operations database changes observing industry standard software development life cycle approaches with development, testing and deployment supported by comprehensive documentation
· Manage releases of code from the engineering team to ensure separation of duty
· As subject matter expert take a key contributing role in data initiatives such as infrastructure development, software tool evaluation, intra company data integration
· Assess system performance and process efficiency making recommendations for change
· Identify gaps and technical issues affecting data flows and lead the development of solutions to these
· Work with internal and external stakeholders to ensure that data solutions are reliably populated on time
· Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.
· Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.
· Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.
2. Experience Requirements
· 5 years previous experience supporting datawarehousing, data lake solutions is essential
· 5 years previous experience of Microsoft SQL Server SSIS is desirable
· Experience with monitoring and incident management is essential
· Experience of managing a technical team is desirable
· Experience working in an IT environment
· Experience working within banking or other financial institutions is desirable
·
3. Knowledge Requirements
· Strong background of working in data teams
· Robust knowledge of RDBMS principles and how best to manage environments
· Strong knowledge of a standardised SDLC is desirable
Company - Tekclan Software Solutions
Position – SQL Developer
Experience – Minimum 4+ years of experience in MS SQL server, SQL Programming, ETL development.
Location - Chennai
We are seeking a highly skilled SQL Developer with expertise in MS SQL Server, SSRS, SQL programming, writing stored procedures, and proficiency in ETL using SSIS. The ideal candidate will have a strong understanding of database concepts, query optimization, and data modeling.
Responsibilities:
1. Develop, optimize, and maintain SQL queries, stored procedures, and functions for efficient data retrieval and manipulation.
2. Design and implement ETL processes using SSIS for data extraction, transformation, and loading from various sources.
3. Collaborate with cross-functional teams to gather business requirements and translate them into technical specifications.
4. Create and maintain data models, ensuring data integrity, normalization, and performance.
5. Generate insightful reports and dashboards using SSRS to facilitate data-driven decision making.
6. Troubleshoot and resolve database performance issues, bottlenecks, and data inconsistencies.
7. Conduct thorough testing and debugging of SQL code to ensure accuracy and reliability.
8. Stay up-to-date with emerging trends and advancements in SQL technologies and provide recommendations for improvement.
9. Should be an independent and individual contributor.
Requirements:
1. Minimum of 4+ years of experience in MS SQL server, SQL Programming, ETL development.
2. Proven experience as a SQL Developer with a strong focus on MS SQL Server.
3. Proficiency in SQL programming, including writing complex queries, stored procedures, and functions.
4. In-depth knowledge of ETL processes and hands-on experience with SSIS.
5. Strong expertise in creating reports and dashboards using SSRS.
6. Familiarity with database design principles, query optimization, and data modeling.
7. Experience with performance tuning and troubleshooting SQL-related issues.
8. Excellent problem-solving skills and attention to detail.
9. Strong communication and collaboration abilities.
10. Ability to work independently and handle multiple tasks simultaneously.
Preferred Skills:
1. Certification in MS SQL Server or related technologies.
2. Knowledge of other database systems such as Oracle or MySQL.
3. Familiarity with data warehousing concepts and tools.
4. Experience with version control systems.
Job Title: Data Engineer
Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.
Responsibilities:
- Design, build, and maintain data pipelines to collect, store, and process data from various sources.
- Create and manage data warehousing and data lake solutions.
- Develop and maintain data processing and data integration tools.
- Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
- Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
- Ensure data quality and integrity across all data sources.
- Develop and implement best practices for data governance, security, and privacy.
- Monitor data pipeline performance / Errors and troubleshoot issues as needed.
- Stay up-to-date with emerging data technologies and best practices.
Requirements:
Bachelor's degree in Computer Science, Information Systems, or a related field.
Experience with ETL tools like Matillion,SSIS,Informatica
Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.
Experience in writing complex SQL queries
Strong programming skills in languages such as Python, Java, or Scala.
Experience with data modeling, data warehousing, and data integration.
Strong problem-solving skills and ability to work independently.
Excellent communication and collaboration skills.
Familiarity with big data technologies such as Hadoop, Spark, or Kafka.
Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks
Familiarity with cloud computing platforms such as AWS, Azure, or GCP.
Familiarity with Reporting tools
Teamwork/ growth contribution
- Helping the team in taking the Interviews and identifying right candidates
- Adhering to timelines
- Intime status communication and upfront communication of any risks
- Tech, train, share knowledge with peers.
- Good Communication skills
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
Good to have :
Master's degree in Computer Science, Information Systems, or a related field.
Experience with NoSQL databases such as MongoDB or Cassandra.
Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.
Knowledge of machine learning and statistical modeling techniques.
If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.
Designation: Senior - DBA
Experience: 6-9 years
CTC: INR 17-20 LPA
Night Allowance: INR 800/Night
Location: Hyderabad,Hybrid
Notice Period: NA
Shift Timing : 6:30 pm to 3:30 am
Openings: 3
Roles and Responsibilities:
As a Senior Database Administrator is responsible for the physical design development
administration and optimization of properly engineered database systems to meet agreed
business and technical requirements.
The candidate will work as part of but not limited to the Onsite/Offsite DBA
group-Administration and management of databases in Dev Stage and Production
environments
Performance tuning of database schema stored procedures etc.
Providing technical input on the setup configuration of database servers and SAN disk
subsystem on all database servers.
Troubleshooting and handling all database related issues and tracking them through to
resolution.
Pro-active monitoring of databases both from a performance and capacity management
perspective.
Performing database maintenance activities such as backup/recovery rebuilding and
reorganizing indexes.
Ensuring that all database releases are properly assessed and measured from a
functionality and performance perspective.
Ensuring that all databases are up to date with the latest service packs patches &
security fixes.
Take ownership and ensure high quality timely delivery of projects on hand.
Collaborate with application/database developers quality assurance and
operations/support staff
Will help manage large high transaction rate SQL Server production
Eligibility:
Bachelors/Master Degree (BE/BTech/MCA/MTect/MS)
6 - 8 years of solid experience in SQL Server 2016/2019 Database administration and
maintenance on Azure and AWS cloud.
Experience handling and managing large SQL Server databases in a real time production
environment with sizes greater than 200+ GB
Experience in troubleshooting and resolving database integrity issues performance
issues blocking/deadlocking issues connectivity issues data replication issues etc.
Experience on Configuration Trouble shoot on SQL Server HA
Ability to detect and troubleshoot database related CPUmemoryI/Odisk space and other
resource contention issues.
Experience with database maintenance activities such as backup/recovery & capacity
monitoring/management and Azure Backup Services.
Experience with HA/Failover technologies such as Clustering SAN Replication Log
shipping & mirroring.
Experience collaborating with development teams on physical database design activities
and performance tuning.
Experience in managing and making software deployments/changes in real time
production environments.
Ability to work on multiple projects at one time with minimal supervision and ensure high
quality timely delivery.
Knowledge on tools like SQL Lite speed SQL Diagnostic Manager App Dynamics.
Strong understanding of Data Warehousing concepts and SQL server Architecture
Certified DBA Proficient in TSQL Proficient in the various Storage technologies such as
ASM SAN NAS RAID Multi patching
Strong analytical and problem solving skills Proactive independent and proven ability to
work under tight target and pressure
Experience working in a highly regulated environment such as a financial services
institutions
Expertise in SSIS SSRS
Skills:
SSIS
SSRS
- Manage different versions of complicate code and distribute them to different teams in the organization utilizing TFS.
- Develop an ASP.Net application to input and manage a production schedule, production statistical analysis and trend reporting.
- Create routines for importing data utilizing XML, CSV and comma delimitate files.
- Filter and cleanse OLTP data with complex store procedures and SSIS packages in the staging area.
- Develop a dimensional database and OLAP cube using SSAS for analysis, maintenance and good customer service.
- Involve in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
- Extract the data from XML and load it to dimensional model.
- Maintain SQL scripts, indexes, complex queries for data analysis and extraction.
- Create advanced reports like dashboard and scoreboard using SharePoint and power pivot for better presentation of data.
- Create cubes in SSAS reports which require complex calculations such as calculation of the premium for a particular policy.
- Create SharePoint sub sites, lists, libraries, folders and apply site permissions according to the given requirement.
- Work with business analysts, subject matter experts, and other team members to determine data extraction and transformation requirements.
- Develop T-SQL functions and store procedures to support complex business requirements.
- Create and configure an OLTP replication database for disaster recovery purpose.
- Designed/Implemented/Maintain OLAP servers and processes to replicate production data to the server.
Job Description:
Candidate should ideally be a B.E/B.Tech graduate with around 3 to 5 years of experience, with core T-SQL scripting , performance tuning, SSIS, NOSQL skills. The current role requires a core T-SQL , NoSQL and SSIS developer. The SQL Developer will be responsible for:
- Creating & designing database and all database objects such as tables, views, StoredProcedures, functions etc.
- Writing and troubleshooting simple to complex SQL scripts, stored procedures, views, functions, etc. as per the requirement.
- Troubleshooting and fixing all performance issues with respect to databases (objects & scripts).
- SQL Query tuning and performance tuning.
- Develop/migrate existing scripts to SSIS.
- Maintaining referential integrity, domain integrity and column integrity by using the available options such as constraints, etc.
- Maintaining database objects (like tables, views,Stored Procedures) performance by following the tuning tips like normalization, indexes, etc.
- Importing and exporting the data from different sources using tools, writing required scripts.
- Works closely with other members of the team to accomplish the projects.
- Assists project teams to design and develop core business components utilizing industry-accepted analysis and design standards.
- Provides and/or assists with, project management and project leadership.
- Create and maintain documentation for all the projects.
- Working with NOSQL databases (Mostly on MongoDB) .
PRIMARY SKILLS:
TSQL, SQL, NoSQL, Performance Tuning, SSIS, Import & Export of data from different data sources and from different formats.
SECONDARY SKILLS:
Database Administration
REQUIRED SKILLS:
- 3+ years of professional experience in working with MSSQL Server 2008 and above.
- 3+ years of professional experience in writing TSQL scripts.
- Thorough experience in SQL query tuning and performance tuning.
- Experience in developing and maintaining SSIS packages.
- Experience in working with NoSQL databases. MongoDB would be preferable.
- Fluent knowledge of MSSQL server DBA.
- Should be a good team player and should possess good attitude and work ethics.
- Should possess excellent communication and written skills.
PREFERRED SKILLS:
- MSSQL DBA.
- SSIS.
- NOSQL, MongoDB
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
- Design and solution new integrations/APIs and update existing API functionalities (enhancements or bugfixes) on Integration Services like Logic Apps, BizTalk, SSIS, Stream Sets etc.
- Identify opportunities, propose & implement solutions to improve integrations/middleware solution delivery process efficiency, KPI performance, customer experience and deliver value.
- Ability to guide and train developers on Integration/API tools as required and provide coaching and mentoring.
- Ability to lead design, development, testing, deployment of integrations using Azure Services.
- Ability to lead the Integration solutions for various SaaS/Market standard programs and complex project delivery with the help of Integration developers.
- Ability to design & develop integration frameworks considering all relevant artefacts including, security, governance, error handling, requirement traceability and access management.
- Strong analysis and communication skills. Able to manage conflict and misalignment within the team and with the business.
- Dedicated to project or product work for the majority of the year working in business and functional natural teams and establish Cross-Org performance management including effective participation in natural teams
- Presenting the changes to the Integration CoE and landscape managers during the Design review, Code review, Delivery Review and Pre-CAB/CAB meetings to receive required approvals.
- Strong engagement skills, working with stakeholders at all levels on regular basis; Ability to pro-actively engender a strong sense of community and team working.
- Expected to play a pivotal role in onboarding and building new cloud-based integration/API solutions, streamline and standardize the interfaces.
- Should be able to take ownership of deliverables and conduct reviews review board and seek signoffs
- Desired Skills and Experience.
- University degree or college diploma in Computer Sciences or related major Excellent communication skills – both verbal and written University degree or college diploma in Computer Sciences or related major.
- 8+ years programming and software development experience 6+ years .NET programming experience 4+ years BizTalk development experience
- Strong understanding of EAI, SOA, and ESB architecture Strong understanding of and experience with Object Oriented design and development Solid understanding of enterprise integration patterns
- Solid understanding of software development patterns.
Experience: 4+ Years
Job Location: Remote
Work timings : 2.30 pm -11.30 pm
Technical & Business Expertise:
-Hands on integration experience in SSIS/Mulesoft
- Hands on experience Azure Synapse
- Proven advanced level of writing database experience in SQL Server
- Proven advanced level of understanding about Data Lake
- Proven intermediate level of writing Python or similar programming language
- Intermediate understanding of Cloud Platforms (GCP)
- Intermediate understanding of Data Warehousing
- Advanced Understanding of Source Control (Github)
About OJ Commerce:
OJ Commerce is a fast-growing, profitable online retailer based in Florida, USA with a full-fledged India office based in Chennai driven by a sophisticated, data-driven system to run the operations with virtually no human intervention. We strive to be the best-in-class ecommerce company delivering exceptional value to customers by leveraging technology, innovation and brand-partnerships to provide a seamless & enjoyable shopping of high-quality products at the best prices. Role: Individual Contributor Role.
Skill Sets:
• Exposure to data modelling and ability to develop complex queries & procedures
• Manage and monitor performance, capacity and security of the database
• Continuously improve the performance of the database in alliance with the Database Adminstrator by optimizing jobs, procedures & queries
Desirable Skills:
➢ Candidates with experience in ecommerce / online retail is added value.
➢ Communication Skills – Excellent verbal and written communication skills with ability to explain ideas very clearly.
➢ Self–motivated and flexible, with an ability to work independently.
A proficient, independent contributor that assists in technical design, development, implementation, and support of data pipelines; beginning to invest in less-experienced engineers.
Responsibilities:
- Design, Create and maintain on premise and cloud based data integration pipelines.
- Assemble large, complex data sets that meet functional/non functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
- Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data pipelines to enable BI, Analytics and Data Science teams that assist them in building and optimizing their systems
- Assists in the onboarding, training and development of team members.
- Reviews code changes and pull requests for standardization and best practices
- Evolve existing development to be automated, scalable, resilient, self-serve platforms
- Assist the team in the design and requirements gathering for technical and non technical work to drive the direction of projects
Technical & Business Expertise:
-Hands on integration experience in SSIS/Mulesoft
- Hands on experience Azure Synapse
- Proven advanced level of writing database experience in SQL Server
- Proven advanced level of understanding about Data Lake
- Proven intermediate level of writing Python or similar programming language
- Intermediate understanding of Cloud Platforms (GCP)
- Intermediate understanding of Data Warehousing
- Advanced Understanding of Source Control (Github)
4 - 8 overall experience.
- 1-2 years’ experience in Azure Data Factory - schedule Jobs in Flows and ADF Pipelines, Performance Tuning, Error logging etc..
- 1+ years of experience with Power BI - designing and developing reports, dashboards, metrics and visualizations in Powe BI.
- (Required) Participate in video conferencing calls - daily stand-up meetings and all day working with team members on cloud migration planning, development, and support.
- Proficiency in relational database concepts & design using star, Azure Datawarehouse, and data vault.
- Requires 2-3 years of experience with SQL scripting (merge, joins, and stored procedures) and best practices.
- Knowledge on deploying and run SSIS packages in Azure.
- Knowledge of Azure Data Bricks.
- Ability to write and execute complex SQL queries and stored procedures.
- Proficient with SQL Server/T-SQL programming in creation and optimization of complex Stored Procedures, UDF, CTE and Triggers
- Overall Experience should be between 4 to 7 years
- Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts. Experience in SQL server, DW principles and SSIS.
- Should have strong experience in building data transformations with SSIS including importing data from files, and moving data from source to destination.
- Creating new SSIS packages or modifying existing SSIS packages using SQL server
- Debug and fine-tune SSIS processes to ensure accurate and efficient movement of data. Experience with ETL testing & data validation.
- 1+ years of experience with Azure services like Azure Data Factory, Data flow, Azure blob Storage, etc.
- 1+ years of experience with developing Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime.
- Must be able to build Business Intelligence solutions in a collaborative, agile development environment.
- Reporting experience with Power BI or SSRS is a plus.
- Experience working on an Agile/Scrum team preferred.
- Proven strong problem-solving skills, troubleshooting, and root cause analysis.
- Excellent written and verbal communication skills.
Job Profile
Purpose of Role: To support and maintain the existing and future applications managed by ABD (Agile Business Development) Team.
Main Responsibilities:
- To work on existing and future applications as required
- Work with ABD India Application Support and Business Support team to address production issues and/or to do development work
- Work with US ABD team to coordinate the on-going support and monthly release of C-Hub (Customer-Hub) application and other ABD Applications
- Report status to management on project progress
- Identify and escalate related issues and risks
- Coordinate with others in Development and Business Support team to implement new features of SQL Server/ MS Web based technologies
- Assess current modules and offer performance improvements techniques
- Make sure Database Design Standards/coding standards established by ABD are followed
- Constant communication with ABD leadership team in US.
- Problem Solving ability with “Can-do” attitude
- IT experience in Banking industry is preferred
Candidate Profile:
- Minimum 10 years of experience at least 3-5 years of experience in Corporate & Investment Banking area and Bachelor's degree in Engineering, Accounting, Finance, or equivalent
- Expertise with MS SQL Server development including SSIS and SSRS, Proficient in development of Stored procedure, Views, Functions etc in MS SQL Server
- Experienced with DBA kind tasks such as replication, linked server, query analyzer, DBCC commands, data transfer from one DB server to other DB Server etc
- Experienced in MS Excel VBA and macro scripting
- Knowledge of data Encryption in database
We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.
In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that
- Has healthcare experience and is passionate about helping heal people,
- Loves working with data,
- Has an obsessive focus on data quality,
- Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
- Has strong data interrogation and analysis skills,
- Defaults to written communication and delivers clean documentation, and,
- Enjoys working with customers and problem solving for them.
A day in the life at Innovaccer:
- Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
- Measure and communicate impact to our customers.
- Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.
What You Need:
- 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
- 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
- Intermediate to advanced level SQL programming skills.
- Data Analytics and Visualization (using tools like PowerBI)
- The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
- Ability to work in a fast-paced and agile environment.
- Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.
What we offer:
- Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
- Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
- Health benefits: We cover health insurance for you and your loved ones.
- Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
- Pet-friendly office and open floor plan: No boring cubicles.
About Company:
Working with a multitude of clients populating the FTSE and Fortune 500s, Audit Partnership is a people focused organization with a strong belief in our employees. We hire the best people to provide the best services to our clients.
APL offers profit recovery services to organizations of all sizes across a number of sectors. APL was borne out of a desire to offer an alternative from the stagnant service provision on offer in the profit recovery industry.
Every year we cover million of pounds for our clients and also work closely with them, sharing our audit findings to minimize future losses. Our dedicated and highly experienced audit teams utilize progressive & dynamic financial service solutions & industry leading technology to achieve maximum success.
We provide dynamic work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies within financial services.
Headquartered in the UK, we have expanded from a small team in 2002 to a market leading organization serving clients across the globe while keeping our clients at the heart of all decisions we make.
Job description:
We are looking for a high-potential, enthusiastic SQL Data Engineer with a strong desire to build a career in data analysis, database design and application solutions. Reporting directly to our UK based Technology team, you will provide support to our global operation in the delivery of data analysis, conversion, and application development to our core audit functions.
Duties will include assisting with data loads, using T-SQL to analyse data, front-end code changes, data housekeeping, data administration, and supporting the Data Services team as a whole. Your contribution will grow in line with your experience and skills, becoming increasingly involved in the core service functions and client delivery. A self-starter with a deep commitment to the highest standards of quality and customer service. We are offering a fantastic career opportunity, working for a leading international financial services organisation, serving the world’s largest organisations.
What we are looking for:
- 1-2 years of previous experience in a similar role
- Data analysis and conversion skills using Microsoft SQL Server is essential
- An understanding of relational database design and build
- Schema design, normalising data, indexing, query performance analysis
- Ability to analyse complex data to identify patterns and detect anomalies
- Assisting with ETL design and implementation projects
- Knowledge or experience in one or more of the key technologies below would be preferable:
- Microsoft SQL Server (SQL Server Management Studio, Stored Procedure writing etc)
- T-SQL
- Programming languages (C#, VB, Python etc)
- Use of Python to manipulate and import data
- Experience of ETL/automation advantageous but not essential (SSIS/Prefect/Azure)
- A self-starter who can drive projects with minimal guidance
- Meeting stakeholders to agree system requirements
- Someone who is enthusiastic and eager to learn
- Very good command of English and excellent communication skills
Perks & Benefits:
- A fantastic work life balance
- Competitive compensation and benefits
- Exposure of working with Fortune 500 organization
- Expert guidance and nurture from global leaders
- Opportunities for career and personal advancement with our continued global growth strategy
- Industry leading training programs
- A working environment that is exciting, fun and engaging
- Creating, designing and developing data models
- Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
- Validating results and creating business reports
- Monitoring and tuning data loads and queries
- Develop and prepare a schedule for a new data warehouse
- Analyze large databases and recommend appropriate optimization for the same
- Administer all requirements and design various functional specifications for data
- Provide support to the Software Development Life cycle
- Prepare various code designs and ensure efficient implementation of the same
- Evaluate all codes and ensure the quality of all project deliverables
- Monitor data warehouse work and provide subject matter expertise
- Hands-on BI practices, data structures, data modeling, SQL skills
Experience
Experience Range |
5 Years - 10 Years |
Function | Information Technology |
Desired Skills |
Must have Skills: SQL
Hard Skills for a Data Warehouse Developer:
Soft Skills for Data Warehouse Developers:
|
A global provider of Business Process Management company
Power BI Developer
Senior visualization engineer with 5 years’ experience in Power BI to develop and deliver solutions that enable delivery of information to audiences in support of key business processes. In addition, Hands-on experience on Azure data services like ADF and databricks is a must.
Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business, and technical counterparts.
Candidates should have worked in agile development environments.
Desired Competencies:
- Should have minimum of 3 years project experience using Power BI on Azure stack.
- Should have good understanding and working knowledge of Data Warehouse and Data Modelling.
- Good hands-on experience of Power BI
- Hands-on experience T-SQL/ DAX/ MDX/ SSIS
- Data Warehousing on SQL Server (preferably 2016)
- Experience in Azure Data Services – ADF, DataBricks & PySpark
- Manage own workload with minimum supervision.
- Take responsibility of projects or issues assigned to them
- Be personable, flexible and a team player
- Good written and verbal communications
- Have a strong personality who will be able to operate directly with users
The candidate will be responsible for working on .NET based projects for our clients in USA. The candidate must be self-motivated and quick learner. The candidate should also be able to communicate with the client on a regular basis and gather requirements from the client, provide updates. The work would be in ASP.NET / VB.NET and C# with SQL Server databases. The candidate will be responsible for maintaining existing web applications as well as implement new applications based on client requirements.
Required Experience, Skills and Qualifications
- Must have 3+ years experience in ASP.Net with C# and VB.NET
- Must have at least 2 years experience in SQL Server
- Must have experience working with SOAP and REST Web Services
- Must have experience with SSRS and SSIS
- Must have some experience with MVC framework and Angular JS
- Must be able to work with basic CSS and HTML
- Must be able to work with jQuery and Javascript
- Experience with Reporting Services, WCF, etc would be a strong positive
Ecperience- Minimum 3 years
Location - Ahmedabad
Primary Responsibilities:
- We need strong SQL database development skills using the MS SQL server.
- Strong skill in SQL server integration services (SSIS) for ETL development.
- Strong Experience in full life cycle database development project using SQL server.
- Experience in designing and implementing complex logical and physical data models.
- Exposure to web services & web technologies (Javascript, Jquery, CSS, HTML).
- Knowledge of other high-level languages (PERL, Python) will be an added advantage
- Nice to have SQL certification.
Good to have:
• Bachelor’s degree or a minimum of 3+ years of formal industry/professional experience in software development – Healthcare background preferred.MSBI Developer-
We have the following opening in our organization:
Years of Experience: Experience of 4-8 years.
Location- Mumbai ( Thane)/BKC/Andheri
Notice period: Max 15 days or Immediate
Educational Qualification: MCA/ME/Msc-IT/BE/B-Tech/BCA/BSC IT in Computer Science/B.Tech
Requirements:
- 3- 8 years of consulting or relevant work experience
- Should be good in SQL Server 2008 R2 and above.
- Should be excellent at SQL, SSRS & SSIS, SSAS,
- Data modeling, Fact & dimension design, work on a data warehouse or dw architecture design.
- Implementing new technology like power BI, power bi modeling.
- Knowledge of Azure or R-programming is an added advantage.
- Experiences in BI and Visualization Technology (Tableau, Power BI).
- Advanced T-SQL programming skill
- Can scope out a simple or semi-complex project based on business requirements and achievable benefits
- Evaluate, design, and implement enterprise IT-based business solutions, often working on-site to help customers deploy their solutions.
A sports-focused digital media agency
Office Location: Goregaon Mumbai
Position description:
4+ years of experience in database development.
Primary Responsibilities:
- Understand requirements from front end applications developers
- Write advanced queries, stored procedures, cursors, functions & triggers
- Conduct code reviews
- Work with high-traffic application servers
- Complete understanding of various DB like Postgresql, Mongo DB, MySQL
- Ability to identify root cause and mitigate issues
- Ability to lead a team of DB developers
Required Skills:
Mandatory: Passionate about sports, Problem solving, Team player, Target & Result oriented, Strong communication skills
Functional: Oracle SQL, PostgreSQL Development - TSQL, Advance Queries, Query Optimization, Stored Procedures, Triggers & Cursors, Database Design, Indexes, Joins, ETL tools like SSAS, SSIS, High Data Volume Processing, DB Backup/Recovery.
2. Assemble large, complex data sets that meet business requirements
3. Identify, design, and implement internal process improvements
4. Optimize data delivery and re-design infrastructure for greater scalability
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
6. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
7. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs
8. Create data tools for analytics and data scientist team members
Skills Required:
1. Working knowledge of ETL on any cloud (Azure / AWS / GCP)
2. Proficient in Python (Programming / Scripting)
3. Good understanding of any of the data warehousing concepts (Snowflake / AWS Redshift / Azure Synapse Analytics / Google Big Query / Hive)
4. In-depth understanding of principles of database structure
5. Good understanding of any of the ETL technologies (Informatica PowerCenter / AWS Glue / Data Factory / SSIS / Spark / Matillion / Talend / Azure)
6. Proficient in SQL (query solving)
7. Knowledge in Change case Management / Version Control – (VSS / DevOps / TFS / GitHub, Bit bucket, CICD Jenkin)
- Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
- Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
- Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
- Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
- Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
- Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree and Random forest Algorithms.
- PolyBase queries for exporting and importing data into Azure Data Lake.
- Building data models both tabular and multidimensional using SQL Server data tools.
- Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
- Programming experience using python libraries NumPy, Pandas and Matplotlib.
- Implementing NOSQL databases and writing queries using cypher.
- Designing end user visualizations using Power BI, QlikView and Tableau.
- Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
- Experience using the expression languages MDX and DAX.
- Experience in migrating on-premise SQL server database to Microsoft Azure.
- Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
- Performance tuning complex SQL queries, hands on experience using SQL Extended events.
- Data modeling using Power BI for Adhoc reporting.
- Raw data load automation using T-SQL and SSIS
- Expert in migrating existing on-premise database to SQL Azure.
- Experience in using U-SQL for Azure Data Lake Analytics.
- Hands on experience in generating SSRS reports using MDX.
- Experience in designing predictive models using Python and SQL Server.
- Developing machine learning models using Azure Databricks and SQL Server
Required Experience: 5 - 7 Years
Skills : ADF, Azure, SSIS, python
Job Description
Azure Data Engineer with hands on SSIS migrations and ADF expertise.
Roles & Responsibilities
•Overall, 6+ years’ experience in Cloud Data Engineering, with hands on experience in ADF (Azure Data Factory) is required.
Hands on experience with SSIS to ADF migration is preferred.
SQL Server Integration Services (SSIS) workloads to SSIS in ADF. ( Must have done at least one migration)
Hands on experience implementing Azure Data Factory frameworks, scheduling, and performance tuning.
Hands on experience in migrating SSIS solutions to ADF
Hands on experience in ADF coding side.
Hands on experience with MPP Database architecture
Hands on experience in python
Responsibilities
Understand business requirement and actively provide inputs from Data perspective.
Experience of SSIS development.
Experience in Migrating SSIS packages to Azure SSIS Integrated Runtime
Experience in Data Warehouse / Data mart development and migration
Good knowledge and Experience on Azure Data Factory
Expert level knowledge of SQL DB & Datawarehouse
Should know at least one programming language (python or PowerShell)
Should be able to analyse and understand complex data flows in SSIS.
Knowledge on Control-M
Knowledge of Azure data lake is required.
Excellent interpersonal/communication skills (both oral/written) with the ability to communicate
at various levels with clarity & precision.
Build simple to complex pipelines & dataflows.
Work with other Azure stack modules like Azure Data Lakes, SQL DW, etc.
Requirements
Bachelor’s degree in Computer Science, Computer Engineering, or relevant field.
A minimum of 5 years’ experience in a similar role.
Strong knowledge of database structure systems and data mining.
Excellent organizational and analytical abilities.
Outstanding problem solver.
Good written and verbal communication skills.
Must Have Skills:
- Solid Knowledge on DWH, ETL and Big Data Concepts
- Excellent SQL Skills (With knowledge of SQL Analytics Functions)
- Working Experience on any ETL tool i.e. SSIS / Informatica
- Working Experience on any Azure or AWS Big Data Tools.
- Experience on Implementing Data Jobs (Batch / Real time Streaming)
- Excellent written and verbal communication skills in English, Self-motivated with strong sense of ownership and Ready to learn new tools and technologies
Preferred Skills:
- Experience on Py-Spark / Spark SQL
- AWS Data Tools (AWS Glue, AWS Athena)
- Azure Data Tools (Azure Databricks, Azure Data Factory)
Other Skills:
- Knowledge about Azure Blob, Azure File Storage, AWS S3, Elastic Search / Redis Search
- Knowledge on domain/function (across pricing, promotions and assortment).
- Implementation Experience on Schema and Data Validator framework (Python / Java / SQL),
- Knowledge on DQS and MDM.
Key Responsibilities:
- Independently work on ETL / DWH / Big data Projects
- Gather and process raw data at scale.
- Design and develop data applications using selected tools and frameworks as required and requested.
- Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
- Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.
- Work closely with the engineering team to integrate your work into our production systems.
- Process unstructured data into a form suitable for analysis.
- Analyse processed data.
- Support business decisions with ad hoc analysis as needed.
- Monitoring data performance and modifying infrastructure as needed.
Responsibility: Smart Resource, having excellent communication skills
Roles & Responsibilities
Must Have
- Ability to design and develop database architectures
- Expert working knowledge and experience of DBMS
- Ability to write complex T-SQL queries
- Expert knowledge of Azure Data Factory (ADF)
- Ability to create and manage SSIS packages while managing a full ETL lifecycle
- Proficiency in data cleansing and reconciliation
- Ability to assist others in topics related to data management
- Ability to quickly investigate and troubleshoot any data/database issues
- Expert knowledge in MS SQL Database Server administration, performance tuning and maintenance experience.
Should Have
- Ability to work with end-users and project teams to analyze, document and create workflow processes
- Knowledge of SSRS (SQL Server Reporting Services)
- Attention to detail, critical thinking and problem solving skills.
- Excellent verbal/written communication skills and be a good team player.
- Azure SQL knowledge (SQL-as-a-service)
- Understanding of Agile Methodology.
- Working knowledge of Git.
Could Have
- DevOps knowledge (Infrastructure-as-a-code)
- Powershell scripting language
- Monitoring tools like Nagios, Azure tools.
job Description
Problem Formulation: Identifies possible options to address the business problems and must possess good understanding of dimension modelling
Must have worked on at least one end to end project using any Cloud Datawarehouse (Azure Synapses, AWS Redshift, Google Big query)
Good to have an understand of POWER BI and integration with any Cloud services like Azure or GCP
Experience of working with SQL Server, SSIS(Preferred)
Applied Business Acumen: Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues.
Data Transformation/Integration/Optimization:
The ETL developer is responsible for designing and creating the Data warehouse and all related data extraction, transformation and load of data function in the company
The developer should provide the oversight and planning of data models, database structural design and deployment and work closely with the data architect and Business analyst
Duties include working in a cross functional software development teams (Business analyst, Testers, Developers) following agile ceremonies and development practices.
The developer plays a key role in contributing to the design, evaluation, selection, implementation and support of databases solution.
Development and Testing: Develops codes for the required solution by determining the appropriate approach and leveraging business, technical, and data requirements.
Creates test cases to review and validate the proposed solution design. Work on POCs and deploy the software to production servers.
Good to Have (Preferred Skills):
- Minimum 4-8 Years of experience in Data warehouse design and development for large scale application
- Minimum 3 years of experience with star schema, dimensional modelling and extract transform load (ETL) Design and development
- Expertise working with various databases (SQL Server, Oracle)
- Experience developing Packages, Procedures, Views and triggers
- Nice to have Big data technologies
- The individual must have good written and oral communication skills.
- Nice to have SSIS
Education and Experience
- Minimum 4-8 years of software development experience
- Bachelor's and/or Master’s degree in computer science
Please revert back with below details.
Total Experience:
Relevant Experience:
Current CTC:
Expected CTC:
Any offers: Y/N
Notice Period:
Qualification:
DOB:
Present Company Name:
Designation:
Domain
Reason for job change:
Current Location:
NeoQuant Solutions Pvt Ltd
NeoQuant Solutions is a Software Application Development and Software Services company that uses cutting edge technologies in developing software. We provide software solutions and custom products in the domain of Insurance, Banking and Media sector that are known for the use of the latest technology. We believe in innovations and simplicity to solve the business problems of our clients and partners.
Website: https://neoquant.com/">https://neoquant.com/
MSBI Developer-
We have the following opening in our organization:
Years of Experience: Experience of 3 - 8 years
Location- Mumbai (thane, BKC, Andheri)
Educational Qualification: MCA/ME/Msc-IT/BE/B-Tech/BCA/BSC IT in Computer Science/B.Tech
Requirements:
- 3 - 8 years of consulting or relevant work experience
- Should be good in SQL Server 2008 R2 and above.
- Should be excellent at SQL, SSRS & SSIS, SSAS,
- Data modelling, Fact & dimension design, work on a data warehouse or dw architecture design.
- Implementing new technology like power BI, power bi modeling.
- Knowledge of Azure or R-programming is an added advantage.
- Can scope out a simple or semi-complex project based on business requirements and achievable benefits
- Evaluate, design, and implement enterprise IT-based business solutions, often working on-site to help customers deploy their solutions