11+ SQL Azure Jobs in Chennai | SQL Azure Job openings in Chennai
Apply to 11+ SQL Azure Jobs in Chennai on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.
at Delivery Solutions
About UPS:
Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPS’s India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics.
Job Summary:
- Applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software that provide business capabilities, solutions, and/or product suites. Provides systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery of technical solutions is on time and within budget.
- Researches and supports the integration of emerging technologies.
- Provides knowledge and support for applications’ development, integration, and maintenance.
- Develops program logic for new applications or analyzes and modifies logic in existing applications.
- Analyzes requirements, tests, and integrates application components.
- Ensures that system improvements are successfully implemented. May focus on web/internet applications specifically, using a variety of languages and platforms.
REQUIREMENTS
- Experience with Azure Data bricks, SQL, ETL – SSIS Packages – Very Critical.
- Azure Data Factory, Function Apps, DevOps – A must
- Experience with Azure and other cloud technologies
- Database – Oracle, SQL Server and COSMOS experience needed.
- Azure Services (key vault, app config, Blob storage, Redis cache, service bus, event grid, ADLS, App insight etc.)
- Knowledge of STRIIMs
Preffered skills: Microservices experience, preferred. Experience with Angular, .NET core– Not critical
Additional Information **This role will be in-office 3 days a week in Chennai, India **
EXPERIENCE: 5 – 12 years
LEVEL: Senior & Lead Software Engineers
JOB LOCATION: EPAM India Locations
Must Have Skills :
1. NET Full stack Developer (.NET, C#, JavaScript, Angular 4 & above, PostgreSQL)
2. Experience in Unit testing
3. Hands on experience with building RESTful Web APIs, micro services
4. Experience in Asynchronous programming
5. Good understanding of Authentication and Authorization models in Web APIs
6. Good at Data structure and Algorithms.
7. Experience with Entity framework.
8. Added advantage: Experience in Azure
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
- 5+ years of experience in a Data Engineering role on cloud environment
- Must have good experience in Scala/PySpark (preferably on data-bricks environment)
- Extensive experience with Transact-SQL.
- Experience in Data-bricks/Spark.
- Strong experience in Dataware house projects
- Expertise in database development projects with ETL processes.
- Manage and maintain data engineering pipelines
- Develop batch processing, streaming and integration solutions
- Experienced in building and operationalizing large-scale enterprise data solutions and applications
- Using one or more of Azure data and analytics services in combination with custom solutions
- Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
- In-depth understanding of data management (e. g. permissions, security, and monitoring).
- Cloud repositories for e.g. Azure GitHub, Git
- Experience in an agile environment (Prefer Azure DevOps).
Good to have
- Manage source data access security
- Automate Azure Data Factory pipelines
- Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
- Experience in implementing and maintaining CICD pipelines
- Power BI understanding, Delta Lake house architecture
- Knowledge of software development best practices.
- Excellent analytical and organization skills.
- Effective working in a team as well as working independently.
- Strong written and verbal communication skills.
- Expertise in database development projects and ETL processes.
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
An IT Services Major, hiring for a leading insurance player.
- Develop front end User Interface of our UKLB Experience Analysis (Biometrics) tool using SharePoint, PowerApps, Power Automate and Logic Apps
- Connection of UI with back end SQL Database using Logic Apps - Advise solution design as a Logic Apps expert (specifically around UI)
- Responsible for managing technology in projects and providing technical guidance or solutions for work completion
Skills and Experience
- Experience with Azure services like Azure App Services, Azure Active Directory, Azure SQL, Azure PostgreSQL, Key Vault, Azure DevOps, Application Insights, Azure Storage, Redis Cache
- Microsoft Azure Developer Certification
- Experience with .Net SDK, integration tools, Application and Security frameworks
- C#, ASP.NET, Application development using .Net Framework
- Preferred: Insurance or BFSI domain experience
- .Net
- Azure/
Location: Chennai
Salary : 15-20 LPA
An IT Services Major, hiring for a leading insurance player
Job Description:
- Develop front end User Interface of our UKLB Experience Analysis (Biometrics) tool using SharePoint, PowerApps, Power Automate, and Logic Apps
- Connection of UI with back end SQL Database using Logic Apps - Advise solution design as a Logic Apps expert (specifically around UI)
- Responsible for managing technology in projects and providing technical guidance or solutions for work completion
Skills and Experience
- Experience with Azure services like Azure App Services, Azure Active Directory, Azure SQL, Azure PostgreSQL, Key Vault, Azure DevOps, Application Insights, Azure Storage, Redis Cache
- Microsoft Azure Developer Certification
- Experience with .Net SDK, integration tools, Application and Security frameworks
- C#, ASP.NET, Application development using .Net Framework
- Preferred: Insurance or BFSI domain experience
- .Net
- Azure/
Location: Chennai
- Extensive experience in Javascript / NodeJS in the back end
- Front end frameworks such as Bootstrap, Pug, Jquery
- Experience in web frameworks like ExpressJS, Webpack
- Experience in Nginx, Redis, Apache Kafka and MQTT
- Experience with MongoDB
- Experience with Version Control Systems like Git / Mercurial
- Sound knowledge in Software engineering best practices
- Sound knowledge in RestFul API Design
- Working knowledge of Automated testing tools
- Experience in maintaining production servers (Optional)
- Experience with Azure DevOps (Optional)
- Experience in digital payments or financial services industry is a plus.
- Participation in the processes of strategic project-planning meetings.
- Be involved and participate in the overall application lifecycle.
- Collaborate with External Development Teams.
- Define and communicate technical and design requirements, understanding workflows and write code as per requirements.
- Develop functional and sustainable web applications with clean codes.
- Focus on coding and debugging.
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
- Minimum of 12 years of Experience with Informatica ETL, Database technologies.
- Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture.
- Develop solution in highly demanding environment and provide hands on guidance to other team members.
- Head complex ETL requirements and design.
- Implement an Informatica based ETL solution fulfilling stringent performance requirements.
- Collaborate with product development teams and senior designers to develop architectural requirements for the requirements.
- Assess requirements for completeness and accuracy.
- Determine if requirements are actionable for ETL team.
- Conduct impact assessment and determine size of effort based on requirements.
- Develop full SDLC project plans to implement ETL solution and identify resource requirements.
- Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements.
- Assist and verify design of solution and production of all design phase deliverables.
- Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
- Experience with Azure Analysis Services
- Experience in Power BI
- Experience with third-party solutions like Attunity/Stream sets, Informatica
- Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
- Capacity Planning and Performance Tuning on Azure Stack and Spark.