Cutshort logo
SQL Azure Jobs in Bangalore (Bengaluru)

23+ SQL Azure Jobs in Bangalore (Bengaluru) | SQL Azure Job openings in Bangalore (Bengaluru)

Apply to 23+ SQL Azure Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.

icon
Gipfel & Schnell Consultings Pvt Ltd
TanmayaKumar Pattanaik
Posted by TanmayaKumar Pattanaik
Bengaluru (Bangalore)
3 - 9 yrs
Best in industry
Data engineering
ADF
data factory
SQL Azure
databricks
+4 more

Data Engineer

 

Brief Posting Description:

This person will work independently or with a team of data engineers on cloud technology products, projects, and initiatives. Work with all customers, both internal and external, to make sure all data related features are implemented in each solution. Will collaborate with business partners and other technical teams across the organization as required to deliver proposed solutions.

 

Detailed Description:

·        Works with Scrum masters, product owners, and others to identify new features for digital products.

·        Works with IT leadership and business partners to design features for the cloud data platform.

·        Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution.

·        Maintains culture of open communication, collaboration, mutual respect and productive behaviors; participates in the hiring, training, and retention of top tier talent and mentors team members to new and fulfilling career experiences.

·        Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices.

·        Explores all technical options when considering solution, including homegrown coding, third-party sub-systems, enterprise platforms, and existing technology components.

·        Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support.

·        Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts.

·        Understands lifecycle of various technology sub-systems that comprise the enterprise data platform (i.e., version, release, roadmap), including current capabilities, compatibilities, limitations, and dependencies; understands and advises of optimal upgrade paths.

·        Establishes relationships with key IT, QA, and other corporate partners, and regularly communicates and collaborates accordingly while working on cross-functional projects or production issues.

 

 

 

 

Job Requirements:

 

EXPERIENCE:

2 years required; 3 - 5 years preferred experience in a data engineering role.

2 years required, 3 - 5 years preferred experience in Azure data services (Data Factory, Databricks, ADLS, Synapse, SQL DB, etc.)

 

EDUCATION:

Bachelor’s degree information technology, computer science, or data related field preferred

 

SKILLS/REQUIREMENTS:

Expertise working with databases and SQL.

Strong working knowledge of Azure Data Factory and Databricks

Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github preferred)

Strong working knowledge of cloud relational databases (Azure Synapse and Azure SQL preferred)

Familiarity with Agile delivery methodologies

Familiarity with NoSQL databases (such as CosmosDB) preferred.

Any experience with Python, DAX, Azure Logic Apps, Azure Functions, IoT technologies, PowerBI, Power Apps, SSIS, Informatica, Teradata, Oracle DB, and Snowflake preferred but not required.

Ability to multi-task and reprioritize in a dynamic environment.

Outstanding written and verbal communication skills

 

Working Environment:

General Office – Work is generally performed within an office environment, with standard office equipment. Lighting and temperature are adequate and there are no hazardous or unpleasant conditions caused by noise, dust, etc. 

 

physical requirements:                     

Work is generally sedentary in nature but may require standing and walking for up to 10% of the time. 

 

Mental requirements:

Employee required to organize and coordinate schedules.

Employee required to analyze and interpret complex data.

Employee required to problem-solve. 

Employee required to communicate with the public.

Read more
This opening is with an MNC

This opening is with an MNC

Agency job
via LK Consultants by Namita Agate
Bengaluru (Bangalore)
1 - 6 yrs
₹2L - ₹8L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+9 more

ROLE AND RESPONSIBILITIES

Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should

be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and

transform data into insights that drive business value, through use of data analytics, data visualization and data

modeling techniques.


QUALIFICATIONS AND EDUCATION REQUIREMENTS

Technical Bachelor’s Degree.

Non-Technical Degree holders should have 1+ years of relevant experience.

Read more
Epik Solutions
Sakshi Sarraf
Posted by Sakshi Sarraf
Bengaluru (Bangalore), Noida
5 - 10 yrs
₹7L - ₹28L / yr
skill iconPython
SQL
databricks
skill iconScala
Spark
+2 more

Job Description:


As an Azure Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Azure platform. You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Azure Databricks, Python, SQL, Azure Data Factory (ADF), PySpark, and Scala will be essential for performing the following key responsibilities:


Designing and developing data pipelines: You will design and implement scalable and efficient data pipelines using Azure Databricks, PySpark, and Scala. This includes data ingestion, data transformation, and data loading processes.


Data modeling and database design: You will design and implement data models to support efficient data storage, retrieval, and analysis. This may involve working with relational databases, data lakes, or other storage solutions on the Azure platform.


Data integration and orchestration: You will leverage Azure Data Factory (ADF) to orchestrate data integration workflows and manage data movement across various data sources and targets. This includes scheduling and monitoring data pipelines.


Data quality and governance: You will implement data quality checks, validation rules, and data governance processes to ensure data accuracy, consistency, and compliance with relevant regulations and standards.


Performance optimization: You will optimize data pipelines and queries to improve overall system performance and reduce processing time. This may involve tuning SQL queries, optimizing data transformation logic, and leveraging caching techniques.


Monitoring and troubleshooting: You will monitor data pipelines, identify performance bottlenecks, and troubleshoot issues related to data ingestion, processing, and transformation. You will work closely with cross-functional teams to resolve data-related problems.


Documentation and collaboration: You will document data pipelines, data flows, and data transformation processes. You will collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide data engineering support.


Skills and Qualifications:


Strong experience with Azure Databricks, Python, SQL, ADF, PySpark, and Scala.

Proficiency in designing and developing data pipelines and ETL processes.

Solid understanding of data modeling concepts and database design principles.

Familiarity with data integration and orchestration using Azure Data Factory.

Knowledge of data quality management and data governance practices.

Experience with performance tuning and optimization of data pipelines.

Strong problem-solving and troubleshooting skills related to data engineering.

Excellent collaboration and communication skills to work effectively in cross-functional teams.

Understanding of cloud computing principles and experience with Azure services.

Read more
EPAM

at EPAM

1 recruiter
Kranti Rao
Posted by Kranti Rao
Hyderabad, Bengaluru (Bangalore), Gurugram, Pune, Chennai
3 - 11 yrs
₹6L - ₹39L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconAngular (2+)
skill iconAngularJS (1.x)
ASP.NET
+4 more

EXPERIENCE: 5 – 12 years

LEVEL: Senior & Lead Software Engineers

JOB LOCATION: EPAM India Locations

 

Must Have Skills :

1.      NET Full stack Developer (.NET, C#, JavaScript, Angular 4 & above, PostgreSQL)

2.      Experience in Unit testing

3.      Hands on experience with building RESTful Web APIs, micro services

4.      Experience in Asynchronous programming

5.      Good understanding of Authentication and Authorization models in Web APIs

6.      Good at Data structure and Algorithms.

7.      Experience with Entity framework.

8.      Added advantage: Experience in Azure


Read more
Think n Solutions

at Think n Solutions

2 recruiters
TnS HR
Posted by TnS HR
Bengaluru (Bangalore)
2 - 12 yrs
Best in industry
Microsoft SQL Server
SQL Server Integration Services (SSIS)
SQL Server Reporting Services (SSRS)
skill iconAmazon Web Services (AWS)
SQL Azure
+9 more

Criteria:

  • BE/MTech/MCA/MSc
  • 3+yrs Hands on Experience in TSQL / PL SQL / PG SQL or NOSQL
  • Immediate joiners preferred*
  • Candidates will be selected based on logical/technical and scenario-based testing

 

Note: Candidates who have attended the interview process with TnS in the last 6 months will not be eligible.

 

Job Description:

 

  1. Technical Skills Desired:
    1. Experience in MS SQL Server and one of these Relational DB’s, PostgreSQL / AWS Aurora DB / MySQL / Oracle / NOSQL DBs (MongoDB / DynamoDB / DocumentDB) in an application development environment and eagerness to switch
    2. Design database tables, views, indexes
    3. Write functions and procedures for Middle Tier Development Team
    4. Work with any front-end developers in completing the database modules end to end (hands-on experience in parsing of JSON & XML in Stored Procedures would be an added advantage).
    5. Query Optimization for performance improvement
    6. Design & develop SSIS Packages or any other Transformation tools for ETL

 

  1. Functional Skills Desired:
    1. Banking / Insurance / Retail domain would be a
    2. Interaction with a client a

3.      Good to Have Skills:

  1. Knowledge in a Cloud Platform (AWS / Azure)
  2. Knowledge on version control system (SVN / Git)
  3. Exposure to Quality and Process Management
  4. Knowledge in Agile Methodology

 

  1. Soft skills: (additional)
    1. Team building (attitude to train, work along, mentor juniors)
    2. Communication skills (all kinds)
    3. Quality consciousness
    4. Analytical acumen to all business requirements

Think out-of-box for business solution
Read more
HL
Bengaluru (Bangalore)
6 - 15 yrs
₹1L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
• 8+ years of experience in developing Big Data applications
• Strong experience working with Big Data technologies like Spark (Scala/Java),
• Apache Solr, HIVE, HBase, ElasticSearch, MongoDB, Airflow, Oozie, etc.
• Experience working with Relational databases like MySQL, SQLServer, Oracle etc.
• Good understanding of large system architecture and design
• Experience working in AWS/Azure cloud environment is a plus
• Experience using Version Control tools such as Bitbucket/GIT code repository
• Experience using tools like Maven/Jenkins, JIRA
• Experience working in an Agile software delivery environment, with exposure to
continuous integration and continuous delivery tools
• Passionate about technology and delivering solutions to solve complex business
problems
• Great collaboration and interpersonal skills
• Ability to work with team members and lead by example in code, feature
development, and knowledge sharing
Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
skill iconJenkins
skill iconGitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Zyvka Global Services

at Zyvka Global Services

5 recruiters
Ridhima Sharma
Posted by Ridhima Sharma
Remote, Bengaluru (Bangalore)
5 - 12 yrs
₹1L - ₹30L / yr
Internet of Things (IOT)
skill iconJava
skill iconSpring Boot
SQL server
NOSQL Databases
+5 more
Lead Developer (IOT, Java, Azure)

Responsibilities

  • Design, plan and control the implementation of business solutions requests/demands.
  • Execution of best practices, design, and codification, guiding the rest of the team in accordance with it.
  • Gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements
  • Drive complex technical projects from planning through execution
  • Perform code review and manage technical debt
  • Handling release deployments and production issues
  • Coordinate stress tests, stability evaluations, and support for the concurrent processing of specific solutions
  • Participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews

Skills

  • Degree in Informatics Engineering, Computer Science, or in similar areas
  • Minimum of 5+ years’ work experience in the similar roles
  • Expert knowledge in developing cloud-based applications with Java, Spring Boot, Spring Rest, SpringJPA, and Spring Cloud
  • Strong understanding of Azure Data Services
  • Strong working knowledge of SQL Server, SQL Azure Database, No SQL, Data Modeling, Azure AD, ADFS, Identity & Access Management.
  • Hands-on experience in ThingWorx platform (Application development, Mashups creation, Installation of ThingWorx and ThingWorx components)
  • Strong knowledge of IoT Platform
  • Development experience in Microservices Architectures best practices and, Docker, Kubernetes
  • Experience designing /maintaining/tuning high-performance code to ensure optimal performance
  • Strong knowledge of web security practice
  • Experience working in Agile Development
  • Knowledge about Google CloudPlatform and Kubernetes
  • Good understanding of Git, source control procedures, and feature branching
  • Fluent in English - written and spoken (mandatory)
Read more
Renowned
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
Data Structures
SQL Azure
ADF
  • Azure Data Factory, Azure Data Bricks, Talend, BODS, Jenkins
  • Microsoft Office (mandatory)
  • Strong knowledge on Databases, Azure Synapse, data management, SQL
  • Knowledge on any cloud platforms (Azure, AWS etc.,)
Advanced Excel
Read more
Renowned
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
Data Structures
SQL Azure
ADF
  • Azure Data Factory, Azure Data Bricks, Talend, BODS, Jenkins
  • Microsoft Office (mandatory)
  • Strong knowledge on Databases, Azure Synapse, data management, SQL
  • Knowledge on any cloud platforms (Azure, AWS etc.,)
Advanced Excel
Read more
With a leading Business Process Management (BPM) company

With a leading Business Process Management (BPM) company

Agency job
via Jobdost by Saida Jabbar
Pune, Mumbai, Bengaluru (Bangalore)
2 - 7 yrs
₹5L - ₹16L / yr
DNN
DotNetNuke
ASP.NET MVC
skill iconGitHub
DevOps
+13 more

Job Summary

  • Candidate will be responsible for providing full life-cycle development (design, coding, and testing) and maintenance of web-based system on Azure
  • Candidate should have experience in GitHub, knowledge of DevOps is a plus
  • Experienced in designing and implementing web portals, experience with DNN is must
  • Ability to work with multiple languages including C#, ASP.Net, MVC, Javascript and related libraries, HTML, Complex SQL queries, CSS, BootStrap, JSON.
  • Experience in Agile project management methodology
  • Developing and Delivering Excellent Web based solutions/portals/sites based on customer’s requirement within the stipulated timeline
  • The candidate should be flexible to learn new technology and platform and should be creative, innovative for improvement ideas, detail oriented, diligent, and eager to learn and grow

Duties and Responsibilities

  • Understand business requirements to apply logic to integrate functionalities
  • Identify and understand any technical bugs on the server, site, log files or modules and work on resolving the bugs
  • Understand how FTP server is setup for the site
  • Understand system/site technical requirements and suggest enhancements if applicable
  • Designing, coding, unit Testing, and integration with Database
  • Handle site deployment
  • Designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.

Qualifications

Education / Certification

  • B.E. / B.Tech. /MSC in Computer Science or IT.
  • MCAD/MCSD/MSITP/MCPD

Technical Expertise

  • ASP/ASP.NET/VB.NET/MVC/C#/SQL Server 2012+
  • HTML, Javascript, Jquery, CSS, Bootstrap
  • GitHub/DevOps, Azure
  • Web API/ Web Services, Email Services

Skills and Abilities

  • Be able to work with diverse global teams and in an individual contributor role as needed
  • Excellent English written and verbal communication skills (for local team and global stakeholders/team members)
  • Strong task management skills including time management, and ability to manage multiple projects simultaneously
  • Flexibility required to attend late evening meetings with global team members
  • Attention to detail and delivering quality projects and knowledge assets

 

Read more
A Reputed Company

A Reputed Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), Pune
2 - 7 yrs
₹8L - ₹10L / yr
skill icon.NET
ASP.NET
SQL Azure
skill iconC#
skill iconPython
+4 more

Job Description

Build and maintain bots on Azure platform.  Integration with Active directory, WEB API based integration with external systems. Training and Integrate bots as per users’ requirements. Work in line with design guidelines, best practices and standards of bot deliverable. Creative approach to the conversation flow design, human aspects in the bot responses and sentiments

Qualifications

  1. a) 5 years of experience in software development with clear understanding of the project life cycle
    b) Min 2-3 years of hands-on experience in Microsoft Azure Bot Framework, LUIS and other Cognitive services offered by Azure
    c) Hands on experience with Machine Learning based chat bots
    d) Experience with Azure bot services like Text Analytics etc.
    e)Strong database skills and hands-on experience on databases like SQL Server/Oracle
  2. f) Strong experience on Azure Active directory and adaptive cards integration in Chat bot.
  3. g) Strong experience designing and working with

with service-oriented architectures (SOA) and WebAPIs.

  1. h) A strong experience on Microsoft Azure, ASPNET / MVC and programming languages such as C#/VBNET
  2. i) Knowledge of Python and NodeJS is a plus
  3. j) Ability to design and optimize SQL Server 2008 stored procedures.
  4. k) Experience with JQuery, CSS3, HTML5 or similar technologies.
l) Ability to adapt quickly to an existing, complex environment
Read more
A Top level 5 Services Company

A Top level 5 Services Company

Agency job
Bengaluru (Bangalore)
3 - 7 yrs
₹3L - ₹20L / yr
DevOps
Windows Azure
Linux/Unix
Microsoft Windows Azure
SQL Azure
+5 more
  • 4+ years of experience in IT and infrastructure

  • 2+ years of experience in Azure Devops

  • Experience with Azure DevOps using both as CI / CD tool and Agile framework

  • Practical experience building and maintaining automated operational infrastructure

  • Experience in building React or Angular applications, .NET is must.

  • Practical experience using version control systems with Azure Repo

  • Developed and maintained scripts using Power Shell, ARM templates/ Terraform scripts for Infrastructure as a Code.

  • Experience in Linux shell scripting (Ubuntu) is must

  • Hands on experience with release automation, configuration and debugging.

  • Should have good knowledge of branching and merging

  • Integration of tools like static code analysis tools like SonarCube and Snky or static code analyser tools is a must.

Read more
"A Product Startup"

"A Product Startup"

Agency job
Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹20L / yr
Windows Azure
Microsoft Windows Azure
DevOps
Terraform
Solution architecture
+5 more

Senior Cloud Engineer / Jr. Cloud Solutions Architect

 

Roles and Responsibilities

  • Define, implement, deploy and maintain development, QA & production environments for cloud-based Azure architecture.

  • Create a strategy for establishing a secure and well-managed enterprise environment in Azure

  • Define and implement security architecture for production, ensure data security at all levels.

  • Provision Infrastructure as code using Azure CLI Powershell ARM templates and or Terraform with Ansible or other tools.

  • Develop scripts to automate the deployment of resource stacks and associated configurations

  • Extend MLP standard systems management processes into the cloud including change, incident, and problem management

  • Establish and implement monitoring and management infrastructure for both availability and performance management

  • Implement observability patterns using Azure Monitor Azure Application Insights and Log Analytics Workspace.

  • Provide internal training to the team.

 

Primary Skills/Requirements

  • 5+ years of experience in IT and infrastructure

  • 3+ years of experience in Azure design, support and management for a large-scale organization

  • Experience in design and implementation of high availability architecture.

  • Strong experience in Azure CLI Powershell and ARM Templates Terraform.

  • Strong understanding of IT Security and related audits

  • Experience with deploying applications on Linux - Ubuntu

  • Should know Azure offerings (Storage, OS instances, Availability zones, DR, Load balancers, VPN tunnel, Application Gateway, etc.)Cloud monitoring Experience with Azure Log Analytics Azure Monitor.

  • Experience with log collection tools and analysis, as well as infrastructure performance monitoring tools and optimization practices

  • Microsoft Azure Certification MCSE: Cloud Platform and Infrastructure or equivalent certification would be an added advantage

  • Experience with Postgres SQL Database

Behavioural

  • Positive work ethics

  • Ability to adapt to dynamic environment

  • Time Management

  • Team Player

  • Communication skills

  • Ability to work independently

Read more
Dailyhunt

at Dailyhunt

4 recruiters
Agency job
via zyoin by RAKESH RANJAN
Bengaluru (Bangalore)
6 - 8 yrs
₹25L - ₹50L / yr
DBA
skill iconAmazon Web Services (AWS)
SQL Azure
Duties & Responsibilities
1. Implement high-quality cloud architectures that meet customer
requirements and are consistent with enterprise architectural standards.
2. Deep understanding of cloud computing technologies, business
drivers and emerging computing trends.
3. The ideal resource will have experience across enterprise-grade hybrid
cloud or data centre transformation.
4. Install, configure and upgrade MySQL/Postgres cluster database
software.
5. Experience in setting-up DR for RDBMS DBs in Linux environments.
6. Create, configure, manage and migrate NoSQL databases (Redis,
Cassandra and MongoDB).
7. Manage day to day operations from development to production
databases.
8. Monitor the health of cloud services and databases.
9. Good understanding of NoSQL/relational databases.
10. Troubleshoot NoSQL/RDBMS databases general/performance
issues.
11. Experience in Linux OS and scripting.
12. Hands on experience on Azure, GCP and AWS clouds and its services.
13. Knowledge in python/ansible is an added advantage.
14. Leveraging open source technologies and cloud native hosting
platforms.4
15. Design and recommend suitable, secure, performance optimised
database offerings based on business requirements.
16. Ensure security considerations are at the forefront when designing
and managing database solutions.
17. Maintenance work to be planned meticulously to minimise/eradicate
self-inflicted P1 outages.
18. Ability to provide technical system solutions, determine overall
design direction and provide hardware recommendations for complex
technical issues.
19. Provisioning, deployment, monitoring cloud environment using
automation tools like Terraform.
20. Ensure all key databases have deep insight monitoring enabled to
enable improved capabilities around fault detection.

Required Qualifications:
• Minimum 6-8 years of experience as an Database Administrator
preferably in a cloud environment.
• Application migration experience that involves migrating large
scale infra between clouds.
• Experience in executing migration cutover activities. Support UAT,
troubleshoot during and post migration issues
• Ability to work independently on multiple tasks with minimal
guidance Ability and desire to work in a fast paced environment5
• Contribute to overall capacity planning and configuration
management of the supporting infrastructure
• Review recommendations around Security, Availability,
Performance, and from Cloud platform
• Ability to remain flexible in a demanding work environment and
adapt to rapidly changing priorities.
Read more
Wewill disclose the client name after theinitial screening.

Wewill disclose the client name after theinitial screening.

Agency job
via S3B Global by Rattan Saini
Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Pune, Mumbai
9 - 20 yrs
₹10L - ₹40L / yr
Windows Azure
Azure Synapse
Data Structures
SQL Azure
QA DB
+1 more

Job title: Azure Architect

Locations: Noida, Pune, Bangalore and Mumbai

 

Responsibilities:

  • Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
  • Design and Develop the Data lake, Data warehouse using Azure Cloud Services
  • Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
  • Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
  • Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
  • Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
  • Support internal presentations to technical and business teams
  • Provide technical guidance, mentoring and code review, design level technical best practices

 

Experience Needed:

  • 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
  • Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
  • Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
  • Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
  • Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
  • Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
  • Worked with transactional, temporal, time series, and structured and unstructured data.
  • Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).

 

 

Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python

Read more
IT Giant

IT Giant

Agency job
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
10 - 18 yrs
₹15L - ₹30L / yr
ETL
Informatica
Informatica PowerCenter
Windows Azure
SQL Azure
+2 more
Key skills:
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake

Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Read more
A Chemical & Purifier Company headquartered in the US.

A Chemical & Purifier Company headquartered in the US.

Agency job
via Multi Recruit by Fiona RKS
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹18L / yr
Azure data factory
Azure Data factory
Azure Data Engineer
SQL
SQL Azure
+2 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.


Basic Qualifications
  • 2+ years of experience in a Data Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with data pipeline and workflow management tools
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad, Chennai, Mumbai, Pune
8 - 15 yrs
₹16L - ₹28L / yr
PySpark
SQL Azure
azure synapse
Windows Azure
Azure Data Engineer
+3 more
Technology Skills:
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Priyanka U
Posted by Priyanka U
Bengaluru (Bangalore)
8 - 10 yrs
₹16L - ₹28L / yr
SQL Azure
Azure synapse
Azure
Azure Data Architect
Spark
+4 more
Technology Skills:
 
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
 
 
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
Microsoft Windows Azure
SQL Azure
skill iconData Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
Pramati Technologies

at Pramati Technologies

3 recruiters
Kishore Reddy
Posted by Kishore Reddy
Bengaluru (Bangalore)
4 - 8 yrs
₹7L - ₹15L / yr
DBA
Microsoft SQL Server DBA
SQL Azure
skill iconMongoDB
SQL DBA
+1 more
  1. Experience: 3-5 Years
  2. Scripting: PowerShell and either of ( JavaScript, Python)
  3. Kubernetes and docker Hands-On   
  4. Good to have either (Azure / AWS ) 
  5. Any of DB technologies Hands-ON: No SQL - Admin, COSMOS DB, MONGO DB, Maria DBA  
  6. Good to have Analytics knowledge
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort