15+ SQL Azure Jobs in Pune | SQL Azure Job openings in Pune
Apply to 15+ SQL Azure Jobs in Pune on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
KEY RESPONSIBILITIES
· Develop high-quality database solutions.
· Use T-SQL to develop and implement procedures and functions.
· Review and interpret ongoing business report requirements.
· Research required data.
· Build appropriate and useful reporting deliverables.
· Analyze existing SQL queries for performance improvements.
· Suggest new queries.
· Develop procedures and scripts for data migration.
· Provide timely scheduled management reporting.
· Investigate exceptions with regard to asset movements.
MUST-HAVES FOR THIS GIG
T-SQL, Stored Procedure, Functions, Triggers, XML Operations, JSON support on SQL 2016 and above SSIS, SSRS, CTE, EAV Data structure, Integration with NoSQL(MongoDB), SQL Server Indexes, Bulk Insert, BCP, CMD Shell ,Memory Optimization, Performance Tunning, Query Optimization, Database Designing, Table Joins, SQL Server Job agent
Backup and Maintenance plan ,Data Migration, Good Communication
NICE-TO-HAVES FOR THIS GIG:
- Working knowledge of mobile development activity.
- Working knowledge of web hosting solution on IIS7.
Experience working with an offshore –onsite development process
Microsoft Azure Data Integration Engineer
Job Description:
Microsoft Azure Data Integration Developer who will design and build cutting-edge user experiences for our client’s consumer-facing desktop application. The Senior Software Developer will work closely with product owners, UX designers, front-end, and back-end developers to help build the next-generation platform.
Key Skills:
- 3+ years of experience in an enterprise or consumer software development environment using C# and designing and supporting Azure environments
- Expert level programming skills in creating MVC and Microservices architecture
- Experience with modern frameworks and design patterns like .Net core
- Strong knowledge of C# language
- Hands-on experience using the Azure administration portal and iPaaS
- Demonstrable experience deploying enterprise workloads to Azure
- Hands-on experience in Azure function apps, app service, logic apps and storage, Azure Key vault integration, and Azure Sql database.
- Solid understanding of object-oriented programming.
- Experience developing user interfaces and customizing UI controls/components
· Microsoft Azure Certification, Business Continuity, or Disaster Recovery planning experience is a plus
Responsibilities:
- Architect, design & build a modern web application for consumers
- Explore configuring hybrid connectivity between on-premises environments and Azure, and how to monitor network performance to comply with service-level agreements.
- Collaborate with UI/UX teams to deliver high-performing and easy-to-use applications
- Participate in code reviews with staff as necessary to ensure a high-quality, performant product
- Develop a deep understanding of client goals and objectives, and articulate how your solutions address their needs
- Unit testing/test-driven development
- Integration testing
- Deploying Azure VMs (Windows Server) in a highly available environment
- Regularly reviewing existing systems and making recommendations for improvements.
- Maintenance
- Post-deployment production support and troubleshooting
Technical Expertise and Familiarity:
- Cloud Technologies: Azure, iPaaS
- Microsoft: .NET Core, ASP.NET, MVC, C#
- Frameworks/Technologies: MVC, Microservices, Web Services, REST API, Java Script, JQuery, CSS, Testing Frameworks
- IDEs: Visual Studio, VS Code
- Databases: MS SQL Server, Azure SQL
- Familiarity with Agile software development methodologies.
- Advanced knowledge of using Git source control system.
- Azure, AWS, and GCP certifications preferred.
EXPERIENCE: 5 – 12 years
LEVEL: Senior & Lead Software Engineers
JOB LOCATION: EPAM India Locations
Must Have Skills :
1. NET Full stack Developer (.NET, C#, JavaScript, Angular 4 & above, PostgreSQL)
2. Experience in Unit testing
3. Hands on experience with building RESTful Web APIs, micro services
4. Experience in Asynchronous programming
5. Good understanding of Authentication and Authorization models in Web APIs
6. Good at Data structure and Algorithms.
7. Experience with Entity framework.
8. Added advantage: Experience in Azure
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
Job Responsibilities:
- Technically sound in Dot Net technology. Good working knowledge & experience in Web API and SQL Server
- Should be able to carry out requirement analysis, design, coding unit testing and support to fix defects reported during QA, UAT phases and at GO Live times.
- Able to work alone or as part of a team with minimal or no supervision from Delivery leads.
- Good experience required in Azure stack of integration technology-Logic app, Azure Function, APIM and Application insights.
Must have skill
- Strong Web API development using ASP.Net Core, Logic app, azure functions, APIM
- Azure Functions
- Azure Logic App
- Azure APIM
- Azure ServiceBus
Desirable Skills
- Azure Event Grid/Hub
- Azure KeyVault
- Azure SQL – Knowledge on SQL query
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.
Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.
What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.
Qualifications & Experience relevant for the role
• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).
• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
- 5+ years of experience in a Data Engineering role on cloud environment
- Must have good experience in Scala/PySpark (preferably on data-bricks environment)
- Extensive experience with Transact-SQL.
- Experience in Data-bricks/Spark.
- Strong experience in Dataware house projects
- Expertise in database development projects with ETL processes.
- Manage and maintain data engineering pipelines
- Develop batch processing, streaming and integration solutions
- Experienced in building and operationalizing large-scale enterprise data solutions and applications
- Using one or more of Azure data and analytics services in combination with custom solutions
- Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
- In-depth understanding of data management (e. g. permissions, security, and monitoring).
- Cloud repositories for e.g. Azure GitHub, Git
- Experience in an agile environment (Prefer Azure DevOps).
Good to have
- Manage source data access security
- Automate Azure Data Factory pipelines
- Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
- Experience in implementing and maintaining CICD pipelines
- Power BI understanding, Delta Lake house architecture
- Knowledge of software development best practices.
- Excellent analytical and organization skills.
- Effective working in a team as well as working independently.
- Strong written and verbal communication skills.
- Expertise in database development projects and ETL processes.
With a leading Business Process Management (BPM) company
Job Summary
- Candidate will be responsible for providing full life-cycle development (design, coding, and testing) and maintenance of web-based system on Azure
- Candidate should have experience in GitHub, knowledge of DevOps is a plus
- Experienced in designing and implementing web portals, experience with DNN is must
- Ability to work with multiple languages including C#, ASP.Net, MVC, Javascript and related libraries, HTML, Complex SQL queries, CSS, BootStrap, JSON.
- Experience in Agile project management methodology
- Developing and Delivering Excellent Web based solutions/portals/sites based on customer’s requirement within the stipulated timeline
- The candidate should be flexible to learn new technology and platform and should be creative, innovative for improvement ideas, detail oriented, diligent, and eager to learn and grow
Duties and Responsibilities
- Understand business requirements to apply logic to integrate functionalities
- Identify and understand any technical bugs on the server, site, log files or modules and work on resolving the bugs
- Understand how FTP server is setup for the site
- Understand system/site technical requirements and suggest enhancements if applicable
- Designing, coding, unit Testing, and integration with Database
- Handle site deployment
- Designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.
Qualifications
Education / Certification
- B.E. / B.Tech. /MSC in Computer Science or IT.
- MCAD/MCSD/MSITP/MCPD
Technical Expertise
- ASP/ASP.NET/VB.NET/MVC/C#/SQL Server 2012+
- HTML, Javascript, Jquery, CSS, Bootstrap
- GitHub/DevOps, Azure
- Web API/ Web Services, Email Services
Skills and Abilities
- Be able to work with diverse global teams and in an individual contributor role as needed
- Excellent English written and verbal communication skills (for local team and global stakeholders/team members)
- Strong task management skills including time management, and ability to manage multiple projects simultaneously
- Flexibility required to attend late evening meetings with global team members
- Attention to detail and delivering quality projects and knowledge assets
Job Description
Build and maintain bots on Azure platform. Integration with Active directory, WEB API based integration with external systems. Training and Integrate bots as per users’ requirements. Work in line with design guidelines, best practices and standards of bot deliverable. Creative approach to the conversation flow design, human aspects in the bot responses and sentiments
Qualifications
- a) 5 years of experience in software development with clear understanding of the project life cycle
b) Min 2-3 years of hands-on experience in Microsoft Azure Bot Framework, LUIS and other Cognitive services offered by Azure
c) Hands on experience with Machine Learning based chat bots
d) Experience with Azure bot services like Text Analytics etc.
e)Strong database skills and hands-on experience on databases like SQL Server/Oracle - f) Strong experience on Azure Active directory and adaptive cards integration in Chat bot.
- g) Strong experience designing and working with
with service-oriented architectures (SOA) and WebAPIs.
- h) A strong experience on Microsoft Azure, ASPNET / MVC and programming languages such as C#/VBNET
- i) Knowledge of Python and NodeJS is a plus
- j) Ability to design and optimize SQL Server 2008 stored procedures.
- k) Experience with JQuery, CSS3, HTML5 or similar technologies.
Wewill disclose the client name after theinitial screening.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
- Experience with Azure Analysis Services
- Experience in Power BI
- Experience with third-party solutions like Attunity/Stream sets, Informatica
- Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
- Capacity Planning and Performance Tuning on Azure Stack and Spark.
* Take full ownership of product features to implement, provide bug fixes and write tests and tooling for those features to ensure they work well at cloud scale.
* Take pride of ownership in features that are used by users of Top 100 Global enterprises.
Required skills:
* C#/http://ASP.NET/Javascript/MVC">ASP.NET/Javascript/MVC/ Microsuft Azure guru with the ability to understand complex domain problems and churn out a PoC quickly.
* A great communicator of ideas and solutions and a lateral thinker when faced with complex performance or production issues.
* Hands-on on Microsoft Azure and have demonstrated skills in building complex applications on Azure will be added advantage
- Hands-on experience with various Microsoft technologies like .Net, ASP.Net, WCF, Web API, MVC SharePoint and/or Azure.
- Effective project and work-stream management.
- Strong understanding of Agile methodology and processes.
- Managing, mentoring and developing individuals into strong leads.
- Managing project effectively by defining milestones and schedules.
- Critically evaluating information gathered from multiple sources and reconciling conflicts.
- Interacting effectively with both technical and non-technical members on the team.
- Managing risks related to people, technology and processes.
- Ensuring adherence to standards and best practices.
Desired Profile:
- Minimum 10+ years of experience of shipping high quality software.
- Minimum 2 years’ experience of people management with proven track record of building and growing strong teams.
- Strong technical skills with computer science background or equivalent experience.
- Quick learner, ability to multi-task and work under pressure, and work with geographically distributed teams.
- Ability to influence management through presentations, written and verbal communication.
- Great leadership skills to work effectively with.
- Preferred candidates with valid US B1 visa and available to travel onsite at a very short notice.