15+ Windows Azure Jobs in Ahmedabad | Windows Azure Job openings in Ahmedabad
Apply to 15+ Windows Azure Jobs in Ahmedabad on CutShort.io. Explore the latest Windows Azure Job opportunities across top companies like Google, Amazon & Adobe.
Job Description: Data Engineer
Location: Ahmedabad
Experience: 5 to 6 years
Employment Type: Full-Time
We are looking for a highly motivated and experienced Data Engineer to join our team. As a Data Engineer, you will play a critical role in designing, building, and optimizing data pipelines that ensure the availability, reliability, and performance of our data infrastructure. You will collaborate closely with data scientists, analysts, and cross-functional teams to provide timely and efficient data solutions.
Responsibilities
● Design and optimize data pipelines for various data sources
● Design and implement efficient data storage and retrieval mechanisms
● Develop data modelling solutions and data validation mechanisms
● Troubleshoot data-related issues and recommend process improvements
● Collaborate with data scientists and stakeholders to provide data-driven insights and solutions
● Coach and mentor junior data engineers in the team
Skills Required:
● Minimum 4 years of experience in data engineering or related field
● Proficient in designing and optimizing data pipelines and data modeling
● Strong programming expertise in Python
● Hands-on experience with big data technologies such as Hadoop, Spark, and Hive
● Extensive experience with cloud data services such as AWS, Azure, and GCP
● Advanced knowledge of database technologies like SQL, NoSQL, and data warehousing
● Knowledge of distributed computing and storage systems
● Familiarity with DevOps practices and power automate and Microsoft Fabric will be an added advantage
● Strong analytical and problem-solving skills with outstanding communication and collaboration abilities
Qualifications
- Bachelor's degree in Computer Science, Data Science, or a Computer related field
About Kanerika:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Role Responsibilities:
Following are high level responsibilities that you will play but not limited to:
- Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
- Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
- Enable business analytics and self-service reporting through Power BI and other visualization tools.
- Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
- Implement and enforce best practices for data governance, data quality, and security.
- Mentor and guide junior data engineers; establish coding and design standards.
- Evaluate emerging technologies and tools to continuously improve the data ecosystem.
Required Qualifications:
- Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
- Bachelor’s/Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 5-7 years of experience in data engineering or data platform development, with at least 2–3 years in a lead or architect role.
- Strong hands-on experience in one or more of the following:
- Microsoft Fabric (Data Factory, Lakehouse, Data Warehouse)
- Databricks (Spark, Delta Lake, PySpark, MLflow)
- Snowflake (Data Warehousing, Snowpipe, Performance Optimization)
- Power BI (Data Modeling, DAX, Report Development)
- Proficiency in SQL and programming languages like Python or Scala.
- Experience with Azure, AWS, or GCP cloud data services.
- Solid understanding of data modeling, data governance, security, and CI/CD practices.
Preferred Qualifications:
- Familiarity with data modeling techniques and practices for Power BI.
- Knowledge of Azure Databricks or other data processing frameworks.
- Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Who we are :
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
What You Will Do :
As a Data Governance Developer at Kanerika, you will be responsible for building and managing metadata, lineage, and compliance frameworks across the organizations data ecosystem.
Required Qualifications :
- 4 to 6 years of experience in data governance or data management.
- Strong experience in Microsoft Purview and Informatica governance tools.
- Proficient in tracking and visualizing data lineage across systems.
- Familiar with Azure Data Factory, Talend, dbt, and other integration tools.
- Understanding of data regulations : GDPR, CCPA, SOX, HIPAA.
- Ability to translate technical data governance concepts for business stakeholders.
Tools & Technologies :
- Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog
- Experience in Microsoft Purview areas :
1. Label creation and policy management
2. Publish/Auto-labeling
3. Data Loss Prevention & Compliance handling
4. Compliance Manager, Communication Compliance, Insider Risk Management
5. Records Management, Unified Catalog, Information Barriers
6. eDiscovery, Data Map, Lifecycle Management, Compliance Alerts, Audit
7. DSPM, Data Policy
Key Responsibilities :
- Set up and manage Microsoft Purview accounts, collections, and access controls (RBAC).
- Integrate Purview with data sources : Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake.
- Schedule and monitor metadata scanning and classification jobs.
- Implement and maintain collection hierarchies aligned with data ownership.
- Design metadata ingestion workflows for technical, business, and operational metadata.
- Enrich data assets with business context : descriptions, glossary terms, tags.
- Synchronize metadata across tools using REST APIs, PowerShell, or ADF.
- Validate end-to-end lineage for datasets and reports (ADF ? Synapse ? Power BI).
- Resolve lineage gaps or failures using mapping corrections or scripts.
- Perform impact analysis to support downstream data consumers.
- Create custom classification rules for sensitive data (PII, PCI, PHI).
- Apply and manage Microsoft Purview sensitivity labels and policies.
- Integrate with Microsoft Information Protection (MIP) for DLP.
- Manage business glossary in collaboration with domain owners and stewards.
- Implement approval workflows and term governance.
- Conduct audits for glossary and metadata quality and consistency.
- Automate Purview operations using :
- PowerShell, Azure Functions, Logic Apps, REST APIs
- Build pipelines for dynamic source registration and scanning.
- Automate tagging, lineage, and glossary term mapping.
- Enable operational insights using Power BI, Synapse Link, Azure Monitor, and governance APIs.
Job Position: Lead II - Software Engineering
Domain: Information technology (IT)
Location: India - Thiruvananthapuram
Salary: Best in Industry
Job Positions: 1
Experience: 8 - 12 Years
Skills: .Net, Sql Azure, Rest Api, Vue.Js
Notice Period: Immediate – 30 Days
Job Summary:
We are looking for a highly skilled Senior .NET Developer with a minimum of 7 years of experience across the full software development lifecycle, including post-live support. The ideal candidate will have a strong background in .NET backend API development, Agile methodologies, and Cloud infrastructure (preferably Azure). You will play a key role in solution design, development, DevOps pipeline enhancement, and mentoring junior engineers.
Key Responsibilities:
- Design, develop, and maintain scalable and secure .NET backend APIs.
- Collaborate with product owners and stakeholders to understand requirements and translate them into technical solutions.
- Lead and contribute to Agile software delivery processes (Scrum, Kanban).
- Develop and improve CI/CD pipelines and support release cadence targets, using Infrastructure as Code tools (e.g., Terraform).
- Provide post-live support, troubleshooting, and issue resolution as part of full lifecycle responsibilities.
- Implement unit and integration testing to ensure code quality and system stability.
- Work closely with DevOps and cloud engineering teams to manage deployments on Azure (Web Apps, Container Apps, Functions, SQL).
- Contribute to front-end components when necessary, leveraging HTML, CSS, and JavaScript UI frameworks.
- Mentor and coach engineers within a co-located or distributed team environment.
- Maintain best practices in code versioning, testing, and documentation.
Mandatory Skills:
- 7+ years of .NET development experience, including API design and development
- Strong experience with Azure Cloud services, including:
- Web/Container Apps
- Azure Functions
- Azure SQL Server
- Solid understanding of Agile development methodologies (Scrum/Kanban)
- Experience in CI/CD pipeline design and implementation
- Proficient in Infrastructure as Code (IaC) – preferably Terraform
- Strong knowledge of RESTful services and JSON-based APIs
- Experience with unit and integration testing techniques
- Source control using Git
- Strong understanding of HTML, CSS, and cross-browser compatibility
Good-to-Have Skills:
- Experience with Kubernetes and Docker
- Knowledge of JavaScript UI frameworks, ideally Vue.js
- Familiarity with JIRA and Agile project tracking tools
- Exposure to Database as a Service (DBaaS) and Platform as a Service (PaaS) concepts
- Experience mentoring or coaching junior developers
- Strong problem-solving and communication skills
Job Role: Sr. Data Engineer
Location: Navrangpura, Ahmedabad
WORK FROM OFFICE - 5 DAYS A WEEK (UK Shift)
Job Description:
• 5+ years of core experience in python & Data Engineering.
• Must have experience with Azure Data factory and Databricks.
• Exposed to python-oriented Algorithm’s libraries such as NumPy, pandas, beautiful soup, Selenium, pdfplumber, Requests etc.
• Proficient in SQL programming.
• Knowledge on DevOps like CI/CD, Jenkins, Git.
• Experience working with Azure Databricks.
• Able to co-ordinate with Teams across multiple locations and time zones
• Strong interpersonal and communication skills with an ability to lead a team and keep them motivated.
Mandatory Skills : Data Engineer - Azure Data factory, Databricks, Python, SQL/MySQL/PostgreSQ
About Apexon:
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. For over 17 years, Apexon has been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving our clients’ toughest technology problems, and a commitment to continuous improvement. We focus on three broad areas of digital services: User Experience (UI/UX, Commerce); Engineering (QE/Automation, Cloud, Product/Platform); and Data (Foundation, Analytics, and AI/ML), and have deep expertise in BFSI, healthcare, and life sciences.
Apexon is backed by Goldman Sachs Asset Management and Everstone Capital.
To know more about us please visit: https://www.apexon.com/" target="_blank">https://www.apexon.com/
Responsibilities:
- C# Automation engineer with 4-6 years of experience to join our engineering team and help us develop and maintain various software/utilities products.
- Good object-oriented programming concepts and practical knowledge.
- Strong programming skills in C# are required.
- Good knowledge of C# Automation is preferred.
- Good to have experience with the Robot framework.
- Must have knowledge of API (REST APIs), and database (SQL) with the ability to write efficient queries.
- Good to have knowledge of Azure cloud.
- Take end-to-end ownership of test automation development, execution and delivery.
Good to have:
- Experience in tools like SharePoint, Azure DevOps
.
Other skills:
- Strong analytical & logical thinking skills. Ability to think and act rationally when faced with challenges.
• Ideal candidate should have knowledge on new automation tools and guide others on the same.
• Experience in developing BDD and TDD automation framework.
• Hands-on experience in test automation, performance testing and API automation testing.
• Proficiency in at least one programming language, preferably Java, JavaScript/TypeScript.
• Experience with cloud (AWS/Azure) and Continuous Integration CI/CD using Jenkins.
• Knowledge of Agile methodologies and coordination with onsite/offshore teams.
• Strong verbal and written communication skills.
• Expertise in test automation frameworks like TestNG or JUnit.
• Experience with tools, languages, and databases such as Jenkins, SQL, and Selenium.
• Ability to think outside the box and come up with creative solutions.
• Quick learning capability for new tools and technologies.
• Designing the initial test automation architecture for applications that cross multiple platforms and
technologies.
• Analysis of test results and reporting on the quality and effectiveness of the automation testing process.
Company Name: Petpooja!
Location: Ahmedabad
Designation: DevOps Engineer
Experience: Between 2 to 7 Years
Candidates from Ahmedabad will be preferred
Job Location: Ahmedabad
Job Responsibilities: - -
- Planned, implement, and maintain the software development infrastructure.
- Introduce and oversee software development automation across cloud providers like AWS and Azure
- Help develop, manage, and monitor continuous integration and delivery systems
- Collaborate with software developers, QA specialists, and other team members to ensure the timely and successful delivery of new software releases
- Contribute to software design and development, including code review and feedback
- Assist with troubleshooting and problem-solving when issues arise
- Keep up with the latest industry trends and best practices while ensuring the company meets configuration requirements
- Participate in team improvement initiatives
- Help create and maintain internal documentation using Git or other similar applications
- Provide on-call support as needed
Qualification Required:
1. You should have Experience handling various services on the AWS cloud.
2. Previous experience as a Site reliability engineer would be an advantage.
3. You will be well versed with various commands and hands-on with Linux, Ubuntu administration, and other aspects of the Software development team requirement.
4. At least 2 to 7 years of experience with managing AWS Services such as Auto Scaling, Route 53, and various other internal networks.
5. Would recommend if having an AWS Certification.

consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
Job Responsibilities:
- Technically sound in Dot Net technology. Good working knowledge & experience in Web API and SQL Server
- Should be able to carry out requirement analysis, design, coding unit testing and support to fix defects reported during QA, UAT phases and at GO Live times.
- Able to work alone or as part of a team with minimal or no supervision from Delivery leads.
- Good experience required in Azure stack of integration technology-Logic app, Azure Function, APIM and Application insights.
Must have skill
- Strong Web API development using ASP.Net Core, Logic app, azure functions, APIM
- Azure Functions
- Azure Logic App
- Azure APIM
- Azure ServiceBus
Desirable Skills
- Azure Event Grid/Hub
- Azure KeyVault
- Azure SQL – Knowledge on SQL query
● Improve CI/CD tooling using gitlab.
● Implement and improve monitoring and alerting.
● Build and maintain highly available systems.
● Implement the CI pipeline.
● Implement and maintain monitoring stacks.
● Lead and guide the team in identifying and implementing new technologies.
● Implement and own the CI.
● Manage CD tooling.
● Implement and maintain monitoring and alerting.
● Build and maintain highly available production systems.
Skills
● Configuration Management experience such as Kubernetes, Ansible or similar.
● Managing production infrastructure with Terraform, CloudFormation, etc.
● Strong Linux, system administration background.
● Ability to present and communicate the architecture in a visual form. Strong knowledge of AWS,
Azure, GCP.
Skills We Require:- Dev Ops, AWS Admin, terraform, Infrastructure as a Code
SUMMARY:-
- Implement integrations requested by customers
- Deploy updates and fixes
- Provide Level 2 technical support
- Build tools to reduce occurrences of errors and improve customer experience
- Develop software to integrate with internal back-end systems
- Perform root cause analysis for production errors
- Investigate and resolve technical issues
- Develop scripts to automate visualization
- Design procedures for system troubleshooting and maintenance
Have good hands on experience on Dev Ops, AWS Admin, terraform, Infrastructure as a Code
Have knowledge on EC2, Lambda, S3, ELB, VPC, IAM, Cloud Watch, Centos, Server Hardening
Ability to understand business requirements and translate them into technical requirements
A knack for benchmarking and optimizationIntuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East.
Intuitive is looking for highly talented hands-on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.
Job Description :
- Extensive exp. with K8s (EKS/GKE) and k8s eco-system tooling e,g., Prometheus, ArgoCD, Grafana, Istio etc.
- Extensive AWS/GCP Core Infrastructure skills
- Infrastructure/ IAC Automation, Integration - Terraform
- Kubernetes resources engineering and management
- Experience with DevOps tools, CICD pipelines and release management
- Good at creating documentation(runbooks, design documents, implementation plans )
Linux Experience :
- Namespace
- Virtualization
- Containers
Networking Experience
- Virtual networking
- Overlay networks
- Vxlans, GRE
Kubernetes Experience :
Should have experience in bringing up the Kubernetes cluster manually without using kubeadm tool.
Observability
Experience in observability is a plus
Cloud automation :
Familiarity with cloud platforms exclusively AWS, DevOps tools like Jenkins, terraform etc.
Role Description
- Office 365 Migration Specialist
- Position requires prior Office 365 mailbox migration experience, scheduling migration batches and process management.
- Under the general direction of the customer IT/Security/Okta Teams is responsible for assisting in the planning and support of the migration from multiple on-premises Exchange environments & third party provided Mail services into a single Azure-hosted Office 365 environment.
- Candidates must be flexible to work Monday through Friday
Experience and educational requirements
- Bachelors degree/Any equivalent Post Graduation.
Experience: 4 to 8 years
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.
Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.
Rules & Responsibilities:
Technical Skills:
- .Net– Net, C#, .Net core, MVC, Framework, Web API, Web Services, Micro Service and SQL
- Azure – Azure Cloud, SaaS, PaaS, IaaS, Azure Relational, and No-SQL Database, Big Data Services
Responsibilities
- Good understanding of and experience in working on Microsoft Azure (IAAS/PAAS/SAAS)
- Ability to architect, design, and implement cloud-based solutions
- Proven track record of designing and implement the IoT-based solutions/Big Data solutions/applications to the Azure cloud platform.
- Experience in building .Net-based enterprise distributed solutions in Windows and Linux.
- Experience in using CI and CD tools. Jenkins/ Azure pipeline and Terraform. Experience in using another tooling such as Ansible, CloudFormation, etc.
- Good understanding of HA/DR Setups in Cloud
- Experience and working knowledge of Virtualization, Networking, Data Center, and Security
- Deep hands-on experience in the design, development, and deployment of business software at scale.
- Strong hands-on experience in Azure Cloud Platform
- Experience in Kubernetes, Docker, and other cloud deployment, container technologies
- Experience / knowledge of other cloud offerings (e.g. AWS, GCP) will be added advantage
- Experience with monitoring tools like Prometheus, Grafana, Datadog, etc.



