Cutshort logo
SQL Azure Jobs in Delhi, NCR and Gurgaon

14+ SQL Azure Jobs in Delhi, NCR and Gurgaon | SQL Azure Job openings in Delhi, NCR and Gurgaon

Apply to 14+ SQL Azure Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.

icon
VoerEir India

at VoerEir India

2 recruiters
Pooja Jaiswal
Posted by Pooja Jaiswal
Noida
3 - 5 yrs
₹13L - ₹15L / yr
Python
Django
Flask
Linux/Unix
Computer Networking
+3 more

Roles and Responsibilities

• Ability to create solution prototype and conduct proof of concept of new tools.

• Work in research and understanding of new tools and areas.

• Clearly articulate pros and cons of various technologies/platforms and perform

detailed analysis of business problems and technical environments to derive a

solution.

• Optimisation of the application for maximum speed and scalability.

• Work on feature development and bug fixing.

Technical skills

• Must have knowledge of the networking in Linux, and basics of computer networks in

general.

• Must have intermediate/advanced knowledge of one programming language,

preferably Python.

• Must have experience of writing shell scripts and configuration files.

• Should be proficient in bash.

• Should have excellent Linux administration capabilities.

• Working experience of SCM. Git is preferred.

• Knowledge of build and CI-CD tools, like Jenkins, Bamboo etc is a plus.

• Understanding of Architecture of OpenStack/Kubernetes is a plus.

• Code contributed to OpenStack/Kubernetes community will be a plus.

• Data Center network troubleshooting will be a plus.

• Understanding of NFV and SDN domain will be a plus.

Soft skills

• Excellent verbal and written communications skills.

• Highly driven, positive attitude, team player, self-learning, self motivating and flexibility

• Strong customer focus - Decent Networking and relationship management

• Flair for creativity and innovation

• Strategic thinking This is an individual contributor role and will need client interaction on technical side.


Must have Skill - Linux, Networking, Python, Cloud

Additional Skills-OpenStack, Kubernetes, Shell, Java, Development


Read more
Career Forge

at Career Forge

2 candid answers
Mohammad Faiz
Posted by Mohammad Faiz
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹12L - ₹15L / yr
Python
Apache Spark
PySpark
Data engineering
ETL
+10 more

🚀 Exciting Opportunity: Data Engineer Position in Gurugram 🌐


Hello 


We are actively seeking a talented and experienced Data Engineer to join our dynamic team at Reality Motivational Venture in Gurugram (Gurgaon). If you're passionate about data, thrive in a collaborative environment, and possess the skills we're looking for, we want to hear from you!


Position: Data Engineer  

Location: Gurugram (Gurgaon)  

Experience: 5+ years 


Key Skills:

- Python

- Spark, Pyspark

- Data Governance

- Cloud (AWS/Azure/GCP)


Main Responsibilities:

- Define and set up analytics environments for "Big Data" applications in collaboration with domain experts.

- Implement ETL processes for telemetry-based and stationary test data.

- Support in defining data governance, including data lifecycle management.

- Develop large-scale data processing engines and real-time search and analytics based on time series data.

- Ensure technical, methodological, and quality aspects.

- Support CI/CD processes.

- Foster know-how development and transfer, continuous improvement of leading technologies within Data Engineering.

- Collaborate with solution architects on the development of complex on-premise, hybrid, and cloud solution architectures.


Qualification Requirements:

- BSc, MSc, MEng, or PhD in Computer Science, Informatics/Telematics, Mathematics/Statistics, or a comparable engineering degree.

- Proficiency in Python and the PyData stack (Pandas/Numpy).

- Experience in high-level programming languages (C#/C++/Java).

- Familiarity with scalable processing environments like Dask (or Spark).

- Proficient in Linux and scripting languages (Bash Scripts).

- Experience in containerization and orchestration of containerized services (Kubernetes).

- Education in database technologies (SQL/OLAP and Non-SQL).

- Interest in Big Data storage technologies (Elastic, ClickHouse).

- Familiarity with Cloud technologies (Azure, AWS, GCP).

- Fluent English communication skills (speaking and writing).

- Ability to work constructively with a global team.

- Willingness to travel for business trips during development projects.


Preferable:

- Working knowledge of vehicle architectures, communication, and components.

- Experience in additional programming languages (C#/C++/Java, R, Scala, MATLAB).

- Experience in time-series processing.


How to Apply:

Interested candidates, please share your updated CV/resume with me.


Thank you for considering this exciting opportunity.

Read more
Epik Solutions
Sakshi Sarraf
Posted by Sakshi Sarraf
Bengaluru (Bangalore), Noida
4 - 13 yrs
₹7L - ₹18L / yr
Python
SQL
databricks
Scala
Spark
+2 more

Job Description:


As an Azure Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Azure platform. You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Azure Databricks, Python, SQL, Azure Data Factory (ADF), PySpark, and Scala will be essential for performing the following key responsibilities:


Designing and developing data pipelines: You will design and implement scalable and efficient data pipelines using Azure Databricks, PySpark, and Scala. This includes data ingestion, data transformation, and data loading processes.


Data modeling and database design: You will design and implement data models to support efficient data storage, retrieval, and analysis. This may involve working with relational databases, data lakes, or other storage solutions on the Azure platform.


Data integration and orchestration: You will leverage Azure Data Factory (ADF) to orchestrate data integration workflows and manage data movement across various data sources and targets. This includes scheduling and monitoring data pipelines.


Data quality and governance: You will implement data quality checks, validation rules, and data governance processes to ensure data accuracy, consistency, and compliance with relevant regulations and standards.


Performance optimization: You will optimize data pipelines and queries to improve overall system performance and reduce processing time. This may involve tuning SQL queries, optimizing data transformation logic, and leveraging caching techniques.


Monitoring and troubleshooting: You will monitor data pipelines, identify performance bottlenecks, and troubleshoot issues related to data ingestion, processing, and transformation. You will work closely with cross-functional teams to resolve data-related problems.


Documentation and collaboration: You will document data pipelines, data flows, and data transformation processes. You will collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide data engineering support.


Skills and Qualifications:


Strong experience with Azure Databricks, Python, SQL, ADF, PySpark, and Scala.

Proficiency in designing and developing data pipelines and ETL processes.

Solid understanding of data modeling concepts and database design principles.

Familiarity with data integration and orchestration using Azure Data Factory.

Knowledge of data quality management and data governance practices.

Experience with performance tuning and optimization of data pipelines.

Strong problem-solving and troubleshooting skills related to data engineering.

Excellent collaboration and communication skills to work effectively in cross-functional teams.

Understanding of cloud computing principles and experience with Azure services.


Read more
cyntralab
Akanksha Pal
Posted by Akanksha Pal
Noida
8 - 12 yrs
₹14L - ₹20L / yr
SQL Azure
API
Integration
RESTful APIs
HTTP
+5 more

CyntraLabs is focused on providing organizations unified solutions to maximize the business value by utilizing the capabilities offered by emerging technologies. We take pride in providing state-of-the-art solutions in integration and retail that guarantee success in business evolution. Businesses must transform to stay relevant, hence we provide our existing and new partners with foresight to Become agile, Realize new growth opportunities and Adapt to new technologies.


Key Responsibilities:

1.     Design and develop MuleSoft and Azure architecture solutions based on client requirements

2.     Work with clients to understand their business needs and provide technical guidance on MuleSoft and Azure solutions

3.     Collaborate with other architects, developers, and project managers to ensure seamless integration of MuleSoft and Azure solutions with existing systems

4.     Create and maintain technical documentation for MuleSoft and Azure architecture solutions

5.     Perform architecture assessments and provide recommendations for MuleSoft and Azure solutions

6.     Implement MuleSoft and Azure best practices and standards for architecture, design, and development

7.     Develop custom MuleSoft and Azure components, connectors, and templates to support project requirements

8.     Participate in code reviews and provide feedback to other developers

9.     Provide technical leadership and mentorship to other MuleSoft and Azure developers

10.  Stay up-to-date with MuleSoft and Azure technology trends and best practices

11.  Assist with project scoping, planning, and estimation

12.  Communicate with clients to understand their business requirements and provide technical guidance

13.  Work on Azure-specific projects, including infrastructure architecture, security, and monitoring

14.  Design and implement Azure solutions such as Azure Logic Apps, Functions, Event Grids, and API Management

15.  Work with Azure services such as Azure Blob Storage, Azure SQL, and Cosmos DB

16.  Integrate MuleSoft and Azure services using APIs and connectors

 

Eligibility:

 

1.     A Bachelor's degree in Computer Science or related field is preferred

2.     8-10 years of experience in software development, with at least 3 years of experience in MuleSoft and Azure architecture and design

3.     Strong understanding of integration patterns, SOA, and API design principles

4.     Experience with Anypoint Studio and the MuleSoft Anypoint Platform

5.     Hands-on experience with MuleSoft development including creating APIs, connectors, and integration flows

6.     Understanding of RESTful web services, HTTP, JSON, and XML

7.     Familiarity with software development best practices such as Agile and DevOps

8.     Excellent problem-solving skills and ability to troubleshoot complex technical issues

9.     Strong communication and interpersonal skills

10.  Ability to work independently as well as in a team-oriented, collaborative environment

11.  MuleSoft certification as a Solution Architect and Azure certification as an Architect is preferred



Read more
Celebal Technologies Pvt Ltd

at Celebal Technologies Pvt Ltd

2 candid answers
Anjani Upadhyay
Posted by Anjani Upadhyay
Jaipur, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune
3 - 7 yrs
₹10L - ₹20L / yr
Platform as a Service (PaaS)
.NET
Microsoft Windows Azure
ASP.NET
ASP.NET MVC
+9 more

Microsoft Azure Data Integration Engineer



Job Description: 

Microsoft Azure Data Integration Developer who will design and build cutting-edge user experiences for our client’s consumer-facing desktop application. The Senior Software Developer will work closely with product owners, UX designers, front-end, and back-end developers to help build the next-generation platform. 

 

Key Skills:

  • 3+ years of experience in an enterprise or consumer software development environment using C# and designing and supporting Azure environments
  • Expert level programming skills in creating MVC and Microservices architecture
  • Experience with modern frameworks and design patterns like .Net core
  • Strong knowledge of C# language
  • Hands-on experience using the Azure administration portal and iPaaS
  • Demonstrable experience deploying enterprise workloads to Azure
  • Hands-on experience in Azure function apps, app service, logic apps and storage, Azure Key vault integration, and Azure Sql database.
  • Solid understanding of object-oriented programming.
  • Experience developing user interfaces and customizing UI controls/components

·      Microsoft Azure Certification, Business Continuity, or Disaster Recovery planning experience is a plus


Responsibilities:

  • Architect, design & build a modern web application for consumers
  • Explore configuring hybrid connectivity between on-premises environments and Azure, and how to monitor network performance to comply with service-level agreements.
  • Collaborate with UI/UX teams to deliver high-performing and easy-to-use applications
  • Participate in code reviews with staff as necessary to ensure a high-quality, performant product
  • Develop a deep understanding of client goals and objectives, and articulate how your solutions address their needs
  • Unit testing/test-driven development
  • Integration testing
  • Deploying Azure VMs (Windows Server) in a highly available environment
  • Regularly reviewing existing systems and making recommendations for improvements.
  • Maintenance
  • Post-deployment production support and troubleshooting


Technical Expertise and Familiarity:

  • Cloud Technologies: Azure, iPaaS
  • Microsoft: .NET Core, ASP.NET, MVC, C#
  • Frameworks/Technologies: MVC, Microservices, Web Services, REST API, Java Script, JQuery, CSS, Testing Frameworks
  • IDEs: Visual Studio, VS Code
  • Databases: MS SQL Server, Azure SQL
  • Familiarity with Agile software development methodologies.
  • Advanced knowledge of using Git source control system.
  • Azure, AWS, and GCP certifications preferred.

 

 

Read more
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹12L - ₹18L / yr
ASP.NET
SQL Azure
Asp.net core
Azure functions
Azure logic app
+4 more

Job Responsibilities: 

  • Technically sound in Dot Net technology. Good working knowledge & experience in Web API and SQL Server
  • Should be able to carry out requirement analysis, design, coding unit testing and support to fix defects reported during QA, UAT phases and at GO Live times.
  • Able to work alone or as part of a team with minimal or no supervision from Delivery leads.
  • Good experience required in Azure stack of integration technology-Logic app, Azure Function, APIM and Application insights.

 

Must have skill 

  • Strong Web API development using ASP.Net Core, Logic app, azure functions, APIM
  • Azure Functions
  • Azure Logic App
  • Azure APIM
  • Azure ServiceBus

Desirable Skills 

  • Azure Event Grid/Hub
  • Azure KeyVault
  • Azure SQL – Knowledge on SQL query
Read more
Hanu

at Hanu

Agency job
Gurgaon/Gurugram, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
7 - 15 yrs
₹25L - ₹45L / yr
Data Warehouse (DWH)
ETL
ADF
Business Intelligence (BI)
Data architecture
+2 more
Responsibilities

* Formulates and recommends standards for achieving maximum performance

and efficiency of the DW ecosystem.

* Participates in the Pre-sales activities for solutions of various customer

problem-statement/situations.

* Develop business cases and ROI for the customer/clients.

* Interview stakeholders and develop BI roadmap for success given project

prioritization

* Evangelize self-service BI and visual discovery while helping to automate any

manual process at the client site.

* Work closely with the Engineering Manager to ensure prioritization of

customer deliverables.

* Champion data quality, integrity, and reliability throughout the organization by

designing and promoting best practices.

 *Implementation 20%

* Help DW/DE team members with issues needing technical expertise or

complex systems and/or programming knowledge.

* Provide on-the-job training for new or less experienced team members.

* Develop a technical excellence team

Requirements

- experience designing business intelligence solutions

- experience with ETL Process, Data warehouse architecture

- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,

Synapse, Azure Databricks, and Power BI

- Good analytical and problem-solving skills

- Fluent in relational database concepts and flat file processing concepts

- Must be knowledgeable in software development lifecycles/methodologies
Read more
Noida, Delhi, Gurugram, Ghaziabad, Faridabad
1 - 10 yrs
₹5L - ₹30L / yr
Docker
Kubernetes
DevOps
Linux/Unix
SQL Azure
+9 more

Mandatory:
● A minimum of 1 year of development, system design or engineering experience ●
Excellent social, communication, and technical skills
● In-depth knowledge of Linux systems
● Development experience in at least two of the following languages: Php, Go, Python,
JavaScript, C/C++, Bash
● In depth knowledge of web servers (Apache, NgNix preferred)
● Strong in using DevOps tools - Ansible, Jenkins, Docker, ELK
● Knowledge to use APM tools, NewRelic is preferred
● Ability to learn quickly, master our existing systems and identify areas of improvement
● Self-starter that enjoys and takes pride in the engineering work of their team ● Tried
and Tested Real-world Cloud Computing experience - AWS/ GCP/ Azure ● Strong
Understanding of Resilient Systems design
● Experience in Network Design and Management
Read more
pinBox
Agency job
via InvokHR by Sandeepa Kasala
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
11 - 12 yrs
₹17L - ₹25L / yr
Python
Django
PostgreSQL
HTML/CSS
jQuery
+6 more
Position Senior Software Engineer
Location Delhi NCR
Opening Immediate
Search Context
Over 1.8 billion non-salaried informal sector workers globally, and roughly 700m Indians are
not eligible for pension or other social protection benefits. Without an urgent and effective
response to pension exclusion, they face the grim prospect of extreme poverty for over 20
years once they are too old to work.
pinBox is the only global pensionTech committed exclusively to mass-scale digital micropension inclusion among self-employed women and youth. We deploy our white-labelled,
API-enabled pension administration and delivery platform, our unique deployment model and
a simple and intuitive UI/UX to make access to regulated pension, savings and insurance
products easy and simple for non-salaried informal sector workers. We're working actively
with governments, regulators, multilateral aid agencies and leading financial inclusion
stakeholders in Asia and Africa. The pinBox model is already operating in Rwanda, Kenya
and India. We will expand to Bangladesh, Uganda, Chile, Indonesia and Nigeria by 2023.
Governments and pension regulators use the our pensionTech to jumpstart digital micropension and insurance inclusion among informal sector workers. Pension funds and insurers
use our pensionTech to build a mass market for their products beyond their traditional agentled customer base. Banks, MNOs, cooperatives, MFIs, fintech firms and gig-platforms use
our plug-and-play pensionTech to instantly offer an integrated social protection solution to
their clients, members and employees without any new investments in IT or capacity
enhancement.
We’ve recently completed our first equity fundraise to enhance our engineering, business
and delivery capacity and embark on the next stage of pinBox pensionTech development
and expansion. By 2025, we aim to enable and assist 100 million excluded individuals to
start saving for their old age in a secure, affordable and well-regulated environment.
pinBox is looking for senior software engineers who are deeply passionate about using IT to
solve difficult, real-life problems at scale across multiple countries.
The Senior Software Engineer will be expected to
1. Design, code, test, deploy and maintain applications to satisfy business requirements,
2. Plan and implement technical efforts through all phases of the software development
process,
3. Collaborate cross-functionally to make continuous improvements to the pinBox
pensionTech platform,
4. Help drive engineering plans through a broad approach to engineering quality
(consistent and thoughtful patterns, improved observability, unit and integration testing,
etc.),
5. Adhere to national and global architecture standards, risk management and security
policies,
6. Monitor the performance of applications and work with developers to continuously
improve and optimize performance.
The ideal candidate processes
1. An undergraduate degree in engineering,
2. At least 6 years’ experience as a software engineer or in a similar role,
3. Experience working with distributed version control systems such as Git / Mercurial
4. Frontend: Experience with HTML, CSS, bootstrap, Javascript, Jquery is necessary.
Experience with React / Angular will be an advantage,
5. Backend: Experience with Django/Python, PostgreSQL or any other RDBMS is
mandatory. Experience with Redis will be an advantage,
6. Experience in working with AWS / Azure / Google Cloud,
7. As our applications use a number of third party micro-services, experience with REST
APIs, as also with the Indian digital finance ecosystem (UPI, e-KYC) will be both
necessary and an advantage,
8. Critical thinking and problem-solving skills, and
9. Excellent teamwork and interpersonal skills, a keen eye for detail and the ability to
function effectively and proactively under tight deadlines.
Read more
BRAVVURA DIGITAL

at BRAVVURA DIGITAL

1 recruiter
Tina Singh
Posted by Tina Singh
Remote, NCR (Delhi | Gurgaon | Noida)
4 - 6 yrs
₹7L - ₹10L / yr
PHP
Laravel
MySQL
HTML/CSS
Javascript
+8 more
Responsible to code,test,deploy and scale SaaS-based products using PHP, MySql, with Laravel and Lumen MVC framework. 
Read more
Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Pune, Mumbai
9 - 20 yrs
₹10L - ₹40L / yr
Windows Azure
Azure Synapse
Data Structures
SQL Azure
QA DB
+1 more

Job title: Azure Architect

Locations: Noida, Pune, Bangalore and Mumbai

 

Responsibilities:

  • Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
  • Design and Develop the Data lake, Data warehouse using Azure Cloud Services
  • Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
  • Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
  • Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
  • Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
  • Support internal presentations to technical and business teams
  • Provide technical guidance, mentoring and code review, design level technical best practices

 

Experience Needed:

  • 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
  • Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
  • Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
  • Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
  • Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
  • Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
  • Worked with transactional, temporal, time series, and structured and unstructured data.
  • Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).

 

 

Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python

Read more
IT Giant
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
10 - 18 yrs
₹15L - ₹30L / yr
ETL
Informatica
Informatica PowerCenter
Windows Azure
SQL Azure
+2 more
Key skills:
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake

Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Read more
Virtual Engineering Services Pvt Ltd
Abha Sachdeva
Posted by Abha Sachdeva
Delhi, NCR (Delhi | Gurgaon | Noida)
2 - 5 yrs
₹4L - ₹6L / yr
.NET
Javascript
RESTful APIs
HTML/CSS
Jasmine (Javascript Testing Framework)
+5 more

 

The Company:

VEST, Inc. was founded in 1997 in Troy, Michigan, USA. We are a niche, leading worldwide commercial provider of innovative design-automation and sales-automation software, data, and engineering design services for the Fluid Power industry. Our products help engineers and hydraulic corporations do their jobs faster, better and at lowered costs. VEST has developed a portfolio of software solutions to improve the schematic design, 3-D geometric configuration design, and the accurate Manufacturing of hydraulic power units and manifold assemblies. VEST has more recently extended these capabilities as cloud-based services for OEMs, retail salespeople, field service technicians, and engineers.

Virtual Engineering Services Pvt. Ltd. (VES) is the exclusive development center for VEST, Inc. USA. We engineer desktop and cloud apps, and enterprise platforms for VEST, Inc. to help create high quality, scalable, cost-effective solutions, and services for major global OEMs. More details, please visit our website : https://www.vestusa.com">https://www.vestusa.com

The Opportunity:

 

The Opportunity, as a Full Stack Developer, you'll be at the heart of our company - The Engineering Team. You will be the brain behind crafting, developing, testing, going live and maintaining the products. Will help the Quality Assurance Team by squashing those annoying bugs!

Your day-to-day tasks includes:

Involvement in SDLC of Microsoft .Net based web applications including analysis, design, development, coordination, implementation and maintenance of the products.
Required to work in a fast-paced agile environment and with a team of dynamic Software Engineers to develop end-to-end applications/products.
You will develop applications using cutting edge technologies like, Blazor, Microsoft .Net Core, RESTful web services, HTML5, CSS3, JavaScript frameworks, SQL Server etc.
You will use .NET Core using Visual Studio Code as a primary technology to develop applications.
You are always willing to work in a constantly changing world and have ability to adapt to new technologies faster.

 

What You should have:

You have a bachelor’s degree in computer science or related field with excellent academic records.
You come with a minimum of 2 to 3 years of extensive first-hand work experience in C#, .Net Core, SQL Server, MVC, Razor Pages, JavaScript, CSS, HTML and you must have a sound knowledge of OOPS concepts.
Proven experience WCF, REST APIs, LINQ and the Entity Framework.
Experience in Blazor Web Assembly is a plus.
You can understand customers business requirements, capability to translate those business requirements to technology terms and develop applications.
Detail-oriented, Critical-thinking, Experience with software designing in a test-driven environment.
You have a good understanding on Azure DevOps and Agile methodologies.
Excellent communication Skills both written and oral, and you are a fast learner with an ability to adapt quickly.

 

Why you must work for VES.

We have a strong culture of valuing our employees and promoting an autonomous, transparent, and ethical work environment.
Talented and supportive peers who value your contributions and drive your learning.
Challenging opportunities: Learning happens outside the comfort zone and that is where our team likes to be always pushing the boundaries and growing personally and professionally.
Transparency: an open, honest, and direct communication with co-workers, seniors and even all our senior management is approachable.

 

Read more
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
Python
PySpark
Microsoft Windows Azure
SQL Azure
Data Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort