Loading...

{{notif_text}}

Last chance to connect with exciting companies hiring right now - Register now!|L I V E{{days_remaining}} days {{hours_remaining}} hours left!

Windows Azure Jobs in Bangalore (Bengaluru)

Explore top Windows Azure Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Ansible Developer

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
3 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 400000, max: 900000, duration: "undefined", currency: "INR", equity: false})}}

o At least 3 years hands-on experience with Red Hat Linux family • Virtualization and Cloud Computing: Senior level virtualization skills. Can perform most virtualization tasks with minimum assistance.• At least 12 months experience building container-based solutions using one or more of Openshift, Kubernetes, Docker, Helm.• At least 6 months hands-on experience with Linux KVM (libvirt, libguestfs, virsh, qemu, qemu-img, virtio) or equivalent experience using Xen, Oracle VM.• Understanding of cloud service models - IaaS/PaaS/Saas - considered a plus.Ansible - Technical Experience • Install Ansible/ Red Hat Ansible Engine on control nodes. • Create and update inventories of managed hosts and manage connections to them. • Automate administration tasks with Ansible playbooks and adhoc commands.• Write effective Ansible playbooks at scale• Protect sensitive data used by Ansible with Ansible Vault Reuse code and simplify playbook development with Ansible roles • Configure Ansible managed nodes • Create and distribute SSH keys to managed nodes• Configure privilege escalation on managed nodes• Create Ansible plays and playbooks• Know how to work with commonly used Ansible modules• Use variables to retrieve the results of running commands Deep Understanding core components of Ansible:• Inventories• Modules• Variables• Facts• Plays• Playbooks• Configuration files• Automation Development - Any kind of automation development in SDLC to bring effectiveness. Preferably using Ruby, Python, bash, etc. which can return JSON for Ansible.• Working knowledge of software repositories like github or bitbucket• Cloud - Knowledge on OpenStack, VMWare, AWS, Azure, SoftLayer, Google Cloud etc., will be further helpful. • Knowledge and experience with various cloud service and deployment models (ie: IaaS, PaaS, XaaS, on-premise, off-premise, etc.)• Sysadmin background (with both implementation and support experience in terms of infrastructure, servers, OSes, middleware and databases)• Understanding of several middleware such as Oracle, SAP, DB2, MySql, Apache, IIS, etc.,• Solid background with operating systems deployment and administration (various Linux/Unix flavours-RHEL/SLES, AIX as well as Windows Servers)Nice to Have:• Experience with Chef.• Experience with VMware technologies.• Experience in Software Development Projects using agile development methodologies (SCRUM).• Experience with the Azure, AWS and Google public clouds.Soft Skills (100% coverage required)• Strong spoken and written communication skills• Demonstrated ability to work in large teams with geography spread• Ability to lead small teams technically• Client facing experience and skills

Job posted by
apply for job
apply for job
Shivani C picture
Shivani C
Job posted by
Shivani C picture
Shivani C
Apply for job
apply for job

Tech Lead

Founded 2007
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore), Hyderabad
Experience icon
8 - 12 years
Salary icon
Best in industry{{renderSalaryString({min: 1600000, max: 2500000, duration: "undefined", currency: "INR", equity: false})}}

Primary Skills (Must have) .NET, Web API, nunit/xunit, dapperAngular 6 – 8MicroservicesAWS/Azure experience  Any Messaging queues– MSMQ, RabbitMQ, KafkaDatabase – MSSQL, PostgreGeneral architecture design experience, screen mocs, db design, workflow design, participate in design discussions Secondary skills (Good to have) .NET CoreKafkaWorkflow orchestrator – Camunda/any orchestratorLinux/Unix experience – Looking at logs, starting agentsAWS Lambda, EC3, Cloudwatch logs Job/Role Description 8+ years’ experience in SDLC lifecylce engineering2+ years’ experience with a microservices architecture1+ years’ experience in Software Design with various messaging systems/service bus, such as Kafka or RabbitMQUnderstanding of industry best practices and process associated with software development4+ years’ Experience in C# and the .NET Framework1+ years’ experience in cloud technologies (AWS, Azure)

Job posted by
apply for job
apply for job
Gangadhar S picture
Gangadhar S
Job posted by
Gangadhar S picture
Gangadhar S
Apply for job
apply for job

Selenium Testers

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore), Mumbai
Experience icon
2 - 4 years
Salary icon
Best in industry{{renderSalaryString({min: 300000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

Job Description:   3+ years’ experience in software testing Knowledge/Experience of C# .net Define testing objectives based on Acceptance Criteria Write comprehensive test cases to verify project functionality Validate functionality according to requirements/stories Demonstrate strong understanding of software testing practices, strategies and techniques Can write clear, sufficiently detailed bugs, able to track and verify bugs Familiar with Azure DevOps, scrum process Good communication skills, willing to learn/share Experience with test automation (Selenium/Protractor or similar) Undergraduate degree in Computer Science or similar discipline

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Lead DevOps Engineer

Founded 2013
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 3000000, max: 4000000, duration: "undefined", currency: "INR", equity: false})}}

4+ years of experience in understanding development practices, awareness of leading cloud technologies/trends to formulate new DevOps product catalog, devise deployment workflow and strategies, integrate dev tools for static and dynamic code analyses. Experience in any cloud but mandatory 1+ years in Azure. 2+ years of experience writing scripts for Azure or AWS deployments. 1+ years of experience using Kubernetes. Infrastructure provisioning tools expertise in a few tools such as Docker, Chef, Puppet, Ansible, Packer, CloudFormation, Terraform. Experience with application servers, web servers, and databases (Nginx, PostgreSQL, MongoDB, HA Proxy, Tomcat, Flash Media Server/ Red5, Redis, elasticate, etc.

Job posted by
apply for job
apply for job
Ashwini Miniyar picture
Ashwini Miniyar
Job posted by
Ashwini Miniyar picture
Ashwini Miniyar
Apply for job
apply for job

Azure Infrastructure Architect

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
8 - 20 years
Salary icon
Best in industry{{renderSalaryString({min: 2500000, max: 5000000, duration: "undefined", currency: "INR", equity: false})}}

Job Description Intuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East. Intuitive is looking for highly talented hands on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.   Key Responsibilities and Must-have skills: Lead the pre-sales (25%) to post-sales (75%) efforts building Public/Hybrid Cloud solutions working collaboratively with Intuitive and client technical and business stakeholders Experience in analyzing customer's business and technical requirements, assessing existing environment for Cloud enablement, advising on Cloud models, technologies and risk management strategies.  Apply creative thinking/approach to determine technical solutions that further business goals and align with corporate technology strategies Extensive knowledge of Big Data architectures and patterns. Understanding of structured, semi-structured and unstructured data patterns of ingestion. Expertise in data movement and security. Be a customer advocate with obsession for excellence delivering measurable success for Intuitive’s customers with secure, scalable, highly available cloud architecture that leverage Azure Cloud services Extensive experience building Well Architected solutions in-line with Azure cloud adoption framework (DevOps/DevSecOps, Database/Data Warehouse/Data Lake, App Modernization/Containers, Security, Governance, Risk, Compliance, Cost Management and Operational Excellence) Experience with application discovery preferably with tools like Cloudscape/CloudChomp/Device42, to discover application configurations , databases, filesystems, and application dependencies Experience with Optimal Cloud Reference Architecture, Well Architected Framework, Cloud Readiness Assessments, Cloud Migration Assessment & Planning, Defining migration patterns and wave planning for application migration e.g. Re-host, Re-platform, Re-architect etc Experience on architecture, design of Azure cloud services to address scalability, performance, HA, security, availability, compliance, backup and DR, automation, alerting and monitoring and cost Hands-on experience with architecting and deploying Landing Zone architecture with CI/CD pipeline, code build and testing technologies and CI/CD concepts and implementations ideally including Jenkins, Git, Ansible, CloudFormation, Maven, PCF (Concourse, Bosh) and Azure Core Services Expertise in building container applications using Docker, Kubernetes, Micro-services and APIs Excellent knowledge of Hands-on experience in migrating applications to Azure leveraging proven tools and processes including migration, implementation, cutover and rollback plans and execution Hands-on Experience in writing cloud automation scripts/code such as Ansible, Terraform etc. Hands-on Experience with application build/release processes CI/CD pipelines Deep understanding of Agile processes (planning/stand-ups/retros etc), and interact with cross-functional teams i.e. Development, Infrastructure, Security, Performance Engineering, and QA Experience with Serverless platforms is a big plus Experience with Data Engineering (Database, Data Warehouse, Data Lake, Data Modeling, Data Integration and BI is a big plus Experience with Multi-cloud (AWS, GCP, OCI) is a big plus Experience with VMware Cloud Foundation as well as Advanced Windows and Linux Engineering is a big plus Experience with On-prem Data Engineering (Database, Data Warehouse, Data Lake) is a big plus   Additional Requirements: Good consultative, communication, team player and analytical skills are a must, as you will be regularly interacting between various teams which are distributed across the globe Good technical leadership skills and relevant experience working with other highly technical resources through adopting new technologies while also maintaining highly available, mission critical systems Possess deep technical knowledge and extensive experience with complex, IT environments including applications, middleware, storage, networks, information security, and operations as well as have a solid understand of cloud services, cloud native architectures and application delivery models In depth knowledge and understanding of container and microservices technologies and related architectures Ability to clearly communicate verbally and in writing to business and technology leaders, architects, developers, and team members Must be able to collaborate effectively with a group of high performing, technical individuals. Hands on experience with agile, devops and continuous integration / continuous delivery. Demonstrate ability to support cloud architecture recommendations and decisions with research and reasoning Strong hands-on experience scripting/development skills in Python, Ruby, Go, Java, JavaScript, etc. in a corporate environment Hands-on experience with: Terraform, Kubernetes, Jenkins, Kafka, Github, and configuration management tools such as Puppet, Chef, or Ansible Experience with architecting, implementing and maintaining highly available mission critical environments for 24x7 availability Experience working in an environment with a defined production change control process and mature security controls posture Relevant experience with configuration and implementation of IaaS, Infrastructure as code, AWS, Azure, etc. Demonstrate knowledge of cloud architecture and implementation features (OS, multi-tenancy, virtualization, orchestration, elastic scalability) and DevOps tool chains and processes Strong understanding of SOA, object-oriented analysis and design, and client/server systems Demonstrated history of working within deadlines and ability to work well under pressure Experience working in a financial services or highly regulated environment preferred Bachelor’s degree, preferably in a technical discipline (Computer Science, Mathematics, etc.), or equivalent combination of education and experience required 7 or more years experience in IT systems installation, operations, design, and architecture of virtualized servers / cloud systems Azure Architect Expert Certification or higher strongly desired; Relevant industry certifications such as AWS, Microsoft Azure or Google Cloud are a plus

Job posted by
apply for job
apply for job
Aakriti Gupta picture
Aakriti Gupta
Job posted by
Aakriti Gupta picture
Aakriti Gupta
Apply for job
apply for job

IBM Developer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1600000, max: 2500000, duration: "undefined", currency: "INR", equity: false})}}

Strong experience and hands on IBM ISAM and MFP productsstrong experience in configuring MFG backend servicescreation of intergation adapter, SQL adapter, HTTP adaptermultiple types of authenticationexperience of work knowledge ibmdevelop application architecture,user interface designexperience with IBM WAS DB2/my sql

Job posted by
apply for job
apply for job
suman surekha picture
suman surekha
Job posted by
suman surekha picture
suman surekha
Apply for job
apply for job

Data Engineer- SQL+PySpark

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
1 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Must-Have Skills: • Good experience in Pyspark - Including Dataframe core functions and Spark SQL • Good experience in SQL DBs - Be able to write queries including fair complexity. • Should have excellent experience in Big Data programming for data transformation and aggregations • Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption. • Good customer communication. • Good Analytical skill     Technology Skills (Good to Have): Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.  Designing and implementing data engineering, ingestion, and transformation functions Azure Synapse or Azure SQL data warehouse Spark on Azure is available in HD insights and data bricks

Job posted by
apply for job
apply for job
Evelyn Charles picture
Evelyn Charles
Job posted by
Evelyn Charles picture
Evelyn Charles
Apply for job
apply for job

Backend service developer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore), Mumbai, Pune
Experience icon
3 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 400000, max: 1800000, duration: "undefined", currency: "INR", equity: false})}}

JDDesire skill Azure kubernetes Writing services in Nodejs / ExpressJS to be deployed on dockers container Experience in dockers confrigation and build Microservices develovment experience CI/CD Experience as a developer

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Sr. SharePoint Developer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Mumbai, Pune, Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Hyderabad
Experience icon
4 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1600000, duration: "undefined", currency: "INR", equity: false})}}

Sr. SharePoint Developer Responsible for the development of the different custom solutions in SharePoint based on the customer requirements following the Software Development Life Cycle. Should be able to develop the solution independently by following design. Years of experience – Between 4 and 6 years Responsibilities Development of the custom solutions based on the different requirements. Should have extensive experience on the SharePoint OOTB features and should be able to apply the required branding by using JavaScript, JQuery and CSS. Take complete ownership of the solution development – includes detailed design, adherence to coding standards and development guidelines.   Technical Skills Required Extensive knowledge in developing custom solutions in SharePoint Online/2013 using client side (JSOM, .Net and REST API) and the server side object model extensively. In depth knowledge in developing custom web parts and branding using master pages, page layouts and stylesheets. Deep understanding of C#, ASP.Net and OOPS concepts. Good knowledge of the search components. Good understanding on the different types of the authentication options in SharePoint. Good understanding of the concepts related to security, permissions management, content types and other OOTB features. Good understanding of HTML5, CSS3, JQuery & ReactJS. Knowledge in SharePoint Framework development. Should have worked on Client Site webpart development. Knowledge of Modern site & Modern page customization. Knowledge in Azure App Service development, AAD Authentication   Additional Skills Knowledge in Team Foundation Server is required. Knowledge of developing custom solution using Azure & Office 365 Components – App Services, Flow, Teams, Planner Experience in JavaScript libraries – Angular is an added advantage.

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Azure Architect

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 1800000, duration: "undefined", currency: "INR", equity: false})}}

1) 6-9 years of industry experience and at least 4 years of experience in an architect role is required, along with at least 3-5 year experience in designing and building analytics/data solutions in Azure. 2) Demonstrated in-depth skills with Azure Data Factory(ADF),Azure SQL Server, Azure Synapse, ADLS with the ability to configure and administrate all aspects of Azure SQL Server. 3) Demonstrated experience delivering multiple data solutions as an architect. 4) Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies 5) DP-200 and DP-201 certifications preferred. 6) Good to have hands on experience in Power BI and Azure Databricks. 7)Should have good communication and presentation skills.

Job posted by
apply for job
apply for job
Jerrin Thomas picture
Jerrin Thomas
Job posted by
Jerrin Thomas picture
Jerrin Thomas
Apply for job
apply for job

Software Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
via Dremio
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore), Hyderabad
Experience icon
3 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 6500000, duration: "undefined", currency: "INR", equity: false})}}

Be Part Of Building The Future Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market. About the Role The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems. Responsibilities & ownership Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product. Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing. Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment. Lead the team to solve complex and unknown problems  Solve technical problems and customer issues with technical expertise Design and deliver architectures that run optimally on public clouds like  GCP, AWS, and Azure Mentor other team members for high quality and design  Collaborate with Product Management to deliver on customer requirements and innovation Collaborate with Support and field teams to ensure that customers are successful with Dremio Requirements B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience Fluency in Java/C++ with 8+ years of experience developing production-level software Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully Hands-on experience  in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform Passion for learning and delivering using latest technologies Ability to solve ambiguous, unexplored, and cross-team problems effectively Hands on experience of working projects on AWS, Azure, and Google Cloud Platform  Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud)  Understanding of distributed file systems such as  S3, ADLS, or HDFS Excellent communication skills and affinity for collaboration and teamwork Ability to work individually and collaboratively with other team members Ability to scope and plan solution for  big problems and mentors others on the same Interested and motivated to be part of a fast-moving startup with a fun and accomplished team

Job posted by
apply for job
apply for job
Maharaja Subramanian (CW) picture
Maharaja Subramanian (CW)
Job posted by
Maharaja Subramanian (CW) picture
Maharaja Subramanian (CW)
Apply for job
apply for job

DevOps Engineer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Mumbai, Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Pune, Hyderabad
Experience icon
2 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 300000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

Job Description:Must to have Devops, Jenkins, Terraforms, shell scripts.Azure  Cloud computingKnowledge of code versioning tools such as Git, Bitbucket, etcGood communication in both verbal and non verbalShould be team playerGood to have § proficiency in Azure Cloud Platforms§ Knowledge of build and deployment tools such as Jenkin, Ansible, etcHaving 4 to 6  years development experience on below job descriptionBand : B21) Must Have (Top 3 skills) : Devops, Jenkins and Terraforms2)      Good To have : Azure  Cloud computing, GIT and Shell scripts

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Senior DevOps Engineer

Founded 2018
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
7 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 2500000, max: 3500000, duration: "undefined", currency: "INR", equity: false})}}

Hypersonix.ai is disrupting the Analytics space with AI, ML and NLP capabilities to drive real-time business insights with a conversational user experience, enabling decisioning at the speed of thought. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on DevOps Engineer.● Managing deployments on multi-cloud architectures seamlessly including AWS, Azure and GCP● Enabling growth and scalability of the company’s infrastructure based on industry best practices and cutting edge technologiesJob Description: Dev OpsRoles & Responsibilities: ● Manage systems on AWS infrastructure including application servers, database servers ● Proficiency with EC2, Redshift, RDS, Elasticsearch, MongoDB and other AWS services.● Proficiency with managing a distributed service architecture with multiple microservices – including maintaining dev, QA, staging and production environments, managing zero-downtime releases, ensuring failure rollbacks with zero-downtime and scaling on-demand● Containerization of workloads and rapid deployment● Driving cost optimization while balancing performance● Manage high availability of existing systems and proactively address system maintenance issues● Manage AWS infrastructure configurations along with Application Load Balancers, HTTPS configurations, and network configurations (VPCs)● Work with the software engineering team to automate code deployment● Build and maintain tools for deployment, monitoring and operations. And troubleshoot and resolve issues in our dev, test and production environments.● Familiarity with managing Spark based ETL pipelines a plus● Experience in managing a team of DevOps EngineersRequired Qualifications: ● Bachelor’s or Master’s degree in a quantitative field● Cloud computing experience, Amazon Web Services (AWS). Bonus if you've worked on Azure, GCP and on cost optimization.● Prior experience in working on a distributed microservices architecture and containerization.● Strong background in Linux/Windows administration and scripting● Experience with CI/CD pipelines, Git, deployment configuration and monitoring tools● Working understanding of various components for Web Architecture● A working understanding of code and script (Javascript, Angular, Python)● Excellent communication skills, problem-solving and troubleshooting.Job Type: Full-time

Job posted by
apply for job
apply for job
Gowshini Maheswaran picture
Gowshini Maheswaran
Job posted by
Gowshini Maheswaran picture
Gowshini Maheswaran
Apply for job
apply for job

Cloud DevOps Engineer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 600000, max: 900000, duration: "undefined", currency: "INR", equity: false})}}

minimum 4years experience ansible with 3+ years hands onany 1 container- preferably kubernetesany 1 cloud preferably azureoverall 3-5 years experience

Job posted by
apply for job
apply for job
suman surekha picture
suman surekha
Job posted by
suman surekha picture
suman surekha
Apply for job
apply for job

DevOps Engineer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
3 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 400000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

Minimum 4 Years of experience  Ansible with 3+ years hands on  Any 1 container – Preferably Kubernetes    Any 1 Cloud | Preferably Azure  Overall, 3 to 5 years’ experience

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Azure Architect

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Pune, Mumbai
Experience icon
9 - 20 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 4000000, duration: "undefined", currency: "INR", equity: false})}}

Job title: Azure Architect Locations: Noida, Pune, Bangalore and Mumbai   Responsibilities: Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity Design and Develop the Data lake, Data warehouse using Azure Cloud Services Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams Support internal presentations to technical and business teams Provide technical guidance, mentoring and code review, design level technical best practices   Experience Needed: 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure. Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies. Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS) Worked with transactional, temporal, time series, and structured and unstructured data. Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).     Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python

Job posted by
apply for job
apply for job
Rattan Saini picture
Rattan Saini
Job posted by
Rattan Saini picture
Rattan Saini
Apply for job
apply for job

ETL Architect

Founded 1995
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
Experience icon
10 - 18 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Key skills:Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake Job Description Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.

Job posted by
apply for job
apply for job
Jayaraj E picture
Jayaraj E
Job posted by
Jayaraj E picture
Jayaraj E
Apply for job
apply for job

Azure Developer – HD insight

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1600000, duration: "undefined", currency: "INR", equity: false})}}

Working knowledge of setting up and running HD insight applications Hands on experience in Spark, Scala & Hive Hands on experience in ADF – Azure Data Factory Hands on experience in Big Data & Hadoop ECO Systems  Exposure to Azure Service categories like PaaS components and IaaS subscriptions Ability to Design, Develop ingestion & processing frame work for ETL applications Hands on experience in powershell scripting, deployment on Azure Experience in performance tuning and memory configuration Should be adaptable to learn & work on new technologies Should have Communication Good written and spoken

Job posted by
apply for job
apply for job
Harpreet kour picture
Harpreet kour
Job posted by
Harpreet kour picture
Harpreet kour