11+ Visual Studio Jobs in Hyderabad | Visual Studio Job openings in Hyderabad
Apply to 11+ Visual Studio Jobs in Hyderabad on CutShort.io. Explore the latest Visual Studio Job opportunities across top companies like Google, Amazon & Adobe.
is a software product company that provides
5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus
Skills Required :
Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols
- Experience with Infrastructure-as-Code tools(IaS) like Terraform and Cloud Formation.
- Proficiency in cloud-native technologies and architectures (Docker/ Kubernetes), Ci/CD pipelines.
- Good experience in Javascript.
- Expertise in Linux / Windows environment.
- Good Experience in Scripting languages like PowerShell / Bash/ Python.
- Proficiency in revision control and DevOps best practices like Git
Key Sills Required for Lead DevOps Engineer
Containerization Technologies
Docker, Kubernetes, OpenShift
Cloud Technologies
AWS/Azure, GCP
CI/CD Pipeline Tools
Jenkins, Azure Devops
Configuration Management Tools
Ansible, Chef,
SCM Tools
Git, GitHub, Bitbucket
Monitoring Tools
New Relic, Nagios, Prometheus
Cloud Infra Automation
Terraform
Scripting Languages
Python, Shell, Groovy
· Ability to decide the Architecture for the project and tools as per the availability
· Sound knowledge required in the deployment strategies and able to define the timelines
· Team handling skills are a must
· Debugging skills are an advantage
· Good to have knowledge of Databases like Mysql, Postgresql
It is advantageous to be familiar with Kafka. RabbitMQ
· Good to have knowledge of Web servers to deploy web applications
· Good to have knowledge of Code quality checking tools like SonarQube and Vulnerability scanning
· Advantage to having experience in DevSecOps
Note: Tools mentioned in bold are a must and others are added advantage
- Extensive experience in designing & supporting Azure Managed Services Operations.
- Maintaining the Azure Active Directory and Azure AD authentication.
- Azure update management – Handling updates/Patching.
- Good understanding of Azure services (Azure App Service, Azure SQL, Azure Storage Account..etc).
- Understanding of load balancers, DNS, virtual networks, NSG and firewalls in cloud environment.
- ARM templates writing, setup automation for resources provisioning.
- Knowledge on Azure automation and Automation Desire State Configuration.
- Good understanding of High Availability and Auto scaling.
- Azure Backups and ASR (Azure Site Recovery)
- Azure Monitoring and Configuration monitoring (performance metrics, OMS)
- Cloud Migration Experience(On premise to Cloud).
- PowerShell scripting for custom tasks automation.
- Strong experience in configuring, maintaining, and troubleshooting Microsoft based production systems.
Certification:
Azure Administrator (AZ-103) & Azure Architect (AZ-300 & AZ-301)
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
- Collaborate with Dev, QA and Data Science teams on environment maintenance, monitoring (ELK, Prometheus or equivalent), deployments and diagnostics
- Administer a hybrid datacenter, including AWS and EC2 cloud assets
- Administer, automate and troubleshoot container based solutions deployed on AWS ECS
- Be able to troubleshoot problems and provide feedback to engineering on issues
- Automate deployment (Ansible, Python), build (Git, Maven. Make, or equivalent) and integration (Jenkins, Nexus) processes
- Learn and administer technologies such as ELK, Hadoop etc.
- A self-starter and enthusiasm to learn and pick up new technologies in a fast-paced environment.
Need to have
- Hands-on Experience in Cloud based DevOps
- Experience working in AWS (EC2, S3, CloudFront, ECR, ECS etc)
- Experience with any programming language.
- Experience using Ansible, Docker, Jenkins, Kubernetes
- Experience in Python.
- Should be very comfortable working in Linux/Unix environment.
- Exposure to Shell Scripting.
- Solid troubleshooting skills
MTX Group Inc. is seeking a motivated Lead DevOps Engineer to join our team. MTX Group Inc. is a global implementation partner enabling organizations to become fit enterprises. MTX provides expertise across various platforms and technologies, including Google Cloud, Salesforce, artificial intelligence/machine learning, data integration, data governance, data quality, analytics, visualization and mobile technology. MTX’s very own Artificial Intelligence platform Maverick, enables clients to accelerate processes and critical decisions by leveraging a Cognitive Decision Engine, a collection of purpose-built Artificial Neural Networks designed to leverage the power of Machine Learning. The Maverick Platform includes Smart Asset Detection and Monitoring, Chatbot Services, Document Verification, to name a few.
Responsibilities:
- Be responsible for software releases, configuration, monitoring and support of production system components and infrastructure.
- Troubleshoot technical or functional issues in a complex environment to provide timely resolution, with various applications and platforms that are global.
- Bring experience on Google Cloud Platform.
- Write scripts and automation tools in languages such as Bash/Python/Ruby/Golang.
- Configure and manage data sources like PostgreSQL, MySQL, Mongo, Elasticsearch, Redis, Cassandra, Hadoop, etc
- Build automation and tooling around Google Cloud Platform using technologies such as Anthos, Kubernetes, Terraform, Google Deployment Manager, Helm, Cloud Build etc.
- Bring a passion to stay on top of DevOps trends, experiment with and learn new CI/CD technologies.
- Work with users to understand and gather their needs in our catalogue. Then participate in the required developments
- Manage several streams of work concurrently
- Understand how various systems work
- Understand how IT operations are managed
What you will bring:
- 5 years of work experience as a DevOps Engineer.
- Must possess ample knowledge and experience in system automation, deployment, and implementation.
- Must possess experience in using Linux, Jenkins, and ample experience in configuring and automating the monitoring tools.
- Experience in the software development process and tools and languages like SaaS, Python, Java, MongoDB, Shell scripting, Python, MySQL, and Git.
- Knowledge in handling distributed data systems. Examples: Elasticsearch, Cassandra, Hadoop, and others.
What we offer:
- Group Medical Insurance (Family Floater Plan - Self + Spouse + 2 Dependent Children)
- Sum Insured: INR 5,00,000/-
- Maternity cover upto two children
- Inclusive of COVID-19 Coverage
- Cashless & Reimbursement facility
- Access to free online doctor consultation
- Personal Accident Policy (Disability Insurance) -
- Sum Insured: INR. 25,00,000/- Per Employee
- Accidental Death and Permanent Total Disability is covered up to 100% of Sum Insured
- Permanent Partial Disability is covered as per the scale of benefits decided by the Insurer
- Temporary Total Disability is covered
- An option of Paytm Food Wallet (up to Rs. 2500) as a tax saver benefit
- Monthly Internet Reimbursement of upto Rs. 1,000
- Opportunity to pursue Executive Programs/ courses at top universities globally
- Professional Development opportunities through various MTX sponsored certifications on multiple technology stacks including Salesforce, Google Cloud, Amazon & others
*******************
Are you the one? Quick self-discovery test:
- Love for the cloud: When was the last time your dinner entailed an act on “How would ‘Jerry Seinfeld’ pitch Cloud platform & products to this prospect” and your friend did the ‘Sheldon’ version of the same thing.
- Passion: When was the last time you went to a remote gas station while on vacation and ended up helping the gas station owner saasify his 7 gas stations across other geographies.
- Compassion for customers: You listen more than you speak. When you do speak, people feel the need to listen.
- Humor for life: When was the last time you told a concerned CEO, ‘If Elon Musk can attempt to take humanity to Mars, why can’t we take your business to run on the cloud?
Your bucket of undertakings:
This position will be responsible to consult with clients and propose architectural solutions to help move & improve infra from on-premise to cloud or help optimize cloud spend from one public cloud to the other.
- Be the first one to experiment on new-age cloud offerings, help define the best practice as a thought leader for cloud, automation & Dev-Ops, be a solution visionary and technology expert across multiple channels.
- Continually augment skills and learn new tech as the technology and client needs evolve
- Use your experience in the Google cloud platform, AWS, or Microsoft Azure to build hybrid-cloud solutions for customers.
- Provide leadership to project teams, and facilitate the definition of project deliverables around core Cloud-based technology and methods.
- Define tracking mechanisms and ensure IT standards and methodology are met; deliver quality results.
- Participate in technical reviews of requirements, designs, code, and other artifacts
- Identify and keep abreast of new technical concepts in the google cloud platform
- Security, Risk, and Compliance - Advise customers on best practices around access management, network setup, regulatory compliance, and related areas.
Accomplishment Set
- Passionate, persuasive, articulate Cloud professional capable of quickly establishing interest and credibility
- Good business judgment, a comfortable, open communication style, and a willingness and ability to work with customers and teams.
- Strong service attitude and a commitment to quality.
- Highly organised and efficient.
- Confident working with others to inspire a high-quality standard.
Experience :
- 4-8 years experience in Cloud Infrastructure and Operations domains
- Experience with Linux systems and/OR Windows servers
- Specialize in one or two cloud deployment platforms: AWS, GCP
- Hands on experience with AWS services (EKS, ECS, EC2, VPC, RDS, Lambda, GKE, Compute Engine, API Gateway, AppSync and ServiceMesh)
- Experience in one or more scripting language-Python, Bash
- Good understanding of Apache Web Server, Nginx, MySQL, MongoDB, Nagios
- Logging and Monitoring tools (ELK, Stackdriver, CloudWatch)
- DevOps Technologies (AWS DevOps, Jenkins, Git, Maven)
- Knowledge on Configuration Management tools such as Ansible, Terraform, Puppet, Chef, Packer
- Experience working with deployment and orchestration technologies (such as Docker, Kubernetes, Mesos)
Education :
- Is Education overrated? Yes. We believe so. However there is no way to locate you otherwise. So unfortunately we might have to look for a Bachelor's or Master's degree in engineering from a reputed institute or you should be programming from 12. And the latter is better. We will find you faster if you specify the latter in some manner. Not just degree, but we are not too thrilled by tech certifications too ... :)
- To reiterate: Passion to tech-awesome, insatiable desire to learn the latest of the new-age cloud tech, highly analytical aptitude and a strong ‘desire to deliver’ outlives those fancy degrees!
- 3-8 years of experience with hands-on experience in Cloud Computing (AWS/GCP) and IT operational experience in a global enterprise environment.
- Good analytical, communication, problem solving, and learning skills.
- Knowledge on programming against cloud platforms such as Google Cloud Platform and lean development methodologies.
at Ojas Innovative Technologies
A strong background in Azure OR Amazon Web Services (AWS) or a similar cloud platform is a must-have, certification is a plus.
Excellent technical skills and knowledge include but not limited to: cloud methodologies like PaaS and SaaS; programming languages such as Python, Java, .Net; orchestration systems such as Chef, Ansible, Terraform; Azure IaaS servers; PowerShell scripting.
Fully aware of the DevOps cycle with hands-on on deployment models to the cloud.
Experience working with Docker and related containerization technologies.
Experience working on orchestration platforms such as AKS, RedHat OpenShift etc.
Extensive knowledge working with logging and monitoring tools such as EFK, visualization tools such as Grafana and Prometheus.
Exposure to security alert monitoring tools.
Experience in building both microservices and public facing API's.- Experience in working with any of the API gateways.● Responsible for design, development, and implementation of Cloud solutions.
● Responsible for achieving automation & orchestration of tools(Puppet/Chef)
● Monitoring the product's security & health(Datadog/Newrelic)
● Managing and Maintaining databases(Mongo & Postgres)
● Automating Infrastructure using AWS services like CloudFormation
● Participating in Infrastructure Security Audits
● Migrating to Container technologies (Docker/Kubernetes)
● Should be able to work on serverless concepts (AWS Lambda)
● Should be able to work with AWS services like EC2, S3, Cloud-formation, EKS, IAM, RDS, ..etc
What you bring:
● Problem-solving skills that enable you to identify the best solutions.
● Team collaboration and flexibility at work.
● Strong verbal and written communication skills that will help in presenting complex ideas
in
● an accessible and engaging way.
● Ability to choose the best tools and technologies which best fits the business needs.
Aviso offers:
● Dynamic, diverse, inclusive startup environment driven by transparency and velocity
● Bright, open, sunny working environment and collaborative office space
● Convenient office locations in Redwood City, Hyderabad and Bangalore tech hubs
● Competitive salaries and company equity, and a focus on developing world class talent operations
● Comprehensive health insurance available (medical) for you and your family
● Unlimited leaves with manager approval and a 3 month paid sabbatical after 3 years of service
● CEO moonshots projects with cash awards every quarter
● Upskilling and learning support including via paid conferences, online courses, and certifications
● Every month Rupees 2,500 will be credited to Sudexo meal card
Below is the Job details:
Role: DevOps Architect
Experience Level: 8-12 Years
Job Location: Hyderabad
Key Responsibilities :
Look through the various DevOps Tools/Technologies and identify the strengths and provide direction to the DevOps automation team
Out-of-box thought process on the DevOps Automation Platform implementation
Expose various tools and technologies and do POC on integration of the these tools
Evaluate Backend API's for various DevOps tools
Perform code reviews keep in context of RASUI
Mentor the team on the various E2E integrations
Be Liaison in evangelizing the automation solution currently implemented
Bring in various DevOps best Practices/Principles and participate in adoption with various app teams
Must have:
Should possess Bachelors/Masters in computer science with minimum of 8+ years of experience
Should possess minimum 3 years of strong experience in DevOps
Should possess expertise in using various DevOps tools libraries and API's (Jenkins/JIRA/AWX/Nexus/GitHub/BitBucket/SonarQube)
Should possess expertise in optimizing the DevOps stack ( Containers/Kubernetes/Monitoring )
2+ Experience in creating solutions and translate to the development team
Should have strong understanding of OOPs, SDLC (Agile Safe standards)
Proficient in Python , with a good knowledge of its ecosystems (IDEs and Frameworks)
Proficient in various cloud platforms (Azure/AWS/Google cloud platform)
Proficient in various DevOps offerings (Pivotal/OpenStack/Azure DevOps
Regards,
Talent acquisition team
Tetrasoft India
Stay home and Stay safe