o At least 3 years hands-on experience with Red Hat Linux family • Virtualization and Cloud Computing: Senior level virtualization skills. Can perform most virtualization tasks with minimum assistance.• At least 12 months experience building container-based solutions using one or more of Openshift, Kubernetes, Docker, Helm.• At least 6 months hands-on experience with Linux KVM (libvirt, libguestfs, virsh, qemu, qemu-img, virtio) or equivalent experience using Xen, Oracle VM.• Understanding of cloud service models - IaaS/PaaS/Saas - considered a plus.Ansible - Technical Experience • Install Ansible/ Red Hat Ansible Engine on control nodes. • Create and update inventories of managed hosts and manage connections to them. • Automate administration tasks with Ansible playbooks and adhoc commands.• Write effective Ansible playbooks at scale• Protect sensitive data used by Ansible with Ansible Vault Reuse code and simplify playbook development with Ansible roles • Configure Ansible managed nodes • Create and distribute SSH keys to managed nodes• Configure privilege escalation on managed nodes• Create Ansible plays and playbooks• Know how to work with commonly used Ansible modules• Use variables to retrieve the results of running commands Deep Understanding core components of Ansible:• Inventories• Modules• Variables• Facts• Plays• Playbooks• Configuration files• Automation Development - Any kind of automation development in SDLC to bring effectiveness. Preferably using Ruby, Python, bash, etc. which can return JSON for Ansible.• Working knowledge of software repositories like github or bitbucket• Cloud - Knowledge on OpenStack, VMWare, AWS, Azure, SoftLayer, Google Cloud etc., will be further helpful. • Knowledge and experience with various cloud service and deployment models (ie: IaaS, PaaS, XaaS, on-premise, off-premise, etc.)• Sysadmin background (with both implementation and support experience in terms of infrastructure, servers, OSes, middleware and databases)• Understanding of several middleware such as Oracle, SAP, DB2, MySql, Apache, IIS, etc.,• Solid background with operating systems deployment and administration (various Linux/Unix flavours-RHEL/SLES, AIX as well as Windows Servers)Nice to Have:• Experience with Chef.• Experience with VMware technologies.• Experience in Software Development Projects using agile development methodologies (SCRUM).• Experience with the Azure, AWS and Google public clouds.Soft Skills (100% coverage required)• Strong spoken and written communication skills• Demonstrated ability to work in large teams with geography spread• Ability to lead small teams technically• Client facing experience and skills
Primary Skills (Must have) .NET, Web API, nunit/xunit, dapperAngular 6 – 8MicroservicesAWS/Azure experience Any Messaging queues– MSMQ, RabbitMQ, KafkaDatabase – MSSQL, PostgreGeneral architecture design experience, screen mocs, db design, workflow design, participate in design discussions Secondary skills (Good to have) .NET CoreKafkaWorkflow orchestrator – Camunda/any orchestratorLinux/Unix experience – Looking at logs, starting agentsAWS Lambda, EC3, Cloudwatch logs Job/Role Description 8+ years’ experience in SDLC lifecylce engineering2+ years’ experience with a microservices architecture1+ years’ experience in Software Design with various messaging systems/service bus, such as Kafka or RabbitMQUnderstanding of industry best practices and process associated with software development4+ years’ Experience in C# and the .NET Framework1+ years’ experience in cloud technologies (AWS, Azure)
Job Description: 3+ years’ experience in software testing Knowledge/Experience of C# .net Define testing objectives based on Acceptance Criteria Write comprehensive test cases to verify project functionality Validate functionality according to requirements/stories Demonstrate strong understanding of software testing practices, strategies and techniques Can write clear, sufficiently detailed bugs, able to track and verify bugs Familiar with Azure DevOps, scrum process Good communication skills, willing to learn/share Experience with test automation (Selenium/Protractor or similar) Undergraduate degree in Computer Science or similar discipline
4+ years of experience in understanding development practices, awareness of leading cloud technologies/trends to formulate new DevOps product catalog, devise deployment workflow and strategies, integrate dev tools for static and dynamic code analyses. Experience in any cloud but mandatory 1+ years in Azure. 2+ years of experience writing scripts for Azure or AWS deployments. 1+ years of experience using Kubernetes. Infrastructure provisioning tools expertise in a few tools such as Docker, Chef, Puppet, Ansible, Packer, CloudFormation, Terraform. Experience with application servers, web servers, and databases (Nginx, PostgreSQL, MongoDB, HA Proxy, Tomcat, Flash Media Server/ Red5, Redis, elasticate, etc.
Strong experience and hands on IBM ISAM and MFP productsstrong experience in configuring MFG backend servicescreation of intergation adapter, SQL adapter, HTTP adaptermultiple types of authenticationexperience of work knowledge ibmdevelop application architecture,user interface designexperience with IBM WAS DB2/my sql
Must-Have Skills: • Good experience in Pyspark - Including Dataframe core functions and Spark SQL • Good experience in SQL DBs - Be able to write queries including fair complexity. • Should have excellent experience in Big Data programming for data transformation and aggregations • Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption. • Good customer communication. • Good Analytical skill Technology Skills (Good to Have): Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. Designing and implementing data engineering, ingestion, and transformation functions Azure Synapse or Azure SQL data warehouse Spark on Azure is available in HD insights and data bricks
JDDesire skill Azure kubernetes Writing services in Nodejs / ExpressJS to be deployed on dockers container Experience in dockers confrigation and build Microservices develovment experience CI/CD Experience as a developer
1) 6-9 years of industry experience and at least 4 years of experience in an architect role is required, along with at least 3-5 year experience in designing and building analytics/data solutions in Azure. 2) Demonstrated in-depth skills with Azure Data Factory(ADF),Azure SQL Server, Azure Synapse, ADLS with the ability to configure and administrate all aspects of Azure SQL Server. 3) Demonstrated experience delivering multiple data solutions as an architect. 4) Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies 5) DP-200 and DP-201 certifications preferred. 6) Good to have hands on experience in Power BI and Azure Databricks. 7)Should have good communication and presentation skills.
Be Part Of Building The Future Dremio is the Data Lake Engine company. Our mission is to reshape the world of analytics to deliver on the promise of data with a fundamentally new architecture, purpose-built for the exploding trend towards cloud data lake storage such as AWS S3 and Microsoft ADLS. We dramatically reduce and even eliminate the need for the complex and expensive workarounds that have been in use for decades, such as data warehouses (whether on-premise or cloud-native), structural data prep, ETL, cubes, and extracts. We do this by enabling lightning-fast queries directly against data lake storage, combined with full self-service for data users and full governance and control for IT. The results for enterprises are extremely compelling: 100X faster time to insight; 10X greater efficiency; zero data copies; and game-changing simplicity. And equally compelling is the market opportunity for Dremio, as we are well on our way to disrupting a $25BN+ market. About the Role The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for innovative minds with experience in leading and building high quality distributed systems at massive scale and solving complex problems. Responsibilities & ownership Lead, build, deliver and ensure customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product. Work on distributed systems for data processing with efficient protocols and communication, locking and consensus, schedulers, resource management, low latency access to distributed storage, auto scaling, and self healing. Understand and reason about concurrency and parallelization to deliver scalability and performance in a multithreaded and distributed environment. Lead the team to solve complex and unknown problems Solve technical problems and customer issues with technical expertise Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure Mentor other team members for high quality and design Collaborate with Product Management to deliver on customer requirements and innovation Collaborate with Support and field teams to ensure that customers are successful with Dremio Requirements B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience Fluency in Java/C++ with 8+ years of experience developing production-level software Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models, and their use in developing distributed and scalable systems 5+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully Hands-on experience in query processing or optimization, distributed systems, concurrency control, data replication, code generation, networking, and storage systems Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform Passion for learning and delivering using latest technologies Ability to solve ambiguous, unexplored, and cross-team problems effectively Hands on experience of working projects on AWS, Azure, and Google Cloud Platform Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and Google Cloud) Understanding of distributed file systems such as S3, ADLS, or HDFS Excellent communication skills and affinity for collaboration and teamwork Ability to work individually and collaboratively with other team members Ability to scope and plan solution for big problems and mentors others on the same Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Job Description:Must to have Devops, Jenkins, Terraforms, shell scripts.Azure Cloud computingKnowledge of code versioning tools such as Git, Bitbucket, etcGood communication in both verbal and non verbalShould be team playerGood to have § proficiency in Azure Cloud Platforms§ Knowledge of build and deployment tools such as Jenkin, Ansible, etcHaving 4 to 6 years development experience on below job descriptionBand : B21) Must Have (Top 3 skills) : Devops, Jenkins and Terraforms2) Good To have : Azure Cloud computing, GIT and Shell scripts
minimum 4years experience ansible with 3+ years hands onany 1 container- preferably kubernetesany 1 cloud preferably azureoverall 3-5 years experience
Minimum 4 Years of experience Ansible with 3+ years hands on Any 1 container – Preferably Kubernetes Any 1 Cloud | Preferably Azure Overall, 3 to 5 years’ experience
Job title: Azure Architect Locations: Noida, Pune, Bangalore and Mumbai Responsibilities: Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity Design and Develop the Data lake, Data warehouse using Azure Cloud Services Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams Support internal presentations to technical and business teams Provide technical guidance, mentoring and code review, design level technical best practices Experience Needed: 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure. Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies. Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS) Worked with transactional, temporal, time series, and structured and unstructured data. Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux). Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Key skills:Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake Job Description Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Working knowledge of setting up and running HD insight applications Hands on experience in Spark, Scala & Hive Hands on experience in ADF – Azure Data Factory Hands on experience in Big Data & Hadoop ECO Systems Exposure to Azure Service categories like PaaS components and IaaS subscriptions Ability to Design, Develop ingestion & processing frame work for ETL applications Hands on experience in powershell scripting, deployment on Azure Experience in performance tuning and memory configuration Should be adaptable to learn & work on new technologies Should have Communication Good written and spoken