Requirements and Qualifications • Create and maintain optimal data pipeline architecture • Assemble large, complex data sets that meet functional / non-functional business requirements • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using MySQL and AWS technologies • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data- related technical issues and support their data infrastructure needs • Create data analytical tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader • Bachelors/Masters degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field • 5+ years of experience in Data Engineering or building backend systems or platforms • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases • Experience building and optimizing data pipelines, architectures and data sets • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement • Build processes supporting data transformation, data structures, metadata, dependency and workload management • A successful history of manipulating, processing and extracting value from large disconnected datasets • Experience supporting and working with cross-functional teams in a dynamic environment • Experience with real time data pipelines from ingestion to delivery (like Kafka, Kinesis etc) • Familiarity with cloud-based solutions in AWS/Azure/GCP
# Responsibilities - Work with development teams and product managers to ideate software solutions - Design server-side architecture - Develop and manage well-functioning databases and applications - Write effective APIs - Test software to ensure responsiveness and efficiency - Troubleshoot, debug and upgrade software - Write technical documentation Qualifications - 2 to 5 years experience in building REST APIs - Ability to meet tight deadlines - Attention to detail - Capable of prioritizing multiple projects in order to meet goals. - Experience in building APIs using NodeJS, Express & serverless technologies like AWS Lambda - Comfortable with MySQL/ PostgreSQL, Mongo, and Firestore
The DevOps is responsible for crafting and delivering secure and highly available solutions. You will be a critical part of a team passionate about ensuring our critical services are ready and stress tested. You should be comfortable taking on new challenges, defining potential solutions and implementing designs in a team environment. Responsibilities During your employment your responsibilities will include: The DevOps engineer partners closely with Product, Engineering, Support, and OPS. We are responsible for the design, deployment, and continuous operation of the Personalization platform. You will take our existing platform to the next level with CI/CD, automated diagnostics/scaling/healing, and more. You will work on a team responsible for a blend of architecture, automation, development, and application administration. You will craft and deploy solutions from the infrastructure, to the network, and application layers, on public cloud platforms. You will ensure our SaaS platform is available and performing, and that we can notice problems before our customers. You will build the tools to improve speed, confidence and visibility of our SaaS deployments. You will help build security into every step of the software & infrastructure life cycle. You will collaborate with Support and Engineering on customer issues, as needed. Working with distributed data infrastructure, including containerization and virtualization tools, to enable unified engineering and production environments Developing dashboards, monitors, and alerts to increase situational awareness of the state of our production issues/sla/security incidents. Independently conceiving and implementing ways to improve development and operational efficiency, code reliability, and test fidelity. Requirements Hands on Experience with a modern programming language (Python, Go lang, Shell Scripting etc) Hands-on Unix/Linux knowledge Writing scripts using Python, Unix Shell (bash,sh) Familiarity with common configuration management & orchestration tools (ansible, puppet, chef etc) Modern application monitoring tools, and be able to come up with SLI/SLO Continuous delivery/integration tools (Jenkins, Argo CD etc) A DevOps mentality Must have experience with virtualization technologies (Kubernetes, and docker, etc)
Technical Specialist – Associate - UNIX and DB Bengaluru, India Full-time Company Description Commvault is the world's most powerful backup and recovery software in the cloud and on any infrastructure, helping companies transform their data into a powerful strategic asset. Commvault data protection and information management solutions enable companies and organizations of all sizes, in all industries, to protect, access and share all of their data—anywhere and anytime. As an organization, we are committed to a great work culture that embraces our values and promotes professional growth. Our Vaulters are passionate innovators who work together to uncover new challenges that can be solved. We are proud that the focus of every vaulter is to drive our customers' businesses forward. We're all about getting the job done, and having FUN doing it. As vaulters we pride ourselves on transparency, integrity, and respect in everything that we do. NOW is the time to join a growing company with strong roots, where you can take on your next challenge. Job Description The Customer Support Associate - UNIX and DB works within a team of technical support professionals aimed at delivering technical solutions for Commvault’s customer base. This customer base has a wide range of technical ability and this role has no ceiling, you will own an issue from initial pick-up all the way to working with the development team on identifying a possible solution. This position is part of our UNIX and DB group in our Customer Support Center. This is an excellent opportunity for innovative and collaborative technologists to join a global team that has led the industry in satisfaction seven years in a row. We offer innovative training, interesting colleagues, and the opportunity to grow with us. Our UNIX and DB Group is focused around but not limited to backups of Unix and Linux platforms and databases within the Commvault suite, this includes networking, troubleshooting, connectivity, name resolution and performance based issues on an OS and or hardware. We are a source of expertise for not just our customer base but also our partners and consultants on site. Why Commvault? You get the “Freedom to make an impact, together”. We thrive upon collaboration by designing our offices and cafes to spark conversation about work and fun. Although we hail from all walks of life and speak dozens of languages, we’re passionate about equality and integrity. We “go beyond” each and every day. We are looking for a Customer Support Associate with a genuine passion for all things tech to join our expert and super-friendly UNIX and DB team. Position Responsibilities include: Troubleshoot and resolve complex support problems Troubleshoot customer issues using remote desktop software Successfully interact through phone and email with customers as you solve their problems Dedication to the success and satisfaction of our customers Recreation of problems in house Root Cause Analysis and / or provision of examples of software bug Working independently and as a team to come up with the best solutions to a customer problem. Providing best-in-class phone based support for a variety of complex, time critical issues. Using and sharing your knowledge of a wide range of technologies Working remotely on enterprise level customers and dark sites Having the opportunity to build labs and simulators Ability to be involved in product BETA testing Contributing to our Solutions Engine and online forums Position Requirements include: At least 5+ years of technical/customer support experience Expert level knowledge of Unix-flavoured operating systems and its components. Strong understanding of Oracle , HANA and SAP databases, Good understanding of Unix/Linux clusters, with installing and configuring software. Solid understanding of Unix device management (tape and disk) Networking and troubleshooting connectivity, name resolution and performance based issues with Linux OS and/or hardware. Desirable to have exposure to storage arrays like NetApp, HDS Understanding backup theory and design. Backup and data management fundamentals Previous experience with Commvault technologies is highly desirable as is other backup software such as Veritas, Symantec, Backup Exec, ShadowProtect, NetBackup, NetWorker, Avamar, VEEAM, or TSM is a plus Previous experience troubleshooting enterprise environments Strong customer relations skills. Strong multi-tasking and prioritization skills. Strong written and verbal communication skills. Excellent team player. Qualifications We value knowledge and curiosity, no matter whether you learnt it at university, through courses or by learning from others. Ideally a Bachelors Degree Preferred - OCP/OCA/OCM, Red Hat – RHCSA, RHCE Additional Information Recognized as a leader in the Gartner Magic Quadrant for Data Center Backup and Recovery Software, our industry's definitive independent ranking. For the seventh straight year, Commvault has been named a leader. And this year we're furthest on the "completeness of vision" and highest on the "ability to execute." Commvault offers its products through a broad array of distribution partners globally, while building upon its strong portfolio of strategic partnerships with leading technology companies including Microsoft, Amazon Web Services, Cisco, Oracle, SAP, Nutanix, Pure, HP, Hitachi, NetApp and many others. Commvault's global headquarters is located in Tinton Falls NJ, with additional offices that support customers globally across the Americas, EMEA, and APAC.
Senior AWS Security Engineer Job Scope Build Role-based Policies in IAM, establish relevant SOPs to achieve solid access control, and work with security & compliance personnel, data engineer & data scientist for access provision/deprovision, account and access review Manage AWS Keys (include but not limited to key management, key encryption, key expiring and rotating etc.). Collaborate with network security specialist, Configuring AWS Security Group to achieve server hardening. Collaborate with DevOps team, monitoring user access in DBs & other AWS instances using Enterprise Tools (such as Arcsight SIEM & Imperva etc.) Work with data engineer & solution architect for design and implementation of data encryption in AWS infrastructure, validating data encryption following standard guidelines. Work with privacy compliance personnel for design and implementation of user access and data classification in DBs, which follows the standard guidelines (e.g., HIPAA, GDPR, PDPA etc.). Work with project manager, data engineer, and data scientist on handling request for secured data transfer Work with DevOps, security and compliance team, involving in execution and validation of Disaster Recovery Testing Plan (DR) in AWS environment Work closely with technology, security, risk & compliance, internal audit and development teams to build and launch new policies, processes, controls and provide internal trainings, as necessary. Mandatory Requirements Bachelor’s degree or above in Information Security, Computer Science or a related technical field Min 5 years’ experience as cloud engineer, in which at least 2 years as AWS cloud security engineer, and 2 years doing database management Sound knowledge for security framework in managed cloud service, be professional in ‘CIS AWS Foundations Benchmark controls’ Hands on working knowledge of current Identity Access Management and Privileged Access Management technologies and solutions implementation In depth knowledge in AWS IAM, Security Group, KMS, Secret Manager In-depth knowledge on standardizing the SSH login process to server by different teams based on public/private keys Good understanding of AWS VPC, CloudTrail, CloudWatch, knowledge on tracing changes in AWS configurations Good Knowledge of AWS EC2, S3, S3 Glacier, and container service (EKS) In depth knowledge of MongoDB & DynamoDB, independently set up and managed both type of DBs for enterprise level client data/data in data lake In depth knowledge of secured data transfer, independently handled data transfer request using SFTP and/or AWS DataSync (or other enterprise level tools) Good knowledge of TLS (TLS 1.2 & TLS2.0), and AES/AES-GCM Hands on experience in drafting corporate SOPs (related to cloud security, access control, data transfer etc.), and/or reviewing and revising the existing SOPs Preferred Certificates/Qualifications AWS Certified – SysOps Administrator (Associate) AWS Certified Security – Specialty AWS Certified Database - Specialty CIAM, CIST or any other relevant certifications is a plus Working experience in healthcare industry is a plus
We are looking for a great Go developer who possesses a strong understanding of how best to leverage and exploit the language’s unique paradigms, idioms, and syntax. Your primary focus will be on developing Go packages and programs that are scalable and maintainable. You will ensure that these Go packages and programs are well documented and has a reasonable test coverage. You will coordinate with the rest of the team working on different layers of the infrastructure. A commitment to collaborative problem solving, sophisticated design, and quality product is essential. Interested candidates can apply directly on https://cloudfeathergames.com/positions for a faster approach. Number of positions - 2ResponsibilitiesPrimary responsibilities and skills include the following areas: Writing scalable, robust, testable, efficient, and easily maintainable code Translating software requirements into stable, working, high performance software Playing a key role in architectural and design decisions, building toward an efficient micro services distributed architecture Building highly scalable, highly available web services to handle millions of transactions per day in a cloud native environment Working on an agile team using CI / CD best practices to deliver the highest quality software possible quickly A strong desire to learn new things and continually improve yourself and those around you with a “can do anything” mentality You live to experiment, test, fail fast, and learn as you go, we are not looking for a cookie cutter solution to the complex problems we solve Requirements Strong knowledge of Go programming language, paradigms, constructs, and idioms Knowledge of common Goroutine and channel patterns Ability to write clean and effective Godoc comments Familiarity with code versioning tools git Nice To Have Good understanding of SQL and data modeling. Scripting ability (Bash / Shell, Python etc) Any one of these data store MongoDB, Postgres, CockroachDB, or other NoSQL servers Automated testing of applications & Continuous Integration Experience writing and building API systems using REST / JSON / gRPC Experience with Kubernetes, Docker, Kafka, NATS etc.
EDUCATION AND YEARS OF EXPERIENCE REQUIREMENTS: Bachelor’s Degree in Computer Science or IT preferred. 7+ years of experience as SAP Business Objects Developer in building reports and universes with experience as SAP Business Objects Administrator in installing, configuring and setting up SAP Business Objects system at the enterprise level. KNOWLEDGE AND SKILLS REQUIREMENTS: Experience in analysis, design and implementation of applications using SAP Business Objects. Experience with SAP Business Objects in configuring, developing the system as well as in maintaining it at the enterprise level as a support administrator. Experience as SAP Business Objects Developer. Experience with SAP Business Objects Query trouble shooting, Performance Tuning using universes. Exposure to SAP Business Objects universe development integration and implementation Experience in technical implementations of business intelligence applications that deliver business decision making capabilities. Strong SQL knowledge with ability to create adhoc SQL repprts Exposure to full-phase ETL implementation using tools such as Data Stage, Matillion or other ETL tools is a big plus for assisting with Marketplace data views Exposure to tools like Qlik and Business Objects is desirable Hawk Redeem requires work hours to be US Central Standard Time Zone, 4:30 AM start time. JOB RESPONSIBILITIES: Acting as SAP Business Objects expert to advice customers on best practices in deploying SAP Business Objects reports. Creating functional and technical requirements as an input to application design business solution components and prototypes. Design and develop high value SAP Business Objects reports and when possible Qlik dashboards. Delivering high quality business intelligence solutions to our internal customers. Developing business intelligence applications for data analysis optimized for the best performance and scalability requirements using SAP Business Objects. Interacting with business leaders to understand business strategy, conditions, and being able to frame problems Provide on-call support to reporting platform
Looking for talented and passionate people to be part of the team for an upcoming project at client location. QUALIFICATION AND EXPERIENCE Preferably have a working experience of 4 Years and more , on production PostgreSQL DBs. Experience of working in a production support environment Engineering or Equivalent degree Passion for open-source technologies is desired ADDITIONAL SKILLS Install & Configure PostgreSQL, Enterprise DB Technical capabilities PostgreSQL 9.x, 10.x, 11.x Server tuning Troubleshooting of Database issues Linux Shell Scripting Install, Configure and maintain Fail Over mechanism Backup - Restoration, Point in time database recovery A demonstrable ability to articulate and sell the benefits of modern platforms, software and technologies. A real passion for being curious and a continuous learner. You are someone that invests in yourself as much as you invest in your professional relationships. RESPONSIBILITIES Monitoring database performance Optimizing Queries and handle escalations Analyse and assess the impact and risk of low to medium risk changes on high profile production databases Implement security features DR implementation and switch over WHAT IS IN IT FOR YOU? You would be adding a great experience of working with a leading open source solutions company in South East Asia region to your career. You would get to learn from the leaders and grow in the industry. This would be a great opportunity for you to grow in your career through continuous learning, adding depth and breadth of technologies. Since our client work with leading open source technologies and engage with large enterprises, it creates enormous possibilities for career growth for our team
Roles and Responsibility Very good experience in Java Development using java spring, Hibernate, IoT application development, HTML 5, java script, rest service, Kafka, database handling etc. Exposer to control system will be added advantageVery good communication and interpersonal skills. Capability to mentor a group of 3-5 engineers.Engineering degree in EC or CS
10 + years of experience and expertise in database and systems architecture design, development and implementation; Expertise and experience in data structures, indexing, query, and retrieval; Excellent data management skills including ensuring data integrity, data access, security and archiving procedures; Knowledge of the language technology and video extraction research domains to be able to converse fluently with the research communities to transform research requirements into concrete formalisms. Experience in cross platform development for multiple variants of Unix, Linux including 32 and 64 bit Experience with NoSQL, SQL databases, statistics and algorithms Strong oral and written communication skills Design, implement and test novel database architecture designs to accommodate the multimodal data types used in Client managed technology evaluations Administer and monitor database and address data security solutions as applicable Design and develop efficient techniques for fast, optimized data access and transfer of distributed or networked database systems Design, implement and test novel data structures, data search, query and retrieval solutions to enable access to and processing of the multimodal data types used in Client managed technology evaluations Design, implement and test new and update data indexing mechanisms and architecture designs to allow access to and processing of the data types used in Client managed technology evaluations
AWS DevOps EngineerGoodera is looking for an experienced and motivated DevOps professional to be an integral part of its core infrastructure team. As a DevOps Engineer, you must be able to troubleshoot production issues, design, implement, and deploy monitoring tools, collaborate with team members to improve the existing and develop new engineering tools, optimize company's computing architecture, design and conduct security, performance, availability and availability tests.Responsibilities:This is a highly accountable role and the candidate must meet the following professional expectations:• Owning and improving the scalability and reliability of our products.• Working directly with product engineering and infrastructure teams.• Designing and developing various monitoring system tools.• Accountable for developing deployment strategies and build configuration management.• Deploying and updating system and application software.• Ensure regular, effective communication with team members and cross-functional resources.• Maintaining a positive and supportive work culture.• First point of contact for handling customer (may be internal stakeholders) issues, providing guidance and recommendations to increase efficiency and reduce customer incidents.• Develop tooling and processes to drive and improve customer experience, create playbooks.• Eliminate manual tasks via configuration management.• Intelligently migrate services from one AWS region to other AWS regions.• Create, implement and maintain security policies to ensure ISO/ GDPR / SOC / PCI compliance.• Verify infrastructure Automation meets compliance goals and is current with disaster recovery plan.• Evangelize configuration management and automation to other product developers.• Keep himself updated with upcoming technologies to maintain the state of the art infrastructure.Required Candidate profile : • 3+ years of proven experience working in a DevOps environment.• 3+ years of proven experience working in AWS Cloud environments.• Solid understanding of networking and security best practices.• Experience with infrastructure-as-code frameworks such as Ansible, Terraform, Chef, Puppet, CFEngine, etc.• Experience in scripting or programming languages (Bash, Python, PHP, Node.js, Perl, etc.)• Experience designing and building web application environments on AWS, including services such as ECS, ECR, Foregate, Lambda, SNS / SQS, CloudFront, Code Build, Code pipeline, Configuring CloudWatch, WAF, Active Directories, Kubernetes (EKS), EC2, S3, ELB, RDS, Redshift etc.• Hands on Experience in Docker is a big plus.• Experience working in an Agile, fast paced, DevOps environment.• Strong Knowledge in DB such as MongoDB / MySQL / DynamoDB / Redis / Cassandra.• Experience with Open Source and tools such as Haproxy, Apache, Nginx and Nagios etc.• Fluency with version control systems with a preference for Git *• Strong Linux-based infrastructures, Linux administration • Experience with installing and configuring application servers such as WebLogic, JBoss and Tomcat.• Hands-on in logging, monitoring and alerting tools like ELK, Grafana, Metabase, Monit, Zbbix etc.• A team player capable of high performance, flexibility in a dynamic working environment and the ability to lead.d ability to rain others on technical and procedural topics.
Why are we building UrbanCompany? UrbanCompany’s vision is to empower 1 million+ service professionals to become micro-entrepreneurs. Prior to joining UrbanCompany, most of these professionals, be it plumbers, beauticians, carpenters etc. would typically earn INR 10-15k per month, working for a local shop, aggregator or as a freelancer. The UrbanCompany platform enables these ISPs to become micro-entrepreneurs by helping them in 5 key areas – 1. Unlocking market access: Working as an individual franchisee of UC 2. Financing Access: Bank accounts, access to loans, insurance etc. 3. Tech Led Service Standardization: Fixed pricing, clear deliverables, SOPs, delivery tracking, payment systems, reviews etc. 4. Training: Soft and Core Skills Training - training centers and via the app. 5. Consumables Supply Chain: Bulk Procurement of Service Consumables. This helps service professionals become more organized and multiplies their earnings. E.g. - Our beauticians typically earn INR 40-50k per month with no upper ceiling (P95 earn > INR 100k per month), while in the local salons, they would earn between INR 8-15k per month. Job Description: UrbanCompany gets 2.5 Million customers every month, for a variety of their home services needs. It is present in all major metropolitan cities of India and UAE, Australia and Singapore. Over 20,000 service partners rely on UrbanCompany for their earnings and livelihood, with the platform driving either all, or >80% of their business. The platform is growing rapidly, scaling at 3x YoY, supported by a strong balance sheet and a clear path to profitability. Growth Team is the customer facing team and owns the set of services which cater to all our customer traffic. Day to day challenges include high traffic, traffic bursts, always available, ability to experimentation, collecting user behaviour insights, perfomance of APIs, pricing and catalog, requiement gather, checkout and payment, Internalational expansion etc Job Responsibilities :As a part of this team, you would be expected to: ● Strong design fundamentals and experience of designing complex software systems. ● Own atleast one service end to end in growth eco-system along with a small team of 2 ● Build a system for high availability and scalability. ● Define new features and define the new technology stack. ● Set team best practices. Who can apply?- Bachelors/master's in computer science from top tier Engineering School- 4-8 years prior engineering experience in building distributed systems- Proven ability to work on a fast-paced environment- Fanatic about building scalable, opinionated, high-quality, secure and reliable data products- Experience with Databases like Redis, Kafka/Kinesis, Mongo, Mysql, Elasticsearch- Experience with Programming language Node.js, Python, Scala, Java will be plusWhat can you expect?- Work closely with the founding and the leadership team on key projects- Work in full stack teams (PM + Engg full stack + Design) - Execute highly scalable applications & implement best practices- A phenomenal work environment, with massive ownership and growth opportunities- Quick iterations and deployments - fail-fast attitude