Google Cloud Storage Jobs in Bangalore (Bengaluru)

Explore top Google Cloud Storage Job opportunities in Bangalore (Bengaluru) from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon
DP
Posted by Shridhar Nayak
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Scala
Python
Java
Cloud Computing
Google Cloud Storage
+3 more

InViz is Bangalore Based Startup helping Enterprises simplifying the Search and Discovery experiences for both their end customers as well as their internal users. We use state-of-the-art technologies in Computer Vision, Natural Language Processing, Text Mining, and other ML techniques to extract information/concepts from data of different formats- text, images, videos and make them easily discoverable through simple human-friendly touchpoints. 

 

TSDE - Data 

Data Engineer:

 

  • Should have total 3-6 Yrs of experience in Data Engineering.
  • Person should have experience in coding data pipeline on GCP. 
  • Prior experience on Hadoop systems is ideal as candidate may not have total GCP experience. 
  • Strong on programming languages like Scala, Python, Java. 
  • Good understanding of various data storage formats and it’s advantages. 
  • Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources). 
  • Should have Business mindset to understand data and how it will be used for BI and Analytics purposes. 
  • Data Engineer Certification preferred 

 

Experience in Working with GCP tools like

 
 

Store :  CloudSQL , Cloud Storage, Cloud Bigtable,  Bigquery, Cloud Spanner, Cloud DataStore

 

Ingest :  Stackdriver, Pub/Sub, AppEngine, Kubernete Engine, Kafka, DataPrep , Micro services

 

Schedule : Cloud Composer

 

Processing: Cloud Dataproc, Cloud Dataflow, Cloud Dataprep

 

CI/CD - Bitbucket+Jenkinjs / Gitlab

 

Atlassian Suite

 

 

 

 .
Read more

at Benison Technologies

3 recruiters
DP
Posted by Kiranpreet Kaur
Bengaluru (Bangalore), Pune
3 - 10 yrs
₹10L - ₹20L / yr
Go Programming (Golang)
Python
Java
Ruby
Ruby on Rails (ROR)
+4 more

Want to work with an established & growing IT company? Join team Benison to have the right challenges that will help you accelerate your career growth to the next level, faster!   

Benison Technologies was started in 2011 with a mission to revolutionize the silicon industry in India, with a host of amazing big clients like Google, Cisco, McAfee, Intel, and so on, you get to experience the best of both worlds. If you consider yourself an engineer who is capable to join our ever-growing team, then this is the right opportunity for you:  

  

Why Benison Tech?   

We have a partial acquisition from one of the biggest names in the world (well we can’t name them thanks to confidentiality) it’s one of the FAANG companies, and you can “Google” it if you like.  

Oh! & one more thing, this did not happen by accident, our team put a ton of efforts to turn this gigantic dream into a reality.  

Benison Tech has a consistent history of demonstrating growth through innovation time and again.   

We don’t stop there, we then re-invest our profits back into the initiatives for the growth of our people, our culture and the company. Now enough with us, let’s talk about the job roles & responsibilities:    

  

What you will be working on: 

  • Key contributor for developing product strategies and features. 
  • Software development for industries leading SaaS platform 
  • You will be involved closely in planning, designing, integration of client requirements. 
  • You will be working with one of the leaders in data resiliency and data protection. 

  

Here are some technical skills require:  

  • Independently own features and create feature test plans/strategies based on development and feature completion milestones.   
  • Identify quality assurance process bottlenecks and suggest actions for improvement.   
  • Design automation framework for automating feature tests.   
  • Participate in test cases, test plans, s and code reviews.  
  • Resolve functional queries coming from other business units such as support, escalation, product management, etc.   
  • Participate in bug trailing, tracking quality assurance metrics.  
  • Hands-on experience with Python-Selenium or Cypress, will be preferred.   
  • Familiarity with Test Management systems like XRay and bug tracker like JIRA tools.  

  

What we expect from you:    

  • 3-10 Years of relevant experience in QA Automation.   
  • Expert at test automation, creating test plans, test strategies for testing multiple product modules   
  • Should be able to quickly analyze failures and trace back to issues in the product or the automation suite.  
  • As a Software Development Engineer in Test you should be an expert at test automation for APIs as well as UI, creating test plans and test strategies for testing product features.  
  • You will guide and mentor junior team members by reviewing their automation code and test cases to ensure good coverage and quality of a feature  
  • Resolve functional queries coming from other business units such as support, escalation, product management, etc.  
  • Be a quick learner and be open to working on new technologies if needed.    
  • Excellent team player with strong verbal & written communication skills.    
  • Be able to step up when the situation demands such as meeting deadlines and critical production issues.    
  • Propose changes or enhancements to the framework for enabling new feature tests.  

  

 Few Skills which will add brownie points to your role  

  • Working knowledge of Dockers and Kubernetes will be an advantage   
  • Awareness of general manual and automation concepts and all types of testing methods  
  • Knowledge of the Backup or Storage domain will be an advantage.  

  

If the above fits your skill-sets and tickles your interest then read below about the additional benefits that our company offers to talented folks like you:  

  

Work Culture and Benefits    

  • Competitive salary and benefits package  
    (H1-B which means a chance to work onsite out of India)   
  • A culture focused on talent development where you get promoted within the quarterly cycle of your anniversary.         
  • Opportunity to work with cutting-edge & challenging technologies including legacy tech.      
  • Open cafeteria to grab some munchies while you work, we make sure the space feels like your second home, you can also wear pyjamas if you like.         
  • Employee engagement initiatives such as project parties, flexible work hours, and long service awards, team bonding activities within the company, extra learning and personal development trainings, because why stop your learning at one thing!   
  • Insurance coverage: Group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and your parents. (With some of the best insurance partners in India)  
  • Enjoy collaborative innovation (each members gets to innovate & think out of the box), along with highly experienced team managers who maintain diversity and work-life well-being.  
  • And of course, you get to work on projects from some of the most recognised brands within the networking and security space of the world, unlocking global opportunities to learn, grow & contribute in a way that is truly impactful yet purposeful at the same time.  

  

Still not satisfied, and want more proof?  

Head to our website https://benisontech.com to learn more. 

Read more

Reputed client of People First

Agency job
Chennai, Bengaluru (Bangalore), Gurugram
2 - 9 yrs
₹10L - ₹30L / yr
DevOps
Kubernetes
Docker
Jenkins
CI/CD
+4 more
Job Description:
As a MLOps Engineer in QuantumBlack you will:

Develop and deploy technology that enables data scientists and data engineers to build, productionize and deploy machine learning models following best practices. Work to set the standards for SWE and
DevOps practices within multi-disciplinary delivery teams

Choose and use the right cloud services, DevOps tooling and ML tooling for the team to be able to produce high-quality code that allows your team to release to production
.
Build modern, scalable, and secure CI/CD pipelines to automate development and deployment
workflows used by data scientists (ML pipelines) and data engineers (Data pipelines)

Shape and support next generation technology that enables scaling ML products and platforms. Bring
expertise in cloud to enable ML use case development, including MLOps

Our Tech Stack-

We leverage AWS, Google Cloud, Azure, Databricks, Docker, Kubernetes, Argo, Airflow, Kedro, Python,
Terraform, GitHub actions, MLFlow, Node.JS, React, Typescript amongst others in our projects

Key Skills:

• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security

• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)

• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)

• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)

• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)

• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure

• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)

Read more

at Whatdigital

1 recruiter
DP
Posted by Dee Rao
Bengaluru (Bangalore)
3 - 5 yrs
₹6L - ₹12L / yr
Linux/Unix
Linux administration
Apache
MySQL
VPN
+1 more
Job Summary
  • Good experience and exposure of cloud native architecture, development and deployment on public clouds AWS, Google Cloud etc
  • Responsible for Linux server installation, maintain, monitoring, data backup and recovery, securtiy and administration
  • Understanding of clusters, distributed architecture, container environment 
  • Experience in networking, including linux, software defined networking, network virtualization, open protocols, application acceleration and load balancing, DNS, virtual private networks
  • Knowledge of several middleware such as MySQL, Apache etc
  • Responsible for managing network storage
  • Disaster recovery and incident response planning
  • Configuring/monitoring firewalls, routers, switches and other network devices

Responsibilities and Duties
  • Support the globally distributed cloud development and teams by mainitaning the cloud infrastructure labs hosted in a hybrid cloud environment
  • Contribute towards optimization of preformance and acost of running the labs
Read more

at Modistabox

2 recruiters
DP
Posted by Aarushi Mahajan
Bengaluru (Bangalore)
5 - 8 yrs
₹6L - ₹8L / yr
DevOps
Google Cloud Storage
Continuous Integration
Ansible
Amazon Web Services (AWS)
+3 more
Qualifications:
• Bachelor or Master Degree in Computer Science, Software Engineering from a reputed
University.
5 - 8 Years of experience in building scalable, secure and compliant systems.
• More than 2 years of experience in working with GCP deployment for millions of daily visitors
• 5+ years hosting experience in a large heavy-traffic environment
• 5+ years production application support experience in a high uptime environment
• Software development and monitoring knowledge with Automated builds
• Technology:
o Cloud: AWS or Google Cloud
o Source Control: Gitlab or Bitbucket or Github
o Container Concepts: Docker, Microservices
o Continuous Integration: Jenkins, Bamboos
o Infrastructure Automation: Puppet, Chef or Ansible
o Deployment Automation: Jenkins, VSTS or Octopus Deploy
o Orchestration: Kubernets, Mesos, Swarm
o Automation: Node JS or Python
o Linux environment network administration, DNS, firewall and security management
• Ability to be adapt to the startup culture, handle multiple competing priorities, meet
deadlines and troubleshoot problems.
Read more

at Datalicious Pty Ltd

2 recruiters
DP
Posted by Ramjee Ganti
Bengaluru (Bangalore)
2 - 7 yrs
₹7L - ₹20L / yr
Python
Amazon Web Services (AWS)
Google Cloud Storage
Big Data
Data Analytics
+3 more
DESCRIPTION :- We- re looking for an experienced Data Engineer to be part of our team who has a strong cloud technology experience to help our big data team to take our products to the next level.- This is a hands-on role, you will be required to code and develop the product in addition to your leadership role. You need to have a strong software development background and love to work with cutting edge big data platforms.- You are expected to bring with you extensive hands-on experience with Amazon Web Services (Kinesis streams, EMR, Redshift), Spark and other Big Data processing frameworks and technologies as well as advanced knowledge of RDBS and Data Warehousing solutions.REQUIREMENTS :- Strong background working on large scale Data Warehousing and Data processing solutions.- Strong Python and Spark programming experience.- Strong experience in building big data pipelines.- Very strong SQL skills are an absolute must.- Good knowledge of OO, functional and procedural programming paradigms.- Strong understanding of various design patterns.- Strong understanding of data structures and algorithms.- Strong experience with Linux operating systems.- At least 2+ years of experience working as a software developer or a data-driven environment.- Experience working in an agile environment.Lots of passion, motivation and drive to succeed!Highly desirable :- Understanding of agile principles specifically scrum.- Exposure to Google cloud platform services such as BigQuery, compute engine etc.- Docker, Puppet, Ansible, etc..- Understanding of digital marketing and digital advertising space would be advantageous.BENEFITS :Datalicious is a global data technology company that helps marketers improve customer journeys through the implementation of smart data-driven marketing strategies. Our team of marketing data specialists offer a wide range of skills suitable for any challenge and cover everything from web analytics to data engineering, data science and software development.Experience : Join us at any level and we promise you'll feel up-levelled in no time, thanks to the fast-paced, transparent and aggressive growth of DataliciousExposure : Work with ONLY the best clients in the Australian and SEA markets, every problem you solve would directly impact millions of real people at a large scale across industriesWork Culture : Voted as the Top 10 Tech Companies in Australia. Never a boring day at work, and we walk the talk. The CEO organises nerf-gun bouts in the middle of a hectic day.Money: We'd love to have a long term relationship because long term benefits are exponential. We encourage people to get technical certifications via online courses or digital schools.So if you are looking for the chance to work for an innovative, fast growing business that will give you exposure across a diverse range of the world's best clients, products and industry leading technologies, then Datalicious is the company for you!
Read more
DP
Posted by Sindhu Narayan
Bengaluru (Bangalore)
3 - 9 yrs
₹6L - ₹18L / yr
MySQL
Python
Big Data
Google Cloud Storage
API
+3 more
Data Engineer: Pluto7 is a services and solutions company focused on building ML, Ai, Analytics, solutions to accelerate business transformation. We are a Premier Google Cloud Partner, servicing Retail, Manufacturing, Healthcare, and Hi-Tech industries.We’re seeking passionate people to work with us to change the way data is captured, accessed and processed, to make data driven insightful decisions. Must have skills : Hands-on experience in database systems (Structured and Unstructured). Programming in Python, R, SAS. Overall knowledge and exposure on how to architect solutions in cloud platforms like GCP, AWS, Microsoft Azure. Develop and maintain scalable data pipelines, with a focus on writing clean, fault-tolerant code. Hands-on experience in data model design, developing BigQuery/SQL (any variant) stored. Optimize data structures for efficient querying of those systems. Collaborate with internal and external data sources to ensure integrations are accurate, scalable and maintainable. Collaborate with business intelligence/analytics teams on data mart optimizations, query tuning and database design. Execute proof of concepts to assess strategic opportunities and future data extraction and integration capabilities. Must have at least 2 years of experience in building applications, solutions and products based on analytics. Data extraction, Data cleansing and transformation. Strong knowledge on REST APIs, Http Server, MVC architecture. Knowledge on continuous integration/continuous deployment. Preferred but not required: Machine learning and Deep learning experience Certification on any cloud platform is preferred. Experience of data migration from On-Prem to Cloud environment. Exceptional analytical, quantitative, problem-solving, and critical thinking skills Excellent verbal and written communication skills Work Location: Bangalore
Read more

at Pramata

1 recruiter
DP
Posted by Seena Narayanan
Bengaluru (Bangalore)
3 - 7 yrs
₹8L - ₹16L / yr
DevOps
Automation
Programming
Linux/Unix
Software deployment
+7 more
Job Title: DevOps Engineer Work Experience: 3-7 years Qualification: B.E / M. Tech Location: Bangalore, India About Pramata Pramata’s unique, industry-proven offering combines the digitization of critical customer data currently locked in unstructured and obscure sources, then converts that data into high-quality, actionable information accessible through one or multiple applications through the Pramata cloud-based customer digitization platform. Pramata’s customers are some of the largest companies in the world including CenturyLink, Comcast Business, FICO, HPE, Microsoft, NCR, Novelis, and Truven Health IBM. Pramata has helped these companies and more find millions of dollars in revenue, ensure regulatory and pricing compliance, as well as enable risk identification and management across their customer, partner, and even supplier bases. Pramata is headquartered near San Francisco, California and has its Product Engineering and Solutions Delivery Center in Bangalore, India. How Pramata Works Pramata extracts essential intelligence about customer relationships from complex, negotiated contracts, simplifies it from legalese into plain English, synthesizes it with data from CRM, CLM, billing and other systems, and delivers it in the context of a particular user’s role and responsibilities. This is done through Pramata’s unique Digitization-as-a-Service (DaaS) process which transforms unstructured and diverse data into accurate, timely and meaningful digital information stored in the Pramata Digital Intelligence Hub. The Hub keeps the information centralized as one single, shared source of truth along with ensuring that this data remains consistent, accessible and highly secure. The opportunity - What you get to do You will be instrumental in bringing automation to development and testing pipelines, release management, configuration management, environment & application management and day-to-day support of development teams. You will manage the development of capabilities to achieve higher automation, quality and performance in automated build and deployment management, release management, on-demand environment configuration & automation, configuration and change management and in production environment support - Application monitoring, performance management and production support of mission-critical applications including application and system uptime and remote diagnostics - Security - Ensure that the highly sensitive data from our customers is secure at all times. - Instrument applications for performance baselines and to aid rapid diagnostics and resolution in case of system issues. - High availability and disaster recover - Build and maintain systems that are designed to provide 99.9% uptime and ensure that disaster recovery mechanisms are in place. - Automate provisioning and integration tasks as required to deploy new code. - Monitoring - Proactive steps to monitor complex interdependent systems to ensure that issues are being identified and addressed in real-time. Skills required: - Excellent communicator with great interpersonal skills, driving clarity about the intricate systems - Come with hands-on experience in application infrastructure technologies like Linux(RHEL), MySQL, Apache, Nginx, Phusion passenger, Redis etc. - Good understanding of software application builds, configuration management and deployments - Strong scripting skills like Shell, Ruby, Python, Perl etc. Comes with passion for automation - Comfortable with collaboration, open communication and reaching across functional borders. - Advanced problem-solving and task break-down ability. Additional Skills (Good to have but not mandatory): - In depth understanding and experience working with any Cloud Platforms (e.g: AWS, Azure, Google cloud etc) - Experience using configuration management tools like Chef, Puppet, Capistrano, Ansible, etc. - Being able to work under pressure and solve problems using an analytical approach; decisive, fast moving; and a positive attitude. Minimum Qualifications: - Bachelor’s Degree in Computer Science or a related field - Background in technology operations for Linux based applications with 2-4 years of experience in enterprise software - Strong programming skills in Python, Shell or Java - Experience with one or more of the following Configuration Management Tools: Ansible, Chef, Salt, Puppet - Experience with one or more of the following Databases: PostgreSQL, MySQL, Oracle, RDS
Read more
DP
Posted by Sindhu Narayan
Bengaluru (Bangalore)
2 - 7 yrs
₹4L - ₹20L / yr
Statistical Modeling
Data Science
TensorFlow
Python
Machine Learning (ML)
+5 more
Data Scientist : Pluto7 is a services and solutions company focused on building ML, Ai, Analytics, and IoT tailored solutions to accelerate business transformation.We are a Premier Google Cloud Partner, servicing Retail, Manufacturing, Healthcare, and Hi-Tech industries. We are a Google premium partner in AI & ML, which means you'll have the opportunity to work and collaborate with folks from Google. Are you an innovator, have a passion to work with data and find insights, have the inquisitive mind with the constant yearning to learn new ideas; then we are looking for you.As a Pluto7 Data Scientist engineer, you will be one of the key members of our innovative artificial intelligence and machine learning team. You are expected to be unfazed with large volumes of data, love to apply various models, use technology to process and filter data for analysis. Responsibilities: Build and Optimize Machine Learning models. Work with large/complex datasets to solve difficult and non-routine analysis problems, applying advanced analytical methods as needed. Build and prototype data pipelines for analysis at scale. Work cross-functionally with Business Analysts and Data Engineers to help develop cutting edge and innovative artificial intelligence and machine learning models. Make recommendations for selections on machine learning models. Drive accuracy levels to the next stage of the given ML models. Experience in developing visualisation and User Good exposure in exploratory data analysis Strong experience in Statistics and ML algorithms. Minimum qualifications: 2+ years of relevant work experience in ML and advanced data analytics(e.g., as a Machine Learning Specialist / Data scientist ). Strong Experience using machine learning and artificial intelligence frameworks such as Tensorflow, sci-kit learn, Keras using python. Good in Python/R/SAS programming. Understanding of Cloud platforms like GCP, AWS, or other. Preferred qualifications: Work experience in building data pipelines to ingest, cleanse and transform data. Applied experience with machine learning on large datasets and experience translating analysis results into business recommendations. Demonstrated skills in selecting the right statistical tools given a data analysis problem. Demonstrated effective written and verbal communication skills. Demonstrated willingness to both teach others and learn new techniques Work location : Bangalore
Read more
DP
Posted by Harry SH
Bengaluru (Bangalore)
4 - 8 yrs
₹8L - ₹15L / yr
Google Cloud Storage
Docker
Kubernetes
DevOps
Required Skills: Strong experience in AWS / Google Cloud. Strong development experience in Perl, Python, Docker, and Postgres. Strong experience in build/release management. Working experience on Linux. Excellent knowledge of shell scripts. Knowledge of Virtualization Platforms VMware. Working experience on Configuration Management tools. Working experience on Test and Build Systems Jenkins/Maven Should have strong communication skills, a passion to learn, and an ability to work well with people at all levels of an organization. Roles and Responsibilities: Create Deployment Unit consisting build, documents and installation artifacts. Preparing Delivery definition / Release Note / Production turn-over Note documents. Establish DevOps Policies. Communicate with developers, product managers and technical support specialists on product issues. Assist in Creating and maintaining Configuration and Change Management Plan for the project. Choosing suitable DevOps tools. Setting up Configuration Management Environment. Assist in routine back-up and archival of project repository.
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort