Cutshort logo
Multinational Company logo
GCP Data Engineer, WFH
GCP Data Engineer, WFH
Multinational Company's logo

GCP Data Engineer, WFH

Agency job
5 - 15 yrs
₹27L - ₹30L / yr
Remote only
Skills
Data engineering
Google Cloud Platform (GCP)
skill iconPython

• The incumbent should have hands on experience in data engineering and GCP data technologies.

• Should Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform.

• Should Work with Agile and DevOps techniques and implementation approaches in the delivery.

• Showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions.

• Build and deliver Data solutions using GCP products and offerings.
• Have hands on Experience on Python 
Experience on SQL or MySQL. Experience on Looker is an added advantage.

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Multinational Company

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

TEKsystems
at TEKsystems
1 recruiter
priyanka kanwar
Posted by priyanka kanwar
Gurugram
5 - 10 yrs
₹15L - ₹25L / yr
Apache Spark
skill iconAmazon Web Services (AWS)
skill iconPython
airflow
Algorithms

TOP 3 SKILLS

Python (Language)

Spark Framework

Spark Streaming

Docker/Jenkins/ Spinakar

AWS

Hive Queries

He/She should be good coder.

Preff: - Airflow

Must have experience: -

Python

Spark framework and streaming

exposure to Machine Learning Lifecycle is mandatory.

Project:

This is searching domain project. Any searching activity which is happening on website this team create the model for the same, they create sorting/scored model for any search. This is done by the data

scientist This team is working more on the streaming side of data, the candidate would work extensively on Spark streaming and there will be a lot of work in Machine Learning.


INTERVIEW INFORMATION

3-4 rounds.

1st round based on data engineering batching experience.

2nd round based on data engineering streaming experience.

3rd round based on ML lifecycle (3rd round can be a techno-functional round based on previous

feedbacks otherwise 4th round will be a functional round if required.

Read more
Remote only
8 - 16 yrs
₹20L - ₹50L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
sagemaker
skill iconGo Programming (Golang)
+9 more
Data Scientist Lead / Manager
Job Description:
We are looking for an exceptional Data Scientist Lead / Manager who is passionate about data and motivated to build large scale machine learning solutions to shine our data products. This person will be contributing to the analytics of data for insight discovery and development of machine learning pipeline to support modeling of terabytes of daily data for various use cases.

Location: Pune (Initially remote due to COVID 19)

*****Looking for someone who can start immediately / Within a month. Hands-on experience in Python programming (Minimum 5 Years) is a must.


About the Organisation :

- It provides a dynamic, fun workplace filled with passionate individuals. We are at the cutting edge of advertising technology and there is never a dull moment at work.

- We have a truly global footprint, with our headquarters in Singapore and offices in Australia, United States, Germany, United Kingdom and India.

- You will gain work experience in a global environment. We speak over 20 different languages, from more than 16 different nationalities and over 42% of our staff are multilingual.


Qualifications:
• 8+ years relevant working experience
• Master / Bachelors in computer science or engineering
• Working knowledge of Python and SQL
• Experience in time series data, data manipulation, analytics, and visualization
• Experience working with large-scale data
• Proficiency of various ML algorithms for supervised and unsupervised learning
• Experience working in Agile/Lean model
• Experience with Java and Golang is a plus
• Experience with BI toolkit such as Tableau, Superset, Quicksight, etc is a plus
• Exposure to building large-scale ML models using one or more of modern tools and libraries such as AWS Sagemaker, Spark ML-Lib, Dask, Tensorflow, PyTorch, Keras, GCP ML Stack
• Exposure to modern Big Data tech such as Cassandra/Scylla, Kafka, Ceph, Hadoop, Spark
• Exposure to IAAS platforms such as AWS, GCP, Azure

Typical persona: Data Science Manager/Architect
Experience: 8+ years programming/engineering experience (with at least last 4 years in Data science in a Product development company)
Type: Hands-on candidate only

Must:
a. Hands-on Python: pandas,scikit-learn
b. Working knowledge of Kafka
c. Able to carry out own tasks and help the team in resolving problems - logical or technical (25% of job)
d. Good on analytical & debugging skills
e. Strong communication skills

Desired (in order of priorities)
a.Go (Strong advantage)
b. Airflow (Strong advantage)
c. Familiarity & working experience on more than one type of database: relational, object, columnar, graph and other unstructured databases
d. Data structures, Algorithms
e. Experience with multi-threaded and thread sync concepts
f. AWS Sagemaker
g. Keras
Read more
RandomTrees
at RandomTrees
1 recruiter
Amareswarreddt yaddula
Posted by Amareswarreddt yaddula
Remote only
5 - 10 yrs
₹1L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

Job Title: Senior Data Engineer

Experience: 8Yrs to 11Yrs

Location: Remote

Notice: Immediate or Max 1Month

Role: Permanent Role


Skill set: Google Cloud Platform, Big Query, Java, Python Programming Language, Airflow, Data flow, Apache Beam.


Experience required:

5 years of experience in software design and development with 4 years of experience in the data engineering field is preferred.

2 years of Hands-on experience in GCP cloud data implementation suites such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, etc.

Strong experience and understanding of very large-scale data architecture, solutions, and operationalization of data warehouses, data lakes, and analytics platforms.

Mandatory 1 year of software development skills using Java or Python.

Extensive hands-on experience working with data using SQL and Python.


Must Have: GCP, Big Query, Airflow, Data flow, Python, Java.


GCP knowledge must

Java as programming language(preferred)

Big Query, Pub-Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage,

Python

Communication should be good.


Read more
Kloud9 Technologies
Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon EMR
EMR
Spark
PySpark
+9 more

About Kloud9:

 

Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.

 

Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.

 

At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.

 

Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.

 

We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.


What we are looking for:

● 3+ years’ experience developing Data & Analytic solutions

● Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive& Spark

● Experience with relational SQL

● Experience with scripting languages such as Shell, Python

● Experience with source control tools such as GitHub and related dev process

● Experience with workflow scheduling tools such as Airflow

● In-depth knowledge of scalable cloud

● Has a passion for data solutions

● Strong understanding of data structures and algorithms

● Strong understanding of solution and technical design

● Has a strong problem-solving and analytical mindset

● Experience working with Agile Teams.

● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

● Able to quickly pick up new programming languages, technologies, and frameworks

● Bachelor’s Degree in computer science


Why Explore a Career at Kloud9:

 

With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers.

Read more
Remote only
3 - 6 yrs
₹12L - ₹23L / yr
skill iconDeep Learning
Computer Vision
PyTorch
TensorFlow
skill iconPython
+7 more
This person MUST have:
- B.E Computer Science or equivalent.
- In-depth knowledge of machine learning algorithms and their applications including
practical experience with and theoretical understanding of algorithms for classification,
regression and clustering.
- Hands-on experience in computer vision and deep learning projects to solve real world
problems involving vision tasks such as object detection, Object tracking, instance
segmentation, activity detection, depth estimation, optical flow, multi-view geometry,
domain adaptation etc.
- Strong understanding of modern and traditional Computer Vision Algorithms.
- Experience in one of the Deep Learning Frameworks / Networks: PyTorch, TensorFlow,
Darknet (YOLO v4 v5), U-Net, Mask R-CNN, EfficientDet, BERT etc.
- Proficiency with CNN architectures such as ResNet, VGG, UNet, MobileNet, pix2pix,
and Cycle GAN.
- Experienced user of libraries such as OpenCV, scikit-learn, matplotlib and pandas.
- Ability to transform research articles into working solutions to solve real-world problems.
- High proficiency in Python programming knowledge.
- Familiar with software development practices/pipelines (DevOps- Kubernetes, docker
containers, CI/CD tools).
- Strong communication skills.
Read more
Aikon Labs Private Limited
at Aikon Labs Private Limited
1 video
7 recruiters
Shankar K
Posted by Shankar K
Pune
0 - 5 yrs
₹1L - ₹8L / yr
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
Data Structures
Algorithms
skill iconDeep Learning
+4 more
About us
Aikon Labs Pvt Ltd is a start-up focused on Realizing Ideas. One such idea is iEngage.io , our Intelligent Engagement Platform. We leverage Augmented Intelligence, a combination of machine-driven insights & human understanding, to serve a timely response to every interaction from the people you care about.
Get in touch If you are interested. 

Do you have a passion to be a part of an innovative startup? Here’s an opportunity for you - become an active member of our core platform development team.

Main Duties
● Quickly research the latest innovations in Machine Learning, especially with respect to
Natural Language Understanding & implement them if useful
● Train models to provide different insights, mainly from text but also other media such as Audio and Video
● Validate the models trained. Fine-tune & optimise as necessary
● Deploy validated models, wrapped in a Flask server as a REST API or containerize in docker containers
● Build preprocessing pipelines for the models that are bieng served as a REST API
● Periodically, test & validate models in use. Update where necessary

Role & Relationships
We consider ourselves a team & you will be a valuable part of it. You could be reporting to a Senior member or directly to our Founder, CEO

Educational Qualifications
We don’t discriminate. As long as you have the required skill set & the right attitude

Experience
Upto two years of experience, preferably working on ML. Freshers are welcome too!

Skills
Good
● Strong understanding of Java / Python
● Clarity on concepts of Data Science
● A strong grounding in core Machine Learning
● Ability to wrangle & manipulate data into a processable form
● Knowledge of web technologies like Web server (Flask, Django etc), REST API's
Even better
● Experience with deep learning
● Experience with frameworks like Scikit-Learn, Tensorflow, Pytorch, Keras
Competencies
● Knowledge of NLP libraries such as NLTK, spacy, gensim.
● Knowledge of NLP models such as Wod2vec, Glove, ELMO, Fasttext
● An aptitude to solve problems & learn something new
● Highly self-motivated
● Analytical frame of mind
● Ability to work in fast-paced, dynamic environment

Location
Pune

Remuneration
Once we meet, we shall make an offer depending on how good a fit you are & the experience you already have
Read more
Simplilearn Solutions
at Simplilearn Solutions
1 video
36 recruiters
Aniket Manhar Nanjee
Posted by Aniket Manhar Nanjee
Bengaluru (Bangalore)
2 - 5 yrs
₹6L - ₹10L / yr
skill iconData Science
skill iconR Programming
skill iconPython
skill iconScala
Tableau
+1 more
Simplilearn.com is the world’s largest professional certifications company and an Onalytica Top 20 influential brand. With a library of 400+ courses, we've helped 500,000+ professionals advance their careers, delivering $5 billion in pay raises. Simplilearn has over 6500 employees worldwide and our customers include Fortune 1000 companies, top universities, leading agencies and hundreds of thousands of working professionals. We are growing over 200% year on year and having fun doing it. Description We are looking for candidates with strong technical skills and proven track record in building predictive solutions for enterprises. This is a very challenging role and provides an opportunity to work on developing insights based Ed-Tech software products used by large set of customers across globe. It provides an exciting opportunity to work across various advanced analytics & data science problem statement using cutting-edge modern technologies collaborating with product, marketing & sales teams. Responsibilities • Work on enterprise level advanced reporting requirements & data analysis. • Solve various data science problems customer engagement, dynamic pricing, lead scoring, NPS improvement, optimization, chatbots etc. • Work on data engineering problems utilizing our tech stack - S3 Datalake, Spark, Redshift, Presto, Druid, Airflow etc. • Collect relevant data from source systems/Use crawling and parsing infrastructure to put together data sets. • Craft, conduct and analyse A/B experiments to evaluate machine learning models/algorithms. • Communicate findings and take algorithms/models to production with ownership. Desired Skills • BE/BTech/MSc/MS in Computer Science or related technical field. • 2-5 years of experience in advanced analytics discipline with solid data engineering & visualization skills. • Strong SQL skills and BI skills using Tableau & ability to perform various complex analytics in data. • Ability to propose hypothesis and design experiments in the context of specific problems using statistics & ML algorithms. • Good overlap with Modern Data processing framework such as AWS-lambda, Spark using Scala or Python. • Dedication and diligence in understanding the application domain, collecting/cleaning data and conducting various A/B experiments. • Bachelor Degree in Statistics or, prior experience with Ed-Tech is a plus
Read more
IQVIA
at IQVIA
6 recruiters
Nishigandha Wagh
Posted by Nishigandha Wagh
Pune
3 - 6 yrs
₹5L - ₹15L / yr
Data Warehouse (DWH)
Business Intelligence (BI)
skill iconAmazon Web Services (AWS)
SQL
MDM
+1 more
Consultants will have the opportunity to :
- Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support
- Build technical solutions using latest open source and cloud based technologies
- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements
- Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment
- Build demos and POCs in support of business development for new and existing clients
- Prepare project documents and PowerPoint presentations for client communication
- Conduct training sessions to train associates and help shape their growth
Read more
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹30L / yr
Big Data
Hadoop
Data processing
skill iconPython
Data engineering
+3 more

REQUIREMENT:

  •  Previous experience of working in large scale data engineering
  •  4+ years of experience working in data engineering and/or backend technologies with cloud experience (any) is mandatory.
  •  Previous experience of architecting and designing backend for large scale data processing.
  •  Familiarity and experience of working in different technologies related to data engineering – different database technologies, Hadoop, spark, storm, hive etc.
  •  Hands-on and have the ability to contribute a key portion of data engineering backend.
  •  Self-inspired and motivated to drive for exceptional results.
  •  Familiarity and experience working with different stages of data engineering – data acquisition, data refining, large scale data processing, efficient data storage for business analysis.
  •  Familiarity and experience working with different DB technologies and how to scale them.

RESPONSIBILITY:

  •  End to end responsibility to come up with data engineering architecture, design, development and then implementation of it.
  •  Build data engineering workflow for large scale data processing.
  •  Discover opportunities in data acquisition.
  •  Bring industry best practices for data engineering workflow.
  •  Develop data set processes for data modelling, mining and production.
  •  Take additional tech responsibilities for driving an initiative to completion
  •  Recommend ways to improve data reliability, efficiency and quality
  •  Goes out of their way to reduce complexity.
  •  Humble and outgoing - engineering cheerleaders.
Read more
One Labs
at One Labs
1 recruiter
Rahul Gupta
Posted by Rahul Gupta
NCR (Delhi | Gurgaon | Noida)
1 - 3 yrs
₹3L - ₹6L / yr
skill iconData Science
skill iconDeep Learning
skill iconPython
Keras
TensorFlow
+1 more

Job Description


We are looking for a data scientist that will help us to discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products. 

Responsibilities

  • Selecting features, building and optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Extending company’s data with third party sources of information when needed
  • Enhancing data collection procedures to include information that is relevant for building analytic systems
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Doing ad-hoc analysis and presenting results in a clear manner
  • Creating automated anomaly detection systems and constant tracking of its performance

Skills and Qualifications

  • Excellent understanding of machine learning techniques and algorithms, such as Linear regression, SVM, Decision Forests, LSTM, CNN etc.
  • Experience with Deep Learning preferred.
  • Experience with common data science toolkits, such as R, NumPy, MatLab, etc. Excellence in at least one of these is highly desirable
  • Great communication skills
  • Proficiency in using query languages such as SQL, Hive, Pig 
  • Good applied statistics skills, such as statistical testing, regression, etc.
  • Good scripting and programming skills 
  • Data-oriented personality
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos