Cutshort logo
PuTTY Jobs in Mumbai

11+ PuTTY Jobs in Mumbai | PuTTY Job openings in Mumbai

Apply to 11+ PuTTY Jobs in Mumbai on CutShort.io. Explore the latest PuTTY Job opportunities across top companies like Google, Amazon & Adobe.

icon
Magic9 Media and Consumer Knowledge Pvt. Ltd.
Mumbai
3 - 5 yrs
₹7L - ₹12L / yr
ETL
SQL
skill iconPython
Statistical Analysis
skill iconMachine Learning (ML)
+4 more

Job Description

This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.


Problems being solved by our client: 

Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.


Duties and responsibilities:

  • The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions. 
  • Develop, implement, and support statistical or machine learning methodologies and processes. 
  • Build, test new features and concepts and integrate into production process
  • Participate in ongoing research and evaluation of new technologies
  • Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
  • Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients

Qualifications:

  • 3-5 years relevant work experience in areas as outlined below
  • Experience in extracting data using SQL from large databases
  • Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
  • Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered. 
  • Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.  
  • Excellent verbal and written communication skills. 
  • Experience with TV or digital audience measurement or market research data is a plus. 
  • Familiarity with systems analysis or systems thinking is a plus. 
  • Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
  • Excellent verbal, written and computer communication skills
  • Ability to engage with Senior Leaders across all functional departments
  • Ability to take on new responsibilities and adapt to changes

 

Read more
Scremer
Sathish Dhawan
Posted by Sathish Dhawan
Pune, Mumbai
6 - 11 yrs
₹15L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
skill iconJava
Spark


Primary Skills

DynamoDB, Java, Kafka, Spark, Amazon Redshift, AWS Lake Formation, AWS Glue, Python


Skills:

Good work experience showing growth as a Data Engineer.

Hands On programming experience

Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS Lake Formation.

Excellent knowledge in: Python, Scala/Java, Spark, AWS (Lambda, Step Functions, Dynamodb, EMR), Terraform, UI (Angular), Git, Mavena

Experience of performance optimization in Batch and Real time processing applications

Expertise in Data Governance and Data Security Implementation

Good hands-on design and programming skills building reusable tools and products Experience developing in AWS or similar cloud platforms. Preferred:, ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, Quick Sight or similar.

Familiarity with systems with very high volume of transactions, micro service design, or data processing pipelines (Spark).

Knowledge and hands-on experience with server less technologies such as Lambda, MSK, MWAA, Kinesis Analytics a plus.

Expertise in practices like Agile, Peer reviews, Continuous Integration


Roles and responsibilities:

Determining project requirements and developing work schedules for the team.

Delegating tasks and achieving daily, weekly, and monthly goals.

Responsible for designing, building, testing, and deploying the software releases.


Salary: 25LPA-40LPA

Read more
UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai, Navi Mumbai
2 - 6 yrs
₹4L - ₹8L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
MS-PowerPoint

Company Description

UpSolve is a Gen AI and Vision AI startup that helps businesses solve their problems by building custom solutions that drive strategic business decisions. Whether your business is facing time constraints or a lack of resources, UpSolve can help. We build enterprise grade AI solutions with focus on increasing ROI.


Role Description

This is a full-time hybrid role for a Business Analyst located in Mumbai.


Please note: This is an onsite role and good communication skills are expected (oral + written)


Responsibilities

1. Understand existing system integrations for the client.

2. Map and identify gaps in existing systems.

3. Ideate, Advise and Implement AI Solutions to optimize business process.

4. Collaborate with multiple teams and stakeholders.


Qualifications

  • MBA with focus on Business Analytics or Bachelor's degree in Computer Science or IT
  • Minimum 4 Years of Experience
  • Strong written, verbal and collaboration skills
  • Immediate Joiner (Less than 5 days)


Work Location: Mumbai, Work from Office

Read more
a global provider of Business Process Management company

a global provider of Business Process Management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), Mumbai, Pune, Gurugram, Chennai
7 - 15 yrs
₹16L - ₹30L / yr
Agile/Scrum
JIRA
MPP
PPMC

IT Project Manager – Team Lead 

About the position:- 

Responsibilities as a Project Manager:

 

  • Stakeholder management, Budget management.
  • · Merck specific Processes for project execution.
  • Like Risk, info class, IT security etc.
  • · Client communication and updates.
  • · Daily monitoring and follow-up with development, testing, implementation teams for status update.
  • · Project planning, estimation and scheduling the task/assignment for team. · Single point of contact for any other teams.
  • · Close monitoring of allocation and utilization.
  • · Status reporting to LT and stakeholders. ·
  • Responsible for testing and releases. Coding standard and Documentation for project.
  • · Responsible for Change and Risk management for the project.
Read more
codersbrain
Hyderabad, Delhi, Gurugram, Noida, Bengaluru (Bangalore), Mumbai, Kolkata
8 - 15 yrs
₹5L - ₹16L / yr
Informatica
Informatica Data Quality
informatica cloud data quality
SQL
Digital

JD:

 Location Pan India

Experience: 8 to 15Yrs


Must have skills                                                                                                                 

  • "Senior" is defined as 8+ years of IT experience, and a minimum of 5+ years in Digital experience
  • Minimum 3 years of hands-on experience with Informatica Cloud Data Quality (CDQ) toolset, 
  • Strong SQL Skills.

Nice have skills

  • Knowledge of Informatica AXON Data Governance
  • Familiarity of Enterprise Data Catalog 
  • Familiarity Cloud Data Governance and Catalog (CDGC).


Read more
Egnyte

at Egnyte

4 recruiters
Prasanth Mulleti
Posted by Prasanth Mulleti
Remote, Mumbai
4 - 10 yrs
Best in industry
skill iconData Science
data scientist
skill iconMachine Learning (ML)
Time series
QoS
+7 more

Job Description

We are looking for an experienced engineer to join our data science team, who will help us design, develop, and deploy machine learning models in production. You will develop robust models, prepare their deployment into production in a controlled manner, while providing appropriate means to monitor their performance and stability after deployment.

 

What You’ll Do will include (But not limited to):

  • Preparing datasets needed to train and validate our machine learning models
  • Anticipate and build solutions for problems that interrupt availability, performance, and stability in our systems, services, and products at scale.
  • Defining and implementing metrics to evaluate the performance of the models, both for computing performance (such as CPU & memory usage) and for ML performance (such as precision, recall, and F1)
  • Supporting the deployment of machine learning models on our infrastructure, including containerization, instrumentation, and versioning
  • Supporting the whole lifecycle of our machine learning models, including gathering data for retraining, A/B testing, and redeployments
  • Developing, testing, and evaluating tools for machine learning models deployment, monitoring, retraining.
  • Working closely within a distributed team to analyze and apply innovative solutions over billions of documents
  • Supporting solutions ranging from rule-bases, classical ML techniques  to the latest deep learning systems.
  • Partnering with cross-functional team members to bring large scale data engineering solutions to production
  • Communicating your approach and results to a wider audience through presentations

Your Qualifications:

  • Demonstrated success with machine learning in a SaaS or Cloud environment, with hands–on knowledge of model creation and deployments in production at scale
  • Good knowledge of traditional machine learning methods and neural networks
  • Experience with practical machine learning modeling, especially on time-series forecasting, analysis, and causal inference.
  • Experience with data mining algorithms and statistical modeling techniques for anomaly detection in time series such as clustering, classification, ARIMA, and decision trees is preferred.
  • Ability to implement data import, cleansing and transformation functions at scale
  • Fluency in Docker, Kubernetes
  • Working knowledge of relational and dimensional data models with appropriate visualization techniques such as PCA.
  • Solid English skills to effectively communicate with other team members

 

Due to the nature of the role, it would be nice if you have also:

  • Experience with large datasets and distributed computing, especially with the Google Cloud Platform
  • Fluency in at least one deep learning framework: PyTorch, TensorFlow / Keras
  • Experience with No–SQL and Graph databases
  • Experience working in a Colab, Jupyter, or Python notebook environment
  • Some experience with monitoring, analysis, and alerting tools like New Relic, Prometheus, and the ELK stack
  • Knowledge of Java, Scala or Go-Lang programming languages
  • Familiarity with KubeFlow
  • Experience with transformers, for example the Hugging Face libraries
  • Experience with OpenCV

 

About Egnyte

In a content critical age, Egnyte fuels business growth by enabling content-rich business processes, while also providing organizations with visibility and control over their content assets. Egnyte’s cloud-native content services platform leverages the industry’s leading content intelligence engine to deliver a simple, secure, and vendor-neutral foundation for managing enterprise content across business applications and storage repositories. More than 16,000 customers trust Egnyte to enhance employee productivity, automate data management, and reduce file-sharing cost and complexity. Investors include Google Ventures, Kleiner Perkins, Caufield & Byers, and Goldman Sachs. For more information, visit www.egnyte.com

 

#LI-Remote

Read more
Ness Technologies

at Ness Technologies

1 recruiter
Kiran Kaginkar
Posted by Kiran Kaginkar
Bengaluru (Bangalore), Hyderabad, Pune, Navi Mumbai
8 - 13 yrs
₹25L - ₹35L / yr
uipath
Test management
Test suites
Test Manager
Automation
+12 more

Requirements:

·  UIPath certification

·  Proficient in UI Path Platform, Test Manage, Test Suite, with 6+ years of Experience on Test Automation

·  Hands-on experience in building automated scripts using Low-Code No-Code Platform (UIpath)

·  Experience on Testing SOAP or REST API

·  Experience building data driven tests and frameworks for Web, Windows, and Micro services.

·  Understanding of test methodologies (regression, functional, unit, integration, code coverage, performance, etc.)

·  Experience building data driven tests and frameworks for Web, Windows, and Micro services.

·  Designing and developing test automation frameworks and understanding of test automation design patterns and software testing principles.

·  Familiarity with Relational Databases and SQL

·  Bachelor's degree in computer science, engineering or related field

·  Minimum of 7 years of experience in software testing and test automation

·  Minimum of 5 years of experience in UIPath test automation

·  Strong knowledge of test automation frameworks and tools

·  Experience with continuous integration and continuous delivery (CI/CD) pipelines

·  Ability to analyze and debug complex issues

·  Excellent problem-solving skills

·  Strong communication skills and ability to work collaboratively in a team environment

·  Knowledge of agile methodologies

 

 Flexibility

Need to be flexible with respect to working times, provide two hours overlap with IST (Central Time) and UK time on an ongoing basis.

 

If you are passionate about test automation and have experience with UIPath, we encourage you to apply for this exciting opportunity. We offer a competitive salary, excellent benefits, and opportunities for growth and development.

Read more
They provide both wholesale and retail funding. PM1

They provide both wholesale and retail funding. PM1

Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
AWS KINESYS
Data engineering
AWS Lambda
DynamoDB
data pipeline
+11 more
  • Key responsibility is to design & develop a data pipeline for real-time data integration, processing, executing of the model (if required), and exposing output via MQ / API / No-SQL DB for consumption
  • Provide technical expertise to design efficient data ingestion solutions to store & process unstructured data, such as Documents, audio, images, weblogs, etc
  • Developing API services to provide data as a service
  • Prototyping Solutions for complex data processing problems using AWS cloud-native solutions
  • Implementing automated Audit & Quality assurance Checks in Data Pipeline
  • Document & maintain data lineage from various sources to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Skills

  • Programming experience using Python & SQL
  • Extensive working experience in Data Engineering projects, using AWS Kinesys,  AWS S3, DynamoDB, EMR, Lambda, Athena, etc for event processing
  • Experience & expertise in implementing complex data pipeline
  • Strong Familiarity with AWS Toolset for Storage & Processing. Able to recommend the right tools/solutions available to address specific data processing problems
  • Hands-on experience in Unstructured (Audio, Image, Documents, Weblogs, etc) Data processing.
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Real-time Event Processing
  • Data Governance & Quality assurance
  • Containerized deployment
  • Linux
  • Unstructured Data Processing
  • AWS Toolsets for Storage & Processing
  • Data Security

 

Read more
Sattva Consulting

at Sattva Consulting

3 recruiters
Shahana S
Posted by Shahana S
Bengaluru (Bangalore), Mumbai, Gurugram
1 - 4 yrs
₹5L - ₹8L / yr
skill iconPython
Automation
SaaS
skill iconJavascript
API
+1 more
The Opportunity
We are looking for an automation specialist who will play a key role in Sattva's digitisation
initiatives. Our rapid growth in the last year has underscored the importance of
technology-driven solutions to manage business processes at scale.
Currently our tech landscape is a collection of best-of-breed SaaS solutions that need to be
integrated/extended based on business needs. This role involves identifying automation
opportunities and realising them through low/no-code platforms like AppSheet, Zapier, etc. It is a technical role that also involves interfacing with people across different Business Units within Sattva. It offers the opportunity to work with best-in-class SaaS solutions like Google Workspace, FreshTeams, ClickUp, and QuickBooks.

Responsibilities
● Analyse existing landscape of SaaS solutions to identify automation gaps in key
business process
● Integrate best-of-breed SaaS solutions using APIs and Low/No-Code tools
● Build apps to extend existing SaaS solutions like FreshTeams, QuickBooks, ClickUp, etc
using available APIs and SDKs
● Configure SaaS solutions to meet the needs of a specific Business Unit or of a defined
security policy
● Build Slack apps to integrate with SaaS solutions in the landscape
● Troubleshoot technical issues with the configured solutions in the landscape

Ideal Candidate Profile
● 1+ years of experience in integrating/extending SaaS solutions
● Solid expertise in developing automation scripts and applications using Javascript or
Python
● Strong problem-solving ability
● Excellent communication skills
● Proven ability to interface with multiple stakeholders across business vertical
Read more
Leading Multinational Co

Leading Multinational Co

Agency job
via SIlverPeople Consulting by Richa Awasthi
Mumbai
2 - 9 yrs
₹8L - ₹27L / yr
skill iconMachine Learning (ML)

ML Engineer-Analyst/ Senior Analyst

Job purpose:

To design and develop machine learning and deep learning systems. Run machine learning tests andexperiments and implementing appropriate ML algorithms. Works cross-functionally with the Data Scientists, Software application developers and business groups for the development of innovative ML models. Use Agile experience to work collaboratively with other Managers/Owners in geographically distributed teams.

Accountabilities:

  • Work with Data Scientists and Business Analysts to frame problems in a business context. Assist all the processes from data collection, cleaning, and preprocessing, to training models and deploying them to production.
  • Understand business objectives and developing models that help to achieve them, along with metrics to track their progress.
  • Explore and visualize data to gain an understanding of it, then identify differences in data distribution that could affect performance when deploying the model in the real world.
  • Define validation strategies, preprocess or feature engineering to be done on a given dataset and data augmentation pipelines.
  • Analyze the errors of the model and design strategies to overcome them.
  • Collaborate with data engineers to build data and model pipelines, manage the infrastructure and data pipelines needed to bring code to production and demonstrate end-to-end understanding of applications (including, but not limited to, the machine learning algorithms) being created.

Qualifications & Specifications

  • Bachelor's degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Master's degree in relevant specification will be first preference
  • Experience of machine learning algorithms and libraries
  • Understanding of data structures, data modeling and software architecture.
  • Deep knowledge of math, probability, statistics and algorithms
  • Experience with machine learning platforms such as Microsoft Azure, Google Cloud, IBM Watson, and Amazon
  • Big data environment: Hadoop, Spark
  • Programming languages: Python, R, PySpark
  • Supervised & Unsupervised machine learning: linear regression, logistic regression, k-means

clustering, ensemble models, random forest, svm, gradient boosting

  • Sampling data: bagging & boosting, bootstrapping
  • Neural networks: ANN, CNN, RNN related topics
  • Deep learning: Keras, Tensorflow
  • Experience with AWS Sagemaker deployment and agile methodology

 

 

Read more
Techknomatic Services Pvt. Ltd.
Techknomatic Services
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort