Cutshort logo
PuTTY Jobs in Mumbai

11+ PuTTY Jobs in Mumbai | PuTTY Job openings in Mumbai

Apply to 11+ PuTTY Jobs in Mumbai on CutShort.io. Explore the latest PuTTY Job opportunities across top companies like Google, Amazon & Adobe.

icon
Magic9 Media and Consumer Knowledge Pvt. Ltd.
Mumbai
3 - 5 yrs
₹7L - ₹12L / yr
ETL
SQL
skill iconPython
Statistical Analysis
skill iconMachine Learning (ML)
+4 more

Job Description

This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.


Problems being solved by our client: 

Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.


Duties and responsibilities:

  • The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions. 
  • Develop, implement, and support statistical or machine learning methodologies and processes. 
  • Build, test new features and concepts and integrate into production process
  • Participate in ongoing research and evaluation of new technologies
  • Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
  • Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients

Qualifications:

  • 3-5 years relevant work experience in areas as outlined below
  • Experience in extracting data using SQL from large databases
  • Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
  • Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered. 
  • Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.  
  • Excellent verbal and written communication skills. 
  • Experience with TV or digital audience measurement or market research data is a plus. 
  • Familiarity with systems analysis or systems thinking is a plus. 
  • Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
  • Excellent verbal, written and computer communication skills
  • Ability to engage with Senior Leaders across all functional departments
  • Ability to take on new responsibilities and adapt to changes

 

Read more
UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai
2 - 4 yrs
₹7L - ₹11L / yr
skill iconMachine Learning (ML)
skill iconData Science
Microsoft Windows Azure
Google Cloud Platform (GCP)
skill iconPython
+3 more

About UpSolve

Work on cutting-edge tech stack. Build innovative solutions. Computer Vision, NLP, Video Analytics and IOT.


Job Role

  • Ideate use cases to include recent tech releases.
  • Discuss business plans and assist teams in aligning with dynamic KPIs.
  • Design solution architecture from input to infrastructure and services used to data store.


Job Requirements

  • Working knowledge about Azure Cognitive Services.
  • Project Experience in building AI solutions like Chatbots, sentiment analysis, Image Classification, etc.
  • Quick Learner and Problem Solver.


Job Qualifications

  • Work Experience: 2 years +
  • Education: Computer Science/IT Engineer
  • Location: Mumbai
Read more
UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai, Navi Mumbai
2 - 6 yrs
₹4L - ₹8L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
MS-PowerPoint

Company Description

UpSolve is a Gen AI and Vision AI startup that helps businesses solve their problems by building custom solutions that drive strategic business decisions. Whether your business is facing time constraints or a lack of resources, UpSolve can help. We build enterprise grade AI solutions with focus on increasing ROI.


Role Description

This is a full-time hybrid role for a Business Analyst located in Mumbai.


Please note: This is an onsite role and good communication skills are expected (oral + written)


Responsibilities

1. Understand existing system integrations for the client.

2. Map and identify gaps in existing systems.

3. Ideate, Advise and Implement AI Solutions to optimize business process.

4. Collaborate with multiple teams and stakeholders.


Qualifications

  • MBA with focus on Business Analytics or Bachelor's degree in Computer Science or IT
  • Minimum 4 Years of Experience
  • Strong written, verbal and collaboration skills
  • Immediate Joiner (Less than 5 days)


Work Location: Mumbai, Work from Office

Read more
Episource

at Episource

11 recruiters
Ahamed Riaz
Posted by Ahamed Riaz
Mumbai
5 - 12 yrs
₹18L - ₹30L / yr
Big Data
skill iconPython
skill iconAmazon Web Services (AWS)
Serverless
DevOps
+4 more

ABOUT EPISOURCE:


Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.


The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.


What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.


ABOUT THE ROLE:


We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.


This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.


You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.


During the course of a typical day with our team, expect to work on one or more projects around the following;


1. Create and maintain optimal data pipeline architectures for ML


2. Develop a strong API ecosystem for ML pipelines


3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible


4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems


5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations  


6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms

 

7. Designing scalable implementations of the models developed by our Data Science teams  


8. Big data and distributed ML with PySpark on AWS EMR, and more!



BASIC REQUIREMENTS 


  1.  Bachelor’s degree or greater in Computer Science, IT or related fields

  2.  Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects

  3. Strong experience with bash scripting, unix environments and building scalable/distributed systems

  4. Experience with automation/configuration management using Ansible, Terraform, or equivalent

  5. Very strong experience with AWS and Python

  6. Experience building CI/CD systems

  7. Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent

  8. Ability to build and manage application and performance monitoring processes

Read more
codersbrain
Hyderabad, Delhi, Gurugram, Noida, Bengaluru (Bangalore), Mumbai, Kolkata
8 - 15 yrs
₹5L - ₹16L / yr
Informatica
Informatica Data Quality
informatica cloud data quality
SQL
Digital

JD:

 Location Pan India

Experience: 8 to 15Yrs


Must have skills                                                                                                                 

  • "Senior" is defined as 8+ years of IT experience, and a minimum of 5+ years in Digital experience
  • Minimum 3 years of hands-on experience with Informatica Cloud Data Quality (CDQ) toolset, 
  • Strong SQL Skills.

Nice have skills

  • Knowledge of Informatica AXON Data Governance
  • Familiarity of Enterprise Data Catalog 
  • Familiarity Cloud Data Governance and Catalog (CDGC).


Read more
CarWale

at CarWale

5 recruiters
Vanita Acharya
Posted by Vanita Acharya
Navi Mumbai, Mumbai
3 - 5 yrs
₹10L - ₹15L / yr
skill iconData Science
Data Scientist
skill iconR Programming
skill iconPython
skill iconMachine Learning (ML)
+1 more

About CarWale: CarWale's mission is to bring delight in car buying, we offer a bouquet of reliable tools and services to help car consumers decide on buying the right car, at the right price and from the right partner. CarWale has always strived to serve car buyers and owners in the most comprehensive and convenient way possible. We provide a platform where car buyers and owners can research, buy, sell and come together to discuss and talk about their cars.We aim to empower Indian consumers to make informed car buying and ownership decisions with exhaustive and un-biased information on cars through our expert reviews, owner reviews, detailed specifications and comparisons. We understand that a car is by and large the second-most expensive asset a consumer associates his lifestyle with! Together with CarTrade & BikeWale, we are the market leaders in the personal mobility media space.About the Team:We are a bunch of enthusiastic analysts assisting all business functions with their data needs. We deal with huge but diverse datasets to find relationships, patterns and meaningful insights. Our goal is to help drive growth across the organization by creating a data-driven culture.

We are looking for an experienced Data Scientist who likes to explore opportunities and know their way around data to build world class solutions making a real impact on the business. 

 

Skills / Requirements –

  • 3-5 years of experience working on Data Science projects
  • Experience doing statistical modelling of big data sets
  • Expert in Python, R language with deep knowledge of ML packages
  • Expert in fetching data from SQL
  • Ability to present and explain data to management
  • Knowledge of AWS would be beneficial
  • Demonstrate Structural and Analytical thinking
  • Ability to structure and execute data science project end to end

 

Education –

Bachelor’s degree in a quantitative field (Maths, Statistics, Computer Science). Masters will be preferred.

 

Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
Teradata
Vertica
skill iconPython
DBA
Redshift
+8 more
  • Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
  • Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
  • Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
  • Periodic Database health check and maintenance
  • Designing collections in a no-SQL Database for efficient performance
  • Document & maintain data dictionary from various sources to enable data governance
  • Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
  • Data Governance Process Implementation and ensuring data security

Requirements

  • Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
  • Programming experience using Python / Java.
  • Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
  • Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
  • Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
  • Extensive technical experience in SQL including code optimization techniques.
  • Strung knowledge of database performance and tuning, troubleshooting, and tuning.
  • Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
  • Ability to understand business functionality, processes, and flows.
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
  • Any OLAP DWH DBA Experience and User Management will be added advantage.
  • Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
  • Experience in Snowflake will be added advantage.
  • Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.

Functional knowledge

  • Data Governance & Quality Assurance
  • Modern OLAP Database Architecture & Design
  • Linux
  • Data structures, algorithm & data modeling techniques
  • No-SQL database architecture
  • Data Security

 

Read more
upGrad

at upGrad

1 video
19 recruiters
Priyanka Muralidharan
Posted by Priyanka Muralidharan
Bengaluru (Bangalore), Mumbai
4 - 6 yrs
₹19L - ₹24L / yr
SQL
skill iconPython
Tableau
Team Management
Statistical Analysis

Role Summary

We Are looking for an analytically inclined, Insights Driven Product Analyst to make our organisation more data driven. In this role you will be responsible for creating dashboards to drive insights for product and business teams. Be it Day to Day decisions as well as long term impact assessment, Measuring the Efficacy of different products or certain teams, You'll be Empowering each of them. The growing nature of the team will require you to be in touch with all of the teams at upgrad. Are you the "Go-To" person everyone looks at for getting Data, Then this role is for you.

 

Roles & Responsibilities

  • Lead and own the analysis of highly complex data sources, identifying trends and patterns in data and provide insights/recommendations based on analysis results
  • Build, maintain, own and communicate detailed reports to assist Marketing, Growth/Learning Experience and Other Business/Executive Teams
  • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
  • Analyze data and generate insights in the form of user analysis, user segmentation, performance reports, etc.
  • Facilitate review sessions with management, business users and other team members
  • Design and create visualizations to present actionable insights related to data sets and business questions at hand
  • Develop intelligent models around channel performance, user profiling, and personalization

Skills Required

  • Having 4-6 yrs hands-on experience with Product related analytics and reporting
  • Experience with building dashboards in Tableau or other data visualization tools such as D3
  • Strong data, statistics, and analytical skills with a good grasp of SQL.
  • Programming experience in Python is must
  • Comfortable managing large data sets
  • Good Excel/data management skills
Read more
High-Growth Fintech Startup
Agency job
via Unnati by Ramya Senthilnathan
Remote, Mumbai
3 - 5 yrs
₹7L - ₹10L / yr
Business Intelligence (BI)
PowerBI
Analytics
Reporting
Data management
+5 more
Want to join the trailblazing Fintech company which is leveraging software and technology to change the face of short-term financing in India!

Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
 
As a Data Analyst - SQL, you will be working on projects in the Analytics function to generate insights for business as well as manage reporting for the management for all things related to Lending.
 
You will be part of a rapidly growing tech-driven organization and will be responsible for generating insights that will drive business impact and productivity improvements.
 
What you will do:
  • Ensuring ease of data availability, with relevant dimensions, using Business Intelligence tools.
  • Providing strong reporting and analytical information support to the management team.
  • Transforming raw data into essential metrics basis needs of relevant stakeholders.
  • Performing data analysis for generating reports on a periodic basis.
  • Converting essential data into easy to reference visuals using Data Visualization tools (PowerBI, Metabase).
  • Providing recommendations to update current MIS to improve reporting efficiency and consistency.
  • Bringing fresh ideas to the table and keen observers of trends in the analytics and financial services industry.

 

 

What you need to have:
  • MBA/ BE/ Graduate, with work experience of 3+ years.
  • B.Tech /B.E.; MBA / PGDM
  • Experience in Reporting, Data Management (SQL, MongoDB), Visualization (PowerBI, Metabase, Data studio)
  • Work experience (into financial services, Indian Banks/ NBFCs in-house analytics units or Fintech/ analytics start-ups would be a plus.)
Skills:
  • Skilled at writing & optimizing large complicated SQL queries & MongoDB scripts.
  • Strong knowledge of Banking/ Financial Services domain
  • Experience with some of the modern relational databases
  • Ability to work on multiple projects of different nature and self- driven,
  • Liaise with cross-functional teams to resolve data issues and build strong reports

 

Read more
Techknomatic Services Pvt. Ltd.
Techknomatic Services
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more
AthenasOwl

at AthenasOwl

1 video
1 recruiter
Ericsson Fernandes
Posted by Ericsson Fernandes
Mumbai
3 - 7 yrs
₹10L - ₹20L / yr
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
Computer vision
skill iconPython
+1 more

Company Profile and Job Description  

About us:  

AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries.  

For more details visit: www.athenasowl.tv   

  

Role:   

Senior Machine Learning Engineer  

Experience Level:   

4 -6 Years of experience  

Work location:   

Mumbai (Malad W)   

  

Responsibilities:   

  • Develop cutting edge machine learning solutions at scale to solve computer vision problems in the domain of media, entertainment and sports
  • Collaborate with media houses and broadcasters across the globe to solve niche problems in the field of post-production, archiving and viewership
  • Manage a team of highly motivated engineers to deliver high-impact solutions quickly and at scale

 

 

The ideal candidate should have:   

  • Strong programming skills in any one or more programming languages like Python and C/C++
  • Sound fundamentals of data structures, algorithms and object-oriented programming
  • Hands-on experience with any one popular deep learning framework like TensorFlow, PyTorch, etc.
  • Experience in implementing Deep Learning Solutions (Computer Vision, NLP etc.)
  • Ability to quickly learn and communicate the latest findings in AI research
  • Creative thinking for leveraging machine learning to build end-to-end intelligent software systems
  • A pleasantly forceful personality and charismatic communication style
  • Someone who will raise the average effectiveness of the team and has demonstrated exceptional abilities in some area of their life. In short, we are looking for a “Difference Maker”

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort