Cutshort logo
Jupyter notebook jobs

8+ Jupyter Notebook Jobs in India

Apply to 8+ Jupyter Notebook Jobs on CutShort.io. Find your next job, effortlessly. Browse Jupyter Notebook Jobs and apply today!

icon
Client based at Pune location.

Client based at Pune location.

Agency job
Pune
5 - 9 yrs
₹18L - ₹30L / yr
Data Engineer
Python
Datawarehousing
Snow flake schema
Data modeling
+7 more

Skills & Experience:

❖ At least 5+ years of experience as a Data Engineer

❖ Hands-on and in-depth experience with Star / Snowflake schema design, data modeling,

data pipelining and MLOps.

❖ Experience in Data Warehouse technologies (e.g. Snowflake, AWS Redshift, etc)

❖ Experience in AWS data pipelines (Lambda, AWS glue, Step functions, etc)

❖ Proficient in SQL

❖ At least one major programming language (Python / Java)

❖ Experience with Data Analysis Tools such as Looker or Tableau

❖ Experience with Pandas, Numpy, Scikit-learn, and Jupyter notebooks preferred

❖ Familiarity with Git, GitHub, and JIRA.

❖ Ability to locate & resolve data quality issues

❖ Ability to demonstrate end to ed data platform support experience

Other Skills:

❖ Individual contributor

❖ Hands-on with the programming

❖ Strong analytical and problem solving skills with meticulous attention to detail

❖ A positive mindset and can-do attitude

❖ To be a great team player

❖ To have an eye for detail

❖ Looking for opportunities to simplify, automate tasks, and build reusable components.

❖ Ability to judge suitability of new technologies for solving business problems

❖ Build strong relationships with analysts, business, and engineering stakeholders

❖ Task Prioritization

❖ Familiar with agile methodologies.

❖ Fintech or Financial services industry experience

❖ Eagerness to learn, about the Private Equity/Venture Capital ecosystem and associated

secondary market

Responsibilities:

o Design, develop and maintain a data platform that is accurate, secure, available, and fast.

o Engineer efficient, adaptable, and scalable data pipelines to process data.

o Integrate and maintain a variety of data sources: different databases, APIs, SAASs, files, logs,

events, etc.

o Create standardized datasets to service a wide variety of use cases.

o Develop subject-matter expertise in tables, systems, and processes.

o Partner with product and engineering to ensure product changes integrate well with the

data platform.

o Partner with diverse stakeholder teams, understand their challenges and empower them

with data solutions to meet their goals.

o Perform data quality on data sources and automate and maintain a quality control

capability.

Read more
Solar Secure
Saurabh Singh
Posted by Saurabh Singh
Remote only
0 - 1 yrs
₹8000 - ₹10000 / mo
skill iconData Science
skill iconPython
Jupyter Notebook

About the Company :

Nextgen Ai Technologies is at the forefront of innovation in artificial intelligence, specializing in developing cutting-edge AI solutions that transform industries. We are committed to pushing the boundaries of AI technology to solve complex challenges and drive business success.


Currently offering "Data Science Internship" for 2 months.


Data Science Projects details In which Intern’s Will Work :

Project 01 : Image Caption Generator Project in Python

Project 02 : Credit Card Fraud Detection Project

Project 03 : Movie Recommendation System

Project 04 : Customer Segmentation

Project 05 : Brain Tumor Detection with Data Science


Eligibility


A PC or Laptop with decent internet speed.

Good understanding of English language.

Any Graduate with a desire to become a web developer. Freshers are welcomed.

Knowledge of HTML, CSS and JavaScript is a plus but NOT mandatory.

Fresher are welcomed. You will get proper training also, so don't hesitate to apply if you don't have any coding background.


#please note that THIS IS AN INTERNSHIP , NOT A JOB.


We recruit permanent employees from inside our interns only (if needed).


Duration : 02 Months 

MODE: Work From Home (Online)


Responsibilities


Manage reports and sales leads in salesforce.com, CRM.

Develop content, manage design, and user access to SharePoint sites for customers and employees.

Build data driven reports, store procedures, query optimization using SQL and PL/SQL knowledge.

Learned the essentials to C++ and Java to refine code and build the exterior layer of web pages.

Configure and load xml data for the BVT tests.

Set up a GitHub page.

Develop spark scripts by using Scala shell as per requirements.

Develop and A/B test improvements to business survey questions on iOS.

Deploy statistical models to various company data streams using Linux shells.

Create monthly performance-base client billing reports using MySQL and NoSQL databases.

Utilize Hadoop and MapReduce to generate dynamic queries and extract data from HDFS.

Create source code utilizing JavaScript and PHP language to make web pages functional.

Excellent problem-solving skills and the ability to work independently or as part of a team.

Effective communication skills to convey complex technical concepts.


Benefits


Internship Certificate

Letter of recommendation

Stipend Performance Based

Part time work from home (2-3 Hrs per day)

5 days a week, Fully Flexible Shift


Read more
CK-12 Foundation

at CK-12 Foundation

1 video
7 recruiters
Amit Gupta
Posted by Amit Gupta
Bengaluru (Bangalore)
3 - 6 yrs
Best in industry
PyTorch
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
Scikit-Learn
Large Language Model
+2 more

We are seeking a talented and motivated AI Verification Engineer to join our team. The ideal candidate will be responsible for the validation of our AI and Machine Learning systems, ensuring that they meet all necessary quality assurance requirements and work reliably and optimally in real-world scenarios. The role requires strong analytical skills, a good understanding of AI and ML technologies, and a dedication to achieving excellence in the production of state-of-the-art systems.


Key Responsibilities:

  1. Develop and execute validation strategies and test plans for AI and ML systems, during development and on production environments.
  2. Work closely with AI/ML engineers and data scientists in understanding system requirements and capabilities and coming up with key metrics for system efficacy.
  3. Evaluate the system performance under various operating conditions, data variety, and scenarios.
  4. Perform functional, stress, system, and other testing types to ensure our systems' reliability and robustness.
  5. Create automated test procedures and systems for regular verification and validation processes, and detect any abnormal anomalies in usage.
  6. Report and track defects, providing detailed information to facilitate problem resolution.
  7. Lead the continuous review and improvement of validation and testing methodologies, procedures, and tools.
  8. Provide detailed reports and documentation on system performance, issues, and validation results.


Required Skills & Qualifications:

  1. Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  2. Proven experience in the testing and validation of AI/ML systems or equivalent complex systems.
  3. Good knowledge and understanding of AI and ML concepts, tools, and frameworks.
  4. Proficient in scripting and programming languages such as Python, shell scripts etc.
  5. Experience with AI/ML platforms and libraries such as TensorFlow, PyTorch, Keras, or Scikit-Learn.
  6. Excellent problem-solving abilities and attention to detail.
  7. Strong communication skills, with the ability to document and explain complex technical concepts clearly.
  8. Ability to work in a fast-paced, collaborative environment.


Preferred Skills & Qualifications:

  1.  A good understanding of various large language models, image models, and their comparative strengths and weaknesses.
  2. Knowledge of CI/CD pipelines and experience with tools such as Jenkins, Git, Docker.
  3. Experience with cloud platforms like AWS, Google Cloud, or Azure.
  4. Understanding of Data Analysis and Visualization tools and techniques.



Read more
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Krishna kandregula
Posted by Krishna kandregula
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
PowerBI
DAX
+12 more
  • Creating and managing ETL/ELT pipelines based on requirements
  • Build PowerBI dashboards and manage datasets needed.
  • Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
  • Build data cubes for real-time visualisation needs and CXO dashboards.


Required Tech Skills


  • Microsoft PowerBI & DAX
  • Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
  • Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory



Read more
Vmultiply solutions

Vmultiply solutions

Agency job
via Vmultiply solutions by Maimuna fatima
Remote only
5 - 10 yrs
₹8L - ₹10L / yr
skill iconElastic Search
Apache Kafka
skill iconMongoDB
Jupyter Notebook
databricks
+2 more

1. Need to have an understanding of Elastic Search, Kafka, mongo DB, etc.

2. Should have experience of Jupter noobooks, data bricks

3. Java, Pythons

4. Senior level, 5-10 years of experience

5. It is important they have those skills so that they can take over current work. There are codes written in both Java as well as Python. (Java is legacy but that is the main search engine code). So it will be counter-productive if engineers hired have experience in both.

6. Excellent communication, analytical, research, grasping skills

Read more
Institutional-grade tools to understand digital assets

Institutional-grade tools to understand digital assets

Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore), Coimbatore
3 - 7 yrs
₹20L - ₹30L / yr
Business Analysis
skill iconPython
SQL
Web3js
Tableau
+1 more

Qualifications

● Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical

experience (or) degree in an analytical field (e.g. Computer Science, Engineering, Mathematics,

Statistics, Operations Research, Management Science)

● 3+ years experience with data analysis and metrics development

● 3+ years experience analyzing and interpreting data, drawing conclusions, defining

recommended actions, and reporting results across stakeholders

● 2+ years experience writing SQL queries

● 2+ years experience scripting in Python

● Demonstrated curiosity in and excitement for Web3/blockchain technologies

● Interested in learning new technologies to solve customer needs with lots of creative freedom

● Strong communication skills and business acumen

● Self-starter, motivated by an interest for developing the best possible solutions to problems

● Experience with Google Cloud - Bigquery, DataBricks stack, DBT, Tableu, Jupyter is a plus

Read more
Angel One

at Angel One

4 recruiters
Andleeb Mujeeb
Posted by Andleeb Mujeeb
Remote only
1 - 5 yrs
₹6L - ₹9L / yr
skill iconPython
pandas
NumPy
Jupyter Notebook
PyCharm

Position description:

  • Architecture & Design systems for Predictive analysis and writing algorithms to deal with financial data
  • Must have experience on web services and APIs (REST, JSON, and similar) and creation and consumption of RESTful APIs
  • Proficiently writing algorithms with Python/Pandas/Numpy; Jupyter/PyCharm
  • Experience with relational and NoSQL databases (Eg. MSSQL, MongoDB, Redshift, PostgreSQL, Redis)
  • Implementing Machine Learning Models using Python/R for best performance
  • Working with Time Series Data & analyzing large data sets.
  • Implementing financial strategies in python and generating reports to analyze the strategy results.

 

Primary Responsibilities:

  • Writing algorithms to deal with financial data and Implementing financial strategies in (Python, SQL) and generating reports to analyze the strategy results.

 

Educational qualifications preferred Degree: Bachelors degree

Required Knowledge:

  • Highly skilled in SQL, Python, Pandas, Numpy, Machine Learning, Predictive Modelling, Algorithm designing, OOPS concepts
  • 2 - 7 years Full-Time working experience on core SQL, Python role (Non-Support)
  • Bachelor’s Degree in Engineering, equivalent or higher education.
  • Writing algorithms to deal with financial data and Implementing financial strategies in (Python, SQL) and generating reports to analyze the strategy results.
Read more
Numantra Technologies

at Numantra Technologies

2 recruiters
nisha mattas
Posted by nisha mattas
Remote, Mumbai, powai
2 - 12 yrs
₹8L - ₹18L / yr
ADF
PySpark
Jupyter Notebook
Big Data
Windows Azure
+3 more
      • Data pre-processing, data transformation, data analysis, and feature engineering
      • Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
    • Required skills:
      • Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
      • Fluency in Python (Pandas), PySpark, SQL, or similar
      • Azure data factory experience (min 12 months)
      • Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
      • Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
      • Ability to work independently with demonstrated experience in project or program management
      • Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
 
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort