Cutshort logo
Jupyter Notebook Jobs in Hyderabad

11+ Jupyter Notebook Jobs in Hyderabad | Jupyter Notebook Job openings in Hyderabad

Apply to 11+ Jupyter Notebook Jobs in Hyderabad on CutShort.io. Explore the latest Jupyter Notebook Job opportunities across top companies like Google, Amazon & Adobe.

icon
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Krishna kandregula
Posted by Krishna kandregula
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
PowerBI
DAX
+12 more
  • Creating and managing ETL/ELT pipelines based on requirements
  • Build PowerBI dashboards and manage datasets needed.
  • Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
  • Build data cubes for real-time visualisation needs and CXO dashboards.


Required Tech Skills


  • Microsoft PowerBI & DAX
  • Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
  • Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory



Read more
master works
Spandana Bomma
Posted by Spandana Bomma
Hyderabad
3 - 7 yrs
₹6L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+8 more

Job Description-

Responsibilities:

* Work on real-world computer vision problems

* Write robust industry-grade algorithms

* Leverage OpenCV, Python and deep learning frameworks to train models.

* Use Deep Learning technologies such as Keras, Tensorflow, PyTorch etc.

* Develop integrations with various in-house or external microservices.

* Must have experience in deployment practices (Kubernetes, Docker, containerization, etc.) and model compression practices

* Research latest technologies and develop proof of concepts (POCs).

* Build and train state-of-the-art deep learning models to solve Computer Vision related problems, including, but not limited to:

* Segmentation

* Object Detection

* Classification

* Objects Tracking

* Visual Style Transfer

* Generative Adversarial Networks

* Work alongside other researchers and engineers to develop and deploy solutions for challenging real-world problems in the area of Computer Vision

* Develop and plan Computer Vision research projects, in the terms of scope of work including formal definition of research objectives and outcomes

* Provide specialized technical / scientific research to support the organization on different projects for existing and new technologies

Skills:

* Object Detection

* Computer Science

* Image Processing

* Computer Vision

* Deep Learning

* Artificial Intelligence (AI)

* Pattern Recognition

* Machine Learning

* Data Science

* Generative Adversarial Networks (GANs)

* Flask

* SQL

Read more
Digitalshakha
Saurabh Deshmukh
Posted by Saurabh Deshmukh
Remote, Bengaluru (Bangalore), Mumbai, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 5 yrs
₹3L - ₹10L / yr
PowerBI
Spotfire
Qlikview
Tableau
Data Visualization
+2 more

Key Responsibilities:

  1. Collaborate with business stakeholders and data analysts to understand reporting requirements and translate them into effective Power BI solutions.
  2. Design and develop interactive and visually compelling dashboards, reports, and visualizations using Microsoft Power BI.
  3. Ensure data accuracy and consistency in the reports by working closely with data engineers and data architects.
  4. Optimize and streamline existing Power BI reports and dashboards for better performance and user experience.
  5. Develop and maintain data models and data connections to various data sources, ensuring seamless data integration.
  6. Implement security measures and data access controls to protect sensitive information in Power BI reports.
  7. Troubleshoot and resolve issues related to Power BI reports, data refresh, and connectivity problems.
  8. Stay updated with the latest Power BI features and capabilities, and evaluate their potential use in improving existing solutions.
  9. Conduct training sessions and workshops for end-users to promote self-service BI capabilities and enable them to create their own reports.
  10. Collaborate with the wider data and analytics team to identify opportunities for using Power BI to enhance business processes and decision-making.

Requirements:

  1. Bachelor's degree in Computer Science, Information Systems, or a related field.
  2. Proven experience as a Power BI Developer or similar role, with a strong portfolio showcasing previous Power BI projects.
  3. Proficient in Microsoft Power BI, DAX (Data Analysis Expressions), and M (Power Query) to manipulate and analyze data effectively.
  4. Solid understanding of data visualization best practices and design principles to create engaging and intuitive dashboards.
  5. Strong SQL skills and experience with data modeling and database design concepts.
  6. Knowledge of data warehousing concepts and ETL (Extract, Transform, Load) processes.
  7. Ability to work with various data sources, including relational databases, APIs, and cloud-based platforms.
  8. Excellent problem-solving skills and a proactive approach to identifying and addressing issues in Power BI reports.
  9. Familiarity with data security and governance practices in the context of Power BI development.
  10. Strong communication and interpersonal skills to collaborate effectively with cross-functional teams and business stakeholders.
  11. Experience with other BI tools (e.g., Tableau, QlikView) is a plus.

The role of a Power BI Developer is critical in enabling data-driven decision-making and empowering business users to gain valuable insights from data. The successful candidate will have a passion for data visualization and analytics, along with the ability to adapt to new technologies and drive continuous improvement in BI solutions. If you are enthusiastic about leveraging the power of data through Power BI, we encourage you to apply and join our dynamic team.

Read more
Wallero technologies
Hyderabad
7 - 15 yrs
₹20L - ₹28L / yr
SQL
Data modeling
ADF
PowerBI
  1. Strong communication skills are essential, as the selected candidate will be responsible for leading a team of two in the future.
  2. Proficiency in SQL.
  3. Expertise in Data Modelling.
  4. Experience with Azure Data Factory (ADF).
  5. Competence in Power BI.
  6. SQL – Should be strong in Data Modeling , Tables Design and SQL Queries.
  7. ADF – Must have hands-on experience in ADF pipelines and its set-up from End-to-End in Azure including subscriptions, IR and Resource Group creations.
  8. Power BI – Hands-on knowledge in Power BI reports including documentation and follow existing standards.


Read more
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
skill iconPython
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
OSBIndia Private Limited
Bengaluru (Bangalore), Hyderabad
5 - 12 yrs
₹10L - ₹18L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
Stored Procedures
+3 more

1.      Core Responsibilities

·        Leading solutions for data engineering

·        Maintain the integrity of both the design and the data that is held within the architecture

·        Champion and educate people in the development and use of data engineering best practises

·        Support the Head of Data Engineering and lead by example

·        Contribute to the development of database management services and associated processes relating to the delivery of data solutions

·        Provide requirements analysis, documentation, development, delivery and maintenance of data platforms.

·        Develop database requirements in a structured and logical manner ensuring delivery is aligned with business prioritisation and best practise

·        Design and deliver performance enhancements, application migration processes and version upgrades across a pipeline of BI environments.

·        Provide support for the scoping and delivery of BI capability to internal users.

·        Identify risks and issues and escalate to Line / Project manager. 

·        Work with clients, existing asset owners & their service providers and non BI development staff to clarify and deliver work stream objectives in timescales that deliver to the overall project expectations.

·        Develop and maintain documentation in support of all BI processes.

·        Proactively identify cost-justifiable improvements to data manipulation processes.

·        Research and promote relevant BI tools and processes that contribute to increased efficiency and capability in support of corporate objectives.

·        Promote a culture that embraces change, continuous improvement and a ‘can do’ attitude.

·        Demonstrate enthusiasm and self-motivation at all times.

·        Establish effective working relationships with other internal teams to drive improved efficiency and effective processes.

·        Be a champion for high quality data and use of strategic data repositories, associated relational model, and Data Warehouse for optimising the delivery of accurate, consistent and reliable business intelligence

·        Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.

·        Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.

·        Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.

 

2.      Experience Requirements

·        5 years Data Engineering / ETL development experience is essential

·        5 years data design experience in an MI / BI / Analytics environment (Kimball, lake house, data lake) is essential

·        5 years experience of working in a structured Change Management project lifecycle is essential

·        Experience of working in a financial services environment is desirable

·        Experience of dealing with senior management within a large organisation is desirable

·        5 years experience of developing in conjunction with large complex projects and programmes is desirable

·        Experience mentoring other members of the team on best practise and internal standards is essential

·        Experience with cloud data platforms desirable (Microsoft Azure) is desirable

 

3.      Knowledge Requirements

·        A strong knowledge of business intelligence solutions and an ability to translate this into data solutions for the broader business is essential

·        Strong demonstrable knowledge of data warehouse methodologies

·        Robust understanding of high level business processes is essential

·        Understanding of data migration, including reconciliation, data cleanse and cutover is desirable

Read more
Synechron

at Synechron

3 recruiters
Ranjini N
Posted by Ranjini N
Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹2L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
Shell Scripting
+2 more

Position: ETL Developer

Location: Mumbai

Exp.Level: 4+ Yrs

Required Skills:

* Strong scripting knowledge such as: Python and Shell

* Strong relational database skills especially with DB2/Sybase

* Create high quality and optimized stored procedures and queries

* Strong with scripting language such as Python and Unix / K-Shell

* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.

* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.

* Experienced in Agile development process

* Java Knowledge is a big plus but not essential

* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus

* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.

* Good team player; Integrity & ownership

Read more
Persistent Systems

at Persistent Systems

1 video
1 recruiter
Agency job
via Milestone Hr Consultancy by Haina khan
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur
4 - 9 yrs
₹4L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Greetings..

We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.

Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs

Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws
Read more
Softobiz Technologies Private limited

at Softobiz Technologies Private limited

2 candid answers
1 recruiter
Swati Sharma
Posted by Swati Sharma
Hyderabad
5 - 13 yrs
₹10L - ₹25L / yr
azure data factory
SQL server
SSIS
SQL Server Integration Services (SSIS)
Data Warehouse (DWH)
+7 more

Responsibilities


  • Design and implement Azure BI infrastructure, ensure overall quality of delivered solution 
  • Develop analytical & reporting tools, promote and drive adoption of developed BI solutions 
  • Actively participate in BI community 
  • Establish and enforce technical standards and documentation 
  • Participate in daily scrums  
  • Record progress daily in assigned Devops items 


Ideal Candidates should have


  • 5 + years of experience in a similar senior business intelligence development position 
  • To be successful in the role you will require a high level of expertise across all facets of the Microsoft BI stack and prior experience in designing and developing well-performing data warehouse solutions 
  • Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps. 
  • Experience with development methodologies including Agile, DevOps, and CICD patterns 
  • Strong oral and written communication skills in English 
  • Ability and willingness to learn quickly and continuously 
  • Bachelor's Degree in computer science 


Read more
Hammoq

at Hammoq

1 recruiter
Nikitha Muthuswamy
Posted by Nikitha Muthuswamy
Remote, Indore, Ujjain, Hyderabad, Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹15L / yr
pandas
NumPy
Data engineering
Data Engineer
Apache Spark
+6 more
  • Does analytics to extract insights from raw historical data of the organization. 
  • Generates usable training dataset for any/all MV projects with the help of Annotators, if needed.
  • Analyses user trends, and identifies their biggest bottlenecks in Hammoq Workflow.
  • Tests the short/long term impact of productized MV models on those trends.
  • Skills - Numpy, Pandas, SPARK, APACHE SPARK, PYSPARK, ETL mandatory. 
Read more
Carbynetech

at Carbynetech

3 recruiters
Sahithi Kandlakunta
Posted by Sahithi Kandlakunta
Hyderabad
3 - 9 yrs
₹6L - ₹8L / yr
PowerBI
DAX
MDX
Windows Azure
Databricks
+1 more
Should have Experience in building and delivery of dashboard and analytics solutions using Microsofts Power BI and related Azure data services.
Should have Business Intelligence Experience in a data warehouse environment
Should have good experience in writing Power Query, DAX, MDX for complex data projects
Good on Rest Services including the API documentation.

Should have Experience authoring, diagnosing, and altering SQL Server objects and T-SQL
queries

Should have worked on Tabular models in Azure Analysis Services or SSAS

Should have Experience in Microsoft Azure Platform
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort