Cutshort logo

11+ RDF Jobs in Pune | RDF Job openings in Pune

Apply to 11+ RDF Jobs in Pune on CutShort.io. Explore the latest RDF Job opportunities across top companies like Google, Amazon & Adobe.

icon
Bangalore, Hyderabad, and Gurgaon, Kolkata, Mumbai, Pune, Chennai
8 - 12 yrs
₹25L - ₹40L / yr
GraphQL
skill iconJava
RDF

The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.

Work you’ll do


1. Design and develop scalable and efficient knowledge graph architectures.

2. Implement knowledge graph integration with existing data systems and business processes.

3. Lead the ontology design, data modeling, and schema development for knowledge representation.

4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.

5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.

6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.

7. Develop and maintain documentation and specifications for system architectures and designs.

8. Stay updated with the latest industry trends in knowledge graph technologies and data management.

The Team


Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.

It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.

I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.

Qualifications and Experience


Required:


1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.

3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).

4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.

5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).

6. Excellent analytical and problem-solving abilities.

7. Strong communication and collaboration skills to work effectively across teams.

Preferred:


1. Experience with machine learning and natural language processing.

2. Experience with Industry 4.0 technologies and principles

3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.

4. Experience with containerization technologies like Docker and Kubernetes

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune
5 - 6 yrs
₹4L - ₹10L / yr
Windows Azure
skill iconPython
PySpark
ADF
databricks
+2 more

🚀 Hiring: Data Engineer ( Azure ) at Deqode

⭐ Experience: 5+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Delhi, Banglore,

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


⭐ Hiring: Databricks Data Engineer – Lakeflow | Streaming | DBSQL | Data Intelligence

We are looking for a Databricks Data Engineer ( Azure ) to build reliable, scalable, and governed data pipelines powering analytics, operational reporting, and the Data Intelligence Layer.


🔹 Key Responsibilities

✅ Build optimized batch pipelines using Delta Lake (partitioning, OPTIMIZE, Z-ORDER, VACUUM)

✅ Implement incremental ingestion using Databricks Autoloader with schema evolution & checkpointing

✅ Develop Structured Streaming pipelines with watermarking, late data handling & restart safety

✅ Implement declarative pipelines using Lakeflow

✅ Design idempotent, replayable pipelines with safe backfills

✅ Optimize Spark workloads (AQE, skew handling, shuffle & join tuning)

✅ Build curated datasets for Databricks SQL (DBSQL), dashboards & downstream applications

✅ Package and deploy using Databricks Repos & Asset Bundles (CI/CD)

Ensure governance using Unity Catalog and embedded data quality checks


✅ Mandatory Skills (Must Have)

👉 Databricks & Delta Lake (Advanced Optimization & Performance Tuning)

👉 Structured Streaming & Autoloader Implementation

👉 Databricks SQL (DBSQL) & Data Modeling for Analytics

Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Pune
6 - 8 yrs
₹5L - ₹25L / yr
Artificial Intelligence (AI)
Natural Language Processing (NLP)
TensorFlow
Scikit-Learn

Job Title: AI Architect

Location: [Pune, Onsite]

Experience: 6+ Years

Employment Type: Full-Time


Working requirements:

Engagement Commitment: 3 months to start with to ensure that the engagement is progressing well.

Deliverable: Minimum Viable Product is already there, enhancement is required

Working Hours Preferable: California Time (which is ~12 hours behind IST), requesting this because then it will be easier to huddle and both US and India team can work simultaneously


About the Role:

We are seeking an experienced and visionary AI Architect to lead the design, development, and deployment of cutting-edge AI/ML solutions. The ideal candidate will have a strong foundation in artificial intelligence, machine learning, data engineering, and cloud technologies, with the ability to architect scalable and high-performing systems.

Key Responsibilities:

  • Design and implement end-to-end AI/ML architectures for large-scale enterprise applications.
  • Lead AI strategy, frameworks, and roadmaps in collaboration with product and engineering teams.
  • Guide the development of machine learning models (e.g., NLP, Computer Vision, Predictive Analytics).
  • Define best practices for model training, validation, deployment, monitoring, and versioning.
  • Collaborate with data engineers to build and maintain robust data pipelines.
  • Choose the right AI tools, libraries, and platforms based on project needs (e.g., TensorFlow, PyTorch, Hugging Face).
  • Work with cloud platforms (AWS, Azure, GCP) to deploy and manage AI models and services.
  • Ensure AI/ML solutions comply with data privacy, governance, and ethical standards.
  • Mentor junior AI engineers and data scientists.

Required Skills & Qualifications:

  • Bachelor's or Master’s degree in Computer Science, Data Science, AI/ML, or a related field.
  • 6+ years of experience in AI/ML, with at least 2+ years in an architecture or lead role.
  • Strong experience with AI/ML frameworks: TensorFlow, PyTorch, Scikit-learn, etc.
  • Deep understanding of LLMs, transformers, GPT models, and fine-tuning techniques.
  • Proficiency in Python and data processing libraries (Pandas, NumPy, etc.).
  • Experience with cloud-based AI services (AWS SageMaker, Azure ML, Vertex AI, etc.).
  • Knowledge of MLOps practices, CI/CD for models, and model monitoring.
  • Familiarity with data lakehouse architecture, real-time inference, and APIs.
  • Strong communication and leadership skills.

Preferred Qualifications:

  • Experience with generative AI applications and prompt engineering.
  • Knowledge of reinforcement learning or federated learning.
  • Publications or contributions to open-source AI projects.
  • AI certifications from cloud providers (AWS, Azure, GCP).

Why Join Us?

  • Work on transformative AI projects across industries.
  • Collaborate with a passionate and innovative team.
  • Flexible work environment with remote/hybrid options.
  • Continuous learning, upskilling, and growth opportunities.


Read more
global business process management company

global business process management company

Agency job
via Jobdost by Saida Pathan
Bengaluru (Bangalore), Pune, Mumbai, Hyderabad
2 - 10 yrs
₹6L - ₹20L / yr
Appian
BPM
Business process management
MySQL

Job Description:      

 

  • Extensive experience in Appian BPM application development
  • Knowledge of Appian architecture and its objects best practices
  • Participate in analysis, design, and new development of Appian based applications
  • Team leadership and provide technical leadership to Scrum teams
  • Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
  • Build applications: interfaces, process flows, expressions, data types, sites, integrations,
  • Proficient with SQL queries and with accessing data present in DB tables and views
  • Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
  • Experience working with different Appian Object types, query rules, constant rules and expression rules

 Qualifications 

At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher

  • Over 8 years in Implementing IT solutions using BPM or integration technologies
  • Certification Mandatory- L1 and L2 a
  • Experience in Scrum/Agile methodologies with Enterprise level application development projects
  • Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL
Additional information Skills Required
  • Appian BPM application development on version 19.x or higher
  • Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
  • Good leadership skills and the ability to lead a team of software engineers technically
  • Experience working in Agile Scrum teams
  • Good Communication skills
Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
Pune
8 - 12 yrs
₹10L - ₹15L / yr
Data engineering
Data modeling
Snow flake schema
ETL
ETL architecture
+3 more

Job Title: Lead Data Engineer

📍 Location: Pune

🧾 Experience: 10+ Years

💰 Budget: Up to 1.7 LPM


Responsibilities

  • Collaborate with Data & ETL teams to review, optimize, and scale data architectures within Snowflake.
  • Design, develop, and maintain efficient ETL/ELT pipelines and robust data models.
  • Optimize SQL queries for performance and cost efficiency.
  • Ensure data quality, reliability, and security across pipelines and datasets.
  • Implement Snowflake best practices for performance, scaling, and governance.
  • Participate in code reviews, knowledge sharing, and mentoring within the data engineering team.
  • Support BI and analytics initiatives by enabling high-quality, well-modeled datasets.


Read more
Magic EdTech

at Magic EdTech

3 recruiters
Jitendra Singh
Posted by Jitendra Singh
Noida, Pune
6 - 9 yrs
₹10L - ₹15L / yr
skill iconReact.js
skill iconJavascript
skill iconRedux/Flux
skill iconHTML/CSS

Role - Principle Engg | Reactjs

Skill Set - Programming

  • React js with Redux,
  • HTML5/CSS3
  • Javascript
  • Mobx (optional)
  • Storybook (optional)

Unit Test Cases: Jest & Enzyme

Code Quality:

  • SonarQube Knowledge
  • Strong hold on unit-testing frameworks and writing effective unit tests

Other Important Skills:

  • Engineering & Design Skills
  • Good analytical & problem solving skills
  • Ability to ship features end to end without much guidance
  • Experienced with Agile methodologies

Soft Skills

  • Good communication skills
  • Zeal to learn new technologies & Methodologies


Read more
Propelius Technologies

at Propelius Technologies

2 recruiters
Taniya P
Posted by Taniya P
Remote, Surat, Pune, Mumbai, Bengaluru (Bangalore), Hyderabad, Ahmedabad, Vadodara, Indore
2 - 4 yrs
₹5L - ₹10L / yr
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconJavascript
MySQL
skill iconReact.js
+1 more
We are hiring for a Silicon Valley start-up and you will be working directly with the team in San Francisco, CA so good communication is a must.
We are looking for a highly skilled computer programmer who is comfortable with both front and back-end programming.
Full-stack developers are responsible for developing and designing front-end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties.
Full-stack developers will be required to see out a project from conception to the final product, requiring good organizational skills and attention to detail.
Full Stack Developer Responsibilities:
  • Developing front-end website architecture.
  • Designing user interactions on web pages.
  • Developing back-end website applications.
  • Creating servers and databases for functionality.
  • Ensuring cross-platform optimization for mobile phones.
  • Ensuring responsiveness of applications.
  • Working alongside graphic designers for web design features.
  • Seeing through a project from conception to finished product.
  • Designing and developing APIs.
  • Meeting both technical and consumer needs.
  • Staying abreast of developments in web applications and programming languages.
Full Stack Developer Requirements:
  • Degree in computer science.
  • Strong organizational and project management skills.
  • Proficiency with fundamental front-end languages such as HTML, CSS, and JavaScript.
  • Familiarity with JavaScript frameworks React.js.
  • Proficiency with Ruby on Rails
  • Familiarity with database technology such as MySQL or Oracle or PostgreSQL.
  • Excellent verbal communication skills.
  • Good problem-solving skills.
  • Attention to detail.
Benefits:
  • 5 working days
  • Flexible timings
  • Work from home granted
Read more
Horizontal Integration
Remote, Bengaluru (Bangalore), Hyderabad, Vadodara, Pune, Jaipur, Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
6 - 15 yrs
₹10L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Windows Azure
Microsoft Windows Azure
Google Cloud Platform (GCP)
skill iconDocker
+2 more

Position Summary

DevOps is a Department of Horizontal Digital, within which we have 3 different practices.

  1. Cloud Engineering
  2. Build and Release
  3. Managed Services

This opportunity is for Cloud Engineering role who also have some experience with Infrastructure migrations, this will be a complete hands-on job, with focus on migrating clients workloads to the cloud, reporting to the Solution Architect/Team Lead and along with that you are also expected to work on different projects for building out the Sitecore Infrastructure from scratch.

We are Sitecore Platinum Partner and majority of the Infrastructure work that we are doing is for Sitecore.

Sitecore is a .Net Based Enterprise level Web CMS, which can be deployed on On-Prem, IaaS, PaaS and Containers.

So, most of our DevOps work is currently planning, architecting and deploying infrastructure for Sitecore.
 

Key Responsibilities:

  • This role includes ownership of technical, commercial and service elements related to cloud migration and Infrastructure deployments.
  • Person who will be selected for this position will ensure high customer satisfaction delivering Infra and migration projects.
  • Candidate must expect to work in parallel across multiple projects, along with that candidate must also have a fully flexible approach to working hours.
  • Candidate should keep him/herself updated with the rapid technological advancements and developments that are taking place in the industry.
  • Along with that candidate should also have a know-how on Infrastructure as a code, Kubernetes, AKS/EKS, Terraform, Azure DevOps, CI/CD Pipelines.

Requirements:

  • Bachelor’s degree in computer science or equivalent qualification.
  • Total work experience of 6 to 8 Years.
  • Total migration experience of 4 to 6 Years.
  • Multiple Cloud Background (Azure/AWS/GCP)
  • Implementation knowledge of VMs, Vnet,
  • Know-how of Cloud Readiness and Assessment
  • Good Understanding of 6 R's of Migration.
  • Detailed understanding of the cloud offerings
  • Ability to Assess and perform discovery independently for any cloud migration.
  • Working Exp. on Containers and Kubernetes.
  • Good Knowledge of Azure Site Recovery/Azure Migrate/Cloud Endure
  • Understanding on vSphere and Hyper-V Virtualization.
  • Working experience with Active Directory.
  • Working experience with AWS Cloud formation/Terraform templates.
  • Working Experience of VPN/Express route/peering/Network Security Groups/Route Table/NAT Gateway, etc.
  • Experience of working with CI/CD tools like Octopus, Teamcity, Code Build, Code Deploy, Azure DevOps, GitHub action.
  • High Availability and Disaster Recovery Implementations, taking into the consideration of RTO and RPO aspects.
  • Candidates with AWS/Azure/GCP Certifications will be preferred.
Read more
Azoi

at Azoi

1 video
2 recruiters
Ayesha Aga
Posted by Ayesha Aga
Pune
3 - 7 yrs
₹3L - ₹12L / yr
dotnet
webapi
mvx
anuglarjs
Windows Azure
Software Engineer, Sr Engineer, Team Lead BE/MCA with 3-7years of experience in development of web applications / products. Expertise on the MS Stack (.NET 3.5/4/4.5, ASP.NET, MVC, Web API, WCF, SQL Server) Good scripting skills - JavaScript, JQuery, Knockout JS, Angular JS, Durandal etc. Expertise in Test Driven Development is a big advantage. Someone who has worked with Azure Storage a must Exposure to working in agile environments is preferred. Job Responsibilities 1. Develop and code high quality framework components. 2. Develop and enhance cutting edge applications / products. 3. Develop technical documentation. 4. Create automated unit test case suites. If interested please give below details Total Exp Relevant Exp WebAPI or WCF exp ANgularjs or Azure exp MVC Exp Current ctc Expected ctc and Notice period all are required
Read more
CarOK

at CarOK

1 video
3 recruiters
Gurmohit Ahluwalia
Posted by Gurmohit Ahluwalia
Pune
2 - 4 yrs
₹2L - ₹4L / yr
Customer Success
Logistics
Operations
CarOK is an online platform for car servicing & repairs. Currently based out of Pune City with more than 3000 happy customers. CarOK service advisors inspect your car & tell you the minimum work required. The CarOK team sends you multiple quotes for your car's work and then the advisors pickup, supervise and drop off the car at your doorstep.
Read more
InfoVision Labs India Pvt. Ltd. Pune
Shekhar Singh kshatri
Posted by Shekhar Singh kshatri
Pune
5 - 10 yrs
₹5L - ₹5L / yr
Hadoop
skill iconScala
Spark
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort