11+ RDF Jobs in Chennai | RDF Job openings in Chennai
Apply to 11+ RDF Jobs in Chennai on CutShort.io. Explore the latest RDF Job opportunities across top companies like Google, Amazon & Adobe.
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes
Oracle EBS HCM Analyst
Position Description
We are seeking an experienced Techno-Functional Oracle EBS HCM Analyst with deep expertise in HR, Payroll, and OTL (Oracle Time & Labor) modules. The ideal candidate will have a strong understanding of both business processes and technical configurations, ensuring seamless integration, support, and optimization of the Oracle EBS HCM system across multiple GREs and payrolls.
Key Responsibilities:
Configure, maintain, and support Oracle EBS HCM modules including Core HR, Payroll, and OTL.
Manage and support multiple payrolls across several GREs, ensuring accuracy and compliance with organizational and statutory requirements.
Develop, troubleshoot, and optimize Payroll Fast Formulas and related functions.
Ensure compliance with federal and state tax withholding rules and minimum wage regulations.
Perform data validation, extracts, and reporting using SQL and other tools.
Create and maintain end-user documentation, training materials, and functional specifications.
Handle Service Requests (SRs) — creation, tracking, and resolution through Oracle Support or internal teams.
Collaborate with cross-functional teams to support IT and regression testing for patches, upgrades, and other system changes.
Analyze and resolve technical and functional issues impacting HR, Payroll, and OTL operations.
Partner with HR and Payroll business users to identify improvement opportunities and implement system enhancements.
Required Skills & Qualifications:
15+ years of techno-functional experience in Oracle EBS HCM (R12 or higher).
In-depth understanding of HR, Payroll, and OTL configurations and schemas.
Strong experience with Payroll Fast Formulas and core payroll functions.
Expertise in SQL for queries, data validation, and reporting.
Sound knowledge of tax rules, wage compliance, and statutory requirements.
Proven ability to support multiple GREs and payrolls simultaneously.
Experience with testing processes — IT, UAT, and regression testing.
Strong documentation, training, and communication skills.
Proven track record of handling SR lifecycle management effectively.
Nice to Have:
Experience with Oracle EBS R12.2 or higher versions.
Familiarity with Oracle Cloud HCM (for potential migration or hybrid environments).
Exposure to integration tools (e.g., WebADI, BI Publisher, or OTBI).
We are seeking a highly skilled and motivated Python Developer with hands-on experience in AWS cloud services (Lambda, API Gateway, EC2), microservices architecture, PostgreSQL, and Docker. The ideal candidate will be responsible for designing, developing, deploying, and maintaining scalable backend services and APIs, with a strong emphasis on cloud-native solutions and containerized environments.
Key Responsibilities:
- Develop and maintain scalable backend services using Python (Flask, FastAPI, or Django).
- Design and deploy serverless applications using AWS Lambda and API Gateway.
- Build and manage RESTful APIs and microservices.
- Implement CI/CD pipelines for efficient and secure deployments.
- Work with Docker to containerize applications and manage container lifecycles.
- Develop and manage infrastructure on AWS (including EC2, IAM, S3, and other related services).
- Design efficient database schemas and write optimized SQL queries for PostgreSQL.
- Collaborate with DevOps, front-end developers, and product managers for end-to-end delivery.
- Write unit, integration, and performance tests to ensure code reliability and robustness.
- Monitor, troubleshoot, and optimize application performance in production environments.
Required Skills:
- Strong proficiency in Python and Python-based web frameworks.
- Experience with AWS services: Lambda, API Gateway, EC2, S3, CloudWatch.
- Sound knowledge of microservices architecture and asynchronous programming.
- Proficiency with PostgreSQL, including schema design and query optimization.
- Hands-on experience with Docker and containerized deployments.
- Understanding of CI/CD practices and tools like GitHub Actions, Jenkins, or CodePipeline.
- Familiarity with API documentation tools (Swagger/OpenAPI).
- Version control with Git.
Who are we?
Kriyadocs is a leading document workflow SaaS platform focused on the publishing industry. Technology is at the core of our evolution – we’ve consciously striven to always stay ahead of the curve in its adoption to provide best-in-class capabilities for our clients and our employees. This ethos is reflected in our vision and mission.
Our Vision: To make publishing all content as simple as clicking a button and become the partner of choice for individuals and organizations looking to share knowledge.
Our Mission: Provide a fantastic experience to authors, content publishers and our own employees through technology and innovation, by publishing high-quality content seamlessly and quickly. We deliver Happy Authors and Happy Employees.
What is it really like to work here?
At Kriyadocs, every Kriyator is driven by our culture at the core to
- Deliver Excellence - Deliver Delight
- Stay Curious - Stay Driven
- Dream Big - Rise Together
You could also be a Kriyator, if you are
- Fearless in taking on challenges
- Focused on learning, demonstrating new skills and working towards successful outcomes
- Fanatical in taking pride and responsibility in all your work
Why should you join us?
- Industry Leading Product - We are the leading platform in our space and have several large global brands as our customers.
- Create an impact - We give you the environment to transform your ideas into reality and create fantastic experiences for our customers.
- Budding & Agile team - We are a growing team with love for learning, constant quest for quality and are outspoken about ownership.
As a Technical & Style Editor at Kriyadocs, you will play a crucial role in refining and enhancing our written content to meet the highest editorial standards. You will work closely with our authors and writers to ensure accuracy, clarity, and adherence to our style guidelines. This is a fantastic opportunity to work in a creative and collaborative environment, contributing to the success of our publishing projects.
Key Responsibilities:
Review and edit manuscripts for grammar, punctuation, and syntax, ensuring adherence to our in-house style guide.
- Verify and correct technical content, ensuring it is accurate, consistent, and understandable for the target audience.
- Collaborate with authors and writers to maintain the integrity of their work while enhancing readability and coherence.
- Provide feedback and suggestions to improve content structure and flow.
- Proofread and format documents to meet publishing standards.
- Assist in the development and maintenance of editorial guidelines.
Qualifications:
- Bachelor's degree in English, Journalism, or a related field.
- 1-2 years of experience in technical and style editing, preferably in the publishing industry.
- Attention to detail and an eye for consistency.
- Familiarity with editorial and publishing software is a plus.
- Excellent communication skills and the ability to work collaboratively.
If you're a detail-oriented editor with a passion for refining technical content and ensuring style consistency, we'd love to hear from you. Apply now to be a part of our dynamic team at Kriyadocs.
Job Types: Full-time, Permanent
Pay: Up to ₹20,000.00 per month
Benefits:
- Cell phone reimbursement
- Health insurance
- Provident Fund
Schedule:
- Day shift
Supplemental Pay:
- Yearly bonus
Ability to commute/relocate:
- Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required)
Experience:
- total work: 1 year (Preferred)
5-7 years of experience in Data Engineering with solid experience in design, development and implementation of end-to-end data ingestion and data processing system in AWS platform.
2-3 years of experience in AWS Glue, Lambda, Appflow, EventBridge, Python, PySpark, Lake House, S3, Redshift, Postgres, API Gateway, CloudFormation, Kinesis, Athena, KMS, IAM.
Experience in modern data architecture, Lake House, Enterprise Data Lake, Data Warehouse, API interfaces, solution patterns, standards and optimizing data ingestion.
Experience in build of data pipelines from source systems like SAP Concur, Veeva Vault, Azure Cost, various social media platforms or similar source systems.
Expertise in analyzing source data and designing a robust and scalable data ingestion framework and pipelines adhering to client Enterprise Data Architecture guidelines.
Proficient in design and development of solutions for real-time (or near real time) stream data processing as well as batch processing on the AWS platform.
Work closely with business analysts, data architects, data engineers, and data analysts to ensure that the data ingestion solutions meet the needs of the business.
Troubleshoot and provide support for issues related to data quality and data ingestion solutions. This may involve debugging data pipeline processes, optimizing queries, or troubleshooting application performance issues.
Experience in working in Agile/Scrum methodologies, CI/CD tools and practices, coding standards, code reviews, source management (GITHUB), JIRA, JIRA Xray and Confluence.
Experience or exposure to design and development using Full Stack tools.
Strong analytical and problem-solving skills, excellent communication (written and oral), and interpersonal skills.
Bachelor's or master's degree in computer science or related field.
• Server installation, configuration, and maintenance.
• Recommending, upgrading, and maintaining current Linux systems.
• Database administration (MySQL)
• Network configuration, and system security
• Linux mail server.
• Ability to work on weekends and holidays.
Required Skills
• Good English language skills.
• Strong interpersonal communication skills; interacting positively with upper management.
• Independent problem-solving, self-direction.
• Comfortable with most aspects of operating system administration; for example, the configuration of mail systems, system installation, and configuration, printer systems, fundamentals of security, installing third-party software.
• Has a solid understanding of a LINUX-based operating system; understands paging and swapping, inter-process communication, devices and what device drivers do, filesystem concepts (inode, clustering, logical partitions).
• Familiarity with fundamental networking/distributed computing environment concepts; understands routing concepts.
• Ability to write scripts in some administrative language (Shell, Perl, Python).
Technical Skills Required – Mandatory:
- Cloud Technologies – AWS OR Bigdata (Basic Level)
- Python (Highly Experience Level)
- Databases – SQL (Highly Experience Level)
- Operating Systems – Unix (Highly Experience Level)
Strong knowledge in writing complex SQL queries and Knowledge of Performance Tuning
Excellent working knowledge of production support role and Strong debug/troubleshooting skills
Provide 24×7 operational support to all production practices on holidays and weekends.
Hands-on experience in production support for data engineering processes, root cause analysis, and identifying opportunities to improve existing processes.
Roles and Responsibilities:
Set up and optimize the existing integration, monitoring and alerting processes.
Working knowledge of ETL/ELT in Hadoop, Spark and MPP databases is required.
Effectively collaborate with the partners (SMEs, DBA, and Business users) to ensure the reliability of the Data systems.
Should have experience in Big Data, Hadoop.
Currently providing WFH.
immediate joiner or 30 days
Requirements
- Use prospecting strategies, such as Email Campaigns, Account Based Prospecting, LinkedIn Prospecting, Cold calling, Prospect Nurturing, to lead initial outreach to prospects
- Conduct high volume prospecting to generate qualified leads
- Prospecting to USA & SEA markets.
- Create & update email templates, message templates, and call scripts from time to time
- Generate new leads: Identify and contact decision-makers, screen potential business opportunities.
- Research and identify new business opportunities - including new markets, growth areas, trends, customers, partnerships, products and services - or new ways of reaching existing markets
- Identify the needs and challenges of the prospective customer
- Identify key influencers and decision makers at target companies and convince them on the benefits of our solutions
- Schedule discovery meetings between prospects and sales manager
- Continuous follow-ups with prospects and clients
- Build relationships by nurturing warm prospects and finding new business opportunities
- Conduct needs assessments calls with specific prospects as assigned
- Qualify leads from marketing campaigns for sales opportunities
- Develop an overall account strategy that will lead to a well-executed, team based selling effort
- Build sales pipeline in accordance with targets
- Submit reports to Business Development Manager on weekly, monthly, and quarterly results
- 2+ years of experience in the IT industry
- Candidates should be willing to work evenings & nights to cold call the USA market, as and when required.
- Prior experience as a business development rep with IT Consultancies.
- Self-directing and agile, with the ability to pivot strategies quickly, as needed.
- Passionate, creative thinker with exceptional analytical skills.
- Team player with superb collaboration and communication skills.
- Outstanding communication and interpersonal skills.
- Keep current with business development strategies, best practices, and associated resources
Your skills and experience should cover:
-
5+ years of experience with developing, deploying, and debugging solutions on the AWS platform using ALL AWS services such as S3, IAM, Lambda, API Gateway, RDS, Cognito, Cloudtrail, CodePipeline, Cloud Formation, Cloudwatch and WAF (Web Application Firewall).
-
Amazon Web Services (AWS) Certified Developer: Associate, is required; Amazon Web Services (AWS) DevOps Engineer: Professional, preferred.
-
5+ years of experience using one or more modern programming languages (Python, Node.js).
-
Hands-on experience migrating data to the AWS cloud platform
-
Experience with Scrum/Agile methodology.
-
Good understanding of core AWS services, uses, and basic AWS architecture best practices (including security and scalability)
-
Experience with AWS Data Storage Tools.
-
Experience in Configure and implement AWS tools such as CloudWatch, CloudTrail and direct system logs for monitoring.
-
Experience working with GIT, or similar tools.
-
Ability to communicate and represent AWS Recommendations and Standards.
The following areas are highly advantageous:
-
Experience with Docker
-
Experience with PostgreSQL database




