

Vdart Software Services Pvt Ltd
https://vdartinc.comAbout
Company social profiles
Jobs at Vdart Software Services Pvt Ltd
Scrum Master
Bangalore (Marathahalli)
Key Responsibilities
- Lead digital transformation initiatives for global clients within the data engineering domain, aligning technical solutions with business objectives
- Act as a bridge between business and technical teams to gather, analyze, and translate requirements into user stories and acceptance criteria
- Own and manage product backlogs, ensuring continuous grooming and prioritization aligned with client goals
- Facilitate Agile/Scrum ceremonies including daily stand-ups, sprint planning, and retrospectives
- Collaborate with cross-functional teams (Data Engineering, QA, DevOps, and stakeholders) to ensure seamless delivery
- Drive Agile best practices and continuous improvement in processes, timelines, and product quality
- Track and report team performance metrics, sprint progress, and release planning
Required Qualifications
- 8+ years of experience in IT services with a mix of business analysis and Agile delivery roles
- Strong experience working with data engineering teams / data platforms
- Proven experience in delivering client-facing digital products and managing complex stakeholder environments
- Hands-on experience with Agile tools such as JIRA, Confluence, and backlog management platforms
- Strong analytical, communication, and facilitation skills with the ability to balance business and technical priorities
Role: Oracle ERP Support Service Delivery Manager
Location: Marathahalli, Bangalore
Job Description:
Required Skills & Qualifications
Ø Experience: Experience in Oracle ERP Support (Cloud/EBS) and IT Service Delivery Management.
Ø Technical Knowledge: Understanding of Oracle ERP modules (e.g., Financials, HCM, SCM).
Ø Leadership: Small to Medium (15-40) Team management and development skills
Ø Communication: Good client-facing, negotiation, and communication skills.
Key Responsibilities:
Ø Service Delivery Management: Manage end-to-end support operations, ensuring compliance with Service Level Agreements (SLAs) and Key Performance Indicators (KPIs).
Ø Lead incidents, problems, and change management processes, including supporting patches and new releases. Develop service dashboards, monitor performance metrics, and ensure adherence to compliance and risk management.
Ø Identify opportunities for automation and process optimization within ERP operations.
Company Description
VDart is a global leader in digital solutions, product development, and professional services. Headquartered in Atlanta, GA, USA, the company has a robust global presence across North America, Europe, the Middle East, and Asia. VDart Digital specializes in delivering cutting-edge digital transformation solutions, leveraging technologies like AI/ML, blockchain, cloud computing, IoT, and data analytics. Its innovative product portfolio includes offerings such as TestSamurAI, LendSmartAI, IDocLens, and more, which are designed to optimize operations and drive business growth.
Role Description
We are looking for a seasoned data leader to design, build, and own enterprise-scale data platforms on Azure. This role goes beyond development — it requires end-to-end accountability for architecture, data pipelines, transformation frameworks, and production readiness.
You will act as the critical link between business stakeholders, data engineering teams, and analytics functions, ensuring scalable and high-performance data solutions are delivered and maintained.
Key Responsibilities:
- Design and implement robust data pipelines using Azure Data Factory (ADF), including integration with REST APIs and external data sources
- Build scalable data transformation workflows using Databricks (PySpark), handling complex and nested JSON datasets
- Architect and implement Delta Lake-based data platforms, including fact and dimension models (star schema)
- Define and enforce best practices for data modeling, performance optimization, and cost efficiency
- Own end-to-end data platform lifecycle — from architecture and deployment to monitoring and operational support
- Establish production readiness frameworks, including logging, alerting, and data quality checks
- Collaborate closely with business and analytics teams to translate requirements into scalable technical solutions
- Mentor engineering teams and drive architectural governance across projects
Required Experience & Skills:
• Experience building pipelines with Azure Data Factory
• Experience connecting to REST API sources using Azure Data Factory
• Experience building transformations with Databricks using PySpark
• Experience handling complex nested JSON files using PySpark
• Experience designing dimensional models/star schema
• Experience implementing facts and dimension tables in Databricks Delta Lake
• Around 15-20 years of solid experience in building, managing, and optimizing enterprise data platforms with at least 5 years in Azure cloud data services
• Act as a bridge between business, data engineering, and analytics teams to ensure requirements are clearly understood and implemented correctly
• Own end-to-end production readiness of the data platform, including architectural design, deployment patterns, monitoring strategy and operational support.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
✔️ 8+ years of overall software development experience
✔️ 5+ years of relevant experience in Azure Integration using Logic Apps, Azure Functions, and .NET C#
✔️ Work closely with the Integration Architect. Understand and review requirement with Integration Architect for Biztalk/Foresight/Azure interface integrations.
✔️ Implement (Code in Azure Logic Apps, C#, Azure functions), modify integrations interface as per the interface specification.
✔️ Perform and document unit testing for implemented interfaces.
✔️ Assess and manage risk with leads and discuss possible recommendations.
✔️ Follow best practices, standards, and policies in implementation.
✔️ Perform code review sessions, and follow the coding guidelines.
✔️ Perform and support Integration & UAT testing; fix issues and defects.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Description:
We are hiring freshers for our International Chat Support Team to assist customers via chat, resolve queries, and provide exceptional service.
Requirements:
- Any Graduate (Freshers welcome).
- Excellent written English skills.
- Strong problem-solving ability.
- Basic computer knowledge.
- Willingness to work in rotational shifts.
Perks:
- Competitive salary + incentives.
- Career growth opportunities.
- Friendly work environment.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Descriptions:
1. Job Requirement
We are seeking a highly skilled Senior Generative AI Developer to join our AI/ML team. This role focuses on leveraging cutting-edge Generative AI (GenAI) technologies to drive innovation and efficiency in software development and business processes. The ideal candidate will work on advanced techniques such as Retrieval-Augmented Generation (RAG), document chunking, and data embedding across multiple vector databases. They will collaborate closely with cross-functional teams to design, implement, and optimize AI-driven solutions, with exposure to cloud-native AI platforms like Amazon Bedrock and Microsoft Azure OpenAI considered a plus.
Responsibilities:
- Develop and deploy GenAI-based applications to solve complex business problems.
- Implement RAG frameworks to enhance information retrieval and response accuracy.
- Design and optimize document chunking strategies tailored to specific data types and use cases.
- Build and manage data embeddings using various vector databases for high-performance similarity searches.
- Collaborate with data engineers and scientists to integrate AI solutions seamlessly into existing pipelines.
- Explore and implement best practices for leveraging Amazon Bedrock and Azure OpenAI services.
- Stay updated on emerging trends and technologies in the GenAI and AI/ML landscape.
2. Required Technical Skillsets
Mandatory Skills:
- Proficiency in Generative AI frameworks and technologies like
- Prompt Engineering – creating specific prompts to guide LLM outputs
- Fine Tuning and Few-Shot learning – Adapting LLMs for domain-specific tasks.
- Retrieval-Augmented Generation (RAG) – Combining LLMs with external data retrieval systems
- Text-to-Text Transfer Transformation (T5) – Framework for translating tasks into text forms.
- Expertise in designing and implementing chunking strategies for diverse datasets.
- Strong knowledge of data embedding techniques and working experience with vector databases like Pinecone, Weaviate, or Milvus.
- Solid programming skills in Python, with experience in AI/ML libraries such as TensorFlow, PyTorch, or Hugging Face Transformers.
- Familiarity with cloud platforms and services for AI/ML workloads (AWS or Azure).
- Experience with API integration for AI services and building scalable applications.
Desirable Skills:
- Hands-on experience with Amazon Bedrock and/or Microsoft Azure OpenAI services.
- Knowledge of natural language processing (NLP) techniques and large language model (LLM) fine-tuning.
- Exposure to different chunking strategies, including overlap and semantic segmentation.
- Understanding of DevOps practices for deploying and managing AI models in production.
- Strong problem-solving and analytical skills with a focus on delivering value-driven solutions.
Similar companies
About the company
We are a fast growing virtual & hybrid events and engagement platform. Gevme has already powered hundreds of thousands of events around the world for clients like Facebook, Netflix, Starbucks, Forbes, MasterCard, Citibank, Google, Singapore Government etc.
We are a SAAS product company with a strong engineering and family culture; we are always looking for new ways to enhance the event experience and empower efficient event management. We’re on a mission to groom the next generation of event technology thought leaders as we grow.
Join us if you want to become part of a vibrant and fast moving product company that's on a mission to connect people around the world through events.
Jobs
10
About the company
Jobs
26
About the company
About Pendo
Pendo is a leading product experience and software analytics platform that helps companies understand how users interact with their software and improve those experiences. It operates in the product analytics and digital adoption space, enabling organizations to combine analytics, in-app guidance, and user feedback in one unified platform.
Pendo – Key Highlights
- Founded in 2013, headquartered in Raleigh, North Carolina
- Serves 14,000+ companies globally
- Processes 20B+ daily events and supports 1B+ users
- 850+ employees across global offices
- Raised $350M+ total funding from investors like General Atlantic, Tiger Global, and Sapphire Ventures
Chisel was acquired by Pendo in 2026, marking a key milestone in its journey. The acquisition strengthens Pendo’s push into AI-driven product experience, with Chisel’s agentic capabilities becoming a core part of Pendo’s broader platform vision.
Chisel Labs is an AI-powered product management platform built to help product teams move faster and make better decisions. It operates in the product management and AI SaaS space, bringing feedback, roadmapping, and documentation into a unified system of record.
At its core, Chisel functions as an AI PM Agent, automating workflows like PRDs, research, and feedback analysis - allowing teams to focus on strategy, prioritization, and product outcomes.
About Chisel
Chisel is a lean, globally distributed team with presence across the US and India. The team operates at the intersection of AI, product management, and enterprise SaaS, with a strong emphasis on ownership, speed, and building for real-world product teams at scale. Post-acquisition, the team is now part of Pendo’s broader organization.
🏆 Milestones
- Founded in the early 2020s as a next-gen product management platform
- Built one of the early AI-native PM agents for automating product workflows
- Grew adoption across global teams with integrations like Jira, Salesforce, and Zendesk
- Achieved strong product recognition across PM tooling ecosystems
- Acquired by Pendo (2026) to accelerate AI innovation in product experience
Jobs
6
About the company
Jobs
3
About the company
Your Go-To AI Consultancy For AI Research, AI Products, AI Solutions, AI MVP Design, Idea Validation
Jobs
22
About the company
Jobs
1
About the company
At Ampera Technologies, we empower businesses with cutting-edge data analytics, quality assurance, and data engineering solutions
Jobs
16
About the company
Improving is a leading IT professional services firm committed to helping companies achieve lasting success through modern technology. With core expertise in AI, Data, and Applications, we specialize in transforming legacy systems, building cloud-native platforms, and delivering intelligent, future-ready solutions for today’s complex business needs. Improving’s leaders are equally committed to fostering a great place to work that is inclusive and purpose-centered, empowering Improvers to bring their whole selves to work. Our team is known for its collaborative approach and long-term partnerships that prioritize measurable outcomes. By combining technical excellence with strategic insight, Improving enables all stakeholders to grow, adapt, and lead in an ever-evolving digital landscape.
Jobs
5






