📅 Date: September 3, 2025
📍 Location: Co-located with SEMANTiCS 2025 in Vienna
📝 Submission Deadline: July 15, 2025
🧑 On-site participation required
📬 Submission Form: https://forms.gle/6KNBMuRsyXs8RiD89
How can Large Language Models (LLMs) benefit from structured knowledge like DBpedia? And how can we improve DBpedia to better serve the next generation of AI systems?
This session invites talks on the intersection of LLMs and Knowledge Graphs, with a special emphasis on DBpedia. Our goal is to understand how to make Linked Data more useful, accessible, and trustworthy for LLM-based applications—and how to evolve DBpedia in this new AI-dominated landscape.
Topics of Interest
- Retrieval-Augmented Generation (RAG) with DBpedia
- Prompt engineering for KG-aware LLMs
- Query translation: From natural language to SPARQL using LLMs
- Using LLMs to summarize or explain DBpedia data
- LLMs as interfaces for Linked Data consumption
- Automatic ontology alignment and entity linking with LLMs
- Improving LLM factual accuracy with DBpedia as a trusted source
- Challenges in grounding LLM output in structured knowledge
- Scaling and performance considerations for hybrid KG–LLM systems
- Bias, hallucination, and verification in LLMs using DBpedia
- Use cases: e.g., chatbots, semantic search, Q&A systems powered by DBpedia + LLMs
We welcome researchers, developers, and industry practitioners working on concrete tools, early-stage ideas, or critical perspectives.
Submission Guidelines
Please submit your proposal by July 15, 2025 (AoE) via: https://forms.gle/6KNBMuRsyXs8RiD89
Your proposal should include:
- Title
- Abstract (max. 300 words)
- Short biography of the speaker(s)
We are open to a wide range of talk formats: demos, position papers, success stories, lessons learned, or short idea pitches.
📩 Questions? Reach out to us at dbpedia@infai.org
🌐 Event page: https://www.dbpedia.org/blog/dbpedia-day-2025/
Join us to shape how LLMs and DBpedia can empower each other!
- Did you consider this information as helpful?
- Yep!Not quite ...