Sources & Citations

Reference and source display

Display source citations and reference cards in AI chat responses using the Vercel AI SDK with web search tools. When the AI cites web sources, users see inline reference numbers that expand to full source cards — the citation UX pattern from Perplexity and Bing Copilot.

  • Research and Q&A tools where source credibility matters
  • Fact-checking assistants with verifiable reference links
  • Knowledge management tools with authoritative source attribution
  • News and current events assistants with clickable source cards

Tech stack

Vercel AI SDKNext.jsReactTypeScriptWeb Search TooluseChat
npx shadcn@latest add https://shadcnagents.com/r/ai-elements-sources-chat
Explain the transformer architecture
The Transformer architecture, introduced in 2017, revolutionized NLP by replacing recurrence with self-attention mechanisms[1]. The key innovation is the multi-head attention mechanism that allows the model to attend to different positions simultaneously[2]. This approach was later extended by models like BERT[3].
Sources
1

Transformer Architecture

arxiv.org · 2017

2

Attention Is All You Need

papers.nips.cc · 2017

3

BERT: Pre-training

arxiv.org · 2018