In the vast digital expanse of the Akashic Archives, ancient knowledge fragments are scattered across countless web domains, hidden in articles, papers, and digital archives. As a Knowledge Cartographer, you have been chosen to discover, extract, and weave these fragments into an interconnected knowledge graph that reveals the hidden connections between concepts, ideas, and wisdom.
The Great Library’s Council has tasked you with building the Omniscient Codex - a living knowledge system that can traverse the web, capture insights, organize them into a searchable knowledge base, and reveal the invisible threads that connect all human understanding.
Your mission is to create a comprehensive knowledge discovery and mapping system that combines the power of web exploration with intelligent local knowledge organization. You’ll build an autonomous research assistant that can gather information from multiple sources, identify relationships between concepts, and create a navigable knowledge graph.
In this adventure, you’ll learn to use GitHub Copilot Agent Mode with MCP (Model Control Protocol) tools - extending AI capabilities with external tools for web search and file system operations!
Before starting this adventure, you’ll need to perform the following steps:
MCP FireCrawl
server for advanced web scraping from the MCP gallery.MCP File System
server for managing files and directories.By completing this adventure with Agent Mode + MCP, you’ll learn:
Configure MCP Servers using one of these methods:
Method A: Direct Installation from Curated List
Firecrawl
MCP Tools Required
and Sample MCP Configuration
sections below to install the File System MCP Server
Method B: Workspace Configuration
.vscode/mcp.json
file in your workspace rootMCP Tools Required
and Sample MCP Configuration
sections below)Method C: User Configuration
Ctrl+Shift+P
/ Cmd+Shift+P
)Firecrawl
and File System
tools are selectedThis adventure integrates two essential MCP servers:
🔍 FireCrawl MCP Server
firecrawl-mcp
(actively maintained by Mendable AI)📁 File System MCP Server
@modelcontextprotocol/server-filesystem
Get a Firecrawl API key from https://www.firecrawl.dev/app/api-keys.
{
"servers": {
"firecrawl": {
"command": "npx",
"args": [
"-y",
"firecrawl-mcp"
],
"env": {
"FIRECRAWL_API_KEY": "${input:fireCrawlApiKey}"
}
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"${workspaceFolder}"
]
}
},
"inputs": [
{
"type": "promptString",
"id": "fireCrawlApiKey",
"description": "Firecrawl API Key",
"password": true
}
]
}
Security Note: Use input placeholders (${input:variableName}
) for sensitive data like API keys rather than hardcoding them. VS Code will prompt for these values when needed.
Now let’s define the requirements for the Knowledge Cartographer system:
In the Chat panel with “Agent” mode selected, provide a comprehensive prompt such as the following. While this prompt uses JavaScript and Node.js, feel free to adapt it to your preferred language or framework. This prompt assumes you’ve already configured the MCP servers as described above.
Create a complete Knowledge Cartographer system using GitHub Copilot Agent Mode with MCP tools. This should be a two-phase process:
PHASE 1: Use MCP Tools to Gather and Organize Knowledge
First, use Agent Mode with the configured Firecrawl and server-filesystem MCP servers to collect and organize knowledge data:
1. Use FireCrawl MCP Server tools to:
- Scrape web content about topics like "quantum computing" and "artificial intelligence".
- Extract key entities, concepts, and relationships from the scraped content
- Analyze content quality and source credibility
2. Use File System MCP Server tools to:
- Create a structured knowledge base directory ./akashic-archives-demo. Place the directory in the root of your workspace.
- Organize data into topics/ and indexes/ subdirectories
- Save entities, relationships, and sources as JSON files (entities.json, relationships.json, sources.json)
- Create metadata and index files for each knowledge domain
PHASE 2: Create Application to Explore the Knowledge Base
Create a Node.js/JavaScript application that reads and analyzes the organized data. Only use the fs, path, and readline modules from Node.js. Do not use any external libraries or frameworks.
3. Build a Knowledge Base Reader:
- Read the structured JSON files created by MCP tools
- Load entities, relationships, and source information from ./akashic-archives-demo
- Support multiple knowledge domains/topics
4. Implement Knowledge Graph Analysis:
- Analyze relationships between entities
- Identify connection patterns and strengths
- Explore concept clusters and associations
5. Create Interactive Exploration Interface:
- Command-line interface for browsing knowledge domains
- Commands to load topics, find entity connections, explore relationships
- Show original source materials and metadata
- Beautiful mystical-themed console output
6. Add comprehensive error handling and documentation
The key architecture: MCP Tools → Structured Files → Your Application
- MCP tools handle external operations (web scraping, file organization)
- Your application focuses on reading, analyzing, and exploring the data
Please implement this complete system with sample knowledge domains and demonstrate the exploration workflow.
Agent Mode will autonomously:
You’ll see each step in the UI, including MCP tool invocations and results.
As Agent Mode works, you can:
Once your Knowledge Cartographer system works, try asking Agent Mode to:
Enhance the Knowledge Cartographer with these advanced MCP integrations:
1. Implement knowledge graph merge capabilities for multiple sources
2. Create automated knowledge update workflows
3. Add support for multimedia content discovery and organization
4. Build knowledge sharing and export pipelines
5. Implement advanced graph analytics and insights
6. Create knowledge recommendation systems
7. Add integration with external knowledge bases and APIs
When your Agent Mode implementation is complete, running the application should produce output similar to the following. AI is non-deterministic, so your results may vary slightly, but the structure should be similar.
🗺️ Welcome to the Knowledge Cartographer! 🗺️
You are exploring the Akashic Archives - a mystical knowledge base
that has been discovered and organized by the MCP spirits of the web.
The FireCrawl spirits have gathered knowledge from across the digital realm,
while the File System spirits have organized it into sacred scrolls.
Your task: Navigate this treasure trove of interconnected wisdom.
🔮 Initializing archive exploration systems...
✅ Akashic Archives detected at: ./akashic-archives
📚 Available knowledge domains: 2 topics discovered
🌟 Archives ready for exploration!
📖 Loading knowledge domain: quantum-computing
✅ Loaded 5 entities
✅ Loaded 4 relationships
✅ Loaded 3 sources
🔮 Knowledge domain ready for exploration!
🌟 Knowledge Domain: QUANTUM COMPUTING
📊 Overview:
• Entities: 5
• Relationships: 4
• Sources: 3
• Created: 7/15/2025
🔍 Top Entities by Frequency:
├── quantum bit (appears 3 times)
│ └── concepts: quantum mechanics, computation
├── superposition (appears 2 times)
│ └── concepts: quantum mechanics
├── entanglement (appears 2 times)
│ └── concepts: quantum mechanics
├── Shor's algorithm (appears 2 times)
│ └── concepts: cryptography, algorithms
└── quantum gate (appears 1 times)
└── concepts: computation, quantum mechanics
Security and Privacy:
If MCP tools aren’t working:
After using Agent Mode with MCP through this adventure:
🗺️ Happy knowledge cartography! May your explorations reveal the hidden connections that bind all knowledge together! 🗺️