nanoagent/NANOBOT_SKILL_SETUP.md

387 lines
11 KiB
Markdown

# PostgreSQL Analyzer Skill for Nanobot
## Correction: Nanobot DOES Support Skills!
From the [Nanobot GitHub repo](https://github.com/HKUDS/nanobot):
```
nanobot/
├── agent/
│ ├── skills.py # 🎯 Skills loader
│ └── ...
├── skills/ # 🎯 Bundled skills (github, weather, tmux...)
│ └── ...
```
Nanobot has its own skill system that's **different from Kimi CLI's SKILL.md format**:
- **Kimi CLI**: Markdown-based (`SKILL.md`)
- **Nanobot**: Python-based skills + ClawHub integration
---
## Nanobot Skill System Overview
### 1. Built-in Skills
Nanobot comes with bundled skills in the `skills/` directory:
- `github` - GitHub operations
- `weather` - Weather queries
- `tmux` - Terminal multiplexer
- And more...
### 2. ClawHub Skills
Nanobot can search and install skills from **ClawHub** (similar to OpenClaw's skill ecosystem):
```bash
# Search for skills
nanobot skill search postgres
# Install a skill
nanobot skill install postgres-analyzer
```
### 3. Custom Python Skills
You can create custom skills using Python (more powerful than markdown).
---
## Option 1: MCP Server + System Prompt (Simplest)
This is what I showed earlier - use the MCP server with a system prompt:
```json
// ~/.nanobot/config.json
{
"agents": {
"defaults": {
"model": "anthropic/claude-sonnet-4",
"systemPrompt": "You are a PostgreSQL analyst. Use MCP tools..."
}
},
"tools": {
"mcpServers": {
"postgres": {
"command": "python3",
"args": ["/path/to/pg_mcp_server/server.py"],
"env": {
"PG_CONNECTION_STRING": "postgresql://..."
}
}
}
}
}
```
---
## Option 2: Create a Proper Nanobot Skill (Recommended)
Create a Python skill file that Nanobot can load:
### File: `~/.nanobot/skills/postgres_analyzer/skill.py`
```python
"""
PostgreSQL Analyzer Skill for Nanobot
Provides intelligent database analysis capabilities
"""
from nanobot.skills import Skill, intent, tool
from nanobot.agent.context import Context
class PostgresAnalyzerSkill(Skill):
"""
Analyze PostgreSQL databases and generate insights.
"""
name = "postgres-analyzer"
description = "PostgreSQL database analysis and querying"
@intent("explore database")
@intent("show tables")
@intent("what's in my database")
async def explore_database(self, ctx: Context):
"""
When user wants to explore database structure.
Triggered by: 'explore database', 'show tables', etc.
"""
# Call MCP tool via ctx.tools.mcp.postgres
schema = await ctx.tools.mcp.postgres.get_schema()
return {
"type": "text",
"content": f"📊 Database Schema:\n\n{schema}"
}
@intent("analyze table")
@intent("tell me about table")
async def analyze_table(self, ctx: Context, table_name: str = None):
"""
When user wants to analyze a specific table.
Triggered by: 'analyze the orders table'
"""
if not table_name:
# Try to extract from context or ask
return "Which table would you like me to analyze?"
stats = await ctx.tools.mcp.postgres.get_table_stats(
table_name=table_name,
sample_size=5
)
return {
"type": "text",
"content": f"📈 Analysis of '{table_name}':\n\n{stats}"
}
@intent("expensive book")
@intent("cheapest book")
@intent("best seller")
async def book_analysis(self, ctx: Context):
"""
When user asks about book prices or sales.
"""
result = await ctx.tools.mcp.postgres.execute_query(
query="""
SELECT
p.name,
p.price,
SUM(oi.quantity) as sold,
SUM(oi.quantity * oi.unit_price) as revenue
FROM products p
LEFT JOIN order_items oi ON p.id = oi.product_id
WHERE p.category = 'Books'
GROUP BY p.id, p.name, p.price
ORDER BY p.price DESC
""",
limit=10
)
return {
"type": "text",
"content": f"📚 Book Analysis:\n\n{result}"
}
@intent("run query")
@intent("execute sql")
async def custom_query(self, ctx: Context, query: str = None):
"""
When user wants to run a custom SQL query.
"""
if not query:
return "What SQL query would you like to run?"
# Safety check - ensure it's read-only
query_lower = query.lower()
forbidden = ['insert', 'update', 'delete', 'drop', 'create', 'alter']
if any(word in query_lower for word in forbidden):
return "⚠️ For safety, only SELECT queries are allowed."
result = await ctx.tools.mcp.postgres.execute_query(
query=query,
limit=100
)
return {
"type": "text",
"content": f"📝 Query Results:\n\n{result}"
}
@intent("price analysis")
@intent("pricing strategy")
async def pricing_analysis(self, ctx: Context):
"""
When user wants pricing insights.
"""
analysis = await ctx.tools.mcp.postgres.execute_query(
query="""
SELECT
p.category,
COUNT(*) as products,
MIN(p.price) as min_price,
MAX(p.price) as max_price,
AVG(p.price)::numeric(10,2) as avg_price,
SUM(oi.quantity) as units_sold
FROM products p
LEFT JOIN order_items oi ON p.id = oi.product_id
GROUP BY p.category
ORDER BY units_sold DESC
""",
limit=20
)
return {
"type": "text",
"content": f"💰 Pricing Analysis by Category:\n\n{analysis}"
}
# Export the skill
skill = PostgresAnalyzerSkill()
```
### File: `~/.nanobot/skills/postgres_analyzer/skill.json`
```json
{
"name": "postgres-analyzer",
"version": "1.0.0",
"description": "PostgreSQL database analysis and querying",
"author": "Your Name",
"entry": "skill.py",
"intents": [
"explore database",
"analyze table",
"expensive book",
"run query",
"price analysis"
],
"mcpServers": ["postgres"]
}
```
### Register in Config
```json
// ~/.nanobot/config.json
{
"agents": {
"defaults": {
"skills": ["postgres_analyzer"]
}
},
"tools": {
"mcpServers": {
"postgres": {
"command": "python3",
"args": ["/path/to/pg_mcp_server/server.py"],
"env": {
"PG_CONNECTION_STRING": "postgresql://..."
}
}
}
}
}
```
---
## Option 3: ClawHub-Compatible Skill (For Sharing)
To make your skill installable via `nanobot skill install`, create this structure:
```
postgres-analyzer-skill/
├── skill.md # Skill documentation
├── skill.py # Main skill code
├── config.json # Default config
└── README.md
```
### skill.md
```markdown
---
name: postgres-analyzer
version: 1.0.0
description: Analyze PostgreSQL databases and extract insights
author: your-github-username
tags: [database, postgres, sql, analytics]
---
# PostgreSQL Analyzer
This skill helps you analyze PostgreSQL databases and generate insights.
## Requirements
- PostgreSQL database
- PG_CONNECTION_STRING environment variable
## Features
- Explore database schema
- Run SQL queries
- Analyze table statistics
- Generate pricing insights
## Usage
Simply ask:
- "Show me all tables"
- "What's the most expensive book?"
- "Analyze the orders table"
- "Run: SELECT * FROM users LIMIT 10"
```
---
## How Nanobot Decides to Use Skills
```
User: "What's the most expensive book?"
┌─────────────────────────────┐
│ 1. Intent Recognition │
│ ───────────────── │
│ Matches against skill │
│ @intent decorators: │
│ • "expensive book" ✅ │
│ • "analyze table" │
│ • "run query" │
└─────────────┬───────────────┘
┌─────────────────────────────┐
│ 2. Skill Method Called │
│ ───────────────────── │
│ PostgresAnalyzerSkill │
│ .book_analysis() │
└─────────────┬───────────────┘
┌─────────────────────────────┐
│ 3. MCP Tool Execution │
│ ───────────────────── │
│ ctx.tools.mcp.postgres │
│ .execute_query(...) │
└─────────────┬───────────────┘
┌─────────────────────────────┐
│ 4. Result Synthesis │
│ ───────────────── │
│ Return formatted response │
└─────────────────────────────┘
```
---
## Comparison: Kimi CLI vs Nanobot Skills
| Aspect | Kimi CLI | Nanobot |
|--------|----------|---------|
| **Format** | `SKILL.md` (markdown) | Python code |
| **Trigger** | Frontmatter description matching | `@intent()` decorators |
| **Logic** | LLM decides based on instructions | Python code + LLM hybrid |
| **MCP Usage** | Via tool descriptions | Via `ctx.tools.mcp` |
| **Flexibility** | Text-based guidance | Code-based, programmatic |
| **Installation** | Copy to `~/.config/agents/skills/` | `nanobot skill install` or copy to `~/.nanobot/skills/` |
---
## Recommended Approach
For your PostgreSQL analyzer:
1. **Keep the MCP server** (`pg_mcp_server/server.py`) - this is **portable** across Kimi CLI, Nanobot, Claude Desktop, etc.
2. **Choose skill approach based on your needs**:
- **Quick setup**: Use system prompt (Option 1)
- **More control**: Create Python skill (Option 2)
- **Share with community**: ClawHub format (Option 3)
3. **The MCP server is the reusable part** - skills are the interface layer that differs between platforms.