Merge PR #145: fix Zhipu AI API key env var
This commit is contained in:
6
.gitignore
vendored
6
.gitignore
vendored
@@ -12,4 +12,8 @@ docs/
|
||||
*.pyw
|
||||
*.pyz
|
||||
*.pywz
|
||||
*.pyzz
|
||||
*.pyzz
|
||||
.venv/
|
||||
__pycache__/
|
||||
poetry.lock
|
||||
.pytest_cache/
|
||||
|
||||
126
README.md
126
README.md
@@ -16,19 +16,25 @@
|
||||
|
||||
⚡️ Delivers core agent functionality in just **~4,000** lines of code — **99% smaller** than Clawdbot's 430k+ lines.
|
||||
|
||||
📏 Real-time line count: **3,428 lines** (run `bash core_agent_lines.sh` to verify anytime)
|
||||
|
||||
## 📢 News
|
||||
|
||||
- **2026-02-01** 🎉 nanobot launched! Welcome to try 🐈 nanobot!
|
||||
- **2026-02-06** ✨ Added Moonshot/Kimi provider, Discord channel, and enhanced security hardening!
|
||||
- **2026-02-05** ✨ Added Feishu channel, DeepSeek provider, and enhanced scheduled tasks support!
|
||||
- **2026-02-04** 🚀 Released v0.1.3.post4 with multi-provider & Docker support! Check [release notes](https://github.com/HKUDS/nanobot/releases/tag/v0.1.3.post4) for details.
|
||||
- **2026-02-03** ⚡ Integrated vLLM for local LLM support and improved natural language task scheduling!
|
||||
- **2026-02-02** 🎉 nanobot officially launched! Welcome to try 🐈 nanobot!
|
||||
|
||||
## Key Features of nanobot:
|
||||
|
||||
🪶 **Ultra-Lightweight**: Just ~4,000 lines of code — 99% smaller than Clawdbot - core functionality.
|
||||
🪶 **Ultra-Lightweight**: Just ~4,000 lines of core agent code — 99% smaller than Clawdbot.
|
||||
|
||||
🔬 **Research-Ready**: Clean, readable code that's easy to understand, modify, and extend for research.
|
||||
|
||||
⚡️ **Lightning Fast**: Minimal footprint means faster startup, lower resource usage, and quicker iterations.
|
||||
|
||||
💎 **Easy-to-Use**: One-click to depoly and you're ready to go.
|
||||
💎 **Easy-to-Use**: One-click to deploy and you're ready to go.
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
@@ -85,8 +91,7 @@ pip install nanobot-ai
|
||||
|
||||
> [!TIP]
|
||||
> Set your API key in `~/.nanobot/config.json`.
|
||||
> Get API keys: [OpenRouter](https://openrouter.ai/keys) (LLM) · [Brave Search](https://brave.com/search/api/) (optional, for web search)
|
||||
> You can also change the model to `minimax/minimax-m2` for lower cost.
|
||||
> Get API keys: [OpenRouter](https://openrouter.ai/keys) (Global) · [DashScope](https://dashscope.console.aliyun.com) (Qwen) · [Brave Search](https://brave.com/search/api/) (optional, for web search)
|
||||
|
||||
**1. Initialize**
|
||||
|
||||
@@ -96,6 +101,7 @@ nanobot onboard
|
||||
|
||||
**2. Configure** (`~/.nanobot/config.json`)
|
||||
|
||||
For OpenRouter - recommended for global users:
|
||||
```json
|
||||
{
|
||||
"providers": {
|
||||
@@ -107,18 +113,10 @@ nanobot onboard
|
||||
"defaults": {
|
||||
"model": "anthropic/claude-opus-4-5"
|
||||
}
|
||||
},
|
||||
"tools": {
|
||||
"web": {
|
||||
"search": {
|
||||
"apiKey": "BSA-xxx"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
**3. Chat**
|
||||
|
||||
```bash
|
||||
@@ -166,11 +164,12 @@ nanobot agent -m "Hello from my local LLM!"
|
||||
|
||||
## 💬 Chat Apps
|
||||
|
||||
Talk to your nanobot through Telegram, WhatsApp, or Feishu — anytime, anywhere.
|
||||
Talk to your nanobot through Telegram, Discord, WhatsApp, or Feishu — anytime, anywhere.
|
||||
|
||||
| Channel | Setup |
|
||||
|---------|-------|
|
||||
| **Telegram** | Easy (just a token) |
|
||||
| **Discord** | Easy (bot token + intents) |
|
||||
| **WhatsApp** | Medium (scan QR) |
|
||||
| **Feishu** | Medium (app credentials) |
|
||||
|
||||
@@ -206,6 +205,50 @@ nanobot gateway
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><b>Discord</b></summary>
|
||||
|
||||
**1. Create a bot**
|
||||
- Go to https://discord.com/developers/applications
|
||||
- Create an application → Bot → Add Bot
|
||||
- Copy the bot token
|
||||
|
||||
**2. Enable intents**
|
||||
- In the Bot settings, enable **MESSAGE CONTENT INTENT**
|
||||
- (Optional) Enable **SERVER MEMBERS INTENT** if you plan to use allow lists based on member data
|
||||
|
||||
**3. Get your User ID**
|
||||
- Discord Settings → Advanced → enable **Developer Mode**
|
||||
- Right-click your avatar → **Copy User ID**
|
||||
|
||||
**4. Configure**
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"discord": {
|
||||
"enabled": true,
|
||||
"token": "YOUR_BOT_TOKEN",
|
||||
"allowFrom": ["YOUR_USER_ID"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**5. Invite the bot**
|
||||
- OAuth2 → URL Generator
|
||||
- Scopes: `bot`
|
||||
- Bot Permissions: `Send Messages`, `Read Message History`
|
||||
- Open the generated invite URL and add the bot to your server
|
||||
|
||||
**6. Run**
|
||||
|
||||
```bash
|
||||
nanobot gateway
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><b>WhatsApp</b></summary>
|
||||
|
||||
@@ -306,57 +349,22 @@ Config file: `~/.nanobot/config.json`
|
||||
| `openrouter` | LLM (recommended, access to all models) | [openrouter.ai](https://openrouter.ai) |
|
||||
| `anthropic` | LLM (Claude direct) | [console.anthropic.com](https://console.anthropic.com) |
|
||||
| `openai` | LLM (GPT direct) | [platform.openai.com](https://platform.openai.com) |
|
||||
| `deepseek` | LLM (DeepSeek direct) | [platform.deepseek.com](https://platform.deepseek.com) |
|
||||
| `groq` | LLM + **Voice transcription** (Whisper) | [console.groq.com](https://console.groq.com) |
|
||||
| `gemini` | LLM (Gemini direct) | [aistudio.google.com](https://aistudio.google.com) |
|
||||
| `dashscope` | LLM (Qwen) | [dashscope.console.aliyun.com](https://dashscope.console.aliyun.com) |
|
||||
|
||||
|
||||
<details>
|
||||
<summary><b>Full config example</b></summary>
|
||||
### Security
|
||||
|
||||
```json
|
||||
{
|
||||
"agents": {
|
||||
"defaults": {
|
||||
"model": "anthropic/claude-opus-4-5"
|
||||
}
|
||||
},
|
||||
"providers": {
|
||||
"openrouter": {
|
||||
"apiKey": "sk-or-v1-xxx"
|
||||
},
|
||||
"groq": {
|
||||
"apiKey": "gsk_xxx"
|
||||
}
|
||||
},
|
||||
"channels": {
|
||||
"telegram": {
|
||||
"enabled": true,
|
||||
"token": "123456:ABC...",
|
||||
"allowFrom": ["123456789"]
|
||||
},
|
||||
"whatsapp": {
|
||||
"enabled": false
|
||||
},
|
||||
"feishu": {
|
||||
"enabled": false,
|
||||
"appId": "cli_xxx",
|
||||
"appSecret": "xxx",
|
||||
"encryptKey": "",
|
||||
"verificationToken": "",
|
||||
"allowFrom": []
|
||||
}
|
||||
},
|
||||
"tools": {
|
||||
"web": {
|
||||
"search": {
|
||||
"apiKey": "BSA..."
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
> [!TIP]
|
||||
> For production deployments, set `"restrictToWorkspace": true` in your config to sandbox the agent.
|
||||
|
||||
| Option | Default | Description |
|
||||
|--------|---------|-------------|
|
||||
| `tools.restrictToWorkspace` | `false` | When `true`, restricts **all** agent tools (shell, file read/write/edit, list) to the workspace directory. Prevents path traversal and out-of-scope access. |
|
||||
| `channels.*.allowFrom` | `[]` (allow all) | Whitelist of user IDs. Empty = allow everyone; non-empty = only listed users can interact. |
|
||||
|
||||
</details>
|
||||
|
||||
## CLI Reference
|
||||
|
||||
|
||||
264
SECURITY.md
Normal file
264
SECURITY.md
Normal file
@@ -0,0 +1,264 @@
|
||||
# Security Policy
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
If you discover a security vulnerability in nanobot, please report it by:
|
||||
|
||||
1. **DO NOT** open a public GitHub issue
|
||||
2. Create a private security advisory on GitHub or contact the repository maintainers
|
||||
3. Include:
|
||||
- Description of the vulnerability
|
||||
- Steps to reproduce
|
||||
- Potential impact
|
||||
- Suggested fix (if any)
|
||||
|
||||
We aim to respond to security reports within 48 hours.
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
### 1. API Key Management
|
||||
|
||||
**CRITICAL**: Never commit API keys to version control.
|
||||
|
||||
```bash
|
||||
# ✅ Good: Store in config file with restricted permissions
|
||||
chmod 600 ~/.nanobot/config.json
|
||||
|
||||
# ❌ Bad: Hardcoding keys in code or committing them
|
||||
```
|
||||
|
||||
**Recommendations:**
|
||||
- Store API keys in `~/.nanobot/config.json` with file permissions set to `0600`
|
||||
- Consider using environment variables for sensitive keys
|
||||
- Use OS keyring/credential manager for production deployments
|
||||
- Rotate API keys regularly
|
||||
- Use separate API keys for development and production
|
||||
|
||||
### 2. Channel Access Control
|
||||
|
||||
**IMPORTANT**: Always configure `allowFrom` lists for production use.
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"telegram": {
|
||||
"enabled": true,
|
||||
"token": "YOUR_BOT_TOKEN",
|
||||
"allowFrom": ["123456789", "987654321"]
|
||||
},
|
||||
"whatsapp": {
|
||||
"enabled": true,
|
||||
"allowFrom": ["+1234567890"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Security Notes:**
|
||||
- Empty `allowFrom` list will **ALLOW ALL** users (open by default for personal use)
|
||||
- Get your Telegram user ID from `@userinfobot`
|
||||
- Use full phone numbers with country code for WhatsApp
|
||||
- Review access logs regularly for unauthorized access attempts
|
||||
|
||||
### 3. Shell Command Execution
|
||||
|
||||
The `exec` tool can execute shell commands. While dangerous command patterns are blocked, you should:
|
||||
|
||||
- ✅ Review all tool usage in agent logs
|
||||
- ✅ Understand what commands the agent is running
|
||||
- ✅ Use a dedicated user account with limited privileges
|
||||
- ✅ Never run nanobot as root
|
||||
- ❌ Don't disable security checks
|
||||
- ❌ Don't run on systems with sensitive data without careful review
|
||||
|
||||
**Blocked patterns:**
|
||||
- `rm -rf /` - Root filesystem deletion
|
||||
- Fork bombs
|
||||
- Filesystem formatting (`mkfs.*`)
|
||||
- Raw disk writes
|
||||
- Other destructive operations
|
||||
|
||||
### 4. File System Access
|
||||
|
||||
File operations have path traversal protection, but:
|
||||
|
||||
- ✅ Run nanobot with a dedicated user account
|
||||
- ✅ Use filesystem permissions to protect sensitive directories
|
||||
- ✅ Regularly audit file operations in logs
|
||||
- ❌ Don't give unrestricted access to sensitive files
|
||||
|
||||
### 5. Network Security
|
||||
|
||||
**API Calls:**
|
||||
- All external API calls use HTTPS by default
|
||||
- Timeouts are configured to prevent hanging requests
|
||||
- Consider using a firewall to restrict outbound connections if needed
|
||||
|
||||
**WhatsApp Bridge:**
|
||||
- The bridge runs on `localhost:3001` by default
|
||||
- If exposing to network, use proper authentication and TLS
|
||||
- Keep authentication data in `~/.nanobot/whatsapp-auth` secure (mode 0700)
|
||||
|
||||
### 6. Dependency Security
|
||||
|
||||
**Critical**: Keep dependencies updated!
|
||||
|
||||
```bash
|
||||
# Check for vulnerable dependencies
|
||||
pip install pip-audit
|
||||
pip-audit
|
||||
|
||||
# Update to latest secure versions
|
||||
pip install --upgrade nanobot-ai
|
||||
```
|
||||
|
||||
For Node.js dependencies (WhatsApp bridge):
|
||||
```bash
|
||||
cd bridge
|
||||
npm audit
|
||||
npm audit fix
|
||||
```
|
||||
|
||||
**Important Notes:**
|
||||
- Keep `litellm` updated to the latest version for security fixes
|
||||
- We've updated `ws` to `>=8.17.1` to fix DoS vulnerability
|
||||
- Run `pip-audit` or `npm audit` regularly
|
||||
- Subscribe to security advisories for nanobot and its dependencies
|
||||
|
||||
### 7. Production Deployment
|
||||
|
||||
For production use:
|
||||
|
||||
1. **Isolate the Environment**
|
||||
```bash
|
||||
# Run in a container or VM
|
||||
docker run --rm -it python:3.11
|
||||
pip install nanobot-ai
|
||||
```
|
||||
|
||||
2. **Use a Dedicated User**
|
||||
```bash
|
||||
sudo useradd -m -s /bin/bash nanobot
|
||||
sudo -u nanobot nanobot gateway
|
||||
```
|
||||
|
||||
3. **Set Proper Permissions**
|
||||
```bash
|
||||
chmod 700 ~/.nanobot
|
||||
chmod 600 ~/.nanobot/config.json
|
||||
chmod 700 ~/.nanobot/whatsapp-auth
|
||||
```
|
||||
|
||||
4. **Enable Logging**
|
||||
```bash
|
||||
# Configure log monitoring
|
||||
tail -f ~/.nanobot/logs/nanobot.log
|
||||
```
|
||||
|
||||
5. **Use Rate Limiting**
|
||||
- Configure rate limits on your API providers
|
||||
- Monitor usage for anomalies
|
||||
- Set spending limits on LLM APIs
|
||||
|
||||
6. **Regular Updates**
|
||||
```bash
|
||||
# Check for updates weekly
|
||||
pip install --upgrade nanobot-ai
|
||||
```
|
||||
|
||||
### 8. Development vs Production
|
||||
|
||||
**Development:**
|
||||
- Use separate API keys
|
||||
- Test with non-sensitive data
|
||||
- Enable verbose logging
|
||||
- Use a test Telegram bot
|
||||
|
||||
**Production:**
|
||||
- Use dedicated API keys with spending limits
|
||||
- Restrict file system access
|
||||
- Enable audit logging
|
||||
- Regular security reviews
|
||||
- Monitor for unusual activity
|
||||
|
||||
### 9. Data Privacy
|
||||
|
||||
- **Logs may contain sensitive information** - secure log files appropriately
|
||||
- **LLM providers see your prompts** - review their privacy policies
|
||||
- **Chat history is stored locally** - protect the `~/.nanobot` directory
|
||||
- **API keys are in plain text** - use OS keyring for production
|
||||
|
||||
### 10. Incident Response
|
||||
|
||||
If you suspect a security breach:
|
||||
|
||||
1. **Immediately revoke compromised API keys**
|
||||
2. **Review logs for unauthorized access**
|
||||
```bash
|
||||
grep "Access denied" ~/.nanobot/logs/nanobot.log
|
||||
```
|
||||
3. **Check for unexpected file modifications**
|
||||
4. **Rotate all credentials**
|
||||
5. **Update to latest version**
|
||||
6. **Report the incident** to maintainers
|
||||
|
||||
## Security Features
|
||||
|
||||
### Built-in Security Controls
|
||||
|
||||
✅ **Input Validation**
|
||||
- Path traversal protection on file operations
|
||||
- Dangerous command pattern detection
|
||||
- Input length limits on HTTP requests
|
||||
|
||||
✅ **Authentication**
|
||||
- Allow-list based access control
|
||||
- Failed authentication attempt logging
|
||||
- Open by default (configure allowFrom for production use)
|
||||
|
||||
✅ **Resource Protection**
|
||||
- Command execution timeouts (60s default)
|
||||
- Output truncation (10KB limit)
|
||||
- HTTP request timeouts (10-30s)
|
||||
|
||||
✅ **Secure Communication**
|
||||
- HTTPS for all external API calls
|
||||
- TLS for Telegram API
|
||||
- WebSocket security for WhatsApp bridge
|
||||
|
||||
## Known Limitations
|
||||
|
||||
⚠️ **Current Security Limitations:**
|
||||
|
||||
1. **No Rate Limiting** - Users can send unlimited messages (add your own if needed)
|
||||
2. **Plain Text Config** - API keys stored in plain text (use keyring for production)
|
||||
3. **No Session Management** - No automatic session expiry
|
||||
4. **Limited Command Filtering** - Only blocks obvious dangerous patterns
|
||||
5. **No Audit Trail** - Limited security event logging (enhance as needed)
|
||||
|
||||
## Security Checklist
|
||||
|
||||
Before deploying nanobot:
|
||||
|
||||
- [ ] API keys stored securely (not in code)
|
||||
- [ ] Config file permissions set to 0600
|
||||
- [ ] `allowFrom` lists configured for all channels
|
||||
- [ ] Running as non-root user
|
||||
- [ ] File system permissions properly restricted
|
||||
- [ ] Dependencies updated to latest secure versions
|
||||
- [ ] Logs monitored for security events
|
||||
- [ ] Rate limits configured on API providers
|
||||
- [ ] Backup and disaster recovery plan in place
|
||||
- [ ] Security review of custom skills/tools
|
||||
|
||||
## Updates
|
||||
|
||||
**Last Updated**: 2026-02-03
|
||||
|
||||
For the latest security updates and announcements, check:
|
||||
- GitHub Security Advisories: https://github.com/HKUDS/nanobot/security/advisories
|
||||
- Release Notes: https://github.com/HKUDS/nanobot/releases
|
||||
|
||||
## License
|
||||
|
||||
See LICENSE file for details.
|
||||
@@ -11,7 +11,7 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@whiskeysockets/baileys": "7.0.0-rc.9",
|
||||
"ws": "^8.17.0",
|
||||
"ws": "^8.17.1",
|
||||
"qrcode-terminal": "^0.12.0",
|
||||
"pino": "^9.0.0"
|
||||
},
|
||||
|
||||
21
core_agent_lines.sh
Executable file
21
core_agent_lines.sh
Executable file
@@ -0,0 +1,21 @@
|
||||
#!/bin/bash
|
||||
# Count core agent lines (excluding channels/, cli/, providers/ adapters)
|
||||
cd "$(dirname "$0")" || exit 1
|
||||
|
||||
echo "nanobot core agent line count"
|
||||
echo "================================"
|
||||
echo ""
|
||||
|
||||
for dir in agent agent/tools bus config cron heartbeat session utils; do
|
||||
count=$(find "nanobot/$dir" -maxdepth 1 -name "*.py" -exec cat {} + | wc -l)
|
||||
printf " %-16s %5s lines\n" "$dir/" "$count"
|
||||
done
|
||||
|
||||
root=$(cat nanobot/__init__.py nanobot/__main__.py | wc -l)
|
||||
printf " %-16s %5s lines\n" "(root)" "$root"
|
||||
|
||||
echo ""
|
||||
total=$(find nanobot -name "*.py" ! -path "*/channels/*" ! -path "*/cli/*" ! -path "*/providers/*" | xargs cat | wc -l)
|
||||
echo " Core total: $total lines"
|
||||
echo ""
|
||||
echo " (excludes: channels/, cli/, providers/)"
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
import base64
|
||||
import mimetypes
|
||||
import platform
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
@@ -74,6 +75,8 @@ Skills with available="false" need dependencies installed first - you can try in
|
||||
from datetime import datetime
|
||||
now = datetime.now().strftime("%Y-%m-%d %H:%M (%A)")
|
||||
workspace_path = str(self.workspace.expanduser().resolve())
|
||||
system = platform.system()
|
||||
runtime = f"{'macOS' if system == 'Darwin' else system} {platform.machine()}, Python {platform.python_version()}"
|
||||
|
||||
return f"""# nanobot 🐈
|
||||
|
||||
@@ -87,6 +90,9 @@ You are nanobot, a helpful AI assistant. You have access to tools that allow you
|
||||
## Current Time
|
||||
{now}
|
||||
|
||||
## Runtime
|
||||
{runtime}
|
||||
|
||||
## Workspace
|
||||
Your workspace is at: {workspace_path}
|
||||
- Memory files: {workspace_path}/memory/MEMORY.md
|
||||
@@ -118,6 +124,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
||||
current_message: str,
|
||||
skill_names: list[str] | None = None,
|
||||
media: list[str] | None = None,
|
||||
channel: str | None = None,
|
||||
chat_id: str | None = None,
|
||||
) -> list[dict[str, Any]]:
|
||||
"""
|
||||
Build the complete message list for an LLM call.
|
||||
@@ -127,6 +135,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
||||
current_message: The new user message.
|
||||
skill_names: Optional skills to include.
|
||||
media: Optional list of local file paths for images/media.
|
||||
channel: Current channel (telegram, feishu, etc.).
|
||||
chat_id: Current chat/user ID.
|
||||
|
||||
Returns:
|
||||
List of messages including system prompt.
|
||||
@@ -135,6 +145,8 @@ When remembering something, write to {workspace_path}/memory/MEMORY.md"""
|
||||
|
||||
# System prompt
|
||||
system_prompt = self.build_system_prompt(skill_names)
|
||||
if channel and chat_id:
|
||||
system_prompt += f"\n\n## Current Session\nChannel: {channel}\nChat ID: {chat_id}"
|
||||
messages.append({"role": "system", "content": system_prompt})
|
||||
|
||||
# History
|
||||
|
||||
@@ -17,6 +17,7 @@ from nanobot.agent.tools.shell import ExecTool
|
||||
from nanobot.agent.tools.web import WebSearchTool, WebFetchTool
|
||||
from nanobot.agent.tools.message import MessageTool
|
||||
from nanobot.agent.tools.spawn import SpawnTool
|
||||
from nanobot.agent.tools.cron import CronTool
|
||||
from nanobot.agent.subagent import SubagentManager
|
||||
from nanobot.session.manager import SessionManager
|
||||
|
||||
@@ -42,8 +43,11 @@ class AgentLoop:
|
||||
max_iterations: int = 20,
|
||||
brave_api_key: str | None = None,
|
||||
exec_config: "ExecToolConfig | None" = None,
|
||||
cron_service: "CronService | None" = None,
|
||||
restrict_to_workspace: bool = False,
|
||||
):
|
||||
from nanobot.config.schema import ExecToolConfig
|
||||
from nanobot.cron.service import CronService
|
||||
self.bus = bus
|
||||
self.provider = provider
|
||||
self.workspace = workspace
|
||||
@@ -51,6 +55,8 @@ class AgentLoop:
|
||||
self.max_iterations = max_iterations
|
||||
self.brave_api_key = brave_api_key
|
||||
self.exec_config = exec_config or ExecToolConfig()
|
||||
self.cron_service = cron_service
|
||||
self.restrict_to_workspace = restrict_to_workspace
|
||||
|
||||
self.context = ContextBuilder(workspace)
|
||||
self.sessions = SessionManager(workspace)
|
||||
@@ -62,6 +68,7 @@ class AgentLoop:
|
||||
model=self.model,
|
||||
brave_api_key=brave_api_key,
|
||||
exec_config=self.exec_config,
|
||||
restrict_to_workspace=restrict_to_workspace,
|
||||
)
|
||||
|
||||
self._running = False
|
||||
@@ -69,17 +76,18 @@ class AgentLoop:
|
||||
|
||||
def _register_default_tools(self) -> None:
|
||||
"""Register the default set of tools."""
|
||||
# File tools
|
||||
self.tools.register(ReadFileTool())
|
||||
self.tools.register(WriteFileTool())
|
||||
self.tools.register(EditFileTool())
|
||||
self.tools.register(ListDirTool())
|
||||
# File tools (restrict to workspace if configured)
|
||||
allowed_dir = self.workspace if self.restrict_to_workspace else None
|
||||
self.tools.register(ReadFileTool(allowed_dir=allowed_dir))
|
||||
self.tools.register(WriteFileTool(allowed_dir=allowed_dir))
|
||||
self.tools.register(EditFileTool(allowed_dir=allowed_dir))
|
||||
self.tools.register(ListDirTool(allowed_dir=allowed_dir))
|
||||
|
||||
# Shell tool
|
||||
self.tools.register(ExecTool(
|
||||
working_dir=str(self.workspace),
|
||||
timeout=self.exec_config.timeout,
|
||||
restrict_to_workspace=self.exec_config.restrict_to_workspace,
|
||||
restrict_to_workspace=self.restrict_to_workspace,
|
||||
))
|
||||
|
||||
# Web tools
|
||||
@@ -93,6 +101,10 @@ class AgentLoop:
|
||||
# Spawn tool (for subagents)
|
||||
spawn_tool = SpawnTool(manager=self.subagents)
|
||||
self.tools.register(spawn_tool)
|
||||
|
||||
# Cron tool (for scheduling)
|
||||
if self.cron_service:
|
||||
self.tools.register(CronTool(self.cron_service))
|
||||
|
||||
async def run(self) -> None:
|
||||
"""Run the agent loop, processing messages from the bus."""
|
||||
@@ -157,11 +169,17 @@ class AgentLoop:
|
||||
if isinstance(spawn_tool, SpawnTool):
|
||||
spawn_tool.set_context(msg.channel, msg.chat_id)
|
||||
|
||||
cron_tool = self.tools.get("cron")
|
||||
if isinstance(cron_tool, CronTool):
|
||||
cron_tool.set_context(msg.channel, msg.chat_id)
|
||||
|
||||
# Build initial messages (use get_history for LLM-formatted messages)
|
||||
messages = self.context.build_messages(
|
||||
history=session.get_history(),
|
||||
current_message=msg.content,
|
||||
media=msg.media if msg.media else None,
|
||||
channel=msg.channel,
|
||||
chat_id=msg.chat_id,
|
||||
)
|
||||
|
||||
# Agent loop
|
||||
@@ -255,10 +273,16 @@ class AgentLoop:
|
||||
if isinstance(spawn_tool, SpawnTool):
|
||||
spawn_tool.set_context(origin_channel, origin_chat_id)
|
||||
|
||||
cron_tool = self.tools.get("cron")
|
||||
if isinstance(cron_tool, CronTool):
|
||||
cron_tool.set_context(origin_channel, origin_chat_id)
|
||||
|
||||
# Build messages with the announce content
|
||||
messages = self.context.build_messages(
|
||||
history=session.get_history(),
|
||||
current_message=msg.content
|
||||
current_message=msg.content,
|
||||
channel=origin_channel,
|
||||
chat_id=origin_chat_id,
|
||||
)
|
||||
|
||||
# Agent loop (limited for announce handling)
|
||||
@@ -315,21 +339,29 @@ class AgentLoop:
|
||||
content=final_content
|
||||
)
|
||||
|
||||
async def process_direct(self, content: str, session_key: str = "cli:direct") -> str:
|
||||
async def process_direct(
|
||||
self,
|
||||
content: str,
|
||||
session_key: str = "cli:direct",
|
||||
channel: str = "cli",
|
||||
chat_id: str = "direct",
|
||||
) -> str:
|
||||
"""
|
||||
Process a message directly (for CLI usage).
|
||||
Process a message directly (for CLI or cron usage).
|
||||
|
||||
Args:
|
||||
content: The message content.
|
||||
session_key: Session identifier.
|
||||
channel: Source channel (for context).
|
||||
chat_id: Source chat ID (for context).
|
||||
|
||||
Returns:
|
||||
The agent's response.
|
||||
"""
|
||||
msg = InboundMessage(
|
||||
channel="cli",
|
||||
channel=channel,
|
||||
sender_id="user",
|
||||
chat_id="direct",
|
||||
chat_id=chat_id,
|
||||
content=content
|
||||
)
|
||||
|
||||
|
||||
@@ -34,6 +34,7 @@ class SubagentManager:
|
||||
model: str | None = None,
|
||||
brave_api_key: str | None = None,
|
||||
exec_config: "ExecToolConfig | None" = None,
|
||||
restrict_to_workspace: bool = False,
|
||||
):
|
||||
from nanobot.config.schema import ExecToolConfig
|
||||
self.provider = provider
|
||||
@@ -42,6 +43,7 @@ class SubagentManager:
|
||||
self.model = model or provider.get_default_model()
|
||||
self.brave_api_key = brave_api_key
|
||||
self.exec_config = exec_config or ExecToolConfig()
|
||||
self.restrict_to_workspace = restrict_to_workspace
|
||||
self._running_tasks: dict[str, asyncio.Task[None]] = {}
|
||||
|
||||
async def spawn(
|
||||
@@ -96,13 +98,14 @@ class SubagentManager:
|
||||
try:
|
||||
# Build subagent tools (no message tool, no spawn tool)
|
||||
tools = ToolRegistry()
|
||||
tools.register(ReadFileTool())
|
||||
tools.register(WriteFileTool())
|
||||
tools.register(ListDirTool())
|
||||
allowed_dir = self.workspace if self.restrict_to_workspace else None
|
||||
tools.register(ReadFileTool(allowed_dir=allowed_dir))
|
||||
tools.register(WriteFileTool(allowed_dir=allowed_dir))
|
||||
tools.register(ListDirTool(allowed_dir=allowed_dir))
|
||||
tools.register(ExecTool(
|
||||
working_dir=str(self.workspace),
|
||||
timeout=self.exec_config.timeout,
|
||||
restrict_to_workspace=self.exec_config.restrict_to_workspace,
|
||||
restrict_to_workspace=self.restrict_to_workspace,
|
||||
))
|
||||
tools.register(WebSearchTool(api_key=self.brave_api_key))
|
||||
tools.register(WebFetchTool())
|
||||
@@ -149,7 +152,8 @@ class SubagentManager:
|
||||
|
||||
# Execute tools
|
||||
for tool_call in response.tool_calls:
|
||||
logger.debug(f"Subagent [{task_id}] executing: {tool_call.name}")
|
||||
args_str = json.dumps(tool_call.arguments)
|
||||
logger.debug(f"Subagent [{task_id}] executing: {tool_call.name} with arguments: {args_str}")
|
||||
result = await tools.execute(tool_call.name, tool_call.arguments)
|
||||
messages.append({
|
||||
"role": "tool",
|
||||
|
||||
114
nanobot/agent/tools/cron.py
Normal file
114
nanobot/agent/tools/cron.py
Normal file
@@ -0,0 +1,114 @@
|
||||
"""Cron tool for scheduling reminders and tasks."""
|
||||
|
||||
from typing import Any
|
||||
|
||||
from nanobot.agent.tools.base import Tool
|
||||
from nanobot.cron.service import CronService
|
||||
from nanobot.cron.types import CronSchedule
|
||||
|
||||
|
||||
class CronTool(Tool):
|
||||
"""Tool to schedule reminders and recurring tasks."""
|
||||
|
||||
def __init__(self, cron_service: CronService):
|
||||
self._cron = cron_service
|
||||
self._channel = ""
|
||||
self._chat_id = ""
|
||||
|
||||
def set_context(self, channel: str, chat_id: str) -> None:
|
||||
"""Set the current session context for delivery."""
|
||||
self._channel = channel
|
||||
self._chat_id = chat_id
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "cron"
|
||||
|
||||
@property
|
||||
def description(self) -> str:
|
||||
return "Schedule reminders and recurring tasks. Actions: add, list, remove."
|
||||
|
||||
@property
|
||||
def parameters(self) -> dict[str, Any]:
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"type": "string",
|
||||
"enum": ["add", "list", "remove"],
|
||||
"description": "Action to perform"
|
||||
},
|
||||
"message": {
|
||||
"type": "string",
|
||||
"description": "Reminder message (for add)"
|
||||
},
|
||||
"every_seconds": {
|
||||
"type": "integer",
|
||||
"description": "Interval in seconds (for recurring tasks)"
|
||||
},
|
||||
"cron_expr": {
|
||||
"type": "string",
|
||||
"description": "Cron expression like '0 9 * * *' (for scheduled tasks)"
|
||||
},
|
||||
"job_id": {
|
||||
"type": "string",
|
||||
"description": "Job ID (for remove)"
|
||||
}
|
||||
},
|
||||
"required": ["action"]
|
||||
}
|
||||
|
||||
async def execute(
|
||||
self,
|
||||
action: str,
|
||||
message: str = "",
|
||||
every_seconds: int | None = None,
|
||||
cron_expr: str | None = None,
|
||||
job_id: str | None = None,
|
||||
**kwargs: Any
|
||||
) -> str:
|
||||
if action == "add":
|
||||
return self._add_job(message, every_seconds, cron_expr)
|
||||
elif action == "list":
|
||||
return self._list_jobs()
|
||||
elif action == "remove":
|
||||
return self._remove_job(job_id)
|
||||
return f"Unknown action: {action}"
|
||||
|
||||
def _add_job(self, message: str, every_seconds: int | None, cron_expr: str | None) -> str:
|
||||
if not message:
|
||||
return "Error: message is required for add"
|
||||
if not self._channel or not self._chat_id:
|
||||
return "Error: no session context (channel/chat_id)"
|
||||
|
||||
# Build schedule
|
||||
if every_seconds:
|
||||
schedule = CronSchedule(kind="every", every_ms=every_seconds * 1000)
|
||||
elif cron_expr:
|
||||
schedule = CronSchedule(kind="cron", expr=cron_expr)
|
||||
else:
|
||||
return "Error: either every_seconds or cron_expr is required"
|
||||
|
||||
job = self._cron.add_job(
|
||||
name=message[:30],
|
||||
schedule=schedule,
|
||||
message=message,
|
||||
deliver=True,
|
||||
channel=self._channel,
|
||||
to=self._chat_id,
|
||||
)
|
||||
return f"Created job '{job.name}' (id: {job.id})"
|
||||
|
||||
def _list_jobs(self) -> str:
|
||||
jobs = self._cron.list_jobs()
|
||||
if not jobs:
|
||||
return "No scheduled jobs."
|
||||
lines = [f"- {j.name} (id: {j.id}, {j.schedule.kind})" for j in jobs]
|
||||
return "Scheduled jobs:\n" + "\n".join(lines)
|
||||
|
||||
def _remove_job(self, job_id: str | None) -> str:
|
||||
if not job_id:
|
||||
return "Error: job_id is required for remove"
|
||||
if self._cron.remove_job(job_id):
|
||||
return f"Removed job {job_id}"
|
||||
return f"Job {job_id} not found"
|
||||
@@ -6,9 +6,20 @@ from typing import Any
|
||||
from nanobot.agent.tools.base import Tool
|
||||
|
||||
|
||||
def _resolve_path(path: str, allowed_dir: Path | None = None) -> Path:
|
||||
"""Resolve path and optionally enforce directory restriction."""
|
||||
resolved = Path(path).expanduser().resolve()
|
||||
if allowed_dir and not str(resolved).startswith(str(allowed_dir.resolve())):
|
||||
raise PermissionError(f"Path {path} is outside allowed directory {allowed_dir}")
|
||||
return resolved
|
||||
|
||||
|
||||
class ReadFileTool(Tool):
|
||||
"""Tool to read file contents."""
|
||||
|
||||
def __init__(self, allowed_dir: Path | None = None):
|
||||
self._allowed_dir = allowed_dir
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "read_file"
|
||||
@@ -32,7 +43,7 @@ class ReadFileTool(Tool):
|
||||
|
||||
async def execute(self, path: str, **kwargs: Any) -> str:
|
||||
try:
|
||||
file_path = Path(path).expanduser()
|
||||
file_path = _resolve_path(path, self._allowed_dir)
|
||||
if not file_path.exists():
|
||||
return f"Error: File not found: {path}"
|
||||
if not file_path.is_file():
|
||||
@@ -40,8 +51,8 @@ class ReadFileTool(Tool):
|
||||
|
||||
content = file_path.read_text(encoding="utf-8")
|
||||
return content
|
||||
except PermissionError:
|
||||
return f"Error: Permission denied: {path}"
|
||||
except PermissionError as e:
|
||||
return f"Error: {e}"
|
||||
except Exception as e:
|
||||
return f"Error reading file: {str(e)}"
|
||||
|
||||
@@ -49,6 +60,9 @@ class ReadFileTool(Tool):
|
||||
class WriteFileTool(Tool):
|
||||
"""Tool to write content to a file."""
|
||||
|
||||
def __init__(self, allowed_dir: Path | None = None):
|
||||
self._allowed_dir = allowed_dir
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "write_file"
|
||||
@@ -76,12 +90,12 @@ class WriteFileTool(Tool):
|
||||
|
||||
async def execute(self, path: str, content: str, **kwargs: Any) -> str:
|
||||
try:
|
||||
file_path = Path(path).expanduser()
|
||||
file_path = _resolve_path(path, self._allowed_dir)
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
file_path.write_text(content, encoding="utf-8")
|
||||
return f"Successfully wrote {len(content)} bytes to {path}"
|
||||
except PermissionError:
|
||||
return f"Error: Permission denied: {path}"
|
||||
except PermissionError as e:
|
||||
return f"Error: {e}"
|
||||
except Exception as e:
|
||||
return f"Error writing file: {str(e)}"
|
||||
|
||||
@@ -89,6 +103,9 @@ class WriteFileTool(Tool):
|
||||
class EditFileTool(Tool):
|
||||
"""Tool to edit a file by replacing text."""
|
||||
|
||||
def __init__(self, allowed_dir: Path | None = None):
|
||||
self._allowed_dir = allowed_dir
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "edit_file"
|
||||
@@ -120,7 +137,7 @@ class EditFileTool(Tool):
|
||||
|
||||
async def execute(self, path: str, old_text: str, new_text: str, **kwargs: Any) -> str:
|
||||
try:
|
||||
file_path = Path(path).expanduser()
|
||||
file_path = _resolve_path(path, self._allowed_dir)
|
||||
if not file_path.exists():
|
||||
return f"Error: File not found: {path}"
|
||||
|
||||
@@ -138,8 +155,8 @@ class EditFileTool(Tool):
|
||||
file_path.write_text(new_content, encoding="utf-8")
|
||||
|
||||
return f"Successfully edited {path}"
|
||||
except PermissionError:
|
||||
return f"Error: Permission denied: {path}"
|
||||
except PermissionError as e:
|
||||
return f"Error: {e}"
|
||||
except Exception as e:
|
||||
return f"Error editing file: {str(e)}"
|
||||
|
||||
@@ -147,6 +164,9 @@ class EditFileTool(Tool):
|
||||
class ListDirTool(Tool):
|
||||
"""Tool to list directory contents."""
|
||||
|
||||
def __init__(self, allowed_dir: Path | None = None):
|
||||
self._allowed_dir = allowed_dir
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "list_dir"
|
||||
@@ -170,7 +190,7 @@ class ListDirTool(Tool):
|
||||
|
||||
async def execute(self, path: str, **kwargs: Any) -> str:
|
||||
try:
|
||||
dir_path = Path(path).expanduser()
|
||||
dir_path = _resolve_path(path, self._allowed_dir)
|
||||
if not dir_path.exists():
|
||||
return f"Error: Directory not found: {path}"
|
||||
if not dir_path.is_dir():
|
||||
@@ -185,7 +205,7 @@ class ListDirTool(Tool):
|
||||
return f"Directory {path} is empty"
|
||||
|
||||
return "\n".join(items)
|
||||
except PermissionError:
|
||||
return f"Error: Permission denied: {path}"
|
||||
except PermissionError as e:
|
||||
return f"Error: {e}"
|
||||
except Exception as e:
|
||||
return f"Error listing directory: {str(e)}"
|
||||
|
||||
@@ -3,6 +3,8 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any
|
||||
|
||||
from loguru import logger
|
||||
|
||||
from nanobot.bus.events import InboundMessage, OutboundMessage
|
||||
from nanobot.bus.queue import MessageBus
|
||||
|
||||
@@ -102,6 +104,10 @@ class BaseChannel(ABC):
|
||||
metadata: Optional channel-specific metadata.
|
||||
"""
|
||||
if not self.is_allowed(sender_id):
|
||||
logger.warning(
|
||||
f"Access denied for sender {sender_id} on channel {self.name}. "
|
||||
f"Add them to allowFrom list in config to grant access."
|
||||
)
|
||||
return
|
||||
|
||||
msg = InboundMessage(
|
||||
|
||||
261
nanobot/channels/discord.py
Normal file
261
nanobot/channels/discord.py
Normal file
@@ -0,0 +1,261 @@
|
||||
"""Discord channel implementation using Discord Gateway websocket."""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import httpx
|
||||
import websockets
|
||||
from loguru import logger
|
||||
|
||||
from nanobot.bus.events import OutboundMessage
|
||||
from nanobot.bus.queue import MessageBus
|
||||
from nanobot.channels.base import BaseChannel
|
||||
from nanobot.config.schema import DiscordConfig
|
||||
|
||||
|
||||
DISCORD_API_BASE = "https://discord.com/api/v10"
|
||||
MAX_ATTACHMENT_BYTES = 20 * 1024 * 1024 # 20MB
|
||||
|
||||
|
||||
class DiscordChannel(BaseChannel):
|
||||
"""Discord channel using Gateway websocket."""
|
||||
|
||||
name = "discord"
|
||||
|
||||
def __init__(self, config: DiscordConfig, bus: MessageBus):
|
||||
super().__init__(config, bus)
|
||||
self.config: DiscordConfig = config
|
||||
self._ws: websockets.WebSocketClientProtocol | None = None
|
||||
self._seq: int | None = None
|
||||
self._heartbeat_task: asyncio.Task | None = None
|
||||
self._typing_tasks: dict[str, asyncio.Task] = {}
|
||||
self._http: httpx.AsyncClient | None = None
|
||||
|
||||
async def start(self) -> None:
|
||||
"""Start the Discord gateway connection."""
|
||||
if not self.config.token:
|
||||
logger.error("Discord bot token not configured")
|
||||
return
|
||||
|
||||
self._running = True
|
||||
self._http = httpx.AsyncClient(timeout=30.0)
|
||||
|
||||
while self._running:
|
||||
try:
|
||||
logger.info("Connecting to Discord gateway...")
|
||||
async with websockets.connect(self.config.gateway_url) as ws:
|
||||
self._ws = ws
|
||||
await self._gateway_loop()
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception as e:
|
||||
logger.warning(f"Discord gateway error: {e}")
|
||||
if self._running:
|
||||
logger.info("Reconnecting to Discord gateway in 5 seconds...")
|
||||
await asyncio.sleep(5)
|
||||
|
||||
async def stop(self) -> None:
|
||||
"""Stop the Discord channel."""
|
||||
self._running = False
|
||||
if self._heartbeat_task:
|
||||
self._heartbeat_task.cancel()
|
||||
self._heartbeat_task = None
|
||||
for task in self._typing_tasks.values():
|
||||
task.cancel()
|
||||
self._typing_tasks.clear()
|
||||
if self._ws:
|
||||
await self._ws.close()
|
||||
self._ws = None
|
||||
if self._http:
|
||||
await self._http.aclose()
|
||||
self._http = None
|
||||
|
||||
async def send(self, msg: OutboundMessage) -> None:
|
||||
"""Send a message through Discord REST API."""
|
||||
if not self._http:
|
||||
logger.warning("Discord HTTP client not initialized")
|
||||
return
|
||||
|
||||
url = f"{DISCORD_API_BASE}/channels/{msg.chat_id}/messages"
|
||||
payload: dict[str, Any] = {"content": msg.content}
|
||||
|
||||
if msg.reply_to:
|
||||
payload["message_reference"] = {"message_id": msg.reply_to}
|
||||
payload["allowed_mentions"] = {"replied_user": False}
|
||||
|
||||
headers = {"Authorization": f"Bot {self.config.token}"}
|
||||
|
||||
try:
|
||||
for attempt in range(3):
|
||||
try:
|
||||
response = await self._http.post(url, headers=headers, json=payload)
|
||||
if response.status_code == 429:
|
||||
data = response.json()
|
||||
retry_after = float(data.get("retry_after", 1.0))
|
||||
logger.warning(f"Discord rate limited, retrying in {retry_after}s")
|
||||
await asyncio.sleep(retry_after)
|
||||
continue
|
||||
response.raise_for_status()
|
||||
return
|
||||
except Exception as e:
|
||||
if attempt == 2:
|
||||
logger.error(f"Error sending Discord message: {e}")
|
||||
else:
|
||||
await asyncio.sleep(1)
|
||||
finally:
|
||||
await self._stop_typing(msg.chat_id)
|
||||
|
||||
async def _gateway_loop(self) -> None:
|
||||
"""Main gateway loop: identify, heartbeat, dispatch events."""
|
||||
if not self._ws:
|
||||
return
|
||||
|
||||
async for raw in self._ws:
|
||||
try:
|
||||
data = json.loads(raw)
|
||||
except json.JSONDecodeError:
|
||||
logger.warning(f"Invalid JSON from Discord gateway: {raw[:100]}")
|
||||
continue
|
||||
|
||||
op = data.get("op")
|
||||
event_type = data.get("t")
|
||||
seq = data.get("s")
|
||||
payload = data.get("d")
|
||||
|
||||
if seq is not None:
|
||||
self._seq = seq
|
||||
|
||||
if op == 10:
|
||||
# HELLO: start heartbeat and identify
|
||||
interval_ms = payload.get("heartbeat_interval", 45000)
|
||||
await self._start_heartbeat(interval_ms / 1000)
|
||||
await self._identify()
|
||||
elif op == 0 and event_type == "READY":
|
||||
logger.info("Discord gateway READY")
|
||||
elif op == 0 and event_type == "MESSAGE_CREATE":
|
||||
await self._handle_message_create(payload)
|
||||
elif op == 7:
|
||||
# RECONNECT: exit loop to reconnect
|
||||
logger.info("Discord gateway requested reconnect")
|
||||
break
|
||||
elif op == 9:
|
||||
# INVALID_SESSION: reconnect
|
||||
logger.warning("Discord gateway invalid session")
|
||||
break
|
||||
|
||||
async def _identify(self) -> None:
|
||||
"""Send IDENTIFY payload."""
|
||||
if not self._ws:
|
||||
return
|
||||
|
||||
identify = {
|
||||
"op": 2,
|
||||
"d": {
|
||||
"token": self.config.token,
|
||||
"intents": self.config.intents,
|
||||
"properties": {
|
||||
"os": "nanobot",
|
||||
"browser": "nanobot",
|
||||
"device": "nanobot",
|
||||
},
|
||||
},
|
||||
}
|
||||
await self._ws.send(json.dumps(identify))
|
||||
|
||||
async def _start_heartbeat(self, interval_s: float) -> None:
|
||||
"""Start or restart the heartbeat loop."""
|
||||
if self._heartbeat_task:
|
||||
self._heartbeat_task.cancel()
|
||||
|
||||
async def heartbeat_loop() -> None:
|
||||
while self._running and self._ws:
|
||||
payload = {"op": 1, "d": self._seq}
|
||||
try:
|
||||
await self._ws.send(json.dumps(payload))
|
||||
except Exception as e:
|
||||
logger.warning(f"Discord heartbeat failed: {e}")
|
||||
break
|
||||
await asyncio.sleep(interval_s)
|
||||
|
||||
self._heartbeat_task = asyncio.create_task(heartbeat_loop())
|
||||
|
||||
async def _handle_message_create(self, payload: dict[str, Any]) -> None:
|
||||
"""Handle incoming Discord messages."""
|
||||
author = payload.get("author") or {}
|
||||
if author.get("bot"):
|
||||
return
|
||||
|
||||
sender_id = str(author.get("id", ""))
|
||||
channel_id = str(payload.get("channel_id", ""))
|
||||
content = payload.get("content") or ""
|
||||
|
||||
if not sender_id or not channel_id:
|
||||
return
|
||||
|
||||
if not self.is_allowed(sender_id):
|
||||
return
|
||||
|
||||
content_parts = [content] if content else []
|
||||
media_paths: list[str] = []
|
||||
media_dir = Path.home() / ".nanobot" / "media"
|
||||
|
||||
for attachment in payload.get("attachments") or []:
|
||||
url = attachment.get("url")
|
||||
filename = attachment.get("filename") or "attachment"
|
||||
size = attachment.get("size") or 0
|
||||
if not url or not self._http:
|
||||
continue
|
||||
if size and size > MAX_ATTACHMENT_BYTES:
|
||||
content_parts.append(f"[attachment: {filename} - too large]")
|
||||
continue
|
||||
try:
|
||||
media_dir.mkdir(parents=True, exist_ok=True)
|
||||
file_path = media_dir / f"{attachment.get('id', 'file')}_{filename.replace('/', '_')}"
|
||||
resp = await self._http.get(url)
|
||||
resp.raise_for_status()
|
||||
file_path.write_bytes(resp.content)
|
||||
media_paths.append(str(file_path))
|
||||
content_parts.append(f"[attachment: {file_path}]")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to download Discord attachment: {e}")
|
||||
content_parts.append(f"[attachment: {filename} - download failed]")
|
||||
|
||||
reply_to = (payload.get("referenced_message") or {}).get("id")
|
||||
|
||||
await self._start_typing(channel_id)
|
||||
|
||||
await self._handle_message(
|
||||
sender_id=sender_id,
|
||||
chat_id=channel_id,
|
||||
content="\n".join(p for p in content_parts if p) or "[empty message]",
|
||||
media=media_paths,
|
||||
metadata={
|
||||
"message_id": str(payload.get("id", "")),
|
||||
"guild_id": payload.get("guild_id"),
|
||||
"reply_to": reply_to,
|
||||
},
|
||||
)
|
||||
|
||||
async def _start_typing(self, channel_id: str) -> None:
|
||||
"""Start periodic typing indicator for a channel."""
|
||||
await self._stop_typing(channel_id)
|
||||
|
||||
async def typing_loop() -> None:
|
||||
url = f"{DISCORD_API_BASE}/channels/{channel_id}/typing"
|
||||
headers = {"Authorization": f"Bot {self.config.token}"}
|
||||
while self._running:
|
||||
try:
|
||||
await self._http.post(url, headers=headers)
|
||||
except Exception:
|
||||
pass
|
||||
await asyncio.sleep(8)
|
||||
|
||||
self._typing_tasks[channel_id] = asyncio.create_task(typing_loop())
|
||||
|
||||
async def _stop_typing(self, channel_id: str) -> None:
|
||||
"""Stop typing indicator for a channel."""
|
||||
task = self._typing_tasks.pop(channel_id, None)
|
||||
if task:
|
||||
task.cancel()
|
||||
@@ -55,6 +55,17 @@ class ChannelManager:
|
||||
logger.info("WhatsApp channel enabled")
|
||||
except ImportError as e:
|
||||
logger.warning(f"WhatsApp channel not available: {e}")
|
||||
|
||||
# Discord channel
|
||||
if self.config.channels.discord.enabled:
|
||||
try:
|
||||
from nanobot.channels.discord import DiscordChannel
|
||||
self.channels["discord"] = DiscordChannel(
|
||||
self.config.channels.discord, self.bus
|
||||
)
|
||||
logger.info("Discord channel enabled")
|
||||
except ImportError as e:
|
||||
logger.warning(f"Discord channel not available: {e}")
|
||||
|
||||
# Feishu channel
|
||||
if self.config.channels.feishu.enabled:
|
||||
|
||||
@@ -195,7 +195,11 @@ def gateway(
|
||||
default_model=config.agents.defaults.model
|
||||
)
|
||||
|
||||
# Create agent
|
||||
# Create cron service first (callback set after agent creation)
|
||||
cron_store_path = get_data_dir() / "cron" / "jobs.json"
|
||||
cron = CronService(cron_store_path)
|
||||
|
||||
# Create agent with cron service
|
||||
agent = AgentLoop(
|
||||
bus=bus,
|
||||
provider=provider,
|
||||
@@ -204,27 +208,28 @@ def gateway(
|
||||
max_iterations=config.agents.defaults.max_tool_iterations,
|
||||
brave_api_key=config.tools.web.search.api_key or None,
|
||||
exec_config=config.tools.exec,
|
||||
cron_service=cron,
|
||||
restrict_to_workspace=config.tools.restrict_to_workspace,
|
||||
)
|
||||
|
||||
# Create cron service
|
||||
# Set cron callback (needs agent)
|
||||
async def on_cron_job(job: CronJob) -> str | None:
|
||||
"""Execute a cron job through the agent."""
|
||||
response = await agent.process_direct(
|
||||
job.payload.message,
|
||||
session_key=f"cron:{job.id}"
|
||||
session_key=f"cron:{job.id}",
|
||||
channel=job.payload.channel or "cli",
|
||||
chat_id=job.payload.to or "direct",
|
||||
)
|
||||
# Optionally deliver to channel
|
||||
if job.payload.deliver and job.payload.to:
|
||||
from nanobot.bus.events import OutboundMessage
|
||||
await bus.publish_outbound(OutboundMessage(
|
||||
channel=job.payload.channel or "whatsapp",
|
||||
channel=job.payload.channel or "cli",
|
||||
chat_id=job.payload.to,
|
||||
content=response or ""
|
||||
))
|
||||
return response
|
||||
|
||||
cron_store_path = get_data_dir() / "cron" / "jobs.json"
|
||||
cron = CronService(cron_store_path, on_job=on_cron_job)
|
||||
cron.on_job = on_cron_job
|
||||
|
||||
# Create heartbeat service
|
||||
async def on_heartbeat(prompt: str) -> str:
|
||||
@@ -312,6 +317,7 @@ def agent(
|
||||
workspace=config.workspace_path,
|
||||
brave_api_key=config.tools.web.search.api_key or None,
|
||||
exec_config=config.tools.exec,
|
||||
restrict_to_workspace=config.tools.restrict_to_workspace,
|
||||
)
|
||||
|
||||
if message:
|
||||
@@ -370,6 +376,13 @@ def channels_status():
|
||||
wa.bridge_url
|
||||
)
|
||||
|
||||
dc = config.channels.discord
|
||||
table.add_row(
|
||||
"Discord",
|
||||
"✓" if dc.enabled else "✗",
|
||||
dc.gateway_url
|
||||
)
|
||||
|
||||
# Telegram
|
||||
tg = config.channels.telegram
|
||||
tg_config = f"token: {tg.token[:10]}..." if tg.token else "[dim]not configured[/dim]"
|
||||
|
||||
@@ -34,6 +34,7 @@ def load_config(config_path: Path | None = None) -> Config:
|
||||
try:
|
||||
with open(path) as f:
|
||||
data = json.load(f)
|
||||
data = _migrate_config(data)
|
||||
return Config.model_validate(convert_keys(data))
|
||||
except (json.JSONDecodeError, ValueError) as e:
|
||||
print(f"Warning: Failed to load config from {path}: {e}")
|
||||
@@ -61,6 +62,16 @@ def save_config(config: Config, config_path: Path | None = None) -> None:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
|
||||
def _migrate_config(data: dict) -> dict:
|
||||
"""Migrate old config formats to current."""
|
||||
# Move tools.exec.restrictToWorkspace → tools.restrictToWorkspace
|
||||
tools = data.get("tools", {})
|
||||
exec_cfg = tools.get("exec", {})
|
||||
if "restrictToWorkspace" in exec_cfg and "restrictToWorkspace" not in tools:
|
||||
tools["restrictToWorkspace"] = exec_cfg.pop("restrictToWorkspace")
|
||||
return data
|
||||
|
||||
|
||||
def convert_keys(data: Any) -> Any:
|
||||
"""Convert camelCase keys to snake_case for Pydantic."""
|
||||
if isinstance(data, dict):
|
||||
|
||||
@@ -30,10 +30,20 @@ class FeishuConfig(BaseModel):
|
||||
allow_from: list[str] = Field(default_factory=list) # Allowed user open_ids
|
||||
|
||||
|
||||
class DiscordConfig(BaseModel):
|
||||
"""Discord channel configuration."""
|
||||
enabled: bool = False
|
||||
token: str = "" # Bot token from Discord Developer Portal
|
||||
allow_from: list[str] = Field(default_factory=list) # Allowed user IDs
|
||||
gateway_url: str = "wss://gateway.discord.gg/?v=10&encoding=json"
|
||||
intents: int = 37377 # GUILDS + GUILD_MESSAGES + DIRECT_MESSAGES + MESSAGE_CONTENT
|
||||
|
||||
|
||||
class ChannelsConfig(BaseModel):
|
||||
"""Configuration for chat channels."""
|
||||
whatsapp: WhatsAppConfig = Field(default_factory=WhatsAppConfig)
|
||||
telegram: TelegramConfig = Field(default_factory=TelegramConfig)
|
||||
discord: DiscordConfig = Field(default_factory=DiscordConfig)
|
||||
feishu: FeishuConfig = Field(default_factory=FeishuConfig)
|
||||
|
||||
|
||||
@@ -62,10 +72,13 @@ class ProvidersConfig(BaseModel):
|
||||
anthropic: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
openai: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
openrouter: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
deepseek: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
groq: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
zhipu: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
dashscope: ProviderConfig = Field(default_factory=ProviderConfig) # 阿里云通义千问
|
||||
vllm: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
gemini: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
moonshot: ProviderConfig = Field(default_factory=ProviderConfig)
|
||||
|
||||
|
||||
class GatewayConfig(BaseModel):
|
||||
@@ -88,13 +101,13 @@ class WebToolsConfig(BaseModel):
|
||||
class ExecToolConfig(BaseModel):
|
||||
"""Shell exec tool configuration."""
|
||||
timeout: int = 60
|
||||
restrict_to_workspace: bool = False # If true, block commands accessing paths outside workspace
|
||||
|
||||
|
||||
class ToolsConfig(BaseModel):
|
||||
"""Tools configuration."""
|
||||
web: WebToolsConfig = Field(default_factory=WebToolsConfig)
|
||||
exec: ExecToolConfig = Field(default_factory=ExecToolConfig)
|
||||
restrict_to_workspace: bool = False # If true, restrict all tool access to workspace directory
|
||||
|
||||
|
||||
class Config(BaseSettings):
|
||||
@@ -110,26 +123,59 @@ class Config(BaseSettings):
|
||||
"""Get expanded workspace path."""
|
||||
return Path(self.agents.defaults.workspace).expanduser()
|
||||
|
||||
def get_api_key(self) -> str | None:
|
||||
"""Get API key in priority order: OpenRouter > Anthropic > OpenAI > Gemini > Zhipu > Groq > vLLM."""
|
||||
return (
|
||||
self.providers.openrouter.api_key or
|
||||
self.providers.anthropic.api_key or
|
||||
self.providers.openai.api_key or
|
||||
self.providers.gemini.api_key or
|
||||
self.providers.zhipu.api_key or
|
||||
self.providers.groq.api_key or
|
||||
self.providers.vllm.api_key or
|
||||
None
|
||||
)
|
||||
def _match_provider(self, model: str | None = None) -> ProviderConfig | None:
|
||||
"""Match a provider based on model name."""
|
||||
model = (model or self.agents.defaults.model).lower()
|
||||
# Map of keywords to provider configs
|
||||
providers = {
|
||||
"openrouter": self.providers.openrouter,
|
||||
"deepseek": self.providers.deepseek,
|
||||
"anthropic": self.providers.anthropic,
|
||||
"claude": self.providers.anthropic,
|
||||
"openai": self.providers.openai,
|
||||
"gpt": self.providers.openai,
|
||||
"gemini": self.providers.gemini,
|
||||
"zhipu": self.providers.zhipu,
|
||||
"glm": self.providers.zhipu,
|
||||
"zai": self.providers.zhipu,
|
||||
"dashscope": self.providers.dashscope,
|
||||
"qwen": self.providers.dashscope,
|
||||
"groq": self.providers.groq,
|
||||
"moonshot": self.providers.moonshot,
|
||||
"kimi": self.providers.moonshot,
|
||||
"vllm": self.providers.vllm,
|
||||
}
|
||||
for keyword, provider in providers.items():
|
||||
if keyword in model and provider.api_key:
|
||||
return provider
|
||||
return None
|
||||
|
||||
def get_api_key(self, model: str | None = None) -> str | None:
|
||||
"""Get API key for the given model (or default model). Falls back to first available key."""
|
||||
# Try matching by model name first
|
||||
matched = self._match_provider(model)
|
||||
if matched:
|
||||
return matched.api_key
|
||||
# Fallback: return first available key
|
||||
for provider in [
|
||||
self.providers.openrouter, self.providers.deepseek,
|
||||
self.providers.anthropic, self.providers.openai,
|
||||
self.providers.gemini, self.providers.zhipu,
|
||||
self.providers.dashscope, self.providers.moonshot,
|
||||
self.providers.vllm, self.providers.groq,
|
||||
]:
|
||||
if provider.api_key:
|
||||
return provider.api_key
|
||||
return None
|
||||
|
||||
def get_api_base(self) -> str | None:
|
||||
"""Get API base URL if using OpenRouter, Zhipu or vLLM."""
|
||||
if self.providers.openrouter.api_key:
|
||||
def get_api_base(self, model: str | None = None) -> str | None:
|
||||
"""Get API base URL based on model name."""
|
||||
model = (model or self.agents.defaults.model).lower()
|
||||
if "openrouter" in model:
|
||||
return self.providers.openrouter.api_base or "https://openrouter.ai/api/v1"
|
||||
if self.providers.zhipu.api_key:
|
||||
if any(k in model for k in ("zhipu", "glm", "zai")):
|
||||
return self.providers.zhipu.api_base
|
||||
if self.providers.vllm.api_base:
|
||||
if "vllm" in model:
|
||||
return self.providers.vllm.api_base
|
||||
return None
|
||||
|
||||
|
||||
@@ -42,7 +42,9 @@ class LiteLLMProvider(LLMProvider):
|
||||
os.environ["OPENROUTER_API_KEY"] = api_key
|
||||
elif self.is_vllm:
|
||||
# vLLM/custom endpoint - uses OpenAI-compatible API
|
||||
os.environ["OPENAI_API_KEY"] = api_key
|
||||
os.environ["HOSTED_VLLM_API_KEY"] = api_key
|
||||
elif "deepseek" in default_model:
|
||||
os.environ.setdefault("DEEPSEEK_API_KEY", api_key)
|
||||
elif "anthropic" in default_model:
|
||||
os.environ.setdefault("ANTHROPIC_API_KEY", api_key)
|
||||
elif "openai" in default_model or "gpt" in default_model:
|
||||
@@ -52,8 +54,13 @@ class LiteLLMProvider(LLMProvider):
|
||||
elif "zhipu" in default_model or "glm" in default_model or "zai" in default_model:
|
||||
os.environ.setdefault("ZAI_API_KEY", api_key)
|
||||
os.environ.setdefault("ZHIPUAI_API_KEY", api_key)
|
||||
elif "dashscope" in default_model or "qwen" in default_model.lower():
|
||||
os.environ.setdefault("DASHSCOPE_API_KEY", api_key)
|
||||
elif "groq" in default_model:
|
||||
os.environ.setdefault("GROQ_API_KEY", api_key)
|
||||
elif "moonshot" in default_model or "kimi" in default_model:
|
||||
os.environ.setdefault("MOONSHOT_API_KEY", api_key)
|
||||
os.environ.setdefault("MOONSHOT_API_BASE", api_base or "https://api.moonshot.cn/v1")
|
||||
|
||||
if api_base:
|
||||
litellm.api_base = api_base
|
||||
@@ -97,16 +104,34 @@ class LiteLLMProvider(LLMProvider):
|
||||
model.startswith("hosted_vllm/")
|
||||
):
|
||||
model = f"zai/{model}"
|
||||
|
||||
|
||||
# For DashScope/Qwen, ensure dashscope/ prefix
|
||||
if ("qwen" in model.lower() or "dashscope" in model.lower()) and not (
|
||||
model.startswith("dashscope/") or
|
||||
model.startswith("openrouter/")
|
||||
):
|
||||
model = f"dashscope/{model}"
|
||||
|
||||
# For Moonshot/Kimi, ensure moonshot/ prefix (before vLLM check)
|
||||
if ("moonshot" in model.lower() or "kimi" in model.lower()) and not (
|
||||
model.startswith("moonshot/") or model.startswith("openrouter/")
|
||||
):
|
||||
model = f"moonshot/{model}"
|
||||
|
||||
# For Gemini, ensure gemini/ prefix if not already present
|
||||
if "gemini" in model.lower() and not model.startswith("gemini/"):
|
||||
model = f"gemini/{model}"
|
||||
|
||||
|
||||
# For vLLM, use hosted_vllm/ prefix per LiteLLM docs
|
||||
# Convert openai/ prefix to hosted_vllm/ if user specified it
|
||||
if self.is_vllm:
|
||||
model = f"hosted_vllm/{model}"
|
||||
|
||||
# For Gemini, ensure gemini/ prefix if not already present
|
||||
if "gemini" in model.lower() and not model.startswith("gemini/"):
|
||||
model = f"gemini/{model}"
|
||||
|
||||
# kimi-k2.5 only supports temperature=1.0
|
||||
if "kimi-k2.5" in model.lower():
|
||||
temperature = 1.0
|
||||
|
||||
kwargs: dict[str, Any] = {
|
||||
"model": model,
|
||||
"messages": messages,
|
||||
|
||||
40
nanobot/skills/cron/SKILL.md
Normal file
40
nanobot/skills/cron/SKILL.md
Normal file
@@ -0,0 +1,40 @@
|
||||
---
|
||||
name: cron
|
||||
description: Schedule reminders and recurring tasks.
|
||||
---
|
||||
|
||||
# Cron
|
||||
|
||||
Use the `cron` tool to schedule reminders or recurring tasks.
|
||||
|
||||
## Two Modes
|
||||
|
||||
1. **Reminder** - message is sent directly to user
|
||||
2. **Task** - message is a task description, agent executes and sends result
|
||||
|
||||
## Examples
|
||||
|
||||
Fixed reminder:
|
||||
```
|
||||
cron(action="add", message="Time to take a break!", every_seconds=1200)
|
||||
```
|
||||
|
||||
Dynamic task (agent executes each time):
|
||||
```
|
||||
cron(action="add", message="Check HKUDS/nanobot GitHub stars and report", every_seconds=600)
|
||||
```
|
||||
|
||||
List/remove:
|
||||
```
|
||||
cron(action="list")
|
||||
cron(action="remove", job_id="abc123")
|
||||
```
|
||||
|
||||
## Time Expressions
|
||||
|
||||
| User says | Parameters |
|
||||
|-----------|------------|
|
||||
| every 20 minutes | every_seconds: 1200 |
|
||||
| every hour | every_seconds: 3600 |
|
||||
| every day at 8am | cron_expr: "0 8 * * *" |
|
||||
| weekdays at 5pm | cron_expr: "0 17 * * 1-5" |
|
||||
@@ -29,12 +29,10 @@ dependencies = [
|
||||
"rich>=13.0.0",
|
||||
"croniter>=2.0.0",
|
||||
"python-telegram-bot>=21.0",
|
||||
"lark-oapi>=1.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
feishu = [
|
||||
"lark-oapi>=1.0.0",
|
||||
]
|
||||
dev = [
|
||||
"pytest>=7.0.0",
|
||||
"pytest-asyncio>=0.21.0",
|
||||
|
||||
1
test_docker.sh → tests/test_docker.sh
Executable file → Normal file
1
test_docker.sh → tests/test_docker.sh
Executable file → Normal file
@@ -1,5 +1,6 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
cd "$(dirname "$0")/.." || exit 1
|
||||
|
||||
IMAGE_NAME="nanobot-test"
|
||||
|
||||
Reference in New Issue
Block a user