feat: remove JavaScript files replaced by TypeScript equivalents

- Remove 11 JavaScript source files that have been migrated to TypeScript
- Update package.json scripts to reference TypeScript files
- Update documentation and scripts to reference .ts instead of .js
- Keep JavaScript files without TypeScript equivalents (chatbot-related)

This completes the TypeScript migration for core application files while
maintaining backward compatibility for components not yet migrated.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Jonathan Flatt
2025-05-28 15:01:03 +00:00
parent 3128a83b7a
commit fdf255cbec
19 changed files with 25 additions and 3786 deletions

View File

@@ -124,20 +124,20 @@ The system automatically triggers comprehensive PR reviews when all checks pass:
## Architecture Overview
### Core Components
1. **Express Server** (`src/index.js`): Main application entry point that sets up middleware, routes, and error handling
1. **Express Server** (`src/index.ts`): Main application entry point that sets up middleware, routes, and error handling
2. **Routes**:
- GitHub Webhook: `/api/webhooks/github` - Processes GitHub webhook events
- Claude API: `/api/claude` - Direct API access to Claude
- Health Check: `/health` - Service status monitoring
3. **Controllers**:
- `githubController.js` - Handles webhook verification and processing
- `githubController.ts` - Handles webhook verification and processing
4. **Services**:
- `claudeService.js` - Interfaces with Claude Code CLI
- `githubService.js` - Handles GitHub API interactions
- `claudeService.ts` - Interfaces with Claude Code CLI
- `githubService.ts` - Handles GitHub API interactions
5. **Utilities**:
- `logger.js` - Logging functionality with redaction capability
- `awsCredentialProvider.js` - Secure AWS credential management
- `sanitize.js` - Input sanitization and security
- `logger.ts` - Logging functionality with redaction capability
- `awsCredentialProvider.ts` - Secure AWS credential management
- `sanitize.ts` - Input sanitization and security
### Execution Modes & Security Architecture
The system uses different execution modes based on operation type:
@@ -179,7 +179,7 @@ The service supports multiple AWS authentication methods, with a focus on securi
- **Task Roles** (ECS): Automatically uses container credentials
- **Direct credentials**: Not recommended, but supported for backward compatibility
The `awsCredentialProvider.js` utility handles credential retrieval and rotation.
The `awsCredentialProvider.ts` utility handles credential retrieval and rotation.
## Security Features
- Webhook signature verification using HMAC

View File

@@ -89,7 +89,7 @@ To add a new chatbot provider in the future:
1. **Create Provider Class**
```javascript
// src/providers/NewProvider.js
// src/providers/NewProvider.ts
const ChatbotProvider = require('./ChatbotProvider');
class NewProvider extends ChatbotProvider {
@@ -113,7 +113,7 @@ To add a new chatbot provider in the future:
2. **Register Provider**
```javascript
// src/providers/ProviderFactory.js
// src/providers/ProviderFactory.ts
const NewProvider = require('./NewProvider');
// In constructor:
@@ -122,7 +122,7 @@ To add a new chatbot provider in the future:
3. **Add Route Handler**
```javascript
// src/controllers/chatbotController.js
// src/controllers/chatbotController.ts
async function handleNewProviderWebhook(req, res) {
return await handleChatbotWebhook(req, res, 'newprovider');
}

View File

@@ -15,7 +15,7 @@ GitHub → Webhook Service → Docker Container → Claude API
### 1. GitHub Webhook Reception
**Endpoint**: `POST /api/webhooks/github`
**Handler**: `src/index.js:38`
**Handler**: `src/index.ts:38`
1. GitHub sends webhook event to the service
2. Express middleware captures raw body for signature verification
@@ -23,7 +23,7 @@ GitHub → Webhook Service → Docker Container → Claude API
### 2. Webhook Verification & Processing
**Controller**: `src/controllers/githubController.js`
**Controller**: `src/controllers/githubController.ts`
**Method**: `handleWebhook()`
1. Verifies webhook signature using `GITHUB_WEBHOOK_SECRET`
@@ -45,7 +45,7 @@ GitHub → Webhook Service → Docker Container → Claude API
### 4. Claude Container Preparation
**Service**: `src/services/claudeService.js`
**Service**: `src/services/claudeService.ts`
**Method**: `processCommand()`
1. Builds Docker image if not exists: `claude-code-runner:latest`
@@ -79,7 +79,7 @@ GitHub → Webhook Service → Docker Container → Claude API
### 6. Response Handling
**Controller**: `src/controllers/githubController.js`
**Controller**: `src/controllers/githubController.ts`
**Method**: `handleWebhook()`
1. Read response from container

View File

@@ -58,8 +58,8 @@ Instead of complex pooled execution, consider:
## Code Locations
- Container pool service: `src/services/containerPoolService.js`
- Execution logic: `src/services/claudeService.js:170-210`
- Container pool service: `src/services/containerPoolService.ts`
- Execution logic: `src/services/claudeService.ts:170-210`
- Container creation: Modified Docker command in pool service
## Performance Gains Observed

View File

@@ -12,7 +12,7 @@ The webhook service handles sensitive credentials including:
## Security Measures Implemented
### 1. Docker Command Sanitization
In `src/services/claudeService.js`:
In `src/services/claudeService.ts`:
- Docker commands are sanitized before logging
- Sensitive environment variables are replaced with `[REDACTED]`
- Sanitized commands are used in all error messages
@@ -34,13 +34,13 @@ const sanitizedCommand = dockerCommand.replace(/-e [A-Z_]+=\"[^\"]*\"/g, (match)
- Sanitized output is used in error messages and logs
### 3. Logger Redaction
In `src/utils/logger.js`:
In `src/utils/logger.ts`:
- Pino logger configured with comprehensive redaction paths
- Automatically redacts sensitive fields in log output
- Covers nested objects and various field patterns
### 4. Error Response Sanitization
In `src/controllers/githubController.js`:
In `src/controllers/githubController.ts`:
- Only error messages (not full stack traces) are sent to GitHub
- No raw stderr/stdout is exposed in webhook responses
- Generic error messages for internal server errors

View File

@@ -258,7 +258,7 @@ The logger automatically redacts these environment variables when they appear in
### If credentials appear in logs:
1. Identify the specific pattern that wasn't caught
2. Add the new pattern to the redaction paths in `src/utils/logger.js`
2. Add the new pattern to the redaction paths in `src/utils/logger.ts`
3. Add a test case in the test files
4. Run tests to verify the fix
5. Deploy the updated configuration

View File

@@ -7,9 +7,9 @@
"build": "tsc",
"build:watch": "tsc --watch",
"start": "node dist/index.js",
"start:dev": "node src/index.js",
"dev": "ts-node src/index.js",
"dev:watch": "nodemon --exec ts-node src/index.js",
"start:dev": "node dist/index.js",
"dev": "ts-node src/index.ts",
"dev:watch": "nodemon --exec ts-node src/index.ts",
"clean": "rm -rf dist",
"typecheck": "tsc --noEmit",
"test": "jest",

View File

@@ -13,4 +13,4 @@ fi
# Start the server with the specified port
echo "Starting server on port $DEFAULT_PORT..."
PORT=$DEFAULT_PORT node src/index.js
PORT=$DEFAULT_PORT node dist/index.js

File diff suppressed because it is too large Load Diff

View File

@@ -1,144 +0,0 @@
require('dotenv').config();
const express = require('express');
const bodyParser = require('body-parser');
const { createLogger } = require('./utils/logger');
const { StartupMetrics } = require('./utils/startup-metrics');
const githubRoutes = require('./routes/github');
const claudeRoutes = require('./routes/claude');
const chatbotRoutes = require('./routes/chatbot');
const app = express();
const PORT = process.env.PORT || 3003;
const appLogger = createLogger('app');
const startupMetrics = new StartupMetrics();
// Record initial milestones
startupMetrics.recordMilestone('env_loaded', 'Environment variables loaded');
startupMetrics.recordMilestone('express_initialized', 'Express app initialized');
// Request logging middleware
app.use((req, res, next) => {
const startTime = Date.now();
res.on('finish', () => {
const responseTime = Date.now() - startTime;
appLogger.info(
{
method: req.method,
url: req.url,
statusCode: res.statusCode,
responseTime: `${responseTime}ms`
},
`${req.method} ${req.url}`
);
});
next();
});
// Middleware
app.use(startupMetrics.metricsMiddleware());
app.use(
bodyParser.json({
verify: (req, res, buf) => {
// Store the raw body buffer for webhook signature verification
req.rawBody = buf;
}
})
);
startupMetrics.recordMilestone('middleware_configured', 'Express middleware configured');
// Routes
app.use('/api/webhooks/github', githubRoutes);
app.use('/api/claude', claudeRoutes);
app.use('/api/webhooks/chatbot', chatbotRoutes);
startupMetrics.recordMilestone('routes_configured', 'API routes configured');
// Health check endpoint
app.get('/health', async (req, res) => {
const healthCheckStart = Date.now();
const checks = {
status: 'ok',
timestamp: new Date().toISOString(),
startup: req.startupMetrics,
docker: {
available: false,
error: null,
checkTime: null
},
claudeCodeImage: {
available: false,
error: null,
checkTime: null
}
};
// Check Docker availability
const dockerCheckStart = Date.now();
try {
const { execSync } = require('child_process');
execSync('docker ps', { stdio: 'ignore' });
checks.docker.available = true;
} catch (error) {
checks.docker.error = error.message;
}
checks.docker.checkTime = Date.now() - dockerCheckStart;
// Check Claude Code runner image
const imageCheckStart = Date.now();
try {
const { execSync } = require('child_process');
execSync('docker image inspect claude-code-runner:latest', { stdio: 'ignore' });
checks.claudeCodeImage.available = true;
} catch {
checks.claudeCodeImage.error = 'Image not found';
}
checks.claudeCodeImage.checkTime = Date.now() - imageCheckStart;
// Set overall status
if (!checks.docker.available || !checks.claudeCodeImage.available) {
checks.status = 'degraded';
}
checks.healthCheckDuration = Date.now() - healthCheckStart;
res.status(200).json(checks);
});
// Test endpoint for CF tunnel
app.get('/api/test-tunnel', (req, res) => {
appLogger.info('Test tunnel endpoint hit');
res.status(200).json({
status: 'success',
message: 'CF tunnel is working!',
timestamp: new Date().toISOString(),
headers: req.headers,
ip: req.ip || req.connection.remoteAddress
});
});
// Error handling middleware
app.use((err, req, res, _next) => {
appLogger.error(
{
err: {
message: err.message,
stack: err.stack
},
method: req.method,
url: req.url
},
'Request error'
);
res.status(500).json({ error: 'Internal server error' });
});
app.listen(PORT, () => {
startupMetrics.recordMilestone('server_listening', `Server listening on port ${PORT}`);
const totalStartupTime = startupMetrics.markReady();
appLogger.info(`Server running on port ${PORT} (startup took ${totalStartupTime}ms)`);
});

View File

@@ -1,106 +0,0 @@
const express = require('express');
const router = express.Router();
const claudeService = require('../services/claudeService');
const { createLogger } = require('../utils/logger');
const logger = createLogger('claudeRoutes');
/**
* Direct endpoint for Claude processing
* Allows calling Claude without GitHub webhook integration
*/
router.post('/', async (req, res) => {
logger.info({ request: req.body }, 'Received direct Claude request');
try {
const { repoFullName, repository, command, authToken, useContainer = false } = req.body;
// Handle both repoFullName and repository parameters
const repoName = repoFullName || repository;
// Validate required parameters
if (!repoName) {
logger.warn('Missing repository name in request');
return res.status(400).json({ error: 'Repository name is required' });
}
if (!command) {
logger.warn('Missing command in request');
return res.status(400).json({ error: 'Command is required' });
}
// Validate authentication if enabled
if (process.env.CLAUDE_API_AUTH_REQUIRED === '1') {
if (!authToken || authToken !== process.env.CLAUDE_API_AUTH_TOKEN) {
logger.warn('Invalid authentication token');
return res.status(401).json({ error: 'Invalid authentication token' });
}
}
logger.info(
{
repo: repoName,
commandLength: command.length,
useContainer
},
'Processing direct Claude command'
);
// Process the command with Claude
let claudeResponse;
try {
claudeResponse = await claudeService.processCommand({
repoFullName: repoName,
issueNumber: null, // No issue number for direct calls
command,
isPullRequest: false,
branchName: null
});
logger.debug(
{
responseType: typeof claudeResponse,
responseLength: claudeResponse ? claudeResponse.length : 0
},
'Raw Claude response received'
);
// Force a default response if empty
if (!claudeResponse || claudeResponse.trim() === '') {
claudeResponse =
'No output received from Claude container. This is a placeholder response.';
}
} catch (processingError) {
logger.error({ error: processingError }, 'Error during Claude processing');
claudeResponse = `Error: ${processingError.message}`;
}
logger.info(
{
responseLength: claudeResponse ? claudeResponse.length : 0
},
'Successfully processed Claude command'
);
return res.status(200).json({
message: 'Command processed successfully',
response: claudeResponse
});
} catch (error) {
logger.error(
{
err: {
message: error.message,
stack: error.stack
}
},
'Error processing direct Claude command'
);
return res.status(500).json({
error: 'Failed to process command',
message: error.message
});
}
});
module.exports = router;

View File

@@ -1,8 +0,0 @@
const express = require('express');
const router = express.Router();
const githubController = require('../controllers/githubController');
// GitHub webhook endpoint
router.post('/', githubController.handleWebhook);
module.exports = router;

View File

@@ -1,603 +0,0 @@
const { execFileSync } = require('child_process');
const path = require('path');
// const os = require('os');
const { createLogger } = require('../utils/logger');
// const awsCredentialProvider = require('../utils/awsCredentialProvider');
const { sanitizeBotMentions } = require('../utils/sanitize');
const secureCredentials = require('../utils/secureCredentials');
const logger = createLogger('claudeService');
// Get bot username from environment variables - required
const BOT_USERNAME = process.env.BOT_USERNAME;
// Validate bot username is set
if (!BOT_USERNAME) {
logger.error(
'BOT_USERNAME environment variable is not set in claudeService. This is required to prevent infinite loops.'
);
throw new Error('BOT_USERNAME environment variable is required');
}
// Using the shared sanitization utility from utils/sanitize.js
/**
* Processes a command using Claude Code CLI
*
* @param {Object} options - The options for processing the command
* @param {string} options.repoFullName - The full name of the repository (owner/repo)
* @param {number|null} options.issueNumber - The issue number (can be null for direct API calls)
* @param {string} options.command - The command to process with Claude
* @param {boolean} [options.isPullRequest=false] - Whether this is a pull request
* @param {string} [options.branchName] - The branch name for pull requests
* @param {string} [options.operationType='default'] - Operation type: 'auto-tagging', 'pr-review', or 'default'
* @param {Object} [options.chatbotContext] - Chatbot context for non-repository commands
* @returns {Promise<string>} - Claude's response
*/
async function processCommand({
repoFullName,
issueNumber,
command,
isPullRequest = false,
branchName = null,
operationType = 'default',
chatbotContext = null
}) {
try {
logger.info(
{
repo: repoFullName,
issue: issueNumber,
isPullRequest,
branchName,
commandLength: command.length,
chatbotProvider: chatbotContext?.provider,
chatbotUser: chatbotContext?.userId
},
'Processing command with Claude'
);
const githubToken = secureCredentials.get('GITHUB_TOKEN');
// In test mode, skip execution and return a mock response
if (process.env.NODE_ENV === 'test' || !githubToken || !githubToken.includes('ghp_')) {
logger.info(
{
repo: repoFullName,
issue: issueNumber
},
'TEST MODE: Skipping Claude execution'
);
// Create a test response and sanitize it
const testResponse = `Hello! I'm Claude responding to your request.
Since this is a test environment, I'm providing a simulated response. In production, I would:
1. Clone the repository ${repoFullName}
2. ${isPullRequest ? `Checkout PR branch: ${branchName}` : 'Use the main branch'}
3. Analyze the codebase and execute: "${command}"
4. Use GitHub CLI to interact with issues, PRs, and comments
For real functionality, please configure valid GitHub and Claude API tokens.`;
// Always sanitize responses, even in test mode
return sanitizeBotMentions(testResponse);
}
// Build Docker image if it doesn't exist
const dockerImageName = process.env.CLAUDE_CONTAINER_IMAGE || 'claude-code-runner:latest';
try {
execFileSync('docker', ['inspect', dockerImageName], { stdio: 'ignore' });
logger.info({ dockerImageName }, 'Docker image already exists');
} catch (_e) {
logger.info({ dockerImageName }, 'Building Docker image for Claude Code runner');
execFileSync('docker', ['build', '-f', 'Dockerfile.claudecode', '-t', dockerImageName, '.'], {
cwd: path.join(__dirname, '../..'),
stdio: 'pipe'
});
}
// Select appropriate entrypoint script based on operation type
let entrypointScript;
switch (operationType) {
case 'auto-tagging':
entrypointScript = '/scripts/runtime/claudecode-tagging-entrypoint.sh';
logger.info({ operationType }, 'Using minimal tools for auto-tagging operation');
break;
case 'pr-review':
case 'default':
default:
entrypointScript = '/scripts/runtime/claudecode-entrypoint.sh';
logger.info({ operationType }, 'Using full tool set for standard operation');
break;
}
// Create unique container name (sanitized to prevent command injection)
const sanitizedIdentifier = chatbotContext
? `chatbot-${chatbotContext.provider}-${chatbotContext.userId}`.replace(/[^a-zA-Z0-9\-_]/g, '-')
: repoFullName.replace(/[^a-zA-Z0-9\-_]/g, '-');
const containerName = `claude-${sanitizedIdentifier}-${Date.now()}`;
// Create the full prompt with context and instructions based on operation type
let fullPrompt;
if (chatbotContext) {
// Handle chatbot-specific commands (Discord, Slack, etc.)
fullPrompt = `You are Claude, an AI assistant responding to a user via ${chatbotContext.provider} chatbot.
**Context:**
- Platform: ${chatbotContext.provider}
- User: ${chatbotContext.username} (ID: ${chatbotContext.userId})
- Channel: ${chatbotContext.channelId || 'Direct message'}
- Running in: Standalone chatbot mode
**Important Instructions:**
1. This is a general chatbot interaction, not repository-specific
2. You can help with coding questions, explanations, debugging, and general assistance
3. If the user asks about repository operations, let them know they need to mention you in a GitHub issue/PR
4. Be helpful, concise, and friendly
5. Format your response appropriately for ${chatbotContext.provider}
6. You have access to general tools but not repository-specific operations
**User Request:**
${command}
Please respond helpfully to this ${chatbotContext.provider} user.`;
} else if (operationType === 'auto-tagging') {
fullPrompt = `You are Claude, an AI assistant analyzing a GitHub issue for automatic label assignment.
**Context:**
- Repository: ${repoFullName}
- Issue Number: #${issueNumber}
- Operation: Auto-tagging (Read-only + Label assignment)
**Available Tools:**
- Read: Access repository files and issue content
- GitHub: Use 'gh' CLI for label operations only
**Task:**
Analyze the issue and apply appropriate labels using GitHub CLI commands. Use these categories:
- Priority: critical, high, medium, low
- Type: bug, feature, enhancement, documentation, question, security
- Complexity: trivial, simple, moderate, complex
- Component: api, frontend, backend, database, auth, webhook, docker
**Process:**
1. First run 'gh label list' to see available labels
2. Analyze the issue content
3. Use 'gh issue edit #{issueNumber} --add-label "label1,label2,label3"' to apply labels
4. Do NOT comment on the issue - only apply labels
**User Request:**
${command}
Complete the auto-tagging task using only the minimal required tools.`;
} else {
fullPrompt = `You are Claude, an AI assistant responding to a GitHub ${isPullRequest ? 'pull request' : 'issue'} via the ${BOT_USERNAME} webhook.
**Context:**
- Repository: ${repoFullName}
- ${isPullRequest ? 'Pull Request' : 'Issue'} Number: #${issueNumber}
- Current Branch: ${branchName || 'main'}
- Running in: Unattended mode
**Important Instructions:**
1. You have full GitHub CLI access via the 'gh' command
2. When writing code:
- Always create a feature branch for new work
- Make commits with descriptive messages
- Push your work to the remote repository
- Run all tests and ensure they pass
- Fix any linting or type errors
- Create a pull request if appropriate
3. Iterate until the task is complete - don't stop at partial solutions
4. Always check in your work by pushing to the remote before finishing
5. Use 'gh issue comment' or 'gh pr comment' to provide updates on your progress
6. If you encounter errors, debug and fix them before completing
7. **IMPORTANT - Markdown Formatting:**
- When your response contains markdown (like headers, lists, code blocks), return it as properly formatted markdown
- Do NOT escape or encode special characters like newlines (\\n) or quotes
- Return clean, human-readable markdown that GitHub will render correctly
- Your response should look like normal markdown text, not escaped strings
8. **Request Acknowledgment:**
- For larger or complex tasks that will take significant time, first acknowledge the request
- Post a brief comment like "I understand. Working on [task description]..." before starting
- Use 'gh issue comment' or 'gh pr comment' to post this acknowledgment immediately
- This lets the user know their request was received and is being processed
**User Request:**
${command}
Please complete this task fully and autonomously.`;
}
// Prepare environment variables for the container
const envVars = {
REPO_FULL_NAME: repoFullName || '',
ISSUE_NUMBER: issueNumber || '',
IS_PULL_REQUEST: isPullRequest ? 'true' : 'false',
BRANCH_NAME: branchName || '',
OPERATION_TYPE: operationType,
COMMAND: fullPrompt,
GITHUB_TOKEN: githubToken,
ANTHROPIC_API_KEY: secureCredentials.get('ANTHROPIC_API_KEY'),
CHATBOT_PROVIDER: chatbotContext?.provider || '',
CHATBOT_USER_ID: chatbotContext?.userId || '',
CHATBOT_USERNAME: chatbotContext?.username || ''
};
// Note: Environment variables will be added as separate arguments to docker command
// This is safer than building a shell command string
// Run the container
logger.info(
{
containerName,
repo: repoFullName,
isPullRequest,
branch: branchName
},
'Starting Claude Code container'
);
// Build docker run command as an array to prevent command injection
const dockerArgs = ['run', '--rm'];
// Apply container security constraints based on environment variables
if (process.env.CLAUDE_CONTAINER_PRIVILEGED === 'true') {
dockerArgs.push('--privileged');
} else {
// Apply only necessary capabilities instead of privileged mode
const requiredCapabilities = [
'NET_ADMIN', // Required for firewall setup
'SYS_ADMIN' // Required for certain filesystem operations
];
// Add optional capabilities
const optionalCapabilities = {
NET_RAW: process.env.CLAUDE_CONTAINER_CAP_NET_RAW === 'true',
SYS_TIME: process.env.CLAUDE_CONTAINER_CAP_SYS_TIME === 'true',
DAC_OVERRIDE: process.env.CLAUDE_CONTAINER_CAP_DAC_OVERRIDE === 'true',
AUDIT_WRITE: process.env.CLAUDE_CONTAINER_CAP_AUDIT_WRITE === 'true'
};
// Add required capabilities
requiredCapabilities.forEach(cap => {
dockerArgs.push(`--cap-add=${cap}`);
});
// Add optional capabilities if enabled
Object.entries(optionalCapabilities).forEach(([cap, enabled]) => {
if (enabled) {
dockerArgs.push(`--cap-add=${cap}`);
}
});
// Add resource limits
dockerArgs.push(
'--memory',
process.env.CLAUDE_CONTAINER_MEMORY_LIMIT || '2g',
'--cpu-shares',
process.env.CLAUDE_CONTAINER_CPU_SHARES || '1024',
'--pids-limit',
process.env.CLAUDE_CONTAINER_PIDS_LIMIT || '256'
);
}
// Add container name
dockerArgs.push('--name', containerName);
// Add environment variables as separate arguments
Object.entries(envVars)
.filter(([_, value]) => value !== undefined && value !== '')
.forEach(([key, value]) => {
// For long commands, we need to pass them differently
// Docker doesn't support reading env values from files with @ syntax
if (key === 'COMMAND' && String(value).length > 500) {
// We'll pass the command via stdin or mount it as a volume
// For now, let's just pass it directly but properly escaped
dockerArgs.push('-e', `${key}=${String(value)}`);
} else {
dockerArgs.push('-e', `${key}=${String(value)}`);
}
});
// Add the image name and custom entrypoint
dockerArgs.push('--entrypoint', entrypointScript, dockerImageName);
// Create sanitized version for logging (remove sensitive values)
const sanitizedArgs = dockerArgs.map(arg => {
if (typeof arg !== 'string') return arg;
// Check if this is an environment variable assignment
const envMatch = arg.match(/^([A-Z_]+)=(.*)$/);
if (envMatch) {
const envKey = envMatch[1];
const sensitiveSKeys = [
'GITHUB_TOKEN',
'ANTHROPIC_API_KEY',
'AWS_ACCESS_KEY_ID',
'AWS_SECRET_ACCESS_KEY',
'AWS_SESSION_TOKEN'
];
if (sensitiveSKeys.includes(envKey)) {
return `${envKey}=[REDACTED]`;
}
// For the command, also redact to avoid logging the full command
if (envKey === 'COMMAND') {
return `${envKey}=[COMMAND_CONTENT]`;
}
}
return arg;
});
try {
logger.info({ dockerArgs: sanitizedArgs }, 'Executing Docker command');
// No longer using temp files for commands
// Get container lifetime from environment variable or use default (2 hours)
const containerLifetimeMs = parseInt(process.env.CONTAINER_LIFETIME_MS, 10) || 7200000; // 2 hours in milliseconds
logger.info({ containerLifetimeMs }, 'Setting container lifetime');
// Use promisified version of child_process.execFile (safer than exec)
const { promisify } = require('util');
const execFileAsync = promisify(require('child_process').execFile);
const result = await execFileAsync('docker', dockerArgs, {
maxBuffer: 10 * 1024 * 1024, // 10MB buffer
timeout: containerLifetimeMs // Container lifetime in milliseconds
});
// No cleanup needed anymore
let responseText = result.stdout.trim();
// Check for empty response
if (!responseText) {
logger.warn(
{
containerName,
repo: repoFullName,
issue: issueNumber
},
'Empty response from Claude Code container'
);
// Try to get container logs as the response instead
try {
responseText = execFileSync('docker', ['logs', containerName], {
encoding: 'utf8',
maxBuffer: 1024 * 1024,
stdio: ['pipe', 'pipe', 'pipe']
});
logger.info('Retrieved response from container logs');
} catch (e) {
logger.error(
{
error: e.message,
containerName
},
'Failed to get container logs as fallback'
);
}
}
// Sanitize response to prevent infinite loops by removing bot mentions
responseText = sanitizeBotMentions(responseText);
logger.info(
{
repo: repoFullName,
issue: issueNumber,
responseLength: responseText.length,
containerName,
stdout: responseText.substring(0, 500) // Log first 500 chars
},
'Claude Code execution completed successfully'
);
return responseText;
} catch (error) {
// No cleanup needed - we're not using temp files anymore
// Sanitize stderr and stdout to remove any potential credentials
const sanitizeOutput = output => {
if (!output) return output;
// Import the sanitization utility
let sanitized = output.toString();
// Sensitive values to redact
const sensitiveValues = [
githubToken,
secureCredentials.get('ANTHROPIC_API_KEY'),
envVars.AWS_ACCESS_KEY_ID,
envVars.AWS_SECRET_ACCESS_KEY,
envVars.AWS_SESSION_TOKEN
].filter(val => val && val.length > 0);
// Redact specific sensitive values first
sensitiveValues.forEach(value => {
if (value) {
// Convert to string and escape regex special characters
const stringValue = String(value);
// Escape regex special characters
const escapedValue = stringValue.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
sanitized = sanitized.replace(new RegExp(escapedValue, 'g'), '[REDACTED]');
}
});
// Then apply pattern-based redaction for any missed credentials
const sensitivePatterns = [
/AKIA[0-9A-Z]{16}/g, // AWS Access Key pattern
/[a-zA-Z0-9/+=]{40}/g, // AWS Secret Key pattern
/sk-[a-zA-Z0-9]{32,}/g, // API key pattern
/github_pat_[a-zA-Z0-9_]{82}/g, // GitHub fine-grained token pattern
/ghp_[a-zA-Z0-9]{36}/g // GitHub personal access token pattern
];
sensitivePatterns.forEach(pattern => {
sanitized = sanitized.replace(pattern, '[REDACTED]');
});
return sanitized;
};
// Check for specific error types
const errorMsg = error.message || '';
const errorOutput = error.stderr ? error.stderr.toString() : '';
// Check if this is a docker image not found error
if (
errorOutput.includes('Unable to find image') ||
errorMsg.includes('Unable to find image')
) {
logger.error('Docker image not found. Attempting to rebuild...');
try {
execFileSync(
'docker',
['build', '-f', 'Dockerfile.claudecode', '-t', dockerImageName, '.'],
{
cwd: path.join(__dirname, '../..'),
stdio: 'pipe'
}
);
logger.info('Successfully rebuilt Docker image');
} catch (rebuildError) {
logger.error(
{
error: rebuildError.message
},
'Failed to rebuild Docker image'
);
}
}
logger.error(
{
error: error.message,
stderr: sanitizeOutput(error.stderr),
stdout: sanitizeOutput(error.stdout),
containerName,
dockerArgs: sanitizedArgs
},
'Error running Claude Code container'
);
// Try to get container logs for debugging
try {
const logs = execFileSync('docker', ['logs', containerName], {
encoding: 'utf8',
maxBuffer: 1024 * 1024,
stdio: ['pipe', 'pipe', 'pipe']
});
logger.error({ containerLogs: logs }, 'Container logs');
} catch (e) {
logger.error({ error: e.message }, 'Failed to get container logs');
}
// Try to clean up the container if it's still running
try {
execFileSync('docker', ['kill', containerName], { stdio: 'ignore' });
} catch {
// Container might already be stopped
}
// Generate an error ID for log correlation
const timestamp = new Date().toISOString();
const errorId = `err-${Math.random().toString(36).substring(2, 10)}`;
// Log the detailed error with full context
const sanitizedStderr = sanitizeOutput(error.stderr);
const sanitizedStdout = sanitizeOutput(error.stdout);
logger.error(
{
errorId,
timestamp,
error: error.message,
stderr: sanitizedStderr,
stdout: sanitizedStdout,
containerName,
dockerArgs: sanitizedArgs,
repo: repoFullName,
issue: issueNumber
},
'Claude Code container execution failed (with error reference)'
);
// Throw a generic error with reference ID, but without sensitive details
const errorMessage = sanitizeBotMentions(
`Error executing Claude command (Reference: ${errorId}, Time: ${timestamp})`
);
throw new Error(errorMessage);
}
} catch (error) {
// Sanitize the error message to remove any credentials
const sanitizeMessage = message => {
if (!message) return message;
let sanitized = message;
const sensitivePatterns = [
/AWS_ACCESS_KEY_ID="[^"]+"/g,
/AWS_SECRET_ACCESS_KEY="[^"]+"/g,
/AWS_SESSION_TOKEN="[^"]+"/g,
/GITHUB_TOKEN="[^"]+"/g,
/ANTHROPIC_API_KEY="[^"]+"/g,
/AKIA[0-9A-Z]{16}/g, // AWS Access Key pattern
/[a-zA-Z0-9/+=]{40}/g, // AWS Secret Key pattern
/sk-[a-zA-Z0-9]{32,}/g, // API key pattern
/github_pat_[a-zA-Z0-9_]{82}/g, // GitHub fine-grained token pattern
/ghp_[a-zA-Z0-9]{36}/g // GitHub personal access token pattern
];
sensitivePatterns.forEach(pattern => {
sanitized = sanitized.replace(pattern, '[REDACTED]');
});
return sanitized;
};
logger.error(
{
err: {
message: sanitizeMessage(error.message),
stack: sanitizeMessage(error.stack)
},
repo: repoFullName,
issue: issueNumber
},
'Error processing command with Claude'
);
// Generate an error ID for log correlation
const timestamp = new Date().toISOString();
const errorId = `err-${Math.random().toString(36).substring(2, 10)}`;
// Log the sanitized error with its ID for correlation
const sanitizedErrorMessage = sanitizeMessage(error.message);
const sanitizedErrorStack = error.stack ? sanitizeMessage(error.stack) : null;
logger.error(
{
errorId,
timestamp,
error: sanitizedErrorMessage,
stack: sanitizedErrorStack,
repo: repoFullName,
issue: issueNumber
},
'General error in Claude service (with error reference)'
);
// Throw a generic error with reference ID, but without sensitive details
const errorMessage = sanitizeBotMentions(
`Error processing Claude command (Reference: ${errorId}, Time: ${timestamp})`
);
throw new Error(errorMessage);
}
}
module.exports = {
processCommand
};

View File

@@ -1,657 +0,0 @@
const { Octokit } = require('@octokit/rest');
const { createLogger } = require('../utils/logger');
const secureCredentials = require('../utils/secureCredentials');
const logger = createLogger('githubService');
// Create Octokit instance (lazy initialization)
let octokit = null;
function getOctokit() {
if (!octokit) {
const githubToken = secureCredentials.get('GITHUB_TOKEN');
if (githubToken && githubToken.includes('ghp_')) {
octokit = new Octokit({
auth: githubToken,
userAgent: 'Claude-GitHub-Webhook'
});
}
}
return octokit;
}
/**
* Posts a comment to a GitHub issue or pull request
*/
async function postComment({ repoOwner, repoName, issueNumber, body }) {
try {
// Validate parameters to prevent SSRF
const validated = validateGitHubParams(repoOwner, repoName, issueNumber);
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
bodyLength: body.length
},
'Posting comment to GitHub'
);
// In test mode, just log the comment instead of posting to GitHub
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
bodyPreview: body.substring(0, 100) + (body.length > 100 ? '...' : '')
},
'TEST MODE: Would post comment to GitHub'
);
return {
id: 'test-comment-id',
body: body,
created_at: new Date().toISOString()
};
}
// Use Octokit to create comment
const { data } = await client.issues.createComment({
owner: validated.repoOwner,
repo: validated.repoName,
issue_number: validated.issueNumber,
body: body
});
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
commentId: data.id
},
'Comment posted successfully'
);
return data;
} catch (error) {
logger.error(
{
err: {
message: error.message,
responseData: error.response?.data
},
repo: `${repoOwner}/${repoName}`,
issue: issueNumber
},
'Error posting comment to GitHub'
);
throw new Error(`Failed to post comment: ${error.message}`);
}
}
/**
* Validates GitHub repository and issue parameters to prevent SSRF
*/
function validateGitHubParams(repoOwner, repoName, issueNumber) {
// Validate repoOwner and repoName contain only safe characters
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
// Validate issueNumber is a positive integer
const issueNum = parseInt(issueNumber, 10);
if (!Number.isInteger(issueNum) || issueNum <= 0) {
throw new Error('Invalid issue number - must be a positive integer');
}
return { repoOwner, repoName, issueNumber: issueNum };
}
/**
* Adds labels to a GitHub issue
*/
async function addLabelsToIssue({ repoOwner, repoName, issueNumber, labels }) {
try {
// Validate parameters to prevent SSRF
const validated = validateGitHubParams(repoOwner, repoName, issueNumber);
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
labelCount: labels.length
},
'Adding labels to GitHub issue'
);
// In test mode, just log the labels instead of applying to GitHub
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
labelCount: labels.length
},
'TEST MODE: Would add labels to GitHub issue'
);
return {
added_labels: labels,
timestamp: new Date().toISOString()
};
}
// Use Octokit to add labels
const { data } = await client.issues.addLabels({
owner: validated.repoOwner,
repo: validated.repoName,
issue_number: validated.issueNumber,
labels: labels
});
logger.info(
{
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
appliedLabels: data.map(label => label.name)
},
'Labels added successfully'
);
return data;
} catch (error) {
logger.error(
{
err: {
message: error.message,
responseData: error.response?.data
},
repo: `${repoOwner}/${repoName}`,
issue: issueNumber,
labelCount: labels.length
},
'Error adding labels to GitHub issue'
);
throw new Error(`Failed to add labels: ${error.message}`);
}
}
/**
* Creates repository labels if they don't exist
*/
async function createRepositoryLabels({ repoOwner, repoName, labels }) {
try {
// Validate repository parameters to prevent SSRF
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
logger.info(
{
repo: `${repoOwner}/${repoName}`,
labelCount: labels.length
},
'Creating repository labels'
);
// In test mode, just log the operation
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
logger.info(
{
repo: `${repoOwner}/${repoName}`,
labels: labels
},
'TEST MODE: Would create repository labels'
);
return labels;
}
const createdLabels = [];
for (const label of labels) {
try {
// Use Octokit to create label
const { data } = await client.issues.createLabel({
owner: repoOwner,
repo: repoName,
name: label.name,
color: label.color,
description: label.description
});
createdLabels.push(data);
logger.debug({ labelName: label.name }, 'Label created successfully');
} catch (error) {
// Label might already exist - check if it's a 422 (Unprocessable Entity)
if (error.status === 422) {
logger.debug({ labelName: label.name }, 'Label already exists, skipping');
} else {
logger.warn(
{
err: error.message,
labelName: label.name
},
'Failed to create label'
);
}
}
}
return createdLabels;
} catch (error) {
logger.error(
{
err: error.message,
repo: `${repoOwner}/${repoName}`
},
'Error creating repository labels'
);
throw new Error(`Failed to create labels: ${error.message}`);
}
}
/**
* Provides fallback labels based on simple keyword matching
*/
async function getFallbackLabels(title, body) {
const content = `${title} ${body || ''}`.toLowerCase();
const labels = [];
// Type detection - check documentation first for specificity
if (
content.includes(' doc ') ||
content.includes('docs') ||
content.includes('readme') ||
content.includes('documentation')
) {
labels.push('type:documentation');
} else if (
content.includes('bug') ||
content.includes('error') ||
content.includes('issue') ||
content.includes('problem')
) {
labels.push('type:bug');
} else if (content.includes('feature') || content.includes('add') || content.includes('new')) {
labels.push('type:feature');
} else if (
content.includes('improve') ||
content.includes('enhance') ||
content.includes('better')
) {
labels.push('type:enhancement');
} else if (content.includes('question') || content.includes('help') || content.includes('how')) {
labels.push('type:question');
}
// Priority detection
if (
content.includes('critical') ||
content.includes('urgent') ||
content.includes('security') ||
content.includes('down')
) {
labels.push('priority:critical');
} else if (content.includes('important') || content.includes('high')) {
labels.push('priority:high');
} else {
labels.push('priority:medium');
}
// Component detection
if (content.includes('api') || content.includes('endpoint')) {
labels.push('component:api');
} else if (
content.includes('ui') ||
content.includes('frontend') ||
content.includes('interface')
) {
labels.push('component:frontend');
} else if (content.includes('backend') || content.includes('server')) {
labels.push('component:backend');
} else if (content.includes('database') || content.includes('db')) {
labels.push('component:database');
} else if (
content.includes('auth') ||
content.includes('login') ||
content.includes('permission')
) {
labels.push('component:auth');
} else if (content.includes('webhook') || content.includes('github')) {
labels.push('component:webhook');
} else if (content.includes('docker') || content.includes('container')) {
labels.push('component:docker');
}
return labels;
}
/**
* Gets the combined status for a specific commit/ref
* Used to verify all required status checks have passed
*/
async function getCombinedStatus({ repoOwner, repoName, ref }) {
try {
// Validate parameters to prevent SSRF
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
// Validate ref (commit SHA, branch, or tag)
const refPattern = /^[a-zA-Z0-9._/-]+$/;
if (!refPattern.test(ref)) {
throw new Error('Invalid ref - contains unsafe characters');
}
logger.info(
{
repo: `${repoOwner}/${repoName}`,
ref: ref
},
'Getting combined status from GitHub'
);
// In test mode, return a mock successful status
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
logger.info(
{
repo: `${repoOwner}/${repoName}`,
ref: ref
},
'TEST MODE: Returning mock successful combined status'
);
return {
state: 'success',
total_count: 2,
statuses: [
{ state: 'success', context: 'ci/test' },
{ state: 'success', context: 'ci/build' }
]
};
}
// Use Octokit to get combined status
const { data } = await client.repos.getCombinedStatusForRef({
owner: repoOwner,
repo: repoName,
ref: ref
});
logger.info(
{
repo: `${repoOwner}/${repoName}`,
ref: ref,
state: data.state,
totalCount: data.total_count
},
'Combined status retrieved successfully'
);
return data;
} catch (error) {
logger.error(
{
err: {
message: error.message,
status: error.response?.status,
responseData: error.response?.data
},
repo: `${repoOwner}/${repoName}`,
ref: ref
},
'Error getting combined status from GitHub'
);
throw new Error(`Failed to get combined status: ${error.message}`);
}
}
/**
* Check if we've already reviewed this PR at the given commit SHA
* @param {Object} params
* @param {string} params.repoOwner - Repository owner
* @param {string} params.repoName - Repository name
* @param {number} params.prNumber - Pull request number
* @param {string} params.commitSha - Commit SHA to check
* @returns {Promise<boolean>} True if already reviewed at this SHA
*/
async function hasReviewedPRAtCommit({ repoOwner, repoName, prNumber, commitSha }) {
try {
// Validate parameters
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
logger.info(
{
repo: `${repoOwner}/${repoName}`,
pr: prNumber,
commitSha: commitSha
},
'Checking if PR has been reviewed at commit'
);
// In test mode, return false to allow review
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
return false;
}
// Get review comments for this PR using Octokit
const { data: reviews } = await client.pulls.listReviews({
owner: repoOwner,
repo: repoName,
pull_number: prNumber
});
// Check if any review mentions this specific commit SHA
const botUsername = process.env.BOT_USERNAME || 'ClaudeBot';
const existingReview = reviews.find(review => {
return (
review.user.login === botUsername &&
review.body &&
review.body.includes(`commit: ${commitSha}`)
);
});
return !!existingReview;
} catch (error) {
logger.error(
{
err: error.message,
repo: `${repoOwner}/${repoName}`,
pr: prNumber
},
'Failed to check for existing reviews'
);
// On error, assume not reviewed to avoid blocking reviews
return false;
}
}
/**
* Gets check suites for a specific commit
* @param {Object} params
* @param {string} params.repoOwner - Repository owner
* @param {string} params.repoName - Repository name
* @param {string} params.ref - Commit SHA or ref
* @returns {Promise<Object>} The check suites response
*/
async function getCheckSuitesForRef({ repoOwner, repoName, ref }) {
try {
// Validate parameters to prevent SSRF
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
// Validate ref (commit SHA, branch, or tag)
const refPattern = /^[a-zA-Z0-9._/-]+$/;
if (!refPattern.test(ref)) {
throw new Error('Invalid ref - contains unsafe characters');
}
logger.info(
{
repo: `${repoOwner}/${repoName}`,
ref
},
'Getting check suites for ref'
);
// In test mode, return mock data
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
return {
total_count: 1,
check_suites: [
{
id: 12345,
app: { slug: 'github-actions', name: 'GitHub Actions' },
status: 'completed',
conclusion: 'success'
}
]
};
}
// Use Octokit's built-in method
const { data } = await client.checks.listSuitesForRef({
owner: repoOwner,
repo: repoName,
ref: ref
});
return data;
} catch (error) {
logger.error(
{
err: error.message,
repo: `${repoOwner}/${repoName}`,
ref
},
'Failed to get check suites'
);
throw error;
}
}
/**
* Add or remove labels on a pull request
* @param {Object} params
* @param {string} params.repoOwner - Repository owner
* @param {string} params.repoName - Repository name
* @param {number} params.prNumber - Pull request number
* @param {string[]} params.labelsToAdd - Labels to add
* @param {string[]} params.labelsToRemove - Labels to remove
*/
async function managePRLabels({
repoOwner,
repoName,
prNumber,
labelsToAdd = [],
labelsToRemove = []
}) {
try {
// Validate parameters
const repoPattern = /^[a-zA-Z0-9._-]+$/;
if (!repoPattern.test(repoOwner) || !repoPattern.test(repoName)) {
throw new Error('Invalid repository owner or name - contains unsafe characters');
}
// In test mode, just log
const client = getOctokit();
if (process.env.NODE_ENV === 'test' || !client) {
logger.info(
{
repo: `${repoOwner}/${repoName}`,
pr: prNumber,
labelsToAdd,
labelsToRemove
},
'TEST MODE: Would manage PR labels'
);
return;
}
// Remove labels first using Octokit
for (const label of labelsToRemove) {
try {
await client.issues.removeLabel({
owner: repoOwner,
repo: repoName,
issue_number: prNumber,
name: label
});
logger.info(
{
repo: `${repoOwner}/${repoName}`,
pr: prNumber,
label
},
'Removed label from PR'
);
} catch (error) {
// Ignore 404 errors (label not present)
if (error.status !== 404) {
logger.error(
{
err: error.message,
label
},
'Failed to remove label'
);
}
}
}
// Add new labels using Octokit
if (labelsToAdd.length > 0) {
await client.issues.addLabels({
owner: repoOwner,
repo: repoName,
issue_number: prNumber,
labels: labelsToAdd
});
logger.info(
{
repo: `${repoOwner}/${repoName}`,
pr: prNumber,
labels: labelsToAdd
},
'Added labels to PR'
);
}
} catch (error) {
logger.error(
{
err: error.message,
repo: `${repoOwner}/${repoName}`,
pr: prNumber
},
'Failed to manage PR labels'
);
throw error;
}
}
module.exports = {
postComment,
addLabelsToIssue,
createRepositoryLabels,
getFallbackLabels,
getCombinedStatus,
hasReviewedPRAtCommit,
managePRLabels,
getCheckSuitesForRef
};

View File

@@ -1,231 +0,0 @@
const { createLogger } = require('./logger');
const logger = createLogger('awsCredentialProvider');
/**
* AWS Credential Provider for secure credential management
* Implements best practices for AWS authentication
*/
class AWSCredentialProvider {
constructor() {
this.credentials = null;
this.expirationTime = null;
this.credentialSource = null;
}
/**
* Get AWS credentials - PROFILES ONLY
*
* This method implements a caching mechanism to avoid repeatedly reading
* credential files. It checks for cached credentials first, and only reads
* from the filesystem if necessary.
*
* The cached credentials are cleared when:
* 1. clearCache() is called explicitly
* 2. When credentials expire (for temporary credentials)
*
* Static credentials from profiles don't expire, so they remain cached
* until the process ends or cache is explicitly cleared.
*
* @returns {Promise<Object>} Credential object with accessKeyId, secretAccessKey, and region
* @throws {Error} If AWS_PROFILE is not set or credential retrieval fails
*/
async getCredentials() {
if (!process.env.AWS_PROFILE) {
throw new Error('AWS_PROFILE must be set. Direct credential passing is not supported.');
}
// Return cached credentials if available and not expired
if (this.credentials && !this.isExpired()) {
logger.info('Using cached credentials');
return this.credentials;
}
logger.info('Using AWS profile authentication only');
try {
this.credentials = await this.getProfileCredentials(process.env.AWS_PROFILE);
this.credentialSource = `AWS Profile (${process.env.AWS_PROFILE})`;
return this.credentials;
} catch (error) {
logger.error({ error: error.message }, 'Failed to get AWS credentials from profile');
throw error;
}
}
/**
* Check if credentials have expired
*/
isExpired() {
if (!this.expirationTime) {
return false; // Static credentials don't expire
}
return Date.now() > this.expirationTime;
}
/**
* Check if running on EC2 instance
*/
async isEC2Instance() {
try {
const response = await fetch('http://169.254.169.254/latest/meta-data/', {
timeout: 1000
});
return response.ok;
} catch {
return false;
}
}
/**
* Get credentials from EC2 instance metadata
*/
async getInstanceMetadataCredentials() {
const tokenResponse = await fetch('http://169.254.169.254/latest/api/token', {
method: 'PUT',
headers: {
'X-aws-ec2-metadata-token-ttl-seconds': '21600'
},
timeout: 1000
});
const token = await tokenResponse.text();
const roleResponse = await fetch(
'http://169.254.169.254/latest/meta-data/iam/security-credentials/',
{
headers: {
'X-aws-ec2-metadata-token': token
},
timeout: 1000
}
);
const roleName = await roleResponse.text();
const credentialsResponse = await fetch(
`http://169.254.169.254/latest/meta-data/iam/security-credentials/${roleName}`,
{
headers: {
'X-aws-ec2-metadata-token': token
},
timeout: 1000
}
);
const credentials = await credentialsResponse.json();
this.expirationTime = new Date(credentials.Expiration).getTime();
return {
accessKeyId: credentials.AccessKeyId,
secretAccessKey: credentials.SecretAccessKey,
sessionToken: credentials.Token,
region: process.env.AWS_REGION
};
}
/**
* Get credentials from ECS container metadata
*/
async getECSCredentials() {
const uri = process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI;
const response = await fetch(`http://169.254.170.2${uri}`, {
timeout: 1000
});
const credentials = await response.json();
this.expirationTime = new Date(credentials.Expiration).getTime();
return {
accessKeyId: credentials.AccessKeyId,
secretAccessKey: credentials.SecretAccessKey,
sessionToken: credentials.Token,
region: process.env.AWS_REGION
};
}
/**
* Get credentials from AWS profile
*/
async getProfileCredentials(profileName) {
const { promises: fs } = require('fs');
const path = require('path');
const os = require('os');
const credentialsPath = path.join(os.homedir(), '.aws', 'credentials');
const configPath = path.join(os.homedir(), '.aws', 'config');
try {
// Read credentials file
const credentialsContent = await fs.readFile(credentialsPath, 'utf8');
const configContent = await fs.readFile(configPath, 'utf8');
// Parse credentials for the specific profile
const profileRegex = new RegExp(`\\[${profileName}\\]([^\\[]*)`);
const credentialsMatch = credentialsContent.match(profileRegex);
const configMatch = configContent.match(new RegExp(`\\[profile ${profileName}\\]([^\\[]*)`));
if (!credentialsMatch && !configMatch) {
throw new Error(`Profile '${profileName}' not found`);
}
const credentialsSection = credentialsMatch ? credentialsMatch[1] : '';
const configSection = configMatch ? configMatch[1] : '';
// Extract credentials
const accessKeyMatch = credentialsSection.match(/aws_access_key_id\s*=\s*(.+)/);
const secretKeyMatch = credentialsSection.match(/aws_secret_access_key\s*=\s*(.+)/);
const regionMatch = configSection.match(/region\s*=\s*(.+)/);
if (!accessKeyMatch || !secretKeyMatch) {
throw new Error(`Incomplete credentials for profile '${profileName}'`);
}
return {
accessKeyId: accessKeyMatch[1].trim(),
secretAccessKey: secretKeyMatch[1].trim(),
region: regionMatch ? regionMatch[1].trim() : process.env.AWS_REGION
};
} catch (error) {
logger.error({ error: error.message, profile: profileName }, 'Failed to read AWS profile');
throw error;
}
}
/**
* Get environment variables for Docker container
* PROFILES ONLY - No credential passing through environment variables
*/
async getDockerEnvVars() {
if (!process.env.AWS_PROFILE) {
throw new Error('AWS_PROFILE must be set. Direct credential passing is not supported.');
}
logger.info(
{
profile: process.env.AWS_PROFILE
},
'Using AWS profile authentication only'
);
return {
AWS_PROFILE: process.env.AWS_PROFILE,
AWS_REGION: process.env.AWS_REGION
};
}
/**
* Clear cached credentials (useful for testing or rotation)
*/
clearCache() {
this.credentials = null;
this.expirationTime = null;
this.credentialSource = null;
logger.info('Cleared credential cache');
}
}
// Export singleton instance
module.exports = new AWSCredentialProvider();

View File

@@ -1,423 +0,0 @@
const pino = require('pino');
const fs = require('fs');
const path = require('path');
// Create logs directory if it doesn't exist
// Use home directory for logs to avoid permission issues
const homeDir = process.env.HOME || '/tmp';
const logsDir = path.join(homeDir, '.claude-webhook', 'logs');
// eslint-disable-next-line no-sync
if (!fs.existsSync(logsDir)) {
// eslint-disable-next-line no-sync
fs.mkdirSync(logsDir, { recursive: true });
}
// Determine if we should use file transport in production
const isProduction = process.env.NODE_ENV === 'production';
const logFileName = path.join(logsDir, 'app.log');
// Configure different transports based on environment
const transport = isProduction
? {
targets: [
// File transport for production
{
target: 'pino/file',
options: { destination: logFileName, mkdir: true }
},
// Console pretty transport
{
target: 'pino-pretty',
options: {
colorize: true,
levelFirst: true,
translateTime: 'SYS:standard'
},
level: 'info'
}
]
}
: {
// Just use pretty logs in development
target: 'pino-pretty',
options: {
colorize: true,
levelFirst: true,
translateTime: 'SYS:standard'
}
};
// Configure the logger
const logger = pino({
transport,
timestamp: pino.stdTimeFunctions.isoTime,
// Include the hostname and pid in the log data
base: {
pid: process.pid,
hostname: process.env.HOSTNAME || 'unknown',
env: process.env.NODE_ENV || 'development'
},
level: process.env.LOG_LEVEL || 'info',
// Define custom log levels if needed
customLevels: {
http: 35 // Between info (30) and debug (20)
},
redact: {
paths: [
// HTTP headers that might contain credentials
'headers.authorization',
'headers["x-api-key"]',
'headers["x-auth-token"]',
'headers["x-github-token"]',
'headers.bearer',
'*.headers.authorization',
'*.headers["x-api-key"]',
'*.headers["x-auth-token"]',
'*.headers["x-github-token"]',
'*.headers.bearer',
// Generic sensitive field patterns (top-level)
'password',
'passwd',
'pass',
'token',
'secret',
'secretKey',
'secret_key',
'apiKey',
'api_key',
'credential',
'credentials',
'key',
'private',
'privateKey',
'private_key',
'auth',
'authentication',
// Generic sensitive field patterns (nested)
'*.password',
'*.passwd',
'*.pass',
'*.token',
'*.secret',
'*.secretKey',
'*.secret_key',
'*.apiKey',
'*.api_key',
'*.credential',
'*.credentials',
'*.key',
'*.private',
'*.privateKey',
'*.private_key',
'*.auth',
'*.authentication',
// Specific environment variables (top-level)
'AWS_SECRET_ACCESS_KEY',
'AWS_ACCESS_KEY_ID',
'AWS_SESSION_TOKEN',
'AWS_SECURITY_TOKEN',
'GITHUB_TOKEN',
'GH_TOKEN',
'ANTHROPIC_API_KEY',
'GITHUB_WEBHOOK_SECRET',
'WEBHOOK_SECRET',
'BOT_TOKEN',
'API_KEY',
'SECRET_KEY',
'ACCESS_TOKEN',
'REFRESH_TOKEN',
'JWT_SECRET',
'DATABASE_URL',
'DB_PASSWORD',
'REDIS_PASSWORD',
// Nested in any object (*)
'*.AWS_SECRET_ACCESS_KEY',
'*.AWS_ACCESS_KEY_ID',
'*.AWS_SESSION_TOKEN',
'*.AWS_SECURITY_TOKEN',
'*.GITHUB_TOKEN',
'*.GH_TOKEN',
'*.ANTHROPIC_API_KEY',
'*.GITHUB_WEBHOOK_SECRET',
'*.WEBHOOK_SECRET',
'*.BOT_TOKEN',
'*.API_KEY',
'*.SECRET_KEY',
'*.ACCESS_TOKEN',
'*.REFRESH_TOKEN',
'*.JWT_SECRET',
'*.DATABASE_URL',
'*.DB_PASSWORD',
'*.REDIS_PASSWORD',
// Docker-related sensitive content
'dockerCommand',
'*.dockerCommand',
'dockerArgs',
'*.dockerArgs',
'command',
'*.command',
// Environment variable containers
'envVars.AWS_SECRET_ACCESS_KEY',
'envVars.AWS_ACCESS_KEY_ID',
'envVars.AWS_SESSION_TOKEN',
'envVars.AWS_SECURITY_TOKEN',
'envVars.GITHUB_TOKEN',
'envVars.GH_TOKEN',
'envVars.ANTHROPIC_API_KEY',
'envVars.GITHUB_WEBHOOK_SECRET',
'envVars.WEBHOOK_SECRET',
'envVars.BOT_TOKEN',
'envVars.API_KEY',
'envVars.SECRET_KEY',
'envVars.ACCESS_TOKEN',
'envVars.REFRESH_TOKEN',
'envVars.JWT_SECRET',
'envVars.DATABASE_URL',
'envVars.DB_PASSWORD',
'envVars.REDIS_PASSWORD',
'env.AWS_SECRET_ACCESS_KEY',
'env.AWS_ACCESS_KEY_ID',
'env.AWS_SESSION_TOKEN',
'env.AWS_SECURITY_TOKEN',
'env.GITHUB_TOKEN',
'env.GH_TOKEN',
'env.ANTHROPIC_API_KEY',
'env.GITHUB_WEBHOOK_SECRET',
'env.WEBHOOK_SECRET',
'env.BOT_TOKEN',
'env.API_KEY',
'env.SECRET_KEY',
'env.ACCESS_TOKEN',
'env.REFRESH_TOKEN',
'env.JWT_SECRET',
'env.DATABASE_URL',
'env.DB_PASSWORD',
'env.REDIS_PASSWORD',
// Process environment variables (using bracket notation for nested objects)
'process["env"]["AWS_SECRET_ACCESS_KEY"]',
'process["env"]["AWS_ACCESS_KEY_ID"]',
'process["env"]["AWS_SESSION_TOKEN"]',
'process["env"]["AWS_SECURITY_TOKEN"]',
'process["env"]["GITHUB_TOKEN"]',
'process["env"]["GH_TOKEN"]',
'process["env"]["ANTHROPIC_API_KEY"]',
'process["env"]["GITHUB_WEBHOOK_SECRET"]',
'process["env"]["WEBHOOK_SECRET"]',
'process["env"]["BOT_TOKEN"]',
'process["env"]["API_KEY"]',
'process["env"]["SECRET_KEY"]',
'process["env"]["ACCESS_TOKEN"]',
'process["env"]["REFRESH_TOKEN"]',
'process["env"]["JWT_SECRET"]',
'process["env"]["DATABASE_URL"]',
'process["env"]["DB_PASSWORD"]',
'process["env"]["REDIS_PASSWORD"]',
// Process environment variables (as top-level bracket notation keys)
'["process.env.AWS_SECRET_ACCESS_KEY"]',
'["process.env.AWS_ACCESS_KEY_ID"]',
'["process.env.AWS_SESSION_TOKEN"]',
'["process.env.AWS_SECURITY_TOKEN"]',
'["process.env.GITHUB_TOKEN"]',
'["process.env.GH_TOKEN"]',
'["process.env.ANTHROPIC_API_KEY"]',
'["process.env.GITHUB_WEBHOOK_SECRET"]',
'["process.env.WEBHOOK_SECRET"]',
'["process.env.BOT_TOKEN"]',
'["process.env.API_KEY"]',
'["process.env.SECRET_KEY"]',
'["process.env.ACCESS_TOKEN"]',
'["process.env.REFRESH_TOKEN"]',
'["process.env.JWT_SECRET"]',
'["process.env.DATABASE_URL"]',
'["process.env.DB_PASSWORD"]',
'["process.env.REDIS_PASSWORD"]',
// Output streams that might contain leaked credentials
'stderr',
'*.stderr',
'stdout',
'*.stdout',
'output',
'*.output',
'logs',
'*.logs',
'message',
'*.message',
'data',
'*.data',
// Error objects that might contain sensitive information
'error.dockerCommand',
'error.stderr',
'error.stdout',
'error.output',
'error.message',
'error.data',
'err.dockerCommand',
'err.stderr',
'err.stdout',
'err.output',
'err.message',
'err.data',
// HTTP request/response objects
'request.headers.authorization',
'response.headers.authorization',
'req.headers.authorization',
'res.headers.authorization',
'*.request.headers.authorization',
'*.response.headers.authorization',
'*.req.headers.authorization',
'*.res.headers.authorization',
// File paths that might contain credentials
'credentialsPath',
'*.credentialsPath',
'keyPath',
'*.keyPath',
'secretPath',
'*.secretPath',
// Database connection strings and configurations
'connectionString',
'*.connectionString',
'dbUrl',
'*.dbUrl',
'mongoUrl',
'*.mongoUrl',
'redisUrl',
'*.redisUrl',
// Authentication objects
'auth.token',
'auth.secret',
'auth.key',
'auth.password',
'*.auth.token',
'*.auth.secret',
'*.auth.key',
'*.auth.password',
'authentication.token',
'authentication.secret',
'authentication.key',
'authentication.password',
'*.authentication.token',
'*.authentication.secret',
'*.authentication.key',
'*.authentication.password',
// Deep nested patterns (up to 4 levels deep)
'*.*.password',
'*.*.secret',
'*.*.token',
'*.*.apiKey',
'*.*.api_key',
'*.*.credential',
'*.*.key',
'*.*.privateKey',
'*.*.private_key',
'*.*.AWS_SECRET_ACCESS_KEY',
'*.*.AWS_ACCESS_KEY_ID',
'*.*.GITHUB_TOKEN',
'*.*.ANTHROPIC_API_KEY',
'*.*.connectionString',
'*.*.DATABASE_URL',
'*.*.*.password',
'*.*.*.secret',
'*.*.*.token',
'*.*.*.apiKey',
'*.*.*.api_key',
'*.*.*.credential',
'*.*.*.key',
'*.*.*.privateKey',
'*.*.*.private_key',
'*.*.*.AWS_SECRET_ACCESS_KEY',
'*.*.*.AWS_ACCESS_KEY_ID',
'*.*.*.GITHUB_TOKEN',
'*.*.*.ANTHROPIC_API_KEY',
'*.*.*.connectionString',
'*.*.*.DATABASE_URL',
'*.*.*.*.password',
'*.*.*.*.secret',
'*.*.*.*.token',
'*.*.*.*.apiKey',
'*.*.*.*.api_key',
'*.*.*.*.credential',
'*.*.*.*.key',
'*.*.*.*.privateKey',
'*.*.*.*.private_key',
'*.*.*.*.AWS_SECRET_ACCESS_KEY',
'*.*.*.*.AWS_ACCESS_KEY_ID',
'*.*.*.*.GITHUB_TOKEN',
'*.*.*.*.ANTHROPIC_API_KEY',
'*.*.*.*.connectionString',
'*.*.*.*.DATABASE_URL'
],
censor: '[REDACTED]'
}
});
// Add simple file rotation (will be replaced with pino-roll in production)
if (isProduction) {
// Check log file size and rotate if necessary
try {
const maxSize = 10 * 1024 * 1024; // 10MB
// eslint-disable-next-line no-sync
if (fs.existsSync(logFileName)) {
// eslint-disable-next-line no-sync
const stats = fs.statSync(logFileName);
if (stats.size > maxSize) {
// Simple rotation - keep up to 5 backup files
for (let i = 4; i >= 0; i--) {
const oldFile = `${logFileName}.${i}`;
const newFile = `${logFileName}.${i + 1}`;
// eslint-disable-next-line no-sync
if (fs.existsSync(oldFile)) {
// eslint-disable-next-line no-sync
fs.renameSync(oldFile, newFile);
}
}
// eslint-disable-next-line no-sync
fs.renameSync(logFileName, `${logFileName}.0`);
logger.info('Log file rotated');
}
}
} catch (error) {
logger.error({ err: error }, 'Error rotating log file');
}
}
// Log startup message
logger.info(
{
app: 'claude-github-webhook',
startTime: new Date().toISOString(),
nodeVersion: process.version,
env: process.env.NODE_ENV || 'development',
logLevel: logger.level
},
'Application starting'
);
// Create a child logger for specific components
const createLogger = component => {
return logger.child({ component });
};
// Export the logger factory
module.exports = {
logger,
createLogger
};

View File

@@ -1,54 +0,0 @@
/**
* Utilities for sanitizing text to prevent infinite loops and other issues
*/
const { createLogger } = require('./logger');
const logger = createLogger('sanitize');
/**
* Sanitizes text to prevent infinite loops by removing bot username mentions
* @param {string} text - The text to sanitize
* @returns {string} - Sanitized text
*/
function sanitizeBotMentions(text) {
if (!text) return text;
// Get bot username from environment variables - required
const BOT_USERNAME = process.env.BOT_USERNAME;
if (!BOT_USERNAME) {
logger.warn('BOT_USERNAME environment variable is not set. Cannot sanitize properly.');
return text;
}
// Create a regex to find all bot username mentions
// First escape any special regex characters
const escapedUsername = BOT_USERNAME.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
// Look for the username with @ symbol anywhere in the text
const botMentionRegex = new RegExp(escapedUsername, 'gi');
// Replace mentions with a sanitized version (remove @ symbol if present)
const sanitizedName = BOT_USERNAME.startsWith('@') ? BOT_USERNAME.substring(1) : BOT_USERNAME;
const sanitized = text.replace(botMentionRegex, sanitizedName);
// If sanitization occurred, log it
if (sanitized !== text) {
logger.warn('Sanitized bot mentions from text to prevent infinite loops');
}
return sanitized;
}
/**
* Sanitizes an array of labels to remove potentially sensitive or invalid characters.
* @param {string[]} labels - The array of labels to sanitize.
* @returns {string[]} - The sanitized array of labels.
*/
function sanitizeLabels(labels) {
return labels.map(label => label.replace(/[^a-zA-Z0-9:_-]/g, ''));
}
module.exports = {
sanitizeBotMentions,
sanitizeLabels
};

View File

@@ -1,101 +0,0 @@
const fs = require('fs');
const { logger } = require('./logger');
/**
* Secure credential loader - reads from files instead of env vars
* Files are mounted as Docker secrets or regular files
*/
class SecureCredentials {
constructor() {
this.credentials = new Map();
this.loadCredentials();
}
/**
* Load credentials from files or fallback to env vars
*/
loadCredentials() {
const credentialMappings = {
GITHUB_TOKEN: {
file: process.env.GITHUB_TOKEN_FILE || '/run/secrets/github_token',
env: 'GITHUB_TOKEN'
},
ANTHROPIC_API_KEY: {
file: process.env.ANTHROPIC_API_KEY_FILE || '/run/secrets/anthropic_api_key',
env: 'ANTHROPIC_API_KEY'
},
GITHUB_WEBHOOK_SECRET: {
file: process.env.GITHUB_WEBHOOK_SECRET_FILE || '/run/secrets/webhook_secret',
env: 'GITHUB_WEBHOOK_SECRET'
}
};
for (const [key, config] of Object.entries(credentialMappings)) {
let value = null;
// Try to read from file first (most secure)
try {
// eslint-disable-next-line no-sync
if (fs.existsSync(config.file)) {
// eslint-disable-next-line no-sync
value = fs.readFileSync(config.file, 'utf8').trim();
logger.info(`Loaded ${key} from secure file: ${config.file}`);
}
} catch (error) {
logger.warn(`Failed to read ${key} from file ${config.file}: ${error.message}`);
}
// Fallback to environment variable (less secure)
if (!value && process.env[config.env]) {
value = process.env[config.env];
logger.warn(`Using ${key} from environment variable (less secure)`);
}
if (value) {
this.credentials.set(key, value);
} else {
logger.error(`No credential found for ${key}`);
}
}
}
/**
* Get credential value
* @param {string} key - Credential key
* @returns {string|null} - Credential value or null if not found
*/
get(key) {
return this.credentials.get(key) || null;
}
/**
* Check if credential exists
* @param {string} key - Credential key
* @returns {boolean}
*/
has(key) {
return this.credentials.has(key);
}
/**
* Get all available credential keys (for debugging)
* @returns {string[]}
*/
getAvailableKeys() {
return Array.from(this.credentials.keys());
}
/**
* Reload credentials (useful for credential rotation)
*/
reload() {
this.credentials.clear();
this.loadCredentials();
logger.info('Credentials reloaded');
}
}
// Create singleton instance
const secureCredentials = new SecureCredentials();
module.exports = secureCredentials;

View File

@@ -1,66 +0,0 @@
const { createLogger } = require('./logger');
class StartupMetrics {
constructor() {
this.logger = createLogger('startup-metrics');
this.startTime = Date.now();
this.milestones = {};
this.isReady = false;
}
recordMilestone(name, description = '') {
const timestamp = Date.now();
const elapsed = timestamp - this.startTime;
this.milestones[name] = {
timestamp,
elapsed,
description
};
this.logger.info(
{
milestone: name,
elapsed: `${elapsed}ms`,
description
},
`Startup milestone: ${name}`
);
return elapsed;
}
markReady() {
const totalTime = this.recordMilestone('service_ready', 'Service is ready to accept requests');
this.isReady = true;
this.logger.info(
{
totalStartupTime: `${totalTime}ms`,
milestones: this.milestones
},
'Service startup completed'
);
return totalTime;
}
getMetrics() {
return {
isReady: this.isReady,
totalElapsed: Date.now() - this.startTime,
milestones: this.milestones,
startTime: this.startTime
};
}
// Middleware to add startup metrics to responses
metricsMiddleware() {
return (req, res, next) => {
req.startupMetrics = this.getMetrics();
next();
};
}
}
module.exports = { StartupMetrics };