Compare commits

...

6 Commits

Author SHA1 Message Date
Jonathan Flatt
803410d3b2 fix: Update SessionManager tests for new implementation
- Update test to expect docker volume create instead of docker create
- Add unref() method to mock process objects to fix test environment error
- Update spawn expectations to match new docker run implementation
- Fix tests for both startSession and queueSession methods

Tests now pass in CI environment.
2025-06-04 00:51:32 +00:00
Jonathan Flatt
c1d8845b21 feat: Implement Claude orchestration with session management
- Add CLAUDE_WEBHOOK_SECRET for webhook authentication
- Fix Docker volume mounting for Claude credentials
- Capture Claude's internal session ID from stream-json output
- Update entrypoint script to support OUTPUT_FORMAT=stream-json
- Fix environment variable naming (REPOSITORY -> REPO_FULL_NAME)
- Enable parallel session execution with proper authentication
- Successfully tested creating PRs via orchestrated sessions

This enables the webhook to create and manage Claude Code sessions that can:
- Clone repositories
- Create feature branches
- Implement code changes
- Commit and push changes
- Create pull requests

All while capturing Claude's internal session ID for potential resumption.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-06-04 00:38:01 +00:00
Cheffromspace
ea812f5b8f fix: Fix failing tests for SessionManager and IssueHandler (#174)
- Update SessionManager tests to handle synchronous error throwing
- Fix IssueHandler tests to match actual handler implementation
- Update mock expectations to include all required parameters
- Change operationType from 'tagging' to 'auto-tagging'
- Fix return value expectations to match handler responses
- Remove unused imports and variables
2025-06-03 18:04:33 -05:00
Jonathan Flatt
346199ebbd feat: Implement combined test coverage for main project and CLI
- Add combined coverage script to merge lcov reports
- Update GitHub workflows to generate and upload combined coverage
- Install missing CLI dependencies (ora, yaml, cli-table3, mock-fs)
- Add initial tests for SessionManager and IssueHandler
- Exclude type-only files from coverage metrics
- Update jest config to exclude type files from coverage

This ensures Codecov receives coverage data from both the main project
and CLI subdirectory, providing accurate overall project coverage metrics.
2025-06-03 22:43:20 +00:00
Jonathan Flatt
8da021bb00 Update README 2025-06-03 21:44:43 +00:00
Cheffromspace
8926d0026d fix: Add comprehensive test suite to PR checks (#173)
* fix: Fix Claude integration tests by ensuring provider registration

The Claude webhook integration tests were failing because the provider
wasn't being registered before the routes were imported. This was due
to the conditional check that skips provider initialization in test mode.

Changes:
- Move environment variable setup before any imports
- Import Claude provider before importing webhook routes
- Remove duplicate provider registration from beforeAll hook

This ensures the Claude provider is properly registered with the webhook
registry before the tests run.

* fix: Add comprehensive test suite to PR checks

- Replace test:unit with test:ci to run full test suite (unit + integration)
- Add format:check for Prettier validation
- Add typecheck for TypeScript compilation checks
- Add codecov upload for PR coverage reporting
- Add TruffleHog secret scanning for PR changes

This ensures PRs catch all issues that would fail on main branch,
preventing post-merge failures.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>

* test: Remove obsolete Claude integration tests

These tests were for the deprecated /api/webhooks/claude endpoint
that was removed in commit dd5e6e6. The functionality is now covered
by unit tests for the new webhook provider architecture:
- ClaudeWebhookProvider.test.ts
- SessionHandler.test.ts
- OrchestrationHandler.test.ts

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-06-03 15:14:17 -05:00
26 changed files with 5059 additions and 835 deletions

View File

@@ -22,12 +22,18 @@ jobs:
cache: npm
- run: npm ci
- run: npm run lint:check
- run: npm run test:ci
- name: Install CLI dependencies
working-directory: ./cli
run: npm ci
- name: Generate combined coverage
run: ./scripts/combine-coverage.js
env:
NODE_ENV: test
- uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: ./coverage-combined
fail_ci_if_error: true
security:
runs-on: ubuntu-latest

View File

@@ -17,16 +17,33 @@ jobs:
node-version: ${{ env.NODE_VERSION }}
cache: npm
- run: npm ci
- run: npm run format:check
- run: npm run lint:check
- run: npm run test:unit
- run: npm run typecheck
- name: Install CLI dependencies
working-directory: ./cli
run: npm ci
- name: Generate combined coverage
run: ./scripts/combine-coverage.js
env:
NODE_ENV: test
- uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: ./coverage-combined
fail_ci_if_error: true
security:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: ./scripts/security/credential-audit.sh
- uses: trufflesecurity/trufflehog@main
with:
path: ./
base: ${{ github.event.pull_request.base.sha }}
head: ${{ github.event.pull_request.head.sha }}
extra_args: --debug --only-verified
docker:
runs-on: ubuntu-latest

View File

@@ -374,14 +374,6 @@ npm run dev
- ESLint + Prettier for code formatting
- Conventional commits for version management
### Security Checklist
- [ ] No hardcoded credentials
- [ ] All inputs sanitized
- [ ] Webhook signatures verified
- [ ] Container permissions minimal
- [ ] Logs redact sensitive data
## Troubleshooting
### Common Issues
@@ -408,4 +400,4 @@ npm run dev
## License
MIT - See the [LICENSE file](LICENSE) for details.
MIT - See the [LICENSE file](LICENSE) for details.

83
analyze-combined-coverage.js Executable file
View File

@@ -0,0 +1,83 @@
#!/usr/bin/env node
const fs = require('fs');
const path = require('path');
// Read combined lcov.info
const lcovPath = path.join(__dirname, 'coverage-combined', 'lcov.info');
if (!fs.existsSync(lcovPath)) {
console.error('No coverage-combined/lcov.info file found. Run npm run test:combined-coverage first.');
process.exit(1);
}
const lcovContent = fs.readFileSync(lcovPath, 'utf8');
const lines = lcovContent.split('\n');
let currentFile = null;
const fileStats = {};
let totalLines = 0;
let coveredLines = 0;
for (const line of lines) {
if (line.startsWith('SF:')) {
currentFile = line.substring(3);
if (!fileStats[currentFile]) {
fileStats[currentFile] = { lines: 0, covered: 0, functions: 0, functionsHit: 0 };
}
} else if (line.startsWith('DA:')) {
const [lineNum, hits] = line.substring(3).split(',').map(Number);
if (currentFile) {
fileStats[currentFile].lines++;
totalLines++;
if (hits > 0) {
fileStats[currentFile].covered++;
coveredLines++;
}
}
}
}
const overallCoverage = (coveredLines / totalLines) * 100;
console.log('\n=== Combined Coverage Analysis ===\n');
console.log(`Total Lines: ${totalLines}`);
console.log(`Covered Lines: ${coveredLines}`);
console.log(`Overall Coverage: ${overallCoverage.toFixed(2)}%`);
console.log(`Target: 80%`);
console.log(`Status: ${overallCoverage >= 80 ? '✅ PASSED' : '❌ FAILED'}\n`);
// Break down by directory
const srcFiles = Object.entries(fileStats).filter(([file]) => file.startsWith('src/'));
const cliFiles = Object.entries(fileStats).filter(([file]) => file.startsWith('cli/'));
const srcStats = srcFiles.reduce((acc, [, stats]) => ({
lines: acc.lines + stats.lines,
covered: acc.covered + stats.covered
}), { lines: 0, covered: 0 });
const cliStats = cliFiles.reduce((acc, [, stats]) => ({
lines: acc.lines + stats.lines,
covered: acc.covered + stats.covered
}), { lines: 0, covered: 0 });
console.log('=== Coverage by Component ===');
console.log(`Main src/: ${((srcStats.covered / srcStats.lines) * 100).toFixed(2)}% (${srcStats.covered}/${srcStats.lines} lines)`);
console.log(`CLI: ${((cliStats.covered / cliStats.lines) * 100).toFixed(2)}% (${cliStats.covered}/${cliStats.lines} lines)`);
// Show files with lowest coverage
console.log('\n=== Files with Lowest Coverage ===');
const sorted = Object.entries(fileStats)
.map(([file, stats]) => ({
file,
coverage: (stats.covered / stats.lines) * 100,
lines: stats.lines,
covered: stats.covered
}))
.sort((a, b) => a.coverage - b.coverage)
.slice(0, 10);
sorted.forEach(({ file, coverage, covered, lines }) => {
console.log(`${file.padEnd(60)} ${coverage.toFixed(2).padStart(6)}% (${covered}/${lines})`);
});
process.exit(overallCoverage >= 80 ? 0 : 1);

83
analyze-coverage.js Normal file
View File

@@ -0,0 +1,83 @@
#!/usr/bin/env node
const fs = require('fs');
const path = require('path');
// Read lcov.info
const lcovPath = path.join(__dirname, 'coverage', 'lcov.info');
if (!fs.existsSync(lcovPath)) {
console.error('No coverage/lcov.info file found. Run npm test:coverage first.');
process.exit(1);
}
const lcovContent = fs.readFileSync(lcovPath, 'utf8');
const lines = lcovContent.split('\n');
let currentFile = null;
const fileStats = {};
let totalLines = 0;
let coveredLines = 0;
for (const line of lines) {
if (line.startsWith('SF:')) {
currentFile = line.substring(3);
if (!fileStats[currentFile]) {
fileStats[currentFile] = { lines: 0, covered: 0, functions: 0, functionsHit: 0 };
}
} else if (line.startsWith('DA:')) {
const [lineNum, hits] = line.substring(3).split(',').map(Number);
if (currentFile) {
fileStats[currentFile].lines++;
totalLines++;
if (hits > 0) {
fileStats[currentFile].covered++;
coveredLines++;
}
}
} else if (line.startsWith('FNF:')) {
if (currentFile) {
fileStats[currentFile].functions = parseInt(line.substring(4));
}
} else if (line.startsWith('FNH:')) {
if (currentFile) {
fileStats[currentFile].functionsHit = parseInt(line.substring(4));
}
}
}
console.log('\n=== Coverage Analysis ===\n');
console.log(`Total Lines: ${totalLines}`);
console.log(`Covered Lines: ${coveredLines}`);
console.log(`Overall Coverage: ${((coveredLines / totalLines) * 100).toFixed(2)}%\n`);
console.log('=== File Breakdown ===\n');
const sortedFiles = Object.entries(fileStats).sort((a, b) => {
const coverageA = (a[1].covered / a[1].lines) * 100;
const coverageB = (b[1].covered / b[1].lines) * 100;
return coverageA - coverageB;
});
for (const [file, stats] of sortedFiles) {
const coverage = ((stats.covered / stats.lines) * 100).toFixed(2);
console.log(`${file.padEnd(60)} ${coverage.padStart(6)}% (${stats.covered}/${stats.lines} lines)`);
}
// Check if CLI coverage is included
console.log('\n=== Coverage Scope Analysis ===\n');
const cliFiles = sortedFiles.filter(([file]) => file.includes('cli/'));
const srcFiles = sortedFiles.filter(([file]) => file.startsWith('src/'));
console.log(`Main src/ files: ${srcFiles.length}`);
console.log(`CLI files: ${cliFiles.length}`);
if (cliFiles.length > 0) {
console.log('\nCLI files found in coverage:');
cliFiles.forEach(([file]) => console.log(` - ${file}`));
}
// Check for any unexpected files
const otherFiles = sortedFiles.filter(([file]) => !file.startsWith('src/') && !file.includes('cli/'));
if (otherFiles.length > 0) {
console.log('\nOther files in coverage:');
otherFiles.forEach(([file]) => console.log(` - ${file}`));
}

View File

@@ -0,0 +1,99 @@
#!/usr/bin/env node
const fs = require('fs');
const path = require('path');
// Coverage data from the test output
const coverageData = {
'src/index.ts': { statements: 92.64, branches: 78.94, functions: 85.71, lines: 92.64 },
'src/controllers/githubController.ts': { statements: 69.65, branches: 64.47, functions: 84.61, lines: 69.2 },
'src/core/webhook/WebhookProcessor.ts': { statements: 100, branches: 92.3, functions: 100, lines: 100 },
'src/core/webhook/WebhookRegistry.ts': { statements: 97.77, branches: 100, functions: 100, lines: 97.67 },
'src/core/webhook/constants.ts': { statements: 100, branches: 100, functions: 100, lines: 100 },
'src/core/webhook/index.ts': { statements: 0, branches: 100, functions: 0, lines: 0 },
'src/providers/claude/ClaudeWebhookProvider.ts': { statements: 77.41, branches: 46.66, functions: 100, lines: 77.41 },
'src/providers/claude/index.ts': { statements: 100, branches: 100, functions: 0, lines: 100 },
'src/providers/claude/handlers/OrchestrationHandler.ts': { statements: 95.65, branches: 75, functions: 100, lines: 95.65 },
'src/providers/claude/handlers/SessionHandler.ts': { statements: 96.66, branches: 89.28, functions: 100, lines: 96.66 },
'src/providers/claude/services/SessionManager.ts': { statements: 6.06, branches: 0, functions: 0, lines: 6.06 },
'src/providers/claude/services/TaskDecomposer.ts': { statements: 96.87, branches: 93.75, functions: 100, lines: 96.66 },
'src/providers/github/GitHubWebhookProvider.ts': { statements: 95.45, branches: 90.62, functions: 100, lines: 95.45 },
'src/providers/github/index.ts': { statements: 100, branches: 100, functions: 100, lines: 100 },
'src/providers/github/handlers/IssueHandler.ts': { statements: 30.43, branches: 0, functions: 0, lines: 30.43 },
'src/routes/github.ts': { statements: 100, branches: 100, functions: 100, lines: 100 },
'src/routes/webhooks.ts': { statements: 92.1, branches: 100, functions: 57.14, lines: 91.66 },
'src/services/claudeService.ts': { statements: 85.62, branches: 66.17, functions: 100, lines: 86.66 },
'src/services/githubService.ts': { statements: 72.22, branches: 78.57, functions: 75, lines: 71.93 },
'src/types/claude.ts': { statements: 0, branches: 100, functions: 100, lines: 0 },
'src/types/environment.ts': { statements: 0, branches: 0, functions: 0, lines: 0 },
'src/types/index.ts': { statements: 0, branches: 0, functions: 0, lines: 0 },
'src/utils/awsCredentialProvider.ts': { statements: 65.68, branches: 59.25, functions: 54.54, lines: 65.68 },
'src/utils/logger.ts': { statements: 51.61, branches: 47.36, functions: 100, lines: 51.72 },
'src/utils/sanitize.ts': { statements: 100, branches: 100, functions: 100, lines: 100 },
'src/utils/secureCredentials.ts': { statements: 54.28, branches: 70.58, functions: 33.33, lines: 54.28 },
'src/utils/startup-metrics.ts': { statements: 100, branches: 100, functions: 100, lines: 100 }
};
// Calculate different scenarios
console.log('\n=== Coverage Analysis - Matching Codecov ===\n');
// Scenario 1: Exclude type definition files
const withoutTypes = Object.entries(coverageData)
.filter(([file]) => !file.includes('/types/'))
.reduce((acc, [file, data]) => {
acc[file] = data;
return acc;
}, {});
const avgWithoutTypes = calculateAverage(withoutTypes);
console.log(`1. Without type files: ${avgWithoutTypes.toFixed(2)}%`);
// Scenario 2: Exclude files with 0% coverage
const withoutZeroCoverage = Object.entries(coverageData)
.filter(([file, data]) => data.lines > 0)
.reduce((acc, [file, data]) => {
acc[file] = data;
return acc;
}, {});
const avgWithoutZero = calculateAverage(withoutZeroCoverage);
console.log(`2. Without 0% coverage files: ${avgWithoutZero.toFixed(2)}%`);
// Scenario 3: Exclude specific low coverage files
const excludeLowCoverage = Object.entries(coverageData)
.filter(([file]) => {
return !file.includes('/types/') &&
!file.includes('SessionManager.ts') &&
!file.includes('IssueHandler.ts');
})
.reduce((acc, [file, data]) => {
acc[file] = data;
return acc;
}, {});
const avgExcludeLow = calculateAverage(excludeLowCoverage);
console.log(`3. Without types, SessionManager, IssueHandler: ${avgExcludeLow.toFixed(2)}%`);
// Scenario 4: Statement coverage only (what codecov might be reporting)
const statementOnly = calculateStatementAverage(coverageData);
console.log(`4. Statement coverage only: ${statementOnly.toFixed(2)}%`);
// Show which files have the biggest impact
console.log('\n=== Files with lowest coverage ===');
const sorted = Object.entries(coverageData)
.sort((a, b) => a[1].lines - b[1].lines)
.slice(0, 10);
sorted.forEach(([file, data]) => {
console.log(`${file.padEnd(60)} ${data.lines.toFixed(2)}%`);
});
function calculateAverage(data) {
const values = Object.values(data).map(d => d.lines);
return values.reduce((sum, val) => sum + val, 0) / values.length;
}
function calculateStatementAverage(data) {
const values = Object.values(data).map(d => d.statements);
return values.reduce((sum, val) => sum + val, 0) / values.length;
}

439
cli/package-lock.json generated
View File

@@ -10,12 +10,9 @@
"dependencies": {
"axios": "^1.6.2",
"chalk": "^4.1.2",
"cli-table3": "^0.6.3",
"commander": "^14.0.0",
"dotenv": "^16.3.1",
"ora": "^5.4.1",
"uuid": "^9.0.0",
"yaml": "^2.3.4"
"uuid": "^9.0.0"
},
"bin": {
"claude-hub": "claude-hub"
@@ -24,12 +21,16 @@
"@types/jest": "^29.5.0",
"@types/mock-fs": "^4.13.4",
"@types/node": "^20.10.0",
"@types/ora": "^3.1.0",
"@types/uuid": "^9.0.8",
"cli-table3": "^0.6.5",
"jest": "^29.5.0",
"mock-fs": "^5.5.0",
"ora": "^8.2.0",
"ts-jest": "^29.1.0",
"ts-node": "^10.9.2",
"typescript": "^5.3.2"
"typescript": "^5.3.2",
"yaml": "^2.8.0"
}
},
"node_modules/@ampproject/remapping": {
@@ -501,6 +502,7 @@
"version": "1.5.0",
"resolved": "https://registry.npmjs.org/@colors/colors/-/colors-1.5.0.tgz",
"integrity": "sha512-ooWCrlZP11i8GImSjTHYHLkvFDP48nS4+204nGb1RiX/WXYHmJA2III9/e2DWVabCESdW7hBAEzHRqUn9OUVvQ==",
"dev": true,
"optional": true,
"engines": {
"node": ">=0.1.90"
@@ -1016,6 +1018,7 @@
"resolved": "https://registry.npmjs.org/@types/mock-fs/-/mock-fs-4.13.4.tgz",
"integrity": "sha512-mXmM0o6lULPI8z3XNnQCpL0BGxPwx1Ul1wXYEPBGl4efShyxW2Rln0JOPEWGyZaYZMM6OVXM/15zUuFMY52ljg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
@@ -1029,6 +1032,16 @@
"undici-types": "~6.19.2"
}
},
"node_modules/@types/ora": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/@types/ora/-/ora-3.1.0.tgz",
"integrity": "sha512-4e15N42qhHRlxyP5SpX9fK3q4tXvEkdmGdof2DZ0mqPu7glrNT8cs9bbI73NhwEGApq1TSXhs2aFmn19VCTwCQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@types/stack-utils": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/@types/stack-utils/-/stack-utils-2.0.3.tgz",
@@ -1099,6 +1112,7 @@
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
"dev": true,
"engines": {
"node": ">=8"
}
@@ -1282,35 +1296,6 @@
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
"dev": true
},
"node_modules/base64-js": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
},
"node_modules/bl": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/bl/-/bl-4.1.0.tgz",
"integrity": "sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==",
"dependencies": {
"buffer": "^5.5.0",
"inherits": "^2.0.4",
"readable-stream": "^3.4.0"
}
},
"node_modules/brace-expansion": {
"version": "1.1.11",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
@@ -1386,29 +1371,6 @@
"node-int64": "^0.4.0"
}
},
"node_modules/buffer": {
"version": "5.7.1",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz",
"integrity": "sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"dependencies": {
"base64-js": "^1.3.1",
"ieee754": "^1.1.13"
}
},
"node_modules/buffer-from": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/buffer-from/-/buffer-from-1.1.2.tgz",
@@ -1511,20 +1473,26 @@
"dev": true
},
"node_modules/cli-cursor": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/cli-cursor/-/cli-cursor-3.1.0.tgz",
"integrity": "sha512-I/zHAwsKf9FqGoXM4WWRACob9+SNukZTd94DWF57E4toouRulbCxcUh6RKUEOQlYTHJnzkPMySvPNaaSLNfLZw==",
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/cli-cursor/-/cli-cursor-5.0.0.tgz",
"integrity": "sha512-aCj4O5wKyszjMmDT4tZj93kxyydN/K5zPWSCe6/0AV/AA1pqe5ZBIw0a2ZfPQV7lL5/yb5HsUreJ6UFAF1tEQw==",
"dev": true,
"license": "MIT",
"dependencies": {
"restore-cursor": "^3.1.0"
"restore-cursor": "^5.0.0"
},
"engines": {
"node": ">=8"
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/cli-spinners": {
"version": "2.9.2",
"resolved": "https://registry.npmjs.org/cli-spinners/-/cli-spinners-2.9.2.tgz",
"integrity": "sha512-ywqV+5MmyL4E7ybXgKys4DugZbX0FC6LnwrhjuykIjnK9k8OQacQ7axGKnjDXWNhns0xot3bZI5h55H8yo9cJg==",
"dev": true,
"engines": {
"node": ">=6"
},
@@ -1536,6 +1504,8 @@
"version": "0.6.5",
"resolved": "https://registry.npmjs.org/cli-table3/-/cli-table3-0.6.5.tgz",
"integrity": "sha512-+W/5efTR7y5HRD7gACw9yQjqMVvEMLBHmboM/kPWam+H+Hmyrgjh6YncVKK122YZkXrLudzTuAukUw9FnMf7IQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"string-width": "^4.2.0"
},
@@ -1560,14 +1530,6 @@
"node": ">=12"
}
},
"node_modules/clone": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/clone/-/clone-1.0.4.tgz",
"integrity": "sha512-JQHZ2QMW6l3aH/j6xCqQThY/9OH4D/9ls34cgkUBiEeocRTU04tHfKPBsUK1PqZCUQM7GiA0IIXJSuXHI64Kbg==",
"engines": {
"node": ">=0.8"
}
},
"node_modules/co": {
"version": "4.6.0",
"resolved": "https://registry.npmjs.org/co/-/co-4.6.0.tgz",
@@ -1712,17 +1674,6 @@
"node": ">=0.10.0"
}
},
"node_modules/defaults": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/defaults/-/defaults-1.0.4.tgz",
"integrity": "sha512-eFuaLoy/Rxalv2kr+lqMlUnrDWV+3j4pljOIJgLIhI058IQfWJ7vXhyEIHu+HtC738klGALYxOKDO0bQP3tg8A==",
"dependencies": {
"clone": "^1.0.2"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/delayed-stream": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
@@ -1818,7 +1769,8 @@
"node_modules/emoji-regex": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
"dev": true
},
"node_modules/error-ex": {
"version": "1.3.2",
@@ -2098,6 +2050,19 @@
"node": "6.* || 8.* || >= 10.*"
}
},
"node_modules/get-east-asian-width": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/get-east-asian-width/-/get-east-asian-width-1.3.0.tgz",
"integrity": "sha512-vpeMIQKxczTD/0s2CdEWHcb0eeJe6TFjxb+J5xgX7hScxqrGuyjmv4c1D4A/gelKfyox0gJJwIHF+fLjeaM8kQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/get-intrinsic": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
@@ -2260,25 +2225,6 @@
"node": ">=10.17.0"
}
},
"node_modules/ieee754": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz",
"integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
},
"node_modules/import-local": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/import-local/-/import-local-3.2.0.tgz",
@@ -2321,7 +2267,8 @@
"node_modules/inherits": {
"version": "2.0.4",
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
"dev": true
},
"node_modules/is-arrayish": {
"version": "0.2.1",
@@ -2348,6 +2295,7 @@
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
"dev": true,
"engines": {
"node": ">=8"
}
@@ -2362,11 +2310,16 @@
}
},
"node_modules/is-interactive": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/is-interactive/-/is-interactive-1.0.0.tgz",
"integrity": "sha512-2HvIEKRoqS62guEC+qBjpvRubdX910WCMuJTZ+I9yvqKU2/12eSL549HMwtabb4oupdj2sMP50k+XJfB/8JE6w==",
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/is-interactive/-/is-interactive-2.0.0.tgz",
"integrity": "sha512-qP1vozQRI+BMOPcjFzrjXuQvdak2pHNUMZoeG2eRbiSqyvbEf/wQtEOTOX1guk6E3t36RkaqiSt8A/6YElNxLQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8"
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-number": {
@@ -2391,11 +2344,13 @@
}
},
"node_modules/is-unicode-supported": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/is-unicode-supported/-/is-unicode-supported-0.1.0.tgz",
"integrity": "sha512-knxG2q4UC3u8stRGyAVJCOdxFmv5DZiRcdlIaAQXAbSfJya+OhopNotLQrstBhququ4ZpuKbDc/8S6mgXgPFPw==",
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/is-unicode-supported/-/is-unicode-supported-2.1.0.tgz",
"integrity": "sha512-mE00Gnza5EEB3Ds0HfMyllZzbBrmLOX3vfWoj9A9PEnTfratQ/BcaJOuMhnkhjXvb2+FkY3VuHqtAGpTPmglFQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=10"
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
@@ -3162,15 +3117,43 @@
"dev": true
},
"node_modules/log-symbols": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/log-symbols/-/log-symbols-4.1.0.tgz",
"integrity": "sha512-8XPvpAA8uyhfteu8pIvQxpJZ7SYYdpUivZpGy6sFsBuKRY/7rQGavedeB8aK+Zkyq6upMFVL/9AW6vOYzfRyLg==",
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/log-symbols/-/log-symbols-6.0.0.tgz",
"integrity": "sha512-i24m8rpwhmPIS4zscNzK6MSEhk0DUWa/8iYQWxhffV8jkI4Phvs3F+quL5xvS0gdQR0FyTCMMH33Y78dDTzzIw==",
"dev": true,
"license": "MIT",
"dependencies": {
"chalk": "^4.1.0",
"is-unicode-supported": "^0.1.0"
"chalk": "^5.3.0",
"is-unicode-supported": "^1.3.0"
},
"engines": {
"node": ">=10"
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/log-symbols/node_modules/chalk": {
"version": "5.4.1",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-5.4.1.tgz",
"integrity": "sha512-zgVZuo2WcZgfUEmsn6eO3kINexW8RAE4maiQ8QNs8CtpPCSyMiYsULR3HQYkm3w8FIA3SberyMJMSldGsW+U3w==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^12.17.0 || ^14.13 || >=16.0.0"
},
"funding": {
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/log-symbols/node_modules/is-unicode-supported": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/is-unicode-supported/-/is-unicode-supported-1.3.0.tgz",
"integrity": "sha512-43r2mRvz+8JRIKnWJ+3j8JtjRKZ6GmjzfaE/qiBJnikNnYv/6bagRJ1kUhNk8R5EX/GkobD+r+sfxCPJsiKBLQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
@@ -3277,10 +3260,24 @@
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/mimic-fn/-/mimic-fn-2.1.0.tgz",
"integrity": "sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg==",
"dev": true,
"engines": {
"node": ">=6"
}
},
"node_modules/mimic-function": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/mimic-function/-/mimic-function-5.0.1.tgz",
"integrity": "sha512-VP79XUPxV2CigYP3jWwAUFSku2aKqBH7uTAapFWCBqutsbmDo96KY5o8uh6U+/YSIn5OxJnXp73beVkpqMIGhA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/minimatch": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
@@ -3298,6 +3295,7 @@
"resolved": "https://registry.npmjs.org/mock-fs/-/mock-fs-5.5.0.tgz",
"integrity": "sha512-d/P1M/RacgM3dB0sJ8rjeRNXxtapkPCUnMGmIN0ixJ16F/E4GUZCvWcSGfWGz8eaXYvn1s9baUwNjI4LOPEjiA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12.0.0"
}
@@ -3360,6 +3358,7 @@
"version": "5.1.2",
"resolved": "https://registry.npmjs.org/onetime/-/onetime-5.1.2.tgz",
"integrity": "sha512-kbpaSSGJTWdAY5KPVeMOKXSrPtr8C8C7wodJbcsd51jRnmD+GZu8Y0VoU6Dm5Z4vWr0Ig/1NKuWRKf7j5aaYSg==",
"dev": true,
"dependencies": {
"mimic-fn": "^2.1.0"
},
@@ -3371,27 +3370,96 @@
}
},
"node_modules/ora": {
"version": "5.4.1",
"resolved": "https://registry.npmjs.org/ora/-/ora-5.4.1.tgz",
"integrity": "sha512-5b6Y85tPxZZ7QytO+BQzysW31HJku27cRIlkbAXaNx+BdcVi+LlRFmVXzeF6a7JCwJpyw5c4b+YSVImQIrBpuQ==",
"version": "8.2.0",
"resolved": "https://registry.npmjs.org/ora/-/ora-8.2.0.tgz",
"integrity": "sha512-weP+BZ8MVNnlCm8c0Qdc1WSWq4Qn7I+9CJGm7Qali6g44e/PUzbjNqJX5NJ9ljlNMosfJvg1fKEGILklK9cwnw==",
"dev": true,
"license": "MIT",
"dependencies": {
"bl": "^4.1.0",
"chalk": "^4.1.0",
"cli-cursor": "^3.1.0",
"cli-spinners": "^2.5.0",
"is-interactive": "^1.0.0",
"is-unicode-supported": "^0.1.0",
"log-symbols": "^4.1.0",
"strip-ansi": "^6.0.0",
"wcwidth": "^1.0.1"
"chalk": "^5.3.0",
"cli-cursor": "^5.0.0",
"cli-spinners": "^2.9.2",
"is-interactive": "^2.0.0",
"is-unicode-supported": "^2.0.0",
"log-symbols": "^6.0.0",
"stdin-discarder": "^0.2.2",
"string-width": "^7.2.0",
"strip-ansi": "^7.1.0"
},
"engines": {
"node": ">=10"
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/ora/node_modules/ansi-regex": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-6.1.0.tgz",
"integrity": "sha512-7HSX4QQb4CspciLpVFwyRe79O3xsIZDDLER21kERQ71oaPodF8jL725AgJMFAYbooIqolJoRLuM81SpeUkpkvA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/chalk/ansi-regex?sponsor=1"
}
},
"node_modules/ora/node_modules/chalk": {
"version": "5.4.1",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-5.4.1.tgz",
"integrity": "sha512-zgVZuo2WcZgfUEmsn6eO3kINexW8RAE4maiQ8QNs8CtpPCSyMiYsULR3HQYkm3w8FIA3SberyMJMSldGsW+U3w==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^12.17.0 || ^14.13 || >=16.0.0"
},
"funding": {
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/ora/node_modules/emoji-regex": {
"version": "10.4.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-10.4.0.tgz",
"integrity": "sha512-EC+0oUMY1Rqm4O6LLrgjtYDvcVYTy7chDnM4Q7030tP4Kwj3u/pR6gP9ygnp2CJMK5Gq+9Q2oqmrFJAz01DXjw==",
"dev": true,
"license": "MIT"
},
"node_modules/ora/node_modules/string-width": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/string-width/-/string-width-7.2.0.tgz",
"integrity": "sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"emoji-regex": "^10.3.0",
"get-east-asian-width": "^1.0.0",
"strip-ansi": "^7.1.0"
},
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/ora/node_modules/strip-ansi": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.1.0.tgz",
"integrity": "sha512-iq6eVVI64nQQTRYq2KtEg2d2uU7LElhTJwsH4YzIHZshxlgZms/wIc4VoDQTlG/IvVIrBKG06CrZnp0qv7hkcQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"ansi-regex": "^6.0.1"
},
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/chalk/strip-ansi?sponsor=1"
}
},
"node_modules/p-limit": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz",
@@ -3599,19 +3667,6 @@
"integrity": "sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==",
"dev": true
},
"node_modules/readable-stream": {
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz",
"integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==",
"dependencies": {
"inherits": "^2.0.3",
"string_decoder": "^1.1.1",
"util-deprecate": "^1.0.1"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/require-directory": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
@@ -3672,35 +3727,50 @@
}
},
"node_modules/restore-cursor": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/restore-cursor/-/restore-cursor-3.1.0.tgz",
"integrity": "sha512-l+sSefzHpj5qimhFSE5a8nufZYAM3sBSVMAPtYkmC+4EH2anSGaEMXSD0izRQbu9nfyQ9y5JrVmp7E8oZrUjvA==",
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/restore-cursor/-/restore-cursor-5.1.0.tgz",
"integrity": "sha512-oMA2dcrw6u0YfxJQXm342bFKX/E4sG9rbTzO9ptUcR/e8A33cHuvStiYOwH7fszkZlZ1z/ta9AAoPk2F4qIOHA==",
"dev": true,
"license": "MIT",
"dependencies": {
"onetime": "^5.1.0",
"signal-exit": "^3.0.2"
"onetime": "^7.0.0",
"signal-exit": "^4.1.0"
},
"engines": {
"node": ">=8"
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/safe-buffer": {
"version": "5.2.1",
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
"integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
"node_modules/restore-cursor/node_modules/onetime": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/onetime/-/onetime-7.0.0.tgz",
"integrity": "sha512-VXJjc87FScF88uafS3JllDgvAm+c/Slfz06lorj2uAY34rlUu0Nt+v8wreiImcrgAjjIHp1rXpTDlLOGw29WwQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"mimic-function": "^5.0.0"
},
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/restore-cursor/node_modules/signal-exit": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-4.1.0.tgz",
"integrity": "sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==",
"dev": true,
"license": "ISC",
"engines": {
"node": ">=14"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/semver": {
"version": "6.3.1",
@@ -3735,7 +3805,8 @@
"node_modules/signal-exit": {
"version": "3.0.7",
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz",
"integrity": "sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ=="
"integrity": "sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==",
"dev": true
},
"node_modules/sisteransi": {
"version": "1.0.5",
@@ -3789,12 +3860,17 @@
"node": ">=10"
}
},
"node_modules/string_decoder": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
"integrity": "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==",
"dependencies": {
"safe-buffer": "~5.2.0"
"node_modules/stdin-discarder": {
"version": "0.2.2",
"resolved": "https://registry.npmjs.org/stdin-discarder/-/stdin-discarder-0.2.2.tgz",
"integrity": "sha512-UhDfHmA92YAlNnCfhmq0VeNL5bDbiZGg7sZ2IvPsXubGkiNa9EC+tUTsjBRsYUAz87btI6/1wf4XoVvQ3uRnmQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/string-length": {
@@ -3814,6 +3890,7 @@
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
"dev": true,
"dependencies": {
"emoji-regex": "^8.0.0",
"is-fullwidth-code-point": "^3.0.0",
@@ -3827,6 +3904,7 @@
"version": "6.0.1",
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
"dev": true,
"dependencies": {
"ansi-regex": "^5.0.1"
},
@@ -4105,11 +4183,6 @@
"browserslist": ">= 4.21.0"
}
},
"node_modules/util-deprecate": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
"integrity": "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw=="
},
"node_modules/uuid": {
"version": "9.0.1",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-9.0.1.tgz",
@@ -4151,14 +4224,6 @@
"makeerror": "1.0.12"
}
},
"node_modules/wcwidth": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/wcwidth/-/wcwidth-1.0.1.tgz",
"integrity": "sha512-XHPEwS0q6TaxcvG85+8EYkbiCux2XtWG2mkc47Ng2A77BQu9+DqIOJldST4HgPkuea7dvKSj5VgX3P1d4rW8Tg==",
"dependencies": {
"defaults": "^1.0.3"
}
},
"node_modules/which": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
@@ -4229,6 +4294,8 @@
"version": "2.8.0",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.0.tgz",
"integrity": "sha512-4lLa/EcQCB0cJkyts+FpIRx5G/llPxfP6VQU5KByHEhLxY3IJCH0f0Hy1MHI8sClTvsIb8qwRJ6R/ZdlDJ/leQ==",
"dev": true,
"license": "ISC",
"bin": {
"yaml": "bin.mjs"
},

View File

@@ -20,22 +20,23 @@
"dependencies": {
"axios": "^1.6.2",
"chalk": "^4.1.2",
"cli-table3": "^0.6.3",
"commander": "^14.0.0",
"dotenv": "^16.3.1",
"ora": "^5.4.1",
"uuid": "^9.0.0",
"yaml": "^2.3.4"
"uuid": "^9.0.0"
},
"devDependencies": {
"@types/jest": "^29.5.0",
"@types/mock-fs": "^4.13.4",
"@types/node": "^20.10.0",
"@types/ora": "^3.1.0",
"@types/uuid": "^9.0.8",
"cli-table3": "^0.6.5",
"jest": "^29.5.0",
"mock-fs": "^5.5.0",
"ora": "^8.2.0",
"ts-jest": "^29.1.0",
"ts-node": "^10.9.2",
"typescript": "^5.3.2"
"typescript": "^5.3.2",
"yaml": "^2.8.0"
}
}

3819
coverage-combined/lcov.info Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -34,6 +34,7 @@ services:
- GITHUB_TOKEN=${GITHUB_TOKEN}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- GITHUB_WEBHOOK_SECRET=${GITHUB_WEBHOOK_SECRET}
- CLAUDE_WEBHOOK_SECRET=${CLAUDE_WEBHOOK_SECRET}
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:${PORT:-3002}/health"]

4
get-session.json Normal file
View File

@@ -0,0 +1,4 @@
{
"type": "session.get",
"sessionId": "d4ac40bf-1290-4237-83fe-53a4a6197dc5"
}

View File

@@ -26,6 +26,7 @@ module.exports = {
collectCoverageFrom: [
'src/**/*.{js,ts}',
'!src/**/*.d.ts',
'!src/types/**/*.ts',
'!**/node_modules/**',
'!**/dist/**'
],

View File

@@ -19,6 +19,7 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch",
"test:ci": "jest --ci --coverage --testPathPattern='test/(unit|integration).*\\.test\\.(js|ts)$'",
"test:combined-coverage": "./scripts/combine-coverage.js",
"test:docker": "docker-compose -f docker-compose.test.yml run --rm test",
"test:docker:integration": "docker-compose -f docker-compose.test.yml run --rm integration-test",
"test:docker:e2e": "docker-compose -f docker-compose.test.yml run --rm e2e-test",

88
scripts/combine-coverage.js Executable file
View File

@@ -0,0 +1,88 @@
#!/usr/bin/env node
const { execSync } = require('child_process');
const fs = require('fs');
const path = require('path');
/**
* Combine coverage reports from main project and CLI
*/
// Ensure coverage directories exist
const mainCoverageDir = path.join(__dirname, '..', 'coverage');
const cliCoverageDir = path.join(__dirname, '..', 'cli', 'coverage');
const combinedCoverageDir = path.join(__dirname, '..', 'coverage-combined');
// Create combined coverage directory
if (!fs.existsSync(combinedCoverageDir)) {
fs.mkdirSync(combinedCoverageDir, { recursive: true });
}
console.log('Generating main project coverage...');
try {
execSync('npm run test:ci', { stdio: 'inherit', cwd: path.join(__dirname, '..') });
} catch (error) {
console.error('Failed to generate main project coverage');
process.exit(1);
}
console.log('\nGenerating CLI coverage...');
try {
execSync('npm run test:coverage', { stdio: 'inherit', cwd: path.join(__dirname, '..', 'cli') });
} catch (error) {
console.error('Failed to generate CLI coverage');
process.exit(1);
}
// Check if both coverage files exist
const mainLcov = path.join(mainCoverageDir, 'lcov.info');
const cliLcov = path.join(cliCoverageDir, 'lcov.info');
if (!fs.existsSync(mainLcov)) {
console.error('Main project lcov.info not found');
process.exit(1);
}
if (!fs.existsSync(cliLcov)) {
console.error('CLI lcov.info not found');
process.exit(1);
}
// Read both lcov files
const mainLcovContent = fs.readFileSync(mainLcov, 'utf8');
const cliLcovContent = fs.readFileSync(cliLcov, 'utf8');
// Adjust CLI paths to be relative to project root
const adjustedCliLcov = cliLcovContent.replace(/SF:src\//g, 'SF:cli/src/');
// Combine lcov files
const combinedLcov = mainLcovContent + '\n' + adjustedCliLcov;
// Write combined lcov file
const combinedLcovPath = path.join(combinedCoverageDir, 'lcov.info');
fs.writeFileSync(combinedLcovPath, combinedLcov);
console.log('\nCombined coverage report written to:', combinedLcovPath);
// Copy coverage-final.json files as well for better reporting
if (fs.existsSync(path.join(mainCoverageDir, 'coverage-final.json'))) {
const mainJson = JSON.parse(fs.readFileSync(path.join(mainCoverageDir, 'coverage-final.json'), 'utf8'));
const cliJson = JSON.parse(fs.readFileSync(path.join(cliCoverageDir, 'coverage-final.json'), 'utf8'));
// Adjust CLI paths in JSON
const adjustedCliJson = {};
for (const [key, value] of Object.entries(cliJson)) {
const adjustedKey = key.replace(/^src\//, 'cli/src/');
adjustedCliJson[adjustedKey] = value;
}
// Combine JSON coverage
const combinedJson = { ...mainJson, ...adjustedCliJson };
fs.writeFileSync(
path.join(combinedCoverageDir, 'coverage-final.json'),
JSON.stringify(combinedJson, null, 2)
);
}
console.log('\nCoverage combination complete!');
console.log('Upload coverage-combined/lcov.info to Codecov for full project coverage.');

View File

@@ -149,19 +149,37 @@ else
echo "DEBUG: Using $CLAUDE_USER_HOME as HOME for Claude CLI (fallback)" >&2
fi
sudo -u node -E env \
HOME="$CLAUDE_USER_HOME" \
PATH="/usr/local/bin:/usr/local/share/npm-global/bin:$PATH" \
ANTHROPIC_API_KEY="${ANTHROPIC_API_KEY}" \
GH_TOKEN="${GITHUB_TOKEN}" \
GITHUB_TOKEN="${GITHUB_TOKEN}" \
BASH_DEFAULT_TIMEOUT_MS="${BASH_DEFAULT_TIMEOUT_MS}" \
BASH_MAX_TIMEOUT_MS="${BASH_MAX_TIMEOUT_MS}" \
/usr/local/share/npm-global/bin/claude \
--allowedTools "${ALLOWED_TOOLS}" \
--verbose \
--print "${COMMAND}" \
> "${RESPONSE_FILE}" 2>&1
if [ "${OUTPUT_FORMAT}" = "stream-json" ]; then
# For stream-json, output directly to stdout for real-time processing
exec sudo -u node -E env \
HOME="$CLAUDE_USER_HOME" \
PATH="/usr/local/bin:/usr/local/share/npm-global/bin:$PATH" \
ANTHROPIC_API_KEY="${ANTHROPIC_API_KEY}" \
GH_TOKEN="${GITHUB_TOKEN}" \
GITHUB_TOKEN="${GITHUB_TOKEN}" \
BASH_DEFAULT_TIMEOUT_MS="${BASH_DEFAULT_TIMEOUT_MS}" \
BASH_MAX_TIMEOUT_MS="${BASH_MAX_TIMEOUT_MS}" \
/usr/local/share/npm-global/bin/claude \
--allowedTools "${ALLOWED_TOOLS}" \
--output-format stream-json \
--verbose \
--print "${COMMAND}"
else
# Default behavior - write to file
sudo -u node -E env \
HOME="$CLAUDE_USER_HOME" \
PATH="/usr/local/bin:/usr/local/share/npm-global/bin:$PATH" \
ANTHROPIC_API_KEY="${ANTHROPIC_API_KEY}" \
GH_TOKEN="${GITHUB_TOKEN}" \
GITHUB_TOKEN="${GITHUB_TOKEN}" \
BASH_DEFAULT_TIMEOUT_MS="${BASH_DEFAULT_TIMEOUT_MS}" \
BASH_MAX_TIMEOUT_MS="${BASH_MAX_TIMEOUT_MS}" \
/usr/local/share/npm-global/bin/claude \
--allowedTools "${ALLOWED_TOOLS}" \
--verbose \
--print "${COMMAND}" \
> "${RESPONSE_FILE}" 2>&1
fi
# Check for errors
if [ $? -ne 0 ]; then

9
session-request.json Normal file
View File

@@ -0,0 +1,9 @@
{
"type": "session.create",
"session": {
"project": {
"repository": "Cheffromspace/demo-repository",
"requirements": "Implement a hello world program in Python that prints 'Hello, World!' to the console. Create the file as hello_world.py in the root directory. After implementing, create a pull request with the changes."
}
}
}

View File

@@ -48,7 +48,7 @@ type SessionPayload =
* Provides CRUD operations for MCP integration
*/
export class SessionHandler implements WebhookEventHandler<ClaudeWebhookPayload> {
event = 'session';
event = 'session*';
private sessionManager: SessionManager;
constructor() {
@@ -145,6 +145,9 @@ export class SessionHandler implements WebhookEventHandler<ClaudeWebhookPayload>
status: 'initializing' as const
};
// Update the session in SessionManager with containerId
this.sessionManager.updateSession(createdSession);
logger.info('Session created', {
sessionId: createdSession.id,
type: createdSession.type,

View File

@@ -23,51 +23,22 @@ export class SessionManager {
// Generate container name
const containerName = `claude-${session.type}-${session.id.substring(0, 8)}`;
// Get Docker image from environment
const dockerImage = process.env.CLAUDE_CONTAINER_IMAGE ?? 'claudecode:latest';
// Set up volume mounts for persistent storage
const volumeName = `claude-session-${session.id.substring(0, 8)}`;
const volumeName = `${containerName}-volume`;
// Create container without starting it
const createCmd = [
'docker',
'create',
'--name',
containerName,
'--rm',
'-v',
`${volumeName}:/home/user/project`,
'-v',
`${volumeName}-claude:/home/user/.claude`,
'-e',
`SESSION_ID=${session.id}`,
'-e',
`SESSION_TYPE=${session.type}`,
'-e',
`GITHUB_TOKEN=${process.env.GITHUB_TOKEN ?? ''}`,
'-e',
`ANTHROPIC_API_KEY=${process.env.ANTHROPIC_API_KEY ?? ''}`,
'-e',
`REPOSITORY=${session.project.repository}`,
'-e',
`OPERATION_TYPE=session`,
'--workdir',
'/home/user/project',
dockerImage,
'/scripts/runtime/claudecode-entrypoint.sh'
];
logger.info('Creating container resources', { sessionId: session.id, containerName });
execSync(createCmd.join(' '), { stdio: 'pipe' });
// Create volume for workspace
execSync(`docker volume create ${volumeName}`, { stdio: 'pipe' });
logger.info('Container created', { sessionId: session.id, containerName });
logger.info('Container resources created', { sessionId: session.id, containerName });
// Store session
this.sessions.set(session.id, session);
return Promise.resolve(containerName);
} catch (error) {
logger.error('Failed to create container', { sessionId: session.id, error });
logger.error('Failed to create container resources', { sessionId: session.id, error });
throw error;
}
}
@@ -91,34 +62,84 @@ export class SessionManager {
// Prepare the command based on session type
const command = this.buildSessionCommand(session);
// Start the container and execute Claude
// Get Docker image from environment
const dockerImage = process.env.CLAUDE_CONTAINER_IMAGE ?? 'claudecode:latest';
// Start the container and execute Claude with stream-json output
const execCmd = [
'docker',
'exec',
'-i',
'run',
'--rm',
'--name',
session.containerId,
'claude',
'chat',
'--no-prompt',
'-m',
command
'-v',
`${session.containerId}-volume:/home/user/project`,
'-v',
`${process.env.CLAUDE_AUTH_HOST_DIR ?? process.env.HOME + '/.claude-hub'}:/home/node/.claude`,
'-e',
`SESSION_ID=${session.id}`,
'-e',
`SESSION_TYPE=${session.type}`,
'-e',
`GITHUB_TOKEN=${process.env.GITHUB_TOKEN ?? ''}`,
'-e',
`REPO_FULL_NAME=${session.project.repository}`,
'-e',
`COMMAND=${command}`,
'-e',
`OPERATION_TYPE=session`,
'-e',
`OUTPUT_FORMAT=stream-json`,
dockerImage
];
// First start the container
execSync(`docker start ${session.containerId}`, { stdio: 'pipe' });
// Then execute Claude command
// Start the container with Claude command
const dockerProcess = spawn(execCmd[0], execCmd.slice(1), {
env: process.env
env: process.env,
detached: true
});
// Collect output
const logs: string[] = [];
let firstLineProcessed = false;
dockerProcess.stdout.on('data', data => {
const line = data.toString();
logs.push(line);
logger.debug('Session output', { sessionId: session.id, line });
const lines = data
.toString()
.split('\n')
.filter((line: string) => line.trim());
for (const line of lines) {
logs.push(line);
// Process first line to get Claude session ID
if (!firstLineProcessed && line.trim()) {
firstLineProcessed = true;
try {
const initData = JSON.parse(line);
if (
initData.type === 'system' &&
initData.subtype === 'init' &&
initData.session_id
) {
session.claudeSessionId = initData.session_id;
this.sessions.set(session.id, session);
logger.info('Captured Claude session ID', {
sessionId: session.id,
claudeSessionId: session.claudeSessionId
});
}
} catch (err) {
logger.error('Failed to parse first line as JSON', {
sessionId: session.id,
line,
err
});
}
}
logger.debug('Session output', { sessionId: session.id, line });
}
});
dockerProcess.stderr.on('data', data => {
@@ -143,6 +164,9 @@ export class SessionManager {
this.notifyWaitingSessions(session.id);
});
// Unref the process so it can run independently
dockerProcess.unref();
return Promise.resolve();
} catch (error) {
logger.error('Failed to start session', { sessionId: session.id, error });
@@ -183,6 +207,13 @@ export class SessionManager {
return this.sessions.get(sessionId);
}
/**
* Update session
*/
updateSession(session: ClaudeSession): void {
this.sessions.set(session.id, session);
}
/**
* Get all sessions for an orchestration
*/

View File

@@ -90,6 +90,7 @@ export interface ClaudeSession {
type: SessionType;
status: SessionStatus;
containerId?: string;
claudeSessionId?: string; // Claude's internal session ID
project: ProjectInfo;
dependencies: string[];
startedAt?: Date;

View File

@@ -38,6 +38,10 @@ class SecureCredentials {
GITHUB_WEBHOOK_SECRET: {
file: process.env['GITHUB_WEBHOOK_SECRET_FILE'] ?? '/run/secrets/webhook_secret',
env: 'GITHUB_WEBHOOK_SECRET'
},
CLAUDE_WEBHOOK_SECRET: {
file: process.env['CLAUDE_WEBHOOK_SECRET_FILE'] ?? '/run/secrets/claude_webhook_secret',
env: 'CLAUDE_WEBHOOK_SECRET'
}
};

4
start-session-new.json Normal file
View File

@@ -0,0 +1,4 @@
{
"type": "session.start",
"sessionId": "d4ac40bf-1290-4237-83fe-53a4a6197dc5"
}

4
start-session.json Normal file
View File

@@ -0,0 +1,4 @@
{
"type": "session.start",
"sessionId": "aa592787-6451-45fd-8413-229260a18b45"
}

View File

@@ -1,394 +0,0 @@
import request from 'supertest';
import express from 'express';
// Mock child_process to prevent Docker commands
jest.mock('child_process', () => ({
execSync: jest.fn(() => ''),
spawn: jest.fn(() => ({
stdout: { on: jest.fn() },
stderr: { on: jest.fn() },
on: jest.fn((event, callback) => {
if (event === 'close') {
setTimeout(() => callback(0), 100);
}
})
}))
}));
// Mock SessionManager to avoid Docker calls in CI
jest.mock('../../../src/providers/claude/services/SessionManager', () => {
return {
SessionManager: jest.fn().mockImplementation(() => ({
createContainer: jest.fn().mockResolvedValue('mock-container-id'),
startSession: jest.fn().mockResolvedValue(undefined),
getSession: jest.fn().mockImplementation(id => ({
id,
status: 'running',
type: 'implementation',
project: { repository: 'test/repo', requirements: 'test' },
dependencies: []
})),
listSessions: jest.fn().mockResolvedValue([]),
getSessionOutput: jest.fn().mockResolvedValue({ output: 'test output' }),
canStartSession: jest.fn().mockResolvedValue(true),
updateSessionStatus: jest.fn().mockResolvedValue(undefined)
}))
};
});
// Now we can import the routes
import webhookRoutes from '../../../src/routes/webhooks';
// Mock environment variables
process.env.CLAUDE_WEBHOOK_SECRET = 'test-secret';
process.env.SKIP_WEBHOOK_VERIFICATION = '1';
describe('Claude Session Integration Tests', () => {
let app: express.Application;
beforeAll(() => {
// Import provider to register handlers
require('../../../src/providers/claude');
});
beforeEach(() => {
app = express();
app.use(express.json());
app.use('/api/webhooks', webhookRoutes);
});
afterEach(() => {
jest.clearAllMocks();
});
describe('POST /api/webhooks/claude - Session Management', () => {
it('should create a new session', async () => {
const payload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
expect(response.body.data.session).toMatchObject({
type: 'implementation',
status: 'initializing',
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
});
expect(response.body.data.session.id).toBeDefined();
expect(response.body.data.session.containerId).toBeDefined();
});
it('should create session with custom type', async () => {
const payload = {
data: {
type: 'session.create',
session: {
type: 'analysis',
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
expect(response.body.data.session.type).toBe('analysis');
});
it('should reject session creation without repository', async () => {
const payload = {
data: {
type: 'session.create',
session: {
project: {
requirements: 'Test requirements'
}
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(false);
expect(response.body.error).toBe('Repository is required for session creation');
});
it('should reject session creation without requirements', async () => {
const payload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo'
}
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(false);
expect(response.body.error).toBe('Requirements are required for session creation');
});
it('should handle session.get request', async () => {
// First create a session
const createPayload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
}
}
};
const createResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(createPayload);
const sessionId = createResponse.body.data.session.id;
// Then get the session
const getPayload = {
data: {
type: 'session.get',
sessionId
}
};
const getResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(getPayload);
expect(getResponse.status).toBe(200);
expect(getResponse.body.success).toBe(true);
expect(getResponse.body.data.session.id).toBe(sessionId);
});
it('should handle session.list request', async () => {
const payload = {
data: {
type: 'session.list'
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
expect(response.body.data.sessions).toBeDefined();
expect(Array.isArray(response.body.data.sessions)).toBe(true);
});
it('should handle session.start request', async () => {
// Create a session first
const createPayload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
}
}
};
const createResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(createPayload);
const sessionId = createResponse.body.data.session.id;
// Start the session
const startPayload = {
data: {
type: 'session.start',
sessionId
}
};
const startResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(startPayload);
expect(startResponse.status).toBe(200);
expect(startResponse.body.success).toBe(true);
expect(startResponse.body.message).toBe('Session started');
});
it('should handle session.output request', async () => {
// Create a session first
const createPayload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test requirements'
}
}
}
};
const createResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(createPayload);
const sessionId = createResponse.body.data.session.id;
// Get session output
const outputPayload = {
data: {
type: 'session.output',
sessionId
}
};
const outputResponse = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(outputPayload);
expect(outputResponse.status).toBe(200);
expect(outputResponse.body.success).toBe(true);
expect(outputResponse.body.data.sessionId).toBe(sessionId);
expect(outputResponse.body.data.output).toBeNull(); // No output yet
});
it('should reject requests without authentication', async () => {
const payload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test'
}
}
}
};
const response = await request(app).post('/api/webhooks/claude').send(payload);
expect(response.status).toBe(401);
expect(response.body.error).toBe('Unauthorized');
});
it('should reject requests with invalid authentication', async () => {
const payload = {
data: {
type: 'session.create',
session: {
project: {
repository: 'owner/repo',
requirements: 'Test'
}
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer wrong-secret')
.send(payload);
expect(response.status).toBe(401);
expect(response.body.error).toBe('Unauthorized');
});
});
describe('POST /api/webhooks/claude - Orchestration', () => {
it('should create orchestration session', async () => {
const payload = {
data: {
type: 'orchestrate',
project: {
repository: 'owner/repo',
requirements: 'Build a complete e-commerce platform'
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
expect(response.body.message).toBe('Orchestration session created');
expect(response.body.data).toMatchObject({
status: 'initiated',
summary: 'Created orchestration session for owner/repo'
});
expect(response.body.data.orchestrationId).toBeDefined();
expect(response.body.data.sessions).toHaveLength(1);
expect(response.body.data.sessions[0].type).toBe('coordination');
});
it('should create orchestration session without auto-start', async () => {
const payload = {
data: {
type: 'orchestrate',
autoStart: false,
project: {
repository: 'owner/repo',
requirements: 'Analyze and plan implementation'
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-secret')
.send(payload);
expect(response.status).toBe(200);
expect(response.body.success).toBe(true);
expect(response.body.data.sessions[0].status).toBe('initializing');
});
});
});

View File

@@ -1,174 +0,0 @@
import request from 'supertest';
import express from 'express';
// Mock child_process to prevent Docker commands
jest.mock('child_process', () => ({
execSync: jest.fn(() => ''),
spawn: jest.fn(() => ({
stdout: { on: jest.fn() },
stderr: { on: jest.fn() },
on: jest.fn((event, callback) => {
if (event === 'close') {
setTimeout(() => callback(0), 100);
}
})
}))
}));
// Mock SessionManager to avoid Docker calls in CI
jest.mock('../../../src/providers/claude/services/SessionManager', () => {
return {
SessionManager: jest.fn().mockImplementation(() => ({
createContainer: jest.fn().mockResolvedValue('mock-container-id'),
startSession: jest.fn().mockResolvedValue(undefined),
getSession: jest.fn().mockImplementation(id => ({
id,
status: 'running',
type: 'implementation',
project: { repository: 'test/repo', requirements: 'test' },
dependencies: []
})),
listSessions: jest.fn().mockResolvedValue([]),
getSessionOutput: jest.fn().mockResolvedValue({ output: 'test output' }),
canStartSession: jest.fn().mockResolvedValue(true),
updateSessionStatus: jest.fn().mockResolvedValue(undefined)
}))
};
});
// Now we can import the routes
import webhookRoutes from '../../../src/routes/webhooks';
// Set environment variables for testing
process.env.CLAUDE_WEBHOOK_SECRET = 'test-claude-secret';
process.env.SKIP_WEBHOOK_VERIFICATION = '1';
describe('Claude Webhook Integration', () => {
let app: express.Application;
beforeAll(() => {
// Import provider to register handlers
require('../../../src/providers/claude');
});
beforeEach(() => {
app = express();
app.use(express.json());
app.use('/api/webhooks', webhookRoutes);
});
afterEach(() => {
jest.clearAllMocks();
});
describe('POST /api/webhooks/claude', () => {
it('should accept valid orchestration request', async () => {
const payload = {
data: {
type: 'orchestrate',
project: {
repository: 'test-owner/test-repo',
requirements: 'Build a simple REST API with authentication'
},
strategy: {
parallelSessions: 3,
phases: ['analysis', 'implementation', 'testing']
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-claude-secret')
.send(payload)
.expect(200);
expect(response.body).toMatchObject({
message: 'Webhook processed',
event: 'orchestrate'
});
expect(response.body.results).toBeDefined();
expect(response.body.results[0].success).toBe(true);
});
it('should reject request without authorization', async () => {
const payload = {
data: {
type: 'orchestrate',
project: {
repository: 'test-owner/test-repo',
requirements: 'Build API'
}
}
};
// Remove skip verification for this test
const originalSkip = process.env.SKIP_WEBHOOK_VERIFICATION;
delete process.env.SKIP_WEBHOOK_VERIFICATION;
const response = await request(app).post('/api/webhooks/claude').send(payload).expect(401);
expect(response.body).toMatchObject({
error: 'Unauthorized'
});
// Restore skip verification
process.env.SKIP_WEBHOOK_VERIFICATION = originalSkip;
});
it('should handle session management request', async () => {
const payload = {
data: {
type: 'session',
sessionId: 'test-session-123',
project: {
repository: 'test-owner/test-repo',
requirements: 'Manage session'
}
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-claude-secret')
.send(payload)
.expect(200);
expect(response.body).toMatchObject({
message: 'Webhook processed',
event: 'session'
});
});
it('should reject invalid payload', async () => {
const payload = {
data: {
// Missing type field
invalid: 'data'
}
};
const response = await request(app)
.post('/api/webhooks/claude')
.set('Authorization', 'Bearer test-claude-secret')
.send(payload)
.expect(500);
expect(response.body.error).toBeDefined();
});
});
describe('GET /api/webhooks/health', () => {
it('should show Claude provider in health check', async () => {
const response = await request(app).get('/api/webhooks/health').expect(200);
expect(response.body.status).toBe('healthy');
expect(response.body.providers).toBeDefined();
const claudeProvider = response.body.providers.find((p: any) => p.name === 'claude');
expect(claudeProvider).toBeDefined();
expect(claudeProvider.handlerCount).toBeGreaterThan(0);
});
});
});

View File

@@ -0,0 +1,345 @@
import { SessionManager } from '../../../../../src/providers/claude/services/SessionManager';
import { execSync, spawn } from 'child_process';
import type { ClaudeSession } from '../../../../../src/types/claude-orchestration';
// Mock child_process
jest.mock('child_process', () => ({
execSync: jest.fn(),
spawn: jest.fn()
}));
// Mock logger
jest.mock('../../../../../src/utils/logger', () => ({
createLogger: () => ({
info: jest.fn(),
error: jest.fn(),
debug: jest.fn()
})
}));
describe('SessionManager', () => {
let sessionManager: SessionManager;
const mockExecSync = execSync as jest.MockedFunction<typeof execSync>;
const mockSpawn = spawn as jest.MockedFunction<typeof spawn>;
beforeEach(() => {
jest.clearAllMocks();
sessionManager = new SessionManager();
// Setup default mocks
mockExecSync.mockReturnValue(Buffer.from(''));
mockSpawn.mockReturnValue({
stdout: { on: jest.fn() },
stderr: { on: jest.fn() },
on: jest.fn()
} as any);
});
describe('createContainer', () => {
it('should create a container for a session', async () => {
const session: ClaudeSession = {
id: 'test-session-123',
type: 'analysis',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Test requirements',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const containerName = await sessionManager.createContainer(session);
expect(containerName).toBe('claude-analysis-test-ses');
expect(mockExecSync).toHaveBeenCalledWith(expect.stringContaining('docker volume create'), {
stdio: 'pipe'
});
});
it('should handle errors when creating container', () => {
const session: ClaudeSession = {
id: 'test-session-123',
type: 'analysis',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Test requirements',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
mockExecSync.mockImplementation(() => {
throw new Error('Docker error');
});
expect(() => sessionManager.createContainer(session)).toThrow('Docker error');
});
});
describe('startSession', () => {
it('should start a session with a container', async () => {
const session: ClaudeSession = {
id: 'test-session-123',
type: 'implementation',
status: 'pending',
containerId: 'container-123',
project: {
repository: 'owner/repo',
requirements: 'Implement feature X',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
// Mock spawn to simulate successful execution
const mockProcess = {
stdout: {
on: jest.fn((event, cb) => {
if (event === 'data') {
// Simulate stream-json output with Claude session ID
cb(
Buffer.from(
'{"type":"system","subtype":"init","session_id":"claude-session-123"}\n'
)
);
}
})
},
stderr: { on: jest.fn() },
on: jest.fn((event, cb) => {
if (event === 'close') cb(0);
}),
unref: jest.fn()
};
mockSpawn.mockReturnValue(mockProcess as any);
await sessionManager.startSession(session);
expect(mockSpawn).toHaveBeenCalledWith(
'docker',
expect.arrayContaining(['run', '--rm', '--name', 'container-123']),
expect.any(Object)
);
expect(mockProcess.unref).toHaveBeenCalled();
});
it('should throw error if session has no container ID', () => {
const session: ClaudeSession = {
id: 'test-session-123',
type: 'testing',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Test requirements',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
expect(() => sessionManager.startSession(session)).toThrow('Session has no container ID');
});
});
describe('getSession', () => {
it('should return a session by ID', async () => {
const session: ClaudeSession = {
id: 'test-session-123',
type: 'review',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Review code',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
await sessionManager.createContainer(session);
const retrieved = sessionManager.getSession('test-session-123');
expect(retrieved).toBeDefined();
expect(retrieved?.id).toBe('test-session-123');
});
it('should return undefined for non-existent session', () => {
const retrieved = sessionManager.getSession('non-existent');
expect(retrieved).toBeUndefined();
});
});
describe('getAllSessions', () => {
it('should return all sessions', async () => {
const session1: ClaudeSession = {
id: 'session-1',
type: 'analysis',
status: 'pending',
project: {
repository: 'owner/repo1',
requirements: 'Analyze',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const session2: ClaudeSession = {
id: 'session-2',
type: 'implementation',
status: 'pending',
project: {
repository: 'owner/repo2',
requirements: 'Implement',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
await sessionManager.createContainer(session1);
await sessionManager.createContainer(session2);
const allSessions = sessionManager.getAllSessions();
expect(allSessions).toHaveLength(2);
expect(allSessions.map(s => s.id)).toEqual(['session-1', 'session-2']);
});
});
describe('getOrchestrationSessions', () => {
it('should return sessions for a specific orchestration', async () => {
const session1: ClaudeSession = {
id: 'orch-123-session-1',
type: 'analysis',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Analyze',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const session2: ClaudeSession = {
id: 'orch-123-session-2',
type: 'implementation',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Implement',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const otherSession: ClaudeSession = {
id: 'orch-456-session-1',
type: 'testing',
status: 'pending',
project: {
repository: 'owner/repo',
requirements: 'Test',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
await sessionManager.createContainer(session1);
await sessionManager.createContainer(session2);
await sessionManager.createContainer(otherSession);
const orchSessions = sessionManager.getOrchestrationSessions('orch-123');
expect(orchSessions).toHaveLength(2);
expect(orchSessions.map(s => s.id)).toEqual(['orch-123-session-1', 'orch-123-session-2']);
});
});
describe('queueSession', () => {
it('should start session immediately if no dependencies', async () => {
const session: ClaudeSession = {
id: 'test-session',
type: 'analysis',
status: 'pending',
containerId: 'container-123',
project: {
repository: 'owner/repo',
requirements: 'Analyze',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const mockProcess = {
stdout: {
on: jest.fn((event, cb) => {
if (event === 'data') {
cb(
Buffer.from(
'{"type":"system","subtype":"init","session_id":"claude-session-123"}\n'
)
);
}
})
},
stderr: { on: jest.fn() },
on: jest.fn((event, cb) => {
if (event === 'close') cb(0);
}),
unref: jest.fn()
};
mockSpawn.mockReturnValue(mockProcess as any);
await sessionManager.queueSession(session);
expect(mockSpawn).toHaveBeenCalledWith(
'docker',
expect.arrayContaining(['run', '--rm', '--name', 'container-123']),
expect.any(Object)
);
});
it('should queue session if dependencies not met', async () => {
const depSession: ClaudeSession = {
id: 'dep-session',
type: 'analysis',
status: 'running',
project: {
repository: 'owner/repo',
requirements: 'Analyze',
constraints: []
},
dependencies: [],
createdAt: new Date()
};
const session: ClaudeSession = {
id: 'test-session',
type: 'implementation',
status: 'pending',
containerId: 'container-123',
project: {
repository: 'owner/repo',
requirements: 'Implement',
constraints: []
},
dependencies: ['dep-session'],
createdAt: new Date()
};
await sessionManager.createContainer(depSession);
await sessionManager.queueSession(session);
// Should not start immediately
expect(mockSpawn).not.toHaveBeenCalled();
});
});
});

View File

@@ -0,0 +1,111 @@
import { IssueOpenedHandler } from '../../../../../src/providers/github/handlers/IssueHandler';
// Mock dependencies
jest.mock('../../../../../src/utils/logger', () => ({
createLogger: () => ({
info: jest.fn(),
error: jest.fn(),
debug: jest.fn(),
warn: jest.fn()
})
}));
jest.mock('../../../../../src/utils/secureCredentials', () => ({
SecureCredentials: jest.fn().mockImplementation(() => ({
loadCredentials: jest.fn(),
getCredential: jest.fn().mockReturnValue('mock-value')
})),
secureCredentials: {
loadCredentials: jest.fn(),
getCredential: jest.fn().mockReturnValue('mock-value')
}
}));
jest.mock('../../../../../src/services/claudeService');
jest.mock('../../../../../src/services/githubService');
const claudeService = require('../../../../../src/services/claudeService');
describe('IssueOpenedHandler', () => {
let handler: IssueOpenedHandler;
beforeEach(() => {
jest.clearAllMocks();
handler = new IssueOpenedHandler();
});
describe('handle', () => {
const mockPayload = {
event: 'issues.opened',
data: {
action: 'opened',
issue: {
id: 123,
number: 1,
title: 'Test Issue',
body: 'This is a test issue about authentication and API integration',
labels: [],
state: 'open',
user: {
login: 'testuser',
id: 1
},
created_at: new Date().toISOString(),
updated_at: new Date().toISOString()
},
repository: {
id: 456,
name: 'test-repo',
full_name: 'owner/test-repo',
owner: {
login: 'owner',
id: 2
},
private: false
},
sender: {
login: 'testuser',
id: 1
}
}
};
const mockContext = {
timestamp: new Date(),
requestId: 'test-request-id'
};
it('should analyze and label new issues', async () => {
claudeService.processCommand = jest.fn().mockResolvedValue('Labels applied successfully');
const result = await handler.handle(mockPayload as any, mockContext);
expect(claudeService.processCommand).toHaveBeenCalledWith({
repoFullName: 'owner/test-repo',
issueNumber: 1,
command: expect.stringContaining('Analyze this GitHub issue'),
isPullRequest: false,
branchName: null,
operationType: 'auto-tagging'
});
expect(result).toEqual({
success: true,
message: 'Issue auto-tagged successfully',
data: {
repo: 'owner/test-repo',
issue: 1
}
});
});
it('should handle errors gracefully', async () => {
claudeService.processCommand = jest.fn().mockRejectedValue(new Error('Analysis failed'));
const result = await handler.handle(mockPayload as any, mockContext);
expect(result.success).toBe(false);
expect(result.error).toBe('Analysis failed');
});
});
});