-
Notifications
You must be signed in to change notification settings - Fork 952
Feat: General agents + presentations #252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: nightly
Are you sure you want to change the base?
Conversation
…ic coding agent implemented - Abstracted behaviors and objectives - Behavior and Objectives are bot h AgentComponent - CodeGeneratorAgent (Agent DO) houses common business logic - Implemented agentic coding agent and and assistant
- Implemented AI-powered project type prediction (app/workflow/presentation) with confidence scoring and auto-detection when projectType is 'auto' - Enhanced template selection to filter by project type and skip AI selection for single-template scenarios in workflow/presentation types - Added GitHub token caching in CodeGeneratorAgent for persistent OAuth sessions across exports - Updated commitlint config to allow longer commit messages (
- Initialize template cache during agent setup to avoid redundant fetches - Remove redundant project name prompt from template selection - Clean up default projectType fallback logic
- Added concurrency control to prevent duplicate workflow runs on the same PR - Replaced Claude-based comment cleanup with direct GitHub API deletion for better reliability - Enhanced code debugger instructions to handle Vite dev server restarts and config file restrictions
- Replaced unsafe type assertions with proper type guards for legacy state detection - Added explicit type definitions for deprecated state fields and legacy file formats - Eliminated all 'any' types while maintaining backward compatibility with legacy states
…ess design - Sandbox layer does not rely on templates now, instead expects raw files list - Tools to init/list templates, files - Templates can be chosen by agentic mode after creation - Restructured system prompt with detailed architecture explanations covering virtual filesystem, sandbox environment, and deployment flow - Better tool descriptions - Improved communication guidelines and workflow steps for better agent reasoning and execution
- Replaced agent mode toggle with project mode selector (App/Slides/Chat) that determines behavior type - Implemented agentic behavior detection for static content (docs, markdown) with automatic editor view - Conditionally render PhaseTimeline and deployment controls based on behavior type (phasic vs agentic)
- Replaced manual template_manager tool with init_suitable_template that uses the original template selector ai - Updated system prompts to emphasize template-first workflow for interactive projects with AI selector as mandatory first step - Simplified template selection process by removing manual list/select commands in favor of intelligent matching ```
- Added conversation history support to AgenticProjectBuilder with message preparation and context tracking - Implemented tool call completion callbacks to sync messages and trigger periodic compactification - Modified AgenticCodingBehavior to queue user inputs during builds and inject them between tool call chains using abort mechanism
- Fix importTemplate to actually work - Fixed template filtering logic to respect 'general' project type - Added behaviorType to logger context for better debugging - fixed not saving behaviorType to state
…ructor - Moved behaviorType and projectType initialization from hardcoded values to constructor-based setup - Changed initial state values to 'unknown' to ensure proper initialization through behavior constructor - Cleared template details cache when importing new templates to prevent stale data
- Moved user input idle check from PhasicCodingBehavior to CodeGeneratorAgent for consistent behavior across all modes - Fixed message order in agenticProjectBuilder to place history after user message instead of before - Added replaceExisting parameter to addConversationMessage for better control over message updates - Enhanced initial state restoration to include queued user messages and behaviorType - Added status and queuePosition fields
- Single convo id needs to be broadcasted but messages need to be saved with unique ids. - Fix message deduplication to use composite key (conversationId + role + tool_call_id) - Improved tool message filtering to validate against parent assistant tool_calls - Removed unused CodingAgentInterface stub file - Simplified addConversationMessage interface by removing replaceExisting parameter
- Added CompletionDetector interface and CompletionConfig for detecting task completion signals - Implemented dependency-aware parallel tool execution engine with resource conflict detection - Added LoopDetector to prevent infinite tool call loops with contextual warnings - Enhanced ToolCallContext with completion signal tracking and warning injection state - Modified tool execution to respect dependencies and execute in parallel groups
… and debugger - Added CompletionDetector to track completion signals via dedicated tools (mark_generation_complete, mark_debugging_complete) - Implemented LoopDetector to prevent infinite tool call loops with contextual warnings - Created wrapToolsWithLoopDetection utility to inject loop detection into tool execution flow - Enhanced system prompts to emphasize efficient parallel tool usage and completion discipline
Feat: Better tooling system
…at/general-agents
…ic coding agent implemented - Abstracted behaviors and objectives - Behavior and Objectives are bot h AgentComponent - CodeGeneratorAgent (Agent DO) houses common business logic - Implemented agentic coding agent and and assistant
- Implemented AI-powered project type prediction (app/workflow/presentation) with confidence scoring and auto-detection when projectType is 'auto' - Enhanced template selection to filter by project type and skip AI selection for single-template scenarios in workflow/presentation types - Added GitHub token caching in CodeGeneratorAgent for persistent OAuth sessions across exports - Updated commitlint config to allow longer commit messages (
- Initialize template cache during agent setup to avoid redundant fetches - Remove redundant project name prompt from template selection - Clean up default projectType fallback logic
Feat: General agent creation, support for app, workflow and slides/docs creation
…at/refactor-agents-base
…at/refactor-agents-base
…at/refactor-agents-base
…nge-builds into feat/refactor-agents-base
b0e7319 to
f6f4207
Compare
…at/refactor-agents-base
- Remove duplicate "Blueprint generation complete" message from chat hook - Fix file saving logic to track saved files during generation callbacks - Ensure deployment to sandbox after template setup completion - Import FileState type for proper file tracking
| let migratedBehaviorType = state.behaviorType; | ||
| if (isStateWithAgentMode(state)) { | ||
| migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic'; | ||
| needsMigration = true; | ||
| logger.info('Migrating agentMode to behaviorType', { | ||
| oldMode: state.agentMode, | ||
| newType: migratedBehaviorType | ||
| }); | ||
| } | ||
|
|
||
| return newState; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CRITICAL: State Migration Bug - behaviorType Not Applied
The migration logic computes migratedBehaviorType (lines 236-244) but NEVER assigns it to the newState object. This means:
- The migration is logged but not actually applied
- Existing sessions with
agentModewill lose their behavior preference - They will default to whatever
state.behaviorTypewas (likely undefined/unknown)
Fix Required:
| let migratedBehaviorType = state.behaviorType; | |
| if (isStateWithAgentMode(state)) { | |
| migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic'; | |
| needsMigration = true; | |
| logger.info('Migrating agentMode to behaviorType', { | |
| oldMode: state.agentMode, | |
| newType: migratedBehaviorType | |
| }); | |
| } | |
| return newState; | |
| let migratedBehaviorType = state.behaviorType; | |
| if (isStateWithAgentMode(state)) { | |
| migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic'; | |
| needsMigration = true; | |
| logger.info('Migrating agentMode to behaviorType', { | |
| oldMode: state.agentMode, | |
| newType: migratedBehaviorType | |
| }); | |
| // Apply the migrated value | |
| newState.behaviorType = migratedBehaviorType; | |
| delete stateWithDeprecated.agentMode; | |
| } | |
| return newState; |
This is a data integrity issue that could break existing user sessions.
| const isAgenticBlueprint = (bp?: Blueprint): bp is AgenticBlueprint => { | ||
| if (!bp) return false; | ||
| return 'plan' in bp && Array.isArray(bp.plan); | ||
| }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Low Priority: Inline Type Guard Can Be Simplified
While this is functional, the type guard doesn't need to be this verbose. TypeScript can infer from the property check alone:
| const isAgenticBlueprint = (bp?: Blueprint): bp is AgenticBlueprint => { | |
| if (!bp) return false; | |
| return 'plan' in bp && Array.isArray(bp.plan); | |
| }; | |
| const isAgenticBlueprint = (bp?: Blueprint): bp is AgenticBlueprint => { | |
| return !!bp && 'plan' in bp && Array.isArray(bp.plan); | |
| }; |
Note: Previous review comment suggested avoiding (bp as any).plan which has been correctly fixed here. This is just a minor simplification suggestion.
…at/refactor-agents-base
…at/refactor-agents-base
| let migratedBehaviorType = state.behaviorType; | ||
| if (isStateWithAgentMode(state)) { | ||
| migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic'; | ||
| needsMigration = true; | ||
| logger.info('Migrating agentMode to behaviorType', { | ||
| oldMode: state.agentMode, | ||
| newType: migratedBehaviorType | ||
| }); | ||
| } | ||
|
|
||
| return newState; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CRITICAL: State Migration Bug - behaviorType Not Applied
The migration logic computes migratedBehaviorType (lines 236-244) but NEVER assigns it to the newState object. This causes:
- Data Loss: Existing Durable Objects with
agentModefield lose their behavior preference - Incorrect Defaults: Sessions fall back to undefined/unknown behavior instead of preserving user intent
- Breaking Change: The PR description acknowledges this breaking change but the migration doesn't actually preserve the data
Fix Required:
let migratedBehaviorType = state.behaviorType;
if (isStateWithAgentMode(state)) {
migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic';
needsMigration = true;
logger.info('Migrating agentMode to behaviorType', {
oldMode: state.agentMode,
newType: migratedBehaviorType
});
// Apply the migrated value to newState
newState.behaviorType = migratedBehaviorType;
// Delete the old field
delete stateWithDeprecated.agentMode;
}
return newState;Note from PR author comment: "Smart mode was never introduced" - if this is true, then this migration may be unnecessary. However, if agentMode: 'deterministic' existed in production, it should still be migrated to behaviorType: 'phasic' to prevent data loss.
…at/refactor-agents-base
… strategy pattern - Remove specialized objective classes (AppObjective, WorkflowObjective, PresentationObjective, GeneralObjective) - Implement strategy pattern for project-type-specific export behavior - Move deployment and export logic from AppObjective into ProjectObjective base class - Simplify objective creation to single factory method with type parameter - Remove unused GitHub token cache from BaseCodingBehavior - Fix conversation
|
Claude finished @AshishKumar4's task —— View job Code Quality & Security Review - CompleteTodo List:
Review SummaryRecommendation: REQUEST CHANGES Found 1 critical bug in state migration that must be fixed before merging. See detailed review below. Key Findings:
Review Posted: What Needs to Be FixedThe critical issue is in Once this is fixed, the PR will be ready to merge. |
| }); | ||
| } | ||
|
|
||
| return newState; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CRITICAL: State Migration Bug - behaviorType Not Applied
The migration logic computes migratedBehaviorType (lines 154-162) but NEVER applies it to the returned newState object (line 164). This causes:
- Data Loss: Existing Durable Objects with
agentModefield lose their behavior preference - Incorrect Defaults: Sessions fall back to undefined/unknown behavior instead of preserving the intended mode
- Breaking Change: The PR acknowledges this breaking change but the migration doesn't preserve the data
Fix Required:
let migratedBehaviorType = state.behaviorType;
if (isStateWithAgentMode(state)) {
migratedBehaviorType = state.agentMode === 'smart' ? 'agentic' : 'phasic';
needsMigration = true;
logger.info('Migrating agentMode to behaviorType', {
oldMode: state.agentMode,
newType: migratedBehaviorType
});
// CRITICAL: Apply the migrated value to newState
newState.behaviorType = migratedBehaviorType;
// Delete the old deprecated field
delete stateWithDeprecated.agentMode;
}
return newState;Note from PR author: The author commented that "Smart mode was never introduced" - if true, this migration may be unnecessary. However, if any Durable Objects have agentMode: 'deterministic' in production, they should still be migrated to behaviorType: 'phasic' to prevent data loss.
Code Quality & Security ReviewRecommendation: REQUEST CHANGES This PR represents a substantial architectural refactoring to separate agent behavior from infrastructure. The overall design is sound, but there is 1 critical bug that must be fixed before merging. Critical Issues1. State Migration Bug - behaviorType Not Applied (CRITICAL)
Code QualityPrevious Review Issues - Resolved:
Architecture Review:
Type Safety:
Code Organization:
Security AnalysisSQL Injection Risk: LOW
Authentication & Authorization: NO CHANGES
Secrets Handling: SECURE
Data Exposure: NO ISSUES
Input Validation: EXISTING PATTERNS MAINTAINED
Workflow Changes: ACCEPTABLE
SummaryThis is a well-architected refactoring that successfully decouples agent behavior from infrastructure. The code quality is high and follows project conventions from CLAUDE.md. However, the critical state migration bug MUST be fixed before merging. Existing Durable Object sessions could lose their behavior configuration, causing runtime issues. Approval Status: REQUEST CHANGES due to critical migration bug Once the migration bug is fixed, this PR will be ready to merge. Additional NotesBreaking Changes:
Testing Recommendations:
|
| case 'agent_connected': { | ||
| const { state, templateDetails } = message; | ||
| const { state, templateDetails, previewUrl } = message; | ||
| console.log('Agent connected', state, templateDetails); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - Console statement in production code
Category: quality
Description:
console.log left in production code - indicates debug code not cleaned up
Suggestion:
Remove debug console.log or replace with proper logging using logger utility
Confidence: 90%
Rule: fe_console_in_production
| }); | ||
|
|
||
| if (templateDetails.allFiles && bootstrapFiles.length === 0) { | ||
| console.log('Template details, important files:', templateDetails.importantFiles, templateDetails) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - Console statement in production code
Category: quality
Description:
console.log left in production code - indicates debug code not cleaned up
Suggestion:
Remove debug console.log or replace with proper logging using logger utility
Confidence: 90%
Rule: fe_console_in_production
|
|
||
| // Sync projectType from backend if it changed | ||
| if (state.projectType) { | ||
| console.log('🎯 [WS] Backend projectType in cf_agent_state:', state.projectType); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - Console statement in production code
Category: quality
Description:
console.log with emoji left in production code - indicates debug code not cleaned up
Suggestion:
Remove debug console.log or replace with proper logging using logger utility
Confidence: 90%
Rule: fe_console_in_production
| import { Connection } from 'agents'; | ||
| import { | ||
| FileConceptType, | ||
| FileOutputType, | ||
| Blueprint, | ||
| AgenticBlueprint, | ||
| PhasicBlueprint, | ||
| } from '../../schemas'; | ||
| import { ExecuteCommandsResponse, PreviewType, RuntimeError, StaticAnalysisResponse, TemplateDetails, TemplateFile } from '../../../services/sandbox/sandboxTypes'; | ||
| import { BaseProjectState, AgenticState, FileState } from '../state'; | ||
| import { AllIssues, AgentSummary, AgentInitArgs, BehaviorType, DeploymentTarget, ProjectType } from '../types'; | ||
| import { WebSocketMessageResponses } from '../../constants'; | ||
| import { ProjectSetupAssistant } from '../../assistants/projectsetup'; | ||
| import { UserConversationProcessor, RenderToolCall } from '../../operations/UserConversationProcessor'; | ||
| import { FileRegenerationOperation } from '../../operations/FileRegeneration'; | ||
| // Database schema imports removed - using zero-storage OAuth flow | ||
| import { BaseSandboxService } from '../../../services/sandbox/BaseSandboxService'; | ||
| import { getTemplateImportantFiles } from '../../../services/sandbox/utils'; | ||
| import { createScratchTemplateDetails } from '../../utils/templates'; | ||
| import { WebSocketMessageData, WebSocketMessageType } from '../../../api/websocketTypes'; | ||
| import { InferenceContext } from '../../inferutils/config.types'; | ||
| import { ModelConfigService } from '../../../database/services/ModelConfigService'; | ||
| import { fixProjectIssues } from '../../../services/code-fixer'; | ||
| import { FastCodeFixerOperation } from '../../operations/PostPhaseCodeFixer'; | ||
| import { looksLikeCommand, validateAndCleanBootstrapCommands } from '../../utils/common'; | ||
| import { customizeTemplateFiles, generateBootstrapScript } from '../../utils/templateCustomizer'; | ||
| import { AppService } from '../../../database'; | ||
| import { RateLimitExceededError } from 'shared/types/errors'; | ||
| import { ImageAttachment, type ProcessedImageAttachment } from '../../../types/image-attachment'; | ||
| import { OperationOptions } from '../../operations/common'; | ||
| import { ImageType, uploadImage } from 'worker/utils/images'; | ||
| import { DeepDebugResult } from '../types'; | ||
| import { updatePackageJson } from '../../utils/packageSyncer'; | ||
| import { ICodingAgent } from '../../services/interfaces/ICodingAgent'; | ||
| import { SimpleCodeGenerationOperation } from '../../operations/SimpleCodeGeneration'; | ||
| import { AgentComponent } from '../AgentComponent'; | ||
| import type { AgentInfrastructure } from '../AgentCore'; | ||
| import { GitVersionControl } from '../../git'; | ||
| import { DeepDebuggerOperation } from '../../operations/DeepDebugger'; | ||
| import type { DeepDebuggerInputs } from '../../operations/DeepDebugger'; | ||
| import { generatePortToken } from 'worker/utils/cryptoUtils'; | ||
| import { getPreviewDomain, getProtocolForHost } from 'worker/utils/urls'; | ||
| import { isDev } from 'worker/utils/envs'; | ||
|
|
||
| export interface BaseCodingOperations { | ||
| regenerateFile: FileRegenerationOperation; | ||
| fastCodeFixer: FastCodeFixerOperation; | ||
| processUserMessage: UserConversationProcessor; | ||
| simpleGenerateFiles: SimpleCodeGenerationOperation; | ||
| } | ||
|
|
||
| /** | ||
| * Base class for all coding behaviors | ||
| */ | ||
| export abstract class BaseCodingBehavior<TState extends BaseProjectState> | ||
| extends AgentComponent<TState> implements ICodingAgent { | ||
| protected static readonly MAX_COMMANDS_HISTORY = 10; | ||
|
|
||
| protected projectSetupAssistant: ProjectSetupAssistant | undefined; | ||
|
|
||
| protected templateDetailsCache: TemplateDetails | null = null; | ||
|
|
||
| // In-memory storage for user-uploaded images (not persisted in DO state) | ||
| protected pendingUserImages: ProcessedImageAttachment[] = [] | ||
| protected generationPromise: Promise<void> | null = null; | ||
| protected currentAbortController?: AbortController; | ||
| protected deepDebugPromise: Promise<{ transcript: string } | { error: string }> | null = null; | ||
| protected deepDebugConversationId: string | null = null; | ||
|
|
||
| protected staticAnalysisCache: StaticAnalysisResponse | null = null; | ||
|
|
||
| protected operations: BaseCodingOperations = { | ||
| regenerateFile: new FileRegenerationOperation(), | ||
| fastCodeFixer: new FastCodeFixerOperation(), | ||
| processUserMessage: new UserConversationProcessor(), | ||
| simpleGenerateFiles: new SimpleCodeGenerationOperation(), | ||
| }; | ||
|
|
||
| getBehavior(): BehaviorType { | ||
| return this.state.behaviorType; | ||
| } | ||
|
|
||
| protected isAgenticState(state: BaseProjectState): state is AgenticState { | ||
| return state.behaviorType === 'agentic'; | ||
| } | ||
|
|
||
| constructor(infrastructure: AgentInfrastructure<TState>, protected projectType: ProjectType) { | ||
| super(infrastructure); | ||
|
|
||
| this.setState({ | ||
| ...this.state, | ||
| behaviorType: this.getBehavior(), | ||
| projectType: this.projectType, | ||
| }); | ||
| } | ||
|
|
||
| public async initialize( | ||
| initArgs: AgentInitArgs, | ||
| ..._args: unknown[] | ||
| ): Promise<TState> { | ||
| this.logger.info("Initializing agent"); | ||
| const { templateInfo } = initArgs; | ||
| if (templateInfo) { | ||
| this.templateDetailsCache = templateInfo.templateDetails; | ||
|
|
||
| await this.ensureTemplateDetails(); | ||
| } | ||
|
|
||
| // Reset the logg | ||
| return this.state; | ||
| } | ||
|
|
||
| onStart(_props?: Record<string, unknown> | undefined): Promise<void> { | ||
| return Promise.resolve(); | ||
| } | ||
|
|
||
| protected async initializeAsync(): Promise<void> { | ||
| try { | ||
| const [, setupCommands] = await Promise.all([ | ||
| this.deployToSandbox(), | ||
| this.getProjectSetupAssistant().generateSetupCommands(), | ||
| this.generateReadme() | ||
| ]); | ||
| this.logger.info("Deployment to sandbox service and initial commands predictions completed successfully"); | ||
| await this.executeCommands(setupCommands.commands); | ||
| this.logger.info("Initial commands executed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error during async initialization:", error); | ||
| // throw error; | ||
| } | ||
| } | ||
| onStateUpdate(_state: TState, _source: "server" | Connection) {} | ||
|
|
||
| async ensureTemplateDetails() { | ||
| // Skip fetching details for "scratch" baseline | ||
| if (!this.templateDetailsCache) { | ||
| if (this.state.templateName === 'scratch') { | ||
| this.logger.info('Skipping template details fetch for scratch baseline'); | ||
| return; | ||
| } | ||
| this.logger.info(`Loading template details for: ${this.state.templateName}`); | ||
| const results = await BaseSandboxService.getTemplateDetails(this.state.templateName); | ||
| if (!results.success || !results.templateDetails) { | ||
| throw new Error(`Failed to get template details for: ${this.state.templateName}`); | ||
| } | ||
|
|
||
| const templateDetails = results.templateDetails; | ||
|
|
||
| const customizedAllFiles = { ...templateDetails.allFiles }; | ||
|
|
||
| this.logger.info('Customizing template files for older app'); | ||
| const customizedFiles = customizeTemplateFiles( | ||
| templateDetails.allFiles, | ||
| { | ||
| projectName: this.state.projectName, | ||
| commandsHistory: this.getBootstrapCommands() | ||
| } | ||
| ); | ||
| Object.assign(customizedAllFiles, customizedFiles); | ||
|
|
||
| this.templateDetailsCache = { | ||
| ...templateDetails, | ||
| allFiles: customizedAllFiles | ||
| }; | ||
| this.logger.info('Template details loaded and customized'); | ||
|
|
||
| // If renderMode == 'browser', we can deploy right away | ||
| if (templateDetails.renderMode === 'browser') { | ||
| await this.deployToSandbox(); | ||
| } | ||
| } | ||
| return this.templateDetailsCache; | ||
| } | ||
|
|
||
| public getTemplateDetails(): TemplateDetails { | ||
| if (!this.templateDetailsCache) { | ||
| // Synthesize a minimal scratch template when starting from scratch | ||
| if (this.state.templateName === 'scratch') { | ||
| this.templateDetailsCache = createScratchTemplateDetails(); | ||
| return this.templateDetailsCache; | ||
| } | ||
| this.ensureTemplateDetails(); | ||
| throw new Error('Template details not loaded. Call ensureTemplateDetails() first.'); | ||
| } | ||
| return this.templateDetailsCache; | ||
| } | ||
|
|
||
| protected isPreviewable(): boolean { | ||
| // If there are 'package.json', and 'wrangler.jsonc' files, then it is previewable | ||
| return this.fileManager.fileExists('package.json') && (this.fileManager.fileExists('wrangler.jsonc') || this.fileManager.fileExists('wrangler.toml')); | ||
| } | ||
|
|
||
| /** | ||
| * Update bootstrap script when commands history changes | ||
| * Called after significant command executions | ||
| */ | ||
| private async updateBootstrapScript(commandsHistory: string[]): Promise<void> { | ||
| if (!commandsHistory || commandsHistory.length === 0) { | ||
| return; | ||
| } | ||
|
|
||
| // Use only validated commands | ||
| const bootstrapScript = generateBootstrapScript( | ||
| this.state.projectName, | ||
| commandsHistory | ||
| ); | ||
|
|
||
| await this.fileManager.saveGeneratedFile( | ||
| { | ||
| filePath: '.bootstrap.js', | ||
| fileContents: bootstrapScript, | ||
| filePurpose: 'Updated bootstrap script for first-time clone setup' | ||
| }, | ||
| 'chore: Update bootstrap script with latest commands', | ||
| true | ||
| ); | ||
|
|
||
| this.logger.info('Updated bootstrap script with commands', { | ||
| commandCount: commandsHistory.length, | ||
| commands: commandsHistory | ||
| }); | ||
| } | ||
|
|
||
| getProjectSetupAssistant(): ProjectSetupAssistant { | ||
| if (this.projectSetupAssistant === undefined) { | ||
| this.projectSetupAssistant = new ProjectSetupAssistant({ | ||
| env: this.env, | ||
| agentId: this.getAgentId(), | ||
| query: this.state.query, | ||
| blueprint: this.state.blueprint, | ||
| template: this.getTemplateDetails(), | ||
| inferenceContext: this.state.inferenceContext | ||
| }); | ||
| } | ||
| return this.projectSetupAssistant; | ||
| } | ||
|
|
||
| getSessionId() { | ||
| return this.deploymentManager.getSessionId(); | ||
| } | ||
|
|
||
| getSandboxServiceClient(): BaseSandboxService { | ||
| return this.deploymentManager.getClient(); | ||
| } | ||
|
|
||
| isCodeGenerating(): boolean { | ||
| return this.generationPromise !== null; | ||
| } | ||
|
|
||
| abstract getOperationOptions(): OperationOptions; | ||
|
|
||
| /** | ||
| * Gets or creates an abort controller for the current operation | ||
| * Reuses existing controller for nested operations (e.g., tool calling) | ||
| */ | ||
| protected getOrCreateAbortController(): AbortController { | ||
| // Don't reuse aborted controllers | ||
| if (this.currentAbortController && !this.currentAbortController.signal.aborted) { | ||
| return this.currentAbortController; | ||
| } | ||
|
|
||
| // Create new controller in memory for new operation | ||
| this.currentAbortController = new AbortController(); | ||
|
|
||
| return this.currentAbortController; | ||
| } | ||
|
|
||
| /** | ||
| * Cancels the current inference operation if any | ||
| */ | ||
| public cancelCurrentInference(): boolean { | ||
| if (this.currentAbortController) { | ||
| this.logger.info('Cancelling current inference operation'); | ||
| this.currentAbortController.abort(); | ||
| this.currentAbortController = undefined; | ||
| return true; | ||
| } | ||
| return false; | ||
| } | ||
|
|
||
| /** | ||
| * Clears abort controller after successful completion | ||
| */ | ||
| protected clearAbortController(): void { | ||
| this.currentAbortController = undefined; | ||
| } | ||
|
|
||
| /** | ||
| * Gets inference context with abort signal | ||
| * Reuses existing abort controller for nested operations | ||
| */ | ||
| protected getInferenceContext(): InferenceContext { | ||
| const controller = this.getOrCreateAbortController(); | ||
|
|
||
| return { | ||
| ...this.state.inferenceContext, | ||
| abortSignal: controller.signal, | ||
| }; | ||
| } | ||
|
|
||
| async generateReadme() { | ||
| this.logger.info('Generating README.md'); | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATING, { | ||
| message: 'Generating README.md', | ||
| filePath: 'README.md', | ||
| filePurpose: 'Project documentation and setup instructions' | ||
| }); | ||
|
|
||
| const readme = await this.operations.simpleGenerateFiles.generateReadme(this.getOperationOptions()); | ||
|
|
||
| await this.fileManager.saveGeneratedFile(readme, "feat: README.md"); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message: 'README.md generated successfully', | ||
| file: readme | ||
| }); | ||
| this.logger.info('README.md generated successfully'); | ||
| } | ||
|
|
||
| async setBlueprint(blueprint: Blueprint): Promise<void> { | ||
| this.setState({ | ||
| ...this.state, | ||
| blueprint: blueprint as AgenticBlueprint | PhasicBlueprint, | ||
| }); | ||
| this.broadcast(WebSocketMessageResponses.BLUEPRINT_UPDATED, { | ||
| message: 'Blueprint updated', | ||
| updatedKeys: Object.keys(blueprint || {}) | ||
| }); | ||
| } | ||
|
|
||
| getProjectType() { | ||
| return this.state.projectType; | ||
| } | ||
|
|
||
| async queueUserRequest(request: string, images?: ProcessedImageAttachment[]): Promise<void> { | ||
| this.setState({ | ||
| ...this.state, | ||
| pendingUserInputs: [...this.state.pendingUserInputs, request] | ||
| }); | ||
| if (images && images.length > 0) { | ||
| this.logger.info('Storing user images in-memory for phase generation', { | ||
| imageCount: images.length, | ||
| }); | ||
| this.pendingUserImages = [...this.pendingUserImages, ...images]; | ||
| } | ||
| } | ||
|
|
||
| protected fetchPendingUserRequests(): string[] { | ||
| const inputs = this.state.pendingUserInputs; | ||
| if (inputs.length > 0) { | ||
| this.setState({ | ||
| ...this.state, | ||
| pendingUserInputs: [] | ||
| }); | ||
| } | ||
| return inputs; | ||
| } | ||
|
|
||
| clearConversation(): void { | ||
| this.infrastructure.clearConversation(); | ||
| } | ||
|
|
||
| getGit(): GitVersionControl { | ||
| return this.git; | ||
| } | ||
|
|
||
|
|
||
| /** | ||
| * State machine controller for code generation with user interaction support | ||
| * Executes phases sequentially with review cycles and proper state transitions | ||
| */ | ||
| async generateAllFiles(): Promise<void> { | ||
| if (this.state.mvpGenerated && this.state.pendingUserInputs.length === 0) { | ||
| this.logger.info("Code generation already completed and no user inputs pending"); | ||
| return; | ||
| } | ||
| if (this.isCodeGenerating()) { | ||
| this.logger.info("Code generation already in progress"); | ||
| return; | ||
| } | ||
| this.generationPromise = this.buildWrapper(); | ||
| await this.generationPromise; | ||
| } | ||
|
|
||
| setMVPGenerated(): boolean { | ||
| if (!this.state.mvpGenerated) { | ||
| this.setState({ ...this.state, mvpGenerated: true }); | ||
| this.logger.info('MVP generated'); | ||
| return true; | ||
| } | ||
| return false; | ||
| } | ||
|
|
||
| isMVPGenerated(): boolean { | ||
| return this.state.mvpGenerated; | ||
| } | ||
|
|
||
| private async buildWrapper() { | ||
| this.broadcast(WebSocketMessageResponses.GENERATION_STARTED, { | ||
| message: 'Starting code generation', | ||
| totalFiles: this.getTotalFiles() | ||
| }); | ||
| this.logger.info('Starting code generation', { | ||
| totalFiles: this.getTotalFiles() | ||
| }); | ||
| await this.ensureTemplateDetails(); | ||
| try { | ||
| await this.build(); | ||
| } catch (error) { | ||
| if (error instanceof RateLimitExceededError) { | ||
| this.logger.error("Error in state machine:", error); | ||
| this.broadcast(WebSocketMessageResponses.RATE_LIMIT_ERROR, { error }); | ||
| } else { | ||
| this.broadcastError("Error during generation", error); | ||
| } | ||
| } finally { | ||
| // Clear abort controller after generation completes | ||
| this.clearAbortController(); | ||
|
|
||
| const appService = new AppService(this.env); | ||
| await appService.updateApp( | ||
| this.getAgentId(), | ||
| { | ||
| status: 'completed', | ||
| } | ||
| ); | ||
| this.generationPromise = null; | ||
| this.broadcast(WebSocketMessageResponses.GENERATION_COMPLETE, { | ||
| message: "Code generation and review process completed.", | ||
| instanceId: this.state.sandboxInstanceId, | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Abstract method to be implemented by subclasses | ||
| * Contains the main logic for code generation and review process | ||
| */ | ||
| abstract build(): Promise<void> | ||
|
|
||
| async executeDeepDebug( | ||
| issue: string, | ||
| toolRenderer: RenderToolCall, | ||
| streamCb: (chunk: string) => void, | ||
| focusPaths?: string[], | ||
| ): Promise<DeepDebugResult> { | ||
| const debugPromise = (async () => { | ||
| try { | ||
| const previousTranscript = this.state.lastDeepDebugTranscript ?? undefined; | ||
| const operationOptions = this.getOperationOptions(); | ||
| const filesIndex = operationOptions.context.allFiles | ||
| .filter((f) => | ||
| !focusPaths?.length || | ||
| focusPaths.some((p) => f.filePath.includes(p)), | ||
| ); | ||
|
|
||
| const runtimeErrors = await this.fetchRuntimeErrors(false); | ||
|
|
||
| const inputs: DeepDebuggerInputs = { | ||
| issue, | ||
| previousTranscript, | ||
| filesIndex, | ||
| runtimeErrors, | ||
| streamCb, | ||
| toolRenderer, | ||
| }; | ||
|
|
||
| const operation = new DeepDebuggerOperation(); | ||
|
|
||
| const result = await operation.execute(inputs, operationOptions); | ||
|
|
||
| const transcript = result.transcript; | ||
|
|
||
| // Save transcript for next session | ||
| this.setState({ | ||
| ...this.state, | ||
| lastDeepDebugTranscript: transcript, | ||
| }); | ||
|
|
||
| return { success: true as const, transcript }; | ||
| } catch (e) { | ||
| this.logger.error('Deep debugger failed', e); | ||
| return { success: false as const, error: `Deep debugger failed: ${String(e)}` }; | ||
| } finally { | ||
| this.deepDebugPromise = null; | ||
| this.deepDebugConversationId = null; | ||
| } | ||
| })(); | ||
|
|
||
| // Store promise before awaiting | ||
| this.deepDebugPromise = debugPromise; | ||
|
|
||
| return await debugPromise; | ||
| } | ||
|
|
||
|
|
||
| getModelConfigsInfo() { | ||
| const modelService = new ModelConfigService(this.env); | ||
| return modelService.getModelConfigsInfo(this.state.inferenceContext.userId); | ||
| } | ||
|
|
||
| getTotalFiles(): number { | ||
| return this.fileManager.getGeneratedFilePaths().length | ||
| } | ||
|
|
||
| getSummary(): Promise<AgentSummary> { | ||
| const summaryData = { | ||
| query: this.state.query, | ||
| generatedCode: this.fileManager.getGeneratedFiles(), | ||
| }; | ||
| return Promise.resolve(summaryData); | ||
| } | ||
|
|
||
| async getFullState(): Promise<TState> { | ||
| return this.state; | ||
| } | ||
|
|
||
| migrateStateIfNeeded(): void { | ||
| // no-op, only older phasic agents need this, for now. | ||
| } | ||
|
|
||
| getFileGenerated(filePath: string) { | ||
| return this.fileManager!.getGeneratedFile(filePath) || null; | ||
| } | ||
|
|
||
| async fetchRuntimeErrors(clear: boolean = true, shouldWait: boolean = true): Promise<RuntimeError[]> { | ||
| if (shouldWait) { | ||
| await this.deploymentManager.waitForPreview(); | ||
| } | ||
|
|
||
| try { | ||
| const errors = await this.deploymentManager.fetchRuntimeErrors(clear); | ||
|
|
||
| if (errors.length > 0) { | ||
| this.broadcast(WebSocketMessageResponses.RUNTIME_ERROR_FOUND, { | ||
| errors, | ||
| message: "Runtime errors found", | ||
| count: errors.length | ||
| }); | ||
| } | ||
|
|
||
| return errors; | ||
| } catch (error) { | ||
| this.logger.error("Exception fetching runtime errors:", error); | ||
| // If fetch fails, initiate redeploy | ||
| this.deployToSandbox(); | ||
| const message = "<runtime errors not available at the moment as preview is not deployed>"; | ||
| return [{ message, timestamp: new Date().toISOString(), level: 0, rawOutput: message }]; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Perform static code analysis on the generated files | ||
| * This helps catch potential issues early in the development process | ||
| */ | ||
| async runStaticAnalysisCode(files?: string[]): Promise<StaticAnalysisResponse> { | ||
| try { | ||
| // Check if we have cached static analysis | ||
| if (this.staticAnalysisCache) { | ||
| return this.staticAnalysisCache; | ||
| } | ||
|
|
||
| const analysisResponse = await this.deploymentManager.runStaticAnalysis(files); | ||
| this.staticAnalysisCache = analysisResponse; | ||
|
|
||
| const { lint, typecheck } = analysisResponse; | ||
| this.broadcast(WebSocketMessageResponses.STATIC_ANALYSIS_RESULTS, { | ||
| lint: { issues: lint.issues, summary: lint.summary }, | ||
| typecheck: { issues: typecheck.issues, summary: typecheck.summary } | ||
| }); | ||
|
|
||
| return analysisResponse; | ||
| } catch (error) { | ||
| this.broadcastError("Failed to lint code", error); | ||
| return { success: false, lint: { issues: [], }, typecheck: { issues: [], } }; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Apply deterministic code fixes for common TypeScript errors | ||
| */ | ||
| protected async applyDeterministicCodeFixes() : Promise<StaticAnalysisResponse | undefined> { | ||
| try { | ||
| // Get static analysis and do deterministic fixes | ||
| const staticAnalysis = await this.runStaticAnalysisCode(); | ||
| if (staticAnalysis.typecheck.issues.length == 0) { | ||
| this.logger.info("No typecheck issues found, skipping deterministic fixes"); | ||
| return staticAnalysis; // So that static analysis is not repeated again | ||
| } | ||
| const typeCheckIssues = staticAnalysis.typecheck.issues; | ||
| this.broadcast(WebSocketMessageResponses.DETERMINISTIC_CODE_FIX_STARTED, { | ||
| message: `Attempting to fix ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`, | ||
| issues: typeCheckIssues | ||
| }); | ||
|
|
||
| this.logger.info(`Attempting to fix ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`); | ||
| const allFiles = this.fileManager.getAllFiles(); | ||
|
|
||
| const fixResult = fixProjectIssues( | ||
| allFiles.map(file => ({ | ||
| filePath: file.filePath, | ||
| fileContents: file.fileContents, | ||
| filePurpose: '' | ||
| })), | ||
| typeCheckIssues | ||
| ); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.DETERMINISTIC_CODE_FIX_COMPLETED, { | ||
| message: `Fixed ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`, | ||
| issues: typeCheckIssues, | ||
| fixResult | ||
| }); | ||
|
|
||
| if (fixResult) { | ||
| // If there are unfixable issues but of type TS2307, extract external module names and install them | ||
| if (fixResult.unfixableIssues.length > 0) { | ||
| const modulesNotFound = fixResult.unfixableIssues.filter(issue => issue.issueCode === 'TS2307'); | ||
| // Reason is of the form: External package "xyz" should be handled by package manager | ||
| const moduleNames = modulesNotFound.flatMap(issue => { | ||
| const match = issue.reason.match(/External package ["'](.+?)["']/); | ||
| const name = match?.[1]; | ||
| return (typeof name === 'string' && name.trim().length > 0 && !name.startsWith('@shared')) ? [name] : []; | ||
| }).filter((name) => !name.includes('cloudflare:')); | ||
| if (moduleNames.length > 0) { | ||
| const installCommands = moduleNames.map(moduleName => `bun install ${moduleName}`); | ||
| await this.executeCommands(installCommands, false); | ||
|
|
||
| this.logger.info(`Deterministic code fixer installed missing modules: ${moduleNames.join(', ')}`); | ||
| } else { | ||
| this.logger.info(`Deterministic code fixer detected no external modules to install from unfixable TS2307 issues`); | ||
| } | ||
| } | ||
| if (fixResult.modifiedFiles.length > 0) { | ||
| this.logger.info("Applying deterministic fixes to files, Fixes: ", JSON.stringify(fixResult, null, 2)); | ||
| const fixedFiles = fixResult.modifiedFiles.map(file => ({ | ||
| filePath: file.filePath, | ||
| filePurpose: allFiles.find(f => f.filePath === file.filePath)?.filePurpose || '', | ||
| fileContents: file.fileContents | ||
| })); | ||
| await this.fileManager.saveGeneratedFiles(fixedFiles, "fix: applied deterministic fixes"); | ||
|
|
||
| await this.deployToSandbox(fixedFiles, false, "fix: applied deterministic fixes"); | ||
| this.logger.info("Deployed deterministic fixes to sandbox"); | ||
| } | ||
| } | ||
| this.logger.info(`Applied deterministic code fixes: ${JSON.stringify(fixResult, null, 2)}`); | ||
| } catch (error) { | ||
| this.broadcastError('Deterministic code fixer failed', error); | ||
| } | ||
| // return undefined; | ||
| } | ||
|
|
||
| async fetchAllIssues(resetIssues: boolean = false): Promise<AllIssues> { | ||
| if (!this.state.sandboxInstanceId) { | ||
| return { runtimeErrors: [], staticAnalysis: { success: false, lint: { issues: [], }, typecheck: { issues: [], } } }; | ||
| } | ||
| const [runtimeErrors, staticAnalysis] = await Promise.all([ | ||
| this.fetchRuntimeErrors(resetIssues), | ||
| this.runStaticAnalysisCode() | ||
| ]); | ||
| this.logger.info("Fetched all issues:", JSON.stringify({ runtimeErrors, staticAnalysis })); | ||
|
|
||
| return { runtimeErrors, staticAnalysis }; | ||
| } | ||
|
|
||
| async updateProjectName(newName: string): Promise<boolean> { | ||
| try { | ||
| const valid = /^[a-z0-9-_]{3,50}$/.test(newName); | ||
| if (!valid) return false; | ||
| const updatedBlueprint = { ...this.state.blueprint, projectName: newName }; | ||
| this.setState({ | ||
| ...this.state, | ||
| blueprint: updatedBlueprint | ||
| }); | ||
| let ok = true; | ||
| if (this.state.sandboxInstanceId) { | ||
| try { | ||
| ok = await this.getSandboxServiceClient().updateProjectName(this.state.sandboxInstanceId, newName); | ||
| } catch (_) { | ||
| ok = false; | ||
| } | ||
| } | ||
| try { | ||
| const appService = new AppService(this.env); | ||
| const dbOk = await appService.updateApp(this.getAgentId(), { title: newName }); | ||
| ok = ok && dbOk; | ||
| } catch (error) { | ||
| this.logger.error('Error updating project name in database:', error); | ||
| ok = false; | ||
| } | ||
| this.broadcast(WebSocketMessageResponses.PROJECT_NAME_UPDATED, { | ||
| message: 'Project name updated', | ||
| projectName: newName | ||
| }); | ||
| return ok; | ||
| } catch (error) { | ||
| this.logger.error('Error updating project name:', error); | ||
| return false; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Update user-facing blueprint fields | ||
| * Only allows updating safe, cosmetic fields - not internal generation state | ||
| */ | ||
| async updateBlueprint(patch: Partial<Blueprint>): Promise<Blueprint> { | ||
| // Fields that are safe to update after generation starts | ||
| // Excludes: initialPhase (breaks phasic generation) | ||
| const safeUpdatableFields = new Set([ | ||
| 'title', | ||
| 'description', | ||
| 'detailedDescription', | ||
| 'colorPalette', | ||
| 'views', | ||
| 'userFlow', | ||
| 'dataFlow', | ||
| 'architecture', | ||
| 'pitfalls', | ||
| 'frameworks', | ||
| 'implementationRoadmap' | ||
| ]); | ||
|
|
||
| // Filter to only safe fields | ||
| const filtered: Record<string, unknown> = {}; | ||
| for (const [key, value] of Object.entries(patch)) { | ||
| if (safeUpdatableFields.has(key) && value !== undefined) { | ||
| filtered[key] = value; | ||
| } | ||
| } | ||
|
|
||
| // Agentic: allow initializing plan if not set yet (first-time plan initialization only) | ||
| if (this.isAgenticState(this.state)) { | ||
| const currentPlan = this.state.blueprint?.plan; | ||
| const patchPlan = 'plan' in patch ? patch.plan : undefined; | ||
| if (Array.isArray(patchPlan) && (!Array.isArray(currentPlan) || currentPlan.length === 0)) { | ||
| filtered['plan'] = patchPlan; | ||
| } | ||
| } | ||
|
|
||
| // projectName requires sandbox update, handle separately | ||
| if ('projectName' in patch && typeof patch.projectName === 'string') { | ||
| await this.updateProjectName(patch.projectName); | ||
| } | ||
|
|
||
| // Merge and update state | ||
| const updated = { ...this.state.blueprint, ...filtered } as Blueprint; | ||
| this.setState({ | ||
| ...this.state, | ||
| blueprint: updated | ||
| }); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.BLUEPRINT_UPDATED, { | ||
| message: 'Blueprint updated', | ||
| updatedKeys: Object.keys(filtered) | ||
| }); | ||
|
|
||
| return updated; | ||
| } | ||
|
|
||
| // ===== Debugging helpers for assistants ===== | ||
| listFiles(): FileOutputType[] { | ||
| return this.fileManager.getAllRelevantFiles(); | ||
| } | ||
|
|
||
| async readFiles(paths: string[]): Promise<{ files: { path: string; content: string }[] }> { | ||
| const results: { path: string; content: string }[] = []; | ||
| const notFoundInFileManager: string[] = []; | ||
|
|
||
| // First, try to read from FileManager (template + generated files) | ||
| for (const path of paths) { | ||
| const file = this.fileManager.getFile(path); | ||
| if (file) { | ||
| results.push({ path, content: file.fileContents }); | ||
| } else { | ||
| notFoundInFileManager.push(path); | ||
| } | ||
| } | ||
|
|
||
| // If some files not found in FileManager and sandbox exists, try sandbox | ||
| if (notFoundInFileManager.length > 0 && this.state.sandboxInstanceId) { | ||
| const resp = await this.getSandboxServiceClient().getFiles( | ||
| this.state.sandboxInstanceId, | ||
| notFoundInFileManager | ||
| ); | ||
| if (resp.success) { | ||
| results.push(...resp.files.map(f => ({ | ||
| path: f.filePath, | ||
| content: f.fileContents | ||
| }))); | ||
| } | ||
| } | ||
|
|
||
| return { files: results }; | ||
| } | ||
|
|
||
| async execCommands(commands: string[], shouldSave: boolean, timeout?: number): Promise<ExecuteCommandsResponse> { | ||
| const { sandboxInstanceId } = this.state; | ||
| if (!sandboxInstanceId) { | ||
| return { success: false, results: [], error: 'No sandbox instance' }; | ||
| } | ||
| const result = await this.getSandboxServiceClient().executeCommands(sandboxInstanceId, commands, timeout); | ||
| if (shouldSave) { | ||
| this.saveExecutedCommands(commands); | ||
| } | ||
| return result; | ||
| } | ||
|
|
||
| updateSlideManifest(file: FileOutputType) { | ||
| // If the project type is presentation and this is a slide file, update the manifest | ||
| if (this.projectType === 'presentation') { | ||
| const templateDetails = this.getTemplateDetails() | ||
| if (!templateDetails) { | ||
| return; | ||
| } | ||
| const slidesDirectory = templateDetails.slideDirectory ?? '/public/slides'; | ||
| if (file.filePath.startsWith(slidesDirectory) && file.filePath.endsWith('.json')) { | ||
| const manifestPath = `${slidesDirectory}/manifest.json` | ||
| const existingManifest = this.fileManager.getFile(manifestPath) | ||
|
|
||
| // Parse existing manifest or create new one | ||
| let manifestData: { slides: string[] } = { slides: [] }; | ||
| if (existingManifest) { | ||
| try { | ||
| const parsed = JSON.parse(existingManifest.fileContents); | ||
| manifestData = { | ||
| slides: Array.isArray(parsed.slides) ? parsed.slides : [] | ||
| }; | ||
| } catch (error) { | ||
| this.logger.error('Failed to parse existing manifest.json', error); | ||
| manifestData = { slides: [] }; | ||
| } | ||
| } else { | ||
| manifestData = { slides: [] }; | ||
| } | ||
|
|
||
| // Add slide path to slides array if not already present | ||
| const relativeSlidePath = file.filePath.replace(slidesDirectory + '/', ''); | ||
| if (!manifestData.slides.includes(relativeSlidePath)) { | ||
| manifestData.slides.push(relativeSlidePath); | ||
|
|
||
| // Save updated manifest | ||
| const updatedManifest: FileOutputType = { | ||
| filePath: manifestPath, | ||
| fileContents: JSON.stringify(manifestData, null, 2), | ||
| filePurpose: 'Presentation slides manifest' | ||
| }; | ||
| this.fileManager.recordFileChanges([updatedManifest]); | ||
|
|
||
| this.logger.info('Updated manifest.json with new slide', { | ||
| slidePath: relativeSlidePath, | ||
| totalSlides: manifestData.slides.length | ||
| }); | ||
| } | ||
| } | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Regenerate a file to fix identified issues | ||
| * Retries up to 3 times before giving up | ||
| */ | ||
| async regenerateFile(file: FileOutputType, issues: string[], retryIndex: number = 0) { | ||
| this.broadcast(WebSocketMessageResponses.FILE_REGENERATING, { | ||
| message: `Regenerating file: ${file.filePath}`, | ||
| filePath: file.filePath, | ||
| original_issues: issues, | ||
| }); | ||
|
|
||
| const result = await this.operations.regenerateFile.execute( | ||
| {file, issues, retryIndex}, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| this.updateSlideManifest(result); | ||
| const fileState = await this.fileManager.saveGeneratedFile(result); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.FILE_REGENERATED, { | ||
| message: `Regenerated file: ${file.filePath}`, | ||
| file: fileState, | ||
| original_issues: issues, | ||
| }); | ||
|
|
||
| return fileState; | ||
| } | ||
|
|
||
| async regenerateFileByPath(path: string, issues: string[]): Promise<{ path: string; diff: string }> { | ||
| const templateDetails = this.getTemplateDetails(); | ||
| if (templateDetails && templateDetails.dontTouchFiles && templateDetails.dontTouchFiles.includes(path)) { | ||
| return { | ||
| path, | ||
| diff: '<WRITE PROTECTED - TEMPLATE FILE, CANNOT MODIFY - SKIPPED - NO CHANGES MADE>' | ||
| }; | ||
| } | ||
| // Prefer local file manager; fallback to sandbox | ||
| let fileContents = ''; | ||
| let filePurpose = ''; | ||
| try { | ||
| const fmFile = this.fileManager.getFile(path); | ||
| if (fmFile) { | ||
| fileContents = fmFile.fileContents; | ||
| filePurpose = fmFile.filePurpose || ''; | ||
| } else { | ||
| const { sandboxInstanceId } = this.state; | ||
| if (!sandboxInstanceId) { | ||
| throw new Error('No sandbox instance available'); | ||
| } | ||
| const resp = await this.getSandboxServiceClient().getFiles(sandboxInstanceId, [path]); | ||
| const f = resp.success ? resp.files.find(f => f.filePath === path) : undefined; | ||
| if (!f) throw new Error(resp.error || `File not found: ${path}`); | ||
| fileContents = f.fileContents; | ||
| } | ||
| } catch (e) { | ||
| throw new Error(`Failed to read file for regeneration: ${String(e)}`); | ||
| } | ||
|
|
||
| const regenerated = await this.regenerateFile({ filePath: path, fileContents, filePurpose }, issues, 0); | ||
| // Invalidate cache | ||
| this.staticAnalysisCache = null; | ||
| // Persist to sandbox instance | ||
| // await this.getSandboxServiceClient().writeFiles(sandboxInstanceId, [{ filePath: regenerated.filePath, fileContents: regenerated.fileContents }], `Deep debugger fix: ${path}`); | ||
| await this.deploymentManager.deployToSandbox([regenerated]) | ||
| return { path, diff: regenerated.lastDiff }; | ||
| } | ||
|
|
||
| async generateFiles( | ||
| phaseName: string, | ||
| phaseDescription: string, | ||
| requirements: string[], | ||
| files: FileConceptType[] | ||
| ): Promise<{ files: Array<{ path: string; purpose: string; diff: string }> }> { | ||
| this.logger.info('Generating files for deep debugger', { | ||
| phaseName, | ||
| requirementsCount: requirements.length, | ||
| filesCount: files.length | ||
| }); | ||
|
|
||
| // Broadcast file generation started | ||
| this.broadcast(WebSocketMessageResponses.PHASE_IMPLEMENTING, { | ||
| message: `Generating files: ${phaseName}`, | ||
| phaseName | ||
| }); | ||
|
|
||
| let skippedFiles: { path: string; purpose: string; diff: string }[] = []; | ||
|
|
||
| // Enforce template donttouch constraints | ||
| const templateDetails = this.getTemplateDetails(); | ||
| if (templateDetails && templateDetails.dontTouchFiles) { | ||
| const dontTouchFiles = new Set<string>(templateDetails.dontTouchFiles); | ||
| files = files.filter(file => { | ||
| if (dontTouchFiles.has(file.path)) { | ||
| this.logger.info('Skipping dont-touch file', { filePath: file.path }); | ||
| skippedFiles.push({ path: file.path, purpose: `WRITE-PROTECTED FILE, CANNOT MODIFY`, diff: "<WRITE PROTECTED - TEMPLATE FILE, CANNOT MODIFY - SKIPPED - NO CHANGES MADE>" }); | ||
| return false; | ||
| } | ||
| return true; | ||
| }); | ||
| } | ||
|
|
||
| const savedFiles: FileState[] = []; | ||
|
|
||
| const operation = new SimpleCodeGenerationOperation(); | ||
| const result = await operation.execute( | ||
| { | ||
| phaseName, | ||
| phaseDescription, | ||
| requirements, | ||
| files, | ||
| fileGeneratingCallback: (filePath: string, filePurpose: string) => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATING, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath, | ||
| filePurpose | ||
| }); | ||
| }, | ||
| fileChunkGeneratedCallback: (filePath: string, chunk: string, format: 'full_content' | 'unified_diff') => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_CHUNK_GENERATED, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath, | ||
| chunk, | ||
| format | ||
| }); | ||
| }, | ||
| fileClosedCallback: (file, message) => { | ||
| // Record file to state (sync) | ||
| const saved = this.fileManager.recordFileChanges([file]); | ||
| savedFiles.push(...saved); | ||
| this.updateSlideManifest(file); | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message, | ||
| file | ||
| }); | ||
| } | ||
| }, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| await this.fileManager.saveGeneratedFiles( | ||
| [], | ||
| `feat: ${phaseName}\n\n${phaseDescription}` | ||
| ); | ||
|
|
||
| this.logger.info('Files generated and saved', { | ||
| fileCount: result.files.length | ||
| }); | ||
|
|
||
| await this.deployToSandbox(savedFiles, false); | ||
|
|
||
| return { | ||
| files: [ | ||
| ...skippedFiles, | ||
| ...savedFiles.map(f => { | ||
| return { | ||
| path: f.filePath, | ||
| purpose: f.filePurpose || '', | ||
| diff: f.lastDiff || '' | ||
| }; | ||
| }) | ||
| ] | ||
| }; | ||
| } | ||
|
|
||
| /** | ||
| * Get or create file serving token (lazy generation) | ||
| */ | ||
| private getOrCreateFileServingToken(): string { | ||
| if (!this.state.fileServingToken) { | ||
| const token = generatePortToken(); | ||
| this.setState({ | ||
| ...this.state, | ||
| fileServingToken: { | ||
| token, | ||
| createdAt: Date.now() | ||
| } | ||
| }); | ||
| } | ||
| return this.state.fileServingToken!.token; | ||
| } | ||
|
|
||
| /** | ||
| * Get browser preview URL for file serving | ||
| */ | ||
| public getBrowserPreviewURL(): string { | ||
| const token = this.getOrCreateFileServingToken(); | ||
| const agentId = this.getAgentId(); | ||
| const previewDomain = isDev(this.env) ? 'localhost:5173' : getPreviewDomain(this.env); | ||
|
|
||
| // Format: b-{agentid}-{token}.{previewDomain} | ||
| return `${getProtocolForHost(previewDomain)}://b-${agentId}-${token}.${previewDomain}`; | ||
| } | ||
|
|
||
| // A wrapper for LLM tool to deploy to sandbox | ||
| async deployPreview(clearLogs: boolean = true, forceRedeploy: boolean = false): Promise<string> { | ||
| const response = await this.deployToSandbox([], forceRedeploy, undefined, clearLogs); | ||
| if (response && response.previewURL) { | ||
| this.broadcast(WebSocketMessageResponses.PREVIEW_FORCE_REFRESH, {}); | ||
| return `Deployment successful: ${response.previewURL}`; | ||
| } | ||
| return `Failed to deploy: ${response?.tunnelURL}`; | ||
| } | ||
|
|
||
| async deployToSandbox(files: FileOutputType[] = [], redeploy: boolean = false, commitMessage?: string, clearLogs: boolean = false): Promise<PreviewType | null> { | ||
| // Only deploy if project is previewable | ||
| if (!this.isPreviewable()) { | ||
| throw new Error('Project is not previewable'); | ||
| } | ||
| this.logger.info('[AGENT] Deploying to sandbox', { files: files.length, redeploy, commitMessage, renderMode: this.getTemplateDetails()?.renderMode, templateDetails: this.getTemplateDetails() }); | ||
|
|
||
| if (this.getTemplateDetails()?.renderMode === 'browser') { | ||
| this.logger.info('Deploying to browser native sandbox'); | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_STARTED, {}); | ||
| const result: PreviewType = { | ||
| previewURL: this.getBrowserPreviewURL() | ||
| } | ||
| this.logger.info('Deployed to browser native sandbox'); | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_COMPLETED, result); | ||
| return result; | ||
| } | ||
|
|
||
| // Invalidate static analysis cache | ||
| this.staticAnalysisCache = null; | ||
|
|
||
| // Call deployment manager with callbacks for broadcasting at the right times | ||
| const result = await this.deploymentManager.deployToSandbox( | ||
| files, | ||
| redeploy, | ||
| commitMessage, | ||
| clearLogs, | ||
| { | ||
| onStarted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_STARTED, data); | ||
| }, | ||
| onCompleted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_COMPLETED, data); | ||
| }, | ||
| onError: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_FAILED, data); | ||
| }, | ||
| onAfterSetupCommands: async () => { | ||
| // Sync package.json after setup commands (includes dependency installs) | ||
| await this.syncPackageJsonFromSandbox(); | ||
| } | ||
| } | ||
| ); | ||
|
|
||
| return result; | ||
| } | ||
|
|
||
| /** | ||
| * Deploy the generated code to Cloudflare Workers | ||
| */ | ||
| async deployToCloudflare(target: DeploymentTarget = 'platform'): Promise<{ deploymentUrl?: string; workersUrl?: string } | null> { | ||
| try { | ||
| // Ensure sandbox instance exists first | ||
| if (!this.state.sandboxInstanceId) { | ||
| this.logger.info('No sandbox instance, deploying to sandbox first'); | ||
| await this.deployToSandbox(); | ||
|
|
||
| if (!this.state.sandboxInstanceId) { | ||
| this.logger.error('Failed to deploy to sandbox service'); | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, { | ||
| message: 'Deployment failed: Failed to deploy to sandbox service', | ||
| error: 'Sandbox service unavailable' | ||
| }); | ||
| return null; | ||
| } | ||
| } | ||
|
|
||
| // Call service - handles orchestration, callbacks for broadcasting | ||
| const result = await this.deploymentManager.deployToCloudflare({ | ||
| target, | ||
| callbacks: { | ||
| onStarted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_STARTED, data); | ||
| }, | ||
| onCompleted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_COMPLETED, data); | ||
| }, | ||
| onError: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, data); | ||
| }, | ||
| } | ||
| }); | ||
|
|
||
| // Update database with deployment ID if successful | ||
| if (result.deploymentUrl && result.deploymentId) { | ||
| const appService = new AppService(this.env); | ||
| await appService.updateDeploymentId( | ||
| this.getAgentId(), | ||
| result.deploymentId | ||
| ); | ||
| } | ||
|
|
||
| return result.deploymentUrl ? { deploymentUrl: result.deploymentUrl } : null; | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Cloudflare deployment error:', error); | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, { | ||
| message: 'Deployment failed', | ||
| error: error instanceof Error ? error.message : String(error) | ||
| }); | ||
| return null; | ||
| } | ||
| } | ||
|
|
||
| async importTemplate(templateName: string): Promise<{ templateName: string; filesImported: number; files: TemplateFile[] }> { | ||
| this.logger.info(`Importing template into project: ${templateName}`); | ||
|
|
||
| if (this.state.templateName !== templateName) { | ||
| // Get template catalog info to sync projectType | ||
| const catalogResponse = await BaseSandboxService.listTemplates(); | ||
| const catalogInfo = catalogResponse.success | ||
| ? catalogResponse.templates.find(t => t.name === templateName) | ||
| : null; | ||
|
|
||
| // Update state with template name and projectType if available | ||
| this.setState({ | ||
| ...this.state, | ||
| templateName: templateName, | ||
| ...(catalogInfo?.projectType ? { projectType: catalogInfo.projectType } : {}), | ||
| }); | ||
|
|
||
| this.templateDetailsCache = null; // Clear template details cache | ||
| } | ||
| const templateDetails = await this.ensureTemplateDetails(); | ||
| if (!templateDetails) { | ||
| throw new Error(`Failed to get template details for: ${templateName}`); | ||
| } | ||
|
|
||
| this.setState({ | ||
| ...this.state, | ||
| lastPackageJson: templateDetails.allFiles['package.json'] || this.state.lastPackageJson, | ||
| }); | ||
|
|
||
| // Get important files for return value | ||
| const importantFiles = getTemplateImportantFiles(templateDetails); | ||
|
|
||
| // Ensure deployment to sandbox | ||
| await this.deployToSandbox(); | ||
|
|
||
| // Notify frontend about template metadata update | ||
| this.broadcast(WebSocketMessageResponses.TEMPLATE_UPDATED, { | ||
| templateDetails | ||
| }); | ||
|
|
||
| return { | ||
| templateName: templateDetails.name, | ||
| filesImported: Object.keys(templateDetails.allFiles).length, | ||
| files: importantFiles | ||
| }; | ||
| } | ||
|
|
||
| async waitForGeneration(): Promise<void> { | ||
| if (this.generationPromise) { | ||
| try { | ||
| await this.generationPromise; | ||
| this.logger.info("Code generation completed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error during code generation:", error); | ||
| } | ||
| } else { | ||
| this.logger.error("No generation process found"); | ||
| } | ||
| } | ||
|
|
||
| isDeepDebugging(): boolean { | ||
| return this.deepDebugPromise !== null; | ||
| } | ||
|
|
||
| getDeepDebugSessionState(): { conversationId: string } | null { | ||
| if (this.deepDebugConversationId && this.deepDebugPromise) { | ||
| return { conversationId: this.deepDebugConversationId }; | ||
| } | ||
| return null; | ||
| } | ||
|
|
||
| async waitForDeepDebug(): Promise<void> { | ||
| if (this.deepDebugPromise) { | ||
| try { | ||
| await this.deepDebugPromise; | ||
| this.logger.info("Deep debug session completed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error during deep debug session:", error); | ||
| } finally { | ||
| // Clear promise after waiting completes | ||
| this.deepDebugPromise = null; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| protected async onProjectUpdate(message: string): Promise<void> { | ||
| this.setState({ | ||
| ...this.state, | ||
| projectUpdatesAccumulator: [...this.state.projectUpdatesAccumulator, message] | ||
| }); | ||
| } | ||
|
|
||
| protected async getAndResetProjectUpdates() { | ||
| const projectUpdates = this.state.projectUpdatesAccumulator || []; | ||
| this.setState({ | ||
| ...this.state, | ||
| projectUpdatesAccumulator: [] | ||
| }); | ||
| return projectUpdates; | ||
| } | ||
|
|
||
| public broadcast<T extends WebSocketMessageType>(msg: T, data?: WebSocketMessageData<T>): void { | ||
| if (this.operations.processUserMessage.isProjectUpdateType(msg)) { | ||
| let message = msg as string; | ||
| if (data && 'message' in data) { | ||
| message = (data as { message: string }).message; | ||
| } | ||
| this.onProjectUpdate(message); | ||
| } | ||
| super.broadcast(msg, data); | ||
| } | ||
|
|
||
| protected getBootstrapCommands() { | ||
| const bootstrapCommands = this.state.commandsHistory || []; | ||
| // Validate, deduplicate, and clean | ||
| const { validCommands } = validateAndCleanBootstrapCommands(bootstrapCommands); | ||
| return validCommands; | ||
| } | ||
|
|
||
| protected async saveExecutedCommands(commands: string[]) { | ||
| this.logger.info('Saving executed commands', { commands }); | ||
|
|
||
| // Merge with existing history | ||
| const mergedCommands = [...(this.state.commandsHistory || []), ...commands]; | ||
|
|
||
| // Validate, deduplicate, and clean | ||
| const { validCommands, invalidCommands, deduplicated } = validateAndCleanBootstrapCommands(mergedCommands); | ||
|
|
||
| // Log what was filtered out | ||
| if (invalidCommands.length > 0 || deduplicated > 0) { | ||
| this.logger.warn('[commands] Bootstrap commands cleaned', { | ||
| invalidCommands, | ||
| invalidCount: invalidCommands.length, | ||
| deduplicatedCount: deduplicated, | ||
| finalCount: validCommands.length | ||
| }); | ||
| } | ||
|
|
||
| // Update state with cleaned commands | ||
| this.setState({ | ||
| ...this.state, | ||
| commandsHistory: validCommands | ||
| }); | ||
|
|
||
| // Update bootstrap script with validated commands | ||
| await this.updateBootstrapScript(validCommands); | ||
|
|
||
| // Sync package.json if any dependency-modifying commands were executed | ||
| const hasDependencyCommands = commands.some(cmd => | ||
| cmd.includes('install') || | ||
| cmd.includes(' add ') || | ||
| cmd.includes('remove') || | ||
| cmd.includes('uninstall') | ||
| ); | ||
|
|
||
| if (hasDependencyCommands) { | ||
| this.logger.info('Dependency commands executed, syncing package.json from sandbox'); | ||
| await this.syncPackageJsonFromSandbox(); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Execute commands with retry logic | ||
| * Chunks commands and retries failed ones with AI assistance | ||
| */ | ||
| protected async executeCommands(commands: string[], shouldRetry: boolean = true, chunkSize: number = 5): Promise<void> { | ||
| const state = this.state; | ||
| if (!state.sandboxInstanceId) { | ||
| this.logger.warn('No sandbox instance available for executing commands'); | ||
| return; | ||
| } | ||
|
|
||
| // Sanitize and prepare commands | ||
| commands = commands.join('\n').split('\n').filter(cmd => cmd.trim() !== '').filter(cmd => looksLikeCommand(cmd) && !cmd.includes(' undefined')); | ||
| if (commands.length === 0) { | ||
| this.logger.warn("No commands to execute"); | ||
| return; | ||
| } | ||
|
|
||
| commands = commands.map(cmd => cmd.trim().replace(/^\s*-\s*/, '').replace(/^npm/, 'bun')); | ||
| this.logger.info(`AI suggested ${commands.length} commands to run: ${commands.join(", ")}`); | ||
|
|
||
| // Remove duplicate commands | ||
| commands = Array.from(new Set(commands)); | ||
|
|
||
| // Execute in chunks | ||
| const commandChunks = []; | ||
| for (let i = 0; i < commands.length; i += chunkSize) { | ||
| commandChunks.push(commands.slice(i, i + chunkSize)); | ||
| } | ||
|
|
||
| const successfulCommands: string[] = []; | ||
|
|
||
| for (const chunk of commandChunks) { | ||
| // Retry failed commands up to 3 times | ||
| let currentChunk = chunk; | ||
| let retryCount = 0; | ||
| const maxRetries = shouldRetry ? 3 : 1; | ||
|
|
||
| while (currentChunk.length > 0 && retryCount < maxRetries) { | ||
| try { | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: retryCount > 0 ? `Retrying commands (attempt ${retryCount + 1}/${maxRetries})` : "Executing commands", | ||
| commands: currentChunk | ||
| }); | ||
|
|
||
| const resp = await this.getSandboxServiceClient().executeCommands( | ||
| state.sandboxInstanceId, | ||
| currentChunk | ||
| ); | ||
| if (!resp.results || !resp.success) { | ||
| this.logger.error('Failed to execute commands', { response: resp }); | ||
| // Check if instance is still running | ||
| const status = await this.getSandboxServiceClient().getInstanceStatus(state.sandboxInstanceId); | ||
| if (!status.success || !status.isHealthy) { | ||
| this.logger.error(`Instance ${state.sandboxInstanceId} is no longer running`); | ||
| return; | ||
| } | ||
| break; | ||
| } | ||
|
|
||
| // Process results | ||
| const successful = resp.results.filter(r => r.success); | ||
| const failures = resp.results.filter(r => !r.success); | ||
|
|
||
| // Track successful commands | ||
| if (successful.length > 0) { | ||
| const successfulCmds = successful.map(r => r.command); | ||
| this.logger.info(`Successfully executed ${successful.length} commands: ${successfulCmds.join(", ")}`); | ||
| successfulCommands.push(...successfulCmds); | ||
| } | ||
|
|
||
| // If all succeeded, move to next chunk | ||
| if (failures.length === 0) { | ||
| this.logger.info(`All commands in chunk executed successfully`); | ||
| break; | ||
| } | ||
|
|
||
| // Handle failures | ||
| const failedCommands = failures.map(r => r.command); | ||
| this.logger.warn(`${failures.length} commands failed: ${failedCommands.join(", ")}`); | ||
|
|
||
| // Only retry if shouldRetry is true | ||
| if (!shouldRetry) { | ||
| break; | ||
| } | ||
|
|
||
| retryCount++; | ||
|
|
||
| // For install commands, try AI regeneration | ||
| const failedInstallCommands = failedCommands.filter(cmd => | ||
| cmd.startsWith("bun") || cmd.startsWith("npm") || cmd.includes("install") | ||
| ); | ||
|
|
||
| if (failedInstallCommands.length > 0 && retryCount < maxRetries) { | ||
| // Use AI to suggest alternative commands | ||
| const newCommands = await this.getProjectSetupAssistant().generateSetupCommands( | ||
| `The following install commands failed: ${JSON.stringify(failures, null, 2)}. Please suggest alternative commands.` | ||
| ); | ||
|
|
||
| if (newCommands?.commands && newCommands.commands.length > 0) { | ||
| this.logger.info(`AI suggested ${newCommands.commands.length} alternative commands`); | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: "Executing regenerated commands", | ||
| commands: newCommands.commands | ||
| }); | ||
| currentChunk = newCommands.commands.filter(looksLikeCommand); | ||
| } else { | ||
| this.logger.warn('AI could not generate alternative commands'); | ||
| currentChunk = []; | ||
| } | ||
| } else { | ||
| // No retry needed for non-install commands | ||
| currentChunk = []; | ||
| } | ||
| } catch (error) { | ||
| this.logger.error('Error executing commands:', error); | ||
| // Stop retrying on error | ||
| break; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| // Record command execution history | ||
| const failedCommands = commands.filter(cmd => !successfulCommands.includes(cmd)); | ||
|
|
||
| if (failedCommands.length > 0) { | ||
| this.broadcastError('Failed to execute commands', new Error(failedCommands.join(", "))); | ||
| } else { | ||
| this.logger.info(`All commands executed successfully: ${successfulCommands.join(", ")}`); | ||
| } | ||
|
|
||
| this.saveExecutedCommands(successfulCommands); | ||
| } | ||
|
|
||
| /** | ||
| * Sync package.json from sandbox to agent's git repository | ||
| * Called after install/add/remove commands to keep dependencies in sync | ||
| */ | ||
| protected async syncPackageJsonFromSandbox(): Promise<void> { | ||
| try { | ||
| this.logger.info('Fetching current package.json from sandbox'); | ||
| const results = await this.readFiles(['package.json']); | ||
| if (!results || !results.files || results.files.length === 0) { | ||
| this.logger.warn('Failed to fetch package.json from sandbox', { results }); | ||
| return; | ||
| } | ||
| const packageJsonContent = results.files[0].content; | ||
|
|
||
| const { updated, packageJson } = updatePackageJson(this.state.lastPackageJson, packageJsonContent); | ||
| if (!updated) { | ||
| this.logger.info('package.json has not changed, skipping sync'); | ||
| return; | ||
| } | ||
| // Update state with latest package.json | ||
| this.setState({ | ||
| ...this.state, | ||
| lastPackageJson: packageJson | ||
| }); | ||
|
|
||
| // Commit to git repository | ||
| const fileState = await this.fileManager.saveGeneratedFile( | ||
| { | ||
| filePath: 'package.json', | ||
| fileContents: packageJson, | ||
| filePurpose: 'Project dependencies and configuration' | ||
| }, | ||
| 'chore: sync package.json dependencies from sandbox', | ||
| true | ||
| ); | ||
|
|
||
| this.logger.info('Successfully synced package.json to git', { | ||
| filePath: fileState.filePath, | ||
| }); | ||
|
|
||
| // Broadcast update to clients | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message: 'Synced package.json from sandbox', | ||
| file: fileState | ||
| }); | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Failed to sync package.json from sandbox', error); | ||
| // Non-critical error - don't throw, just log | ||
| } | ||
| } | ||
|
|
||
| async getLogs(_reset?: boolean, durationSeconds?: number): Promise<string> { | ||
| if (!this.state.sandboxInstanceId) { | ||
| throw new Error('Cannot get logs: No sandbox instance available'); | ||
| } | ||
|
|
||
| const response = await this.getSandboxServiceClient().getLogs(this.state.sandboxInstanceId, _reset, durationSeconds); | ||
| if (response.success) { | ||
| return `STDOUT: ${response.logs.stdout}\nSTDERR: ${response.logs.stderr}`; | ||
| } else { | ||
| return `Failed to get logs, ${response.error}`; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Delete files from the file manager | ||
| */ | ||
| async deleteFiles(filePaths: string[]) : Promise<{ success: boolean, error?: string }> { | ||
| const deleteCommands: string[] = []; | ||
| for (const filePath of filePaths) { | ||
| deleteCommands.push(`rm -rf ${filePath}`); | ||
| } | ||
| // Remove the files from file manager | ||
| this.fileManager.deleteFiles(filePaths); | ||
| try { | ||
| await this.executeCommands(deleteCommands, false); | ||
| this.logger.info(`Deleted ${filePaths.length} files: ${filePaths.join(", ")}`); | ||
| return { success: true }; | ||
| } catch (error) { | ||
| this.logger.error('Error deleting files:', error); | ||
| return { success: false, error: error as string }; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Handle user input during conversational code generation | ||
| * Processes user messages and updates pendingUserInputs state | ||
| */ | ||
| async handleUserInput(userMessage: string, images?: ImageAttachment[]): Promise<void> { | ||
| try { | ||
| this.logger.info('Processing user input message', { | ||
| messageLength: userMessage.length, | ||
| pendingInputsCount: this.state.pendingUserInputs.length, | ||
| hasImages: !!images && images.length > 0, | ||
| imageCount: images?.length || 0 | ||
| }); | ||
|
|
||
| // Ensure template details are loaded before processing | ||
| await this.ensureTemplateDetails(); | ||
|
|
||
| // Just fetch runtime errors | ||
| const errors = await this.fetchRuntimeErrors(false, false); | ||
| const projectUpdates = await this.getAndResetProjectUpdates(); | ||
| this.logger.info('Passing context to user conversation processor', { errors, projectUpdates }); | ||
|
|
||
|
|
||
| const conversationState = this.infrastructure.getConversationState(); | ||
| // If there are images, upload them and pass the URLs to the conversation processor | ||
| let uploadedImages: ProcessedImageAttachment[] = []; | ||
| if (images) { | ||
| uploadedImages = await Promise.all(images.map(async (image) => { | ||
| return await uploadImage(this.env, image, ImageType.UPLOADS); | ||
| })); | ||
|
|
||
| this.logger.info('Uploaded images', { uploadedImages }); | ||
| } | ||
|
|
||
| // Process the user message using conversational assistant | ||
| const conversationalResponse = await this.operations.processUserMessage.execute( | ||
| { | ||
| userMessage, | ||
| conversationState, | ||
| conversationResponseCallback: ( | ||
| message: string, | ||
| conversationId: string, | ||
| isStreaming: boolean, | ||
| tool?: { name: string; status: 'start' | 'success' | 'error'; args?: Record<string, unknown> } | ||
| ) => { | ||
| // Track conversationId when deep_debug starts | ||
| if (tool?.name === 'deep_debug' && tool.status === 'start') { | ||
| this.deepDebugConversationId = conversationId; | ||
| } | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.CONVERSATION_RESPONSE, { | ||
| message, | ||
| conversationId, | ||
| isStreaming, | ||
| tool, | ||
| }); | ||
| }, | ||
| errors, | ||
| projectUpdates, | ||
| images: uploadedImages | ||
| }, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| const { conversationResponse, conversationState: newConversationState } = conversationalResponse; | ||
| this.logger.info('User input processed successfully', { | ||
| responseLength: conversationResponse.userResponse.length, | ||
| }); | ||
|
|
||
| this.infrastructure.setConversationState(newConversationState); | ||
| } catch (error) { | ||
| this.logger.error('Error processing user input', error); | ||
| throw error; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Capture screenshot of the given URL using Cloudflare Browser Rendering REST API | ||
| */ | ||
| public async captureScreenshot( | ||
| url: string, | ||
| viewport: { width: number; height: number } = { width: 1280, height: 720 } | ||
| ): Promise<string> { | ||
| if (!this.env.DB || !this.getAgentId()) { | ||
| const error = 'Cannot capture screenshot: DB or agentId not available'; | ||
| this.logger.warn(error); | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| configurationError: true | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| if (!url) { | ||
| const error = 'URL is required for screenshot capture'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| this.logger.info('Capturing screenshot via REST API', { url, viewport }); | ||
|
|
||
| // Notify start of screenshot capture | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_STARTED, { | ||
| message: `Capturing screenshot of ${url}`, | ||
| url, | ||
| viewport | ||
| }); | ||
|
|
||
| try { | ||
| // Use Cloudflare Browser Rendering REST API | ||
| const apiUrl = `https://api.cloudflare.com/client/v4/accounts/${this.env.CLOUDFLARE_ACCOUNT_ID}/browser-rendering/snapshot`; | ||
|
|
||
| const response = await fetch(apiUrl, { | ||
| method: 'POST', | ||
| headers: { | ||
| 'Authorization': `Bearer ${this.env.CLOUDFLARE_API_TOKEN}`, | ||
| 'Content-Type': 'application/json', | ||
| }, | ||
| body: JSON.stringify({ | ||
| url: url, | ||
| viewport: viewport, | ||
| gotoOptions: { | ||
| waitUntil: 'networkidle0', | ||
| timeout: 10000 | ||
| }, | ||
| screenshotOptions: { | ||
| fullPage: false, | ||
| type: 'png' | ||
| } | ||
| }), | ||
| }); | ||
|
|
||
| if (!response.ok) { | ||
| const errorText = await response.text(); | ||
| const error = `Browser Rendering API failed: ${response.status} - ${errorText}`; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| statusCode: response.status, | ||
| statusText: response.statusText | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| const result = await response.json() as { | ||
| success: boolean; | ||
| result: { | ||
| screenshot: string; // base64 encoded | ||
| content: string; // HTML content | ||
| }; | ||
| }; | ||
|
|
||
| if (!result.success || !result.result.screenshot) { | ||
| const error = 'Browser Rendering API succeeded but no screenshot returned'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| apiResponse: result | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| // Get base64 screenshot data | ||
| const base64Screenshot = result.result.screenshot; | ||
| const screenshot: ImageAttachment = { | ||
| id: this.getAgentId(), | ||
| filename: 'latest.png', | ||
| mimeType: 'image/png', | ||
| base64Data: base64Screenshot | ||
| }; | ||
| const uploadedImage = await uploadImage(this.env, screenshot, ImageType.SCREENSHOTS); | ||
|
|
||
| // Persist in database | ||
| try { | ||
| const appService = new AppService(this.env); | ||
| await appService.updateAppScreenshot(this.getAgentId(), uploadedImage.publicUrl); | ||
| } catch (dbError) { | ||
| const error = `Database update failed: ${dbError instanceof Error ? dbError.message : 'Unknown database error'}`; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| screenshotCaptured: true, | ||
| databaseError: true | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| this.logger.info('Screenshot captured and stored successfully', { | ||
| url, | ||
| storage: uploadedImage.publicUrl.startsWith('data:') ? 'database' : (uploadedImage.publicUrl.includes('/api/screenshots/') ? 'r2' : 'images'), | ||
| length: base64Screenshot.length | ||
| }); | ||
|
|
||
| // Notify successful screenshot capture | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_SUCCESS, { | ||
| message: `Successfully captured screenshot of ${url}`, | ||
| url, | ||
| viewport, | ||
| screenshotSize: base64Screenshot.length, | ||
| timestamp: new Date().toISOString() | ||
| }); | ||
|
|
||
| return uploadedImage.publicUrl; | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Failed to capture screenshot via REST API:', error); | ||
|
|
||
| // Only broadcast if error wasn't already broadcast above | ||
| const errorMessage = error instanceof Error ? error.message : 'Unknown error'; | ||
| if (!errorMessage.includes('Browser Rendering API') && !errorMessage.includes('Database update failed')) { | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error: errorMessage, | ||
| url, | ||
| viewport | ||
| }); | ||
| } | ||
|
|
||
| throw new Error(`Screenshot capture failed: ${error instanceof Error ? error.message : 'Unknown error'}`); | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - File exceeds 300 lines (1771 lines)
Category: quality
Description:
File is too large and difficult to maintain. Should be split into focused modules.
Suggestion:
Split into smaller modules: FileManagement, DeploymentManagement, CommandExecution, DebugOperations, UserInteraction modules
Confidence: 95%
Rule: ts_large_file
| import { Agent, AgentContext, ConnectionContext } from "agents"; | ||
| import { AgentInitArgs, AgentSummary, DeployOptions, DeployResult, ExportOptions, ExportResult, DeploymentTarget, BehaviorType } from "./types"; | ||
| import { AgenticState, AgentState, BaseProjectState, CurrentDevState, MAX_PHASES, PhasicState } from "./state"; | ||
| import { Blueprint } from "../schemas"; | ||
| import { BaseCodingBehavior } from "./behaviors/base"; | ||
| import { createObjectLogger, StructuredLogger } from '../../logger'; | ||
| import { InferenceContext } from "../inferutils/config.types"; | ||
| import { getMimeType } from 'hono/utils/mime'; | ||
| import { normalizePath, isPathSafe } from '../../utils/pathUtils'; | ||
| import { FileManager } from '../services/implementations/FileManager'; | ||
| import { DeploymentManager } from '../services/implementations/DeploymentManager'; | ||
| import { GitVersionControl } from '../git'; | ||
| import { StateManager } from '../services/implementations/StateManager'; | ||
| import { PhasicCodingBehavior } from './behaviors/phasic'; | ||
| import { AgenticCodingBehavior } from './behaviors/agentic'; | ||
| import { SqlExecutor } from '../git'; | ||
| import { AgentInfrastructure } from "./AgentCore"; | ||
| import { ProjectType } from './types'; | ||
| import { Connection } from 'agents'; | ||
| import { handleWebSocketMessage, handleWebSocketClose, broadcastToConnections, sendToConnection } from './websocket'; | ||
| import { WebSocketMessageData, WebSocketMessageType } from "worker/api/websocketTypes"; | ||
| import { PreviewType, TemplateDetails } from "worker/services/sandbox/sandboxTypes"; | ||
| import { WebSocketMessageResponses } from "../constants"; | ||
| import { AppService, ModelConfigService } from "worker/database"; | ||
| import { ConversationMessage, ConversationState } from "../inferutils/common"; | ||
| import { ImageAttachment } from "worker/types/image-attachment"; | ||
| import { RateLimitExceededError } from "shared/types/errors"; | ||
| import { ProjectObjective } from "./objectives/base"; | ||
| import { FileOutputType } from "../schemas"; | ||
|
|
||
| const DEFAULT_CONVERSATION_SESSION_ID = 'default'; | ||
|
|
||
| interface AgentBootstrapProps { | ||
| behaviorType?: BehaviorType; | ||
| projectType?: ProjectType; | ||
| } | ||
|
|
||
| export class CodeGeneratorAgent extends Agent<Env, AgentState> implements AgentInfrastructure<AgentState> { | ||
| public _logger: StructuredLogger | undefined; | ||
| private behavior!: BaseCodingBehavior<AgentState>; | ||
| private objective!: ProjectObjective<BaseProjectState>; | ||
| protected static readonly PROJECT_NAME_PREFIX_MAX_LENGTH = 20; | ||
| // Services | ||
| readonly fileManager: FileManager; | ||
| readonly deploymentManager: DeploymentManager; | ||
| readonly git: GitVersionControl; | ||
|
|
||
| // Redeclare as public to satisfy AgentInfrastructure interface | ||
| declare public readonly env: Env; | ||
| declare public readonly sql: SqlExecutor; | ||
|
|
||
| // ========================================== | ||
| // Initialization | ||
| // ========================================== | ||
|
|
||
| initialState = { | ||
| behaviorType: 'unknown' as BehaviorType, | ||
| projectType: 'unknown' as ProjectType, | ||
| projectName: "", | ||
| query: "", | ||
| sessionId: '', | ||
| hostname: '', | ||
| blueprint: {} as unknown as Blueprint, | ||
| templateName: '', | ||
| generatedFilesMap: {}, | ||
| conversationMessages: [], | ||
| inferenceContext: {} as InferenceContext, | ||
| shouldBeGenerating: false, | ||
| sandboxInstanceId: undefined, | ||
| commandsHistory: [], | ||
| lastPackageJson: '', | ||
| pendingUserInputs: [], | ||
| projectUpdatesAccumulator: [], | ||
| lastDeepDebugTranscript: null, | ||
| mvpGenerated: false, | ||
| reviewingInitiated: false, | ||
| generatedPhases: [], | ||
| currentDevState: CurrentDevState.IDLE, | ||
| phasesCounter: MAX_PHASES, | ||
| } as AgentState; | ||
|
|
||
| constructor(ctx: AgentContext, env: Env) { | ||
| super(ctx, env); | ||
|
|
||
| this.sql`CREATE TABLE IF NOT EXISTS full_conversations (id TEXT PRIMARY KEY, messages TEXT)`; | ||
| this.sql`CREATE TABLE IF NOT EXISTS compact_conversations (id TEXT PRIMARY KEY, messages TEXT)`; | ||
|
|
||
| // Create StateManager | ||
| const stateManager = new StateManager( | ||
| () => this.state, | ||
| (s) => this.setState(s) | ||
| ); | ||
|
|
||
| this.git = new GitVersionControl(this.sql.bind(this)); | ||
| this.fileManager = new FileManager( | ||
| stateManager, | ||
| () => this.behavior?.getTemplateDetails?.() || null, | ||
| this.git | ||
| ); | ||
| this.deploymentManager = new DeploymentManager( | ||
| { | ||
| stateManager, | ||
| fileManager: this.fileManager, | ||
| getLogger: () => this.logger(), | ||
| env: this.env | ||
| }, | ||
| 10, // MAX_COMMANDS_HISTORY | ||
| ); | ||
| } | ||
| private createObjective(projectType: ProjectType): ProjectObjective<BaseProjectState> { | ||
| return new ProjectObjective(this as AgentInfrastructure<BaseProjectState>, projectType); | ||
| } | ||
|
|
||
| /** | ||
| * Initialize the agent with project blueprint and template | ||
| * Only called once in an app's lifecycle | ||
| */ | ||
| async initialize( | ||
| initArgs: AgentInitArgs<AgentState>, | ||
| ..._args: unknown[] | ||
| ): Promise<AgentState> { | ||
| const { inferenceContext } = initArgs; | ||
| const sandboxSessionId = DeploymentManager.generateNewSessionId(); | ||
| this.initLogger(inferenceContext.agentId, inferenceContext.userId, sandboxSessionId); | ||
|
|
||
| // Infrastructure setup | ||
| await this.gitInit(); | ||
|
|
||
| // Let behavior handle all state initialization (blueprint, projectName, etc.) | ||
| await this.behavior.initialize({ | ||
| ...initArgs, | ||
| sandboxSessionId // Pass generated session ID to behavior | ||
| }); | ||
|
|
||
| await this.saveToDatabase(); | ||
|
|
||
| return this.state; | ||
| } | ||
|
|
||
| async isInitialized() { | ||
| return this.getAgentId() ? true : false | ||
| } | ||
|
|
||
| /** | ||
| * Called evertime when agent is started or re-started | ||
| * @param props - Optional props | ||
| */ | ||
| async onStart(props?: Record<string, unknown> | undefined): Promise<void> { | ||
| this.logger().info(`Agent ${this.getAgentId()} session: ${this.state.sessionId} onStart`, { props }); | ||
|
|
||
| this.logger().info('Bootstrapping CodeGeneratorAgent', { props }); | ||
| const agentProps = props as AgentBootstrapProps; | ||
| const behaviorType = agentProps?.behaviorType ?? this.state.behaviorType ?? 'phasic'; | ||
| const projectType = agentProps?.projectType ?? this.state.projectType ?? 'app'; | ||
|
|
||
| if (behaviorType === 'phasic') { | ||
| this.behavior = new PhasicCodingBehavior(this as AgentInfrastructure<PhasicState>, projectType); | ||
| } else { | ||
| this.behavior = new AgenticCodingBehavior(this as AgentInfrastructure<AgenticState>, projectType); | ||
| } | ||
|
|
||
| // Create objective based on project type | ||
| this.objective = this.createObjective(projectType); | ||
|
|
||
| this.behavior.onStart(props); | ||
|
|
||
| // Ignore if agent not initialized | ||
| if (!this.state.query) { | ||
| this.logger().warn(`Agent ${this.getAgentId()} session: ${this.state.sessionId} onStart ignored, agent not initialized`); | ||
| return; | ||
| } | ||
|
|
||
| // Ensure state is migrated for any previous versions | ||
| this.behavior.migrateStateIfNeeded(); | ||
|
|
||
| // Check if this is a read-only operation | ||
| const readOnlyMode = props?.readOnlyMode === true; | ||
|
|
||
| if (readOnlyMode) { | ||
| this.logger().info(`Agent ${this.getAgentId()} starting in READ-ONLY mode - skipping expensive initialization`); | ||
| return; | ||
| } | ||
|
|
||
| // Just in case | ||
| await this.gitInit(); | ||
|
|
||
| await this.behavior.ensureTemplateDetails(); | ||
| this.logger().info(`Agent ${this.getAgentId()} session: ${this.state.sessionId} onStart processed successfully`); | ||
|
|
||
| // Load the latest user configs | ||
| const modelConfigService = new ModelConfigService(this.env); | ||
| const userConfigsRecord = await modelConfigService.getUserModelConfigs(this.state.inferenceContext.userId); | ||
| this.setState({ | ||
| ...this.state, | ||
| inferenceContext: { | ||
| ...this.state.inferenceContext, | ||
| userModelConfigs: userConfigsRecord, | ||
| }, | ||
| }); | ||
| this.logger().info(`Agent ${this.getAgentId()} session: ${this.state.sessionId} onStart: User configs loaded successfully`, {userConfigsRecord}); | ||
| } | ||
|
|
||
| onConnect(connection: Connection, ctx: ConnectionContext) { | ||
| this.logger().info(`Agent connected for agent ${this.getAgentId()}`, { connection, ctx }); | ||
| let previewUrl = ''; | ||
| try { | ||
| if (this.behavior.getTemplateDetails().renderMode === 'browser') { | ||
| previewUrl = this.behavior.getBrowserPreviewURL(); | ||
| } | ||
| } catch (error) { | ||
| this.logger().error('Error getting preview URL:', error); | ||
| } | ||
| sendToConnection(connection, WebSocketMessageResponses.AGENT_CONNECTED, { | ||
| state: this.state, | ||
| templateDetails: this.behavior.getTemplateDetails(), | ||
| previewUrl: previewUrl | ||
| }); | ||
| } | ||
|
|
||
| private initLogger(agentId: string, userId: string, sessionId?: string) { | ||
| this._logger = createObjectLogger(this, 'CodeGeneratorAgent'); | ||
| this._logger.setObjectId(agentId); | ||
| this._logger.setFields({ | ||
| agentId, | ||
| userId, | ||
| projectType: this.state.projectType, | ||
| behaviorType: this.state.behaviorType | ||
| }); | ||
| if (sessionId) { | ||
| this._logger.setField('sessionId', sessionId); | ||
| } | ||
| return this._logger; | ||
| } | ||
|
|
||
| // ========================================== | ||
| // Utilities | ||
| // ========================================== | ||
|
|
||
| logger(): StructuredLogger { | ||
| if (!this._logger) { | ||
| this._logger = this.initLogger(this.getAgentId(), this.state.inferenceContext.userId, this.state.sessionId); | ||
| } | ||
| return this._logger; | ||
| } | ||
|
|
||
| getAgentId() { | ||
| return this.state.inferenceContext.agentId; | ||
| } | ||
|
|
||
| getWebSockets(): WebSocket[] { | ||
| return this.ctx.getWebSockets(); | ||
| } | ||
|
|
||
| /** | ||
| * Get the project objective (defines what is being built) | ||
| */ | ||
| getObjective(): ProjectObjective<BaseProjectState> { | ||
| return this.objective; | ||
| } | ||
|
|
||
| /** | ||
| * Get the behavior (defines how code is generated) | ||
| */ | ||
| getBehavior(): BaseCodingBehavior<AgentState> { | ||
| return this.behavior; | ||
| } | ||
|
|
||
| async getFullState(): Promise<AgentState> { | ||
| return await this.behavior.getFullState(); | ||
| } | ||
|
|
||
| async getSummary(): Promise<AgentSummary> { | ||
| return this.behavior.getSummary(); | ||
| } | ||
|
|
||
| getPreviewUrlCache(): string { | ||
| return ''; // Unimplemented | ||
| } | ||
|
|
||
| deployToSandbox( | ||
| files: FileOutputType[] = [], | ||
| redeploy: boolean = false, | ||
| commitMessage?: string, | ||
| clearLogs: boolean = false | ||
| ): Promise<PreviewType | null> { | ||
| return this.behavior.deployToSandbox(files, redeploy, commitMessage, clearLogs); | ||
| } | ||
|
|
||
| deployToCloudflare(target?: DeploymentTarget): Promise<{ deploymentUrl?: string; workersUrl?: string } | null> { | ||
| return this.behavior.deployToCloudflare(target); | ||
| } | ||
|
|
||
| deployProject(options?: DeployOptions): Promise<DeployResult> { | ||
| return this.objective.deploy(options); | ||
| } | ||
|
|
||
| exportProject(options: ExportOptions): Promise<ExportResult> { | ||
| return this.objective.export(options); | ||
| } | ||
|
|
||
| importTemplate(templateName: string): Promise<{ templateName: string; filesImported: number }> { | ||
| return this.behavior.importTemplate(templateName); | ||
| } | ||
|
|
||
| protected async saveToDatabase() { | ||
| this.logger().info(`Saving agent ${this.getAgentId()} to database`); | ||
| // Save the app to database (authenticated users only) | ||
| const appService = new AppService(this.env); | ||
| await appService.createApp({ | ||
| id: this.state.inferenceContext.agentId, | ||
| userId: this.state.inferenceContext.userId, | ||
| sessionToken: null, | ||
| title: this.state.blueprint.title || this.state.query.substring(0, 100), | ||
| description: this.state.blueprint.description, | ||
| originalPrompt: this.state.query, | ||
| finalPrompt: this.state.query, | ||
| framework: this.state.blueprint.frameworks.join(','), | ||
| visibility: 'private', | ||
| status: 'generating', | ||
| createdAt: new Date(), | ||
| updatedAt: new Date() | ||
| }); | ||
| this.logger().info(`App saved successfully to database for agent ${this.state.inferenceContext.agentId}`, { | ||
| agentId: this.state.inferenceContext.agentId, | ||
| userId: this.state.inferenceContext.userId, | ||
| visibility: 'private' | ||
| }); | ||
| this.logger().info(`Agent initialized successfully for agent ${this.state.inferenceContext.agentId}`); | ||
| } | ||
|
|
||
| // ========================================== | ||
| // Conversation Management | ||
| // ========================================== | ||
|
|
||
| /* | ||
| * Each DO has 10 gb of sqlite storage. However, the way agents sdk works, it stores the 'state' object of the agent as a single row | ||
| * in the cf_agents_state table. And row size has a much smaller limit in sqlite. Thus, we only keep current compactified conversation | ||
| * in the agent's core state and store the full conversation in a separate DO table. | ||
| */ | ||
| getConversationState(id: string = DEFAULT_CONVERSATION_SESSION_ID): ConversationState { | ||
| const rows = this.sql<{ messages: string, id: string }>`SELECT * FROM full_conversations WHERE id = ${id}`; | ||
| let fullHistory: ConversationMessage[] = []; | ||
| if (rows.length > 0 && rows[0].messages) { | ||
| try { | ||
| const parsed = JSON.parse(rows[0].messages); | ||
| if (Array.isArray(parsed)) { | ||
| fullHistory = parsed as ConversationMessage[]; | ||
| } | ||
| } catch (_e) { | ||
| this.logger().warn('Failed to parse full conversation history', _e); | ||
| } | ||
| } | ||
|
|
||
| // Load compact (running) history from sqlite with fallback to in-memory state for migration | ||
| const compactRows = this.sql<{ messages: string, id: string }>`SELECT * FROM compact_conversations WHERE id = ${id}`; | ||
| let runningHistory: ConversationMessage[] = []; | ||
| if (compactRows.length > 0 && compactRows[0].messages) { | ||
| try { | ||
| const parsed = JSON.parse(compactRows[0].messages); | ||
| if (Array.isArray(parsed)) { | ||
| runningHistory = parsed as ConversationMessage[]; | ||
| } | ||
| } catch (_e) { | ||
| this.logger().warn('Failed to parse compact conversation history', _e); | ||
| } | ||
| } | ||
| if (runningHistory.length === 0) { | ||
| runningHistory = fullHistory; | ||
| } | ||
|
|
||
| // Remove duplicates | ||
| const deduplicateMessages = (messages: ConversationMessage[]): ConversationMessage[] => { | ||
| const seen = new Set<string>(); | ||
| return messages.filter(msg => { | ||
| const key = `${msg.conversationId}-${msg.role}-${msg.tool_call_id || ''}`; | ||
| if (seen.has(key)) { | ||
| return false; | ||
| } | ||
| seen.add(key); | ||
| return true; | ||
| }); | ||
| }; | ||
|
|
||
| runningHistory = deduplicateMessages(runningHistory); | ||
| fullHistory = deduplicateMessages(fullHistory); | ||
|
|
||
| this.logger().info(`Loaded conversation state ${id}, full_length: ${fullHistory.length}, compact_length: ${runningHistory.length}`, fullHistory); | ||
|
|
||
| return { | ||
| id: id, | ||
| runningHistory, | ||
| fullHistory, | ||
| }; | ||
| } | ||
|
|
||
| setConversationState(conversations: ConversationState) { | ||
| const serializedFull = JSON.stringify(conversations.fullHistory); | ||
| const serializedCompact = JSON.stringify(conversations.runningHistory); | ||
| try { | ||
| this.logger().info(`Saving conversation state ${conversations.id}, full_length: ${serializedFull.length}, compact_length: ${serializedCompact.length}`, serializedFull); | ||
| this.sql`INSERT OR REPLACE INTO compact_conversations (id, messages) VALUES (${conversations.id}, ${serializedCompact})`; | ||
| this.sql`INSERT OR REPLACE INTO full_conversations (id, messages) VALUES (${conversations.id}, ${serializedFull})`; | ||
| } catch (error) { | ||
| this.logger().error(`Failed to save conversation state ${conversations.id}`, error); | ||
| } | ||
| } | ||
|
|
||
| addConversationMessage(message: ConversationMessage) { | ||
| const conversationState = this.getConversationState(); | ||
| if (!conversationState.runningHistory.find(msg => msg.conversationId === message.conversationId)) { | ||
| this.logger().info('Adding conversation message', { | ||
| message, | ||
| conversationId: message.conversationId, | ||
| runningHistoryLength: conversationState.runningHistory.length, | ||
| fullHistoryLength: conversationState.fullHistory.length | ||
| }); | ||
| conversationState.runningHistory.push(message); | ||
| } else { | ||
| conversationState.runningHistory = conversationState.runningHistory.map(msg => { | ||
| if (msg.conversationId === message.conversationId) { | ||
| return message; | ||
| } | ||
| return msg; | ||
| }); | ||
| } | ||
| if (!conversationState.fullHistory.find(msg => msg.conversationId === message.conversationId)) { | ||
| conversationState.fullHistory.push(message); | ||
| } else { | ||
| conversationState.fullHistory = conversationState.fullHistory.map(msg => { | ||
| if (msg.conversationId === message.conversationId) { | ||
| return message; | ||
| } | ||
| return msg; | ||
| }); | ||
| } | ||
| this.setConversationState(conversationState); | ||
| } | ||
|
|
||
| /** | ||
| * Clear conversation history | ||
| */ | ||
| public clearConversation(): void { | ||
| try { | ||
| this.logger().info('Clearing conversation history'); | ||
|
|
||
| // Clear SQL tables for default conversation session | ||
| this.sql`DELETE FROM full_conversations WHERE id = ${DEFAULT_CONVERSATION_SESSION_ID}`; | ||
| this.sql`DELETE FROM compact_conversations WHERE id = ${DEFAULT_CONVERSATION_SESSION_ID}`; | ||
|
|
||
| this.logger().info('Conversation history cleared successfully'); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.CONVERSATION_CLEARED, { | ||
| message: 'Conversation history cleared', | ||
| }); | ||
| } catch (error) { | ||
| this.logger().error('Error clearing conversation history:', error); | ||
| this.broadcastError('Failed to clear conversation history', error); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Handle user input during conversational code generation | ||
| * Processes user messages and updates pendingUserInputs state | ||
| */ | ||
| async handleUserInput(userMessage: string, images?: ImageAttachment[]): Promise<void> { | ||
| try { | ||
| this.logger().info('Processing user input message', { | ||
| messageLength: userMessage.length, | ||
| pendingInputsCount: this.state.pendingUserInputs.length, | ||
| hasImages: !!images && images.length > 0, | ||
| imageCount: images?.length || 0 | ||
| }); | ||
|
|
||
| await this.behavior.handleUserInput(userMessage, images); | ||
| if (!this.behavior.isCodeGenerating()) { | ||
| // If idle, start generation process | ||
| this.logger().info('User input during IDLE state, starting generation'); | ||
| this.behavior.generateAllFiles().catch(error => { | ||
| this.logger().error('Error starting generation from user input:', error); | ||
| }); | ||
| } | ||
|
|
||
| } catch (error) { | ||
| if (error instanceof RateLimitExceededError) { | ||
| this.logger().error('Rate limit exceeded:', error); | ||
| this.broadcast(WebSocketMessageResponses.RATE_LIMIT_ERROR, { | ||
| error | ||
| }); | ||
| return; | ||
| } | ||
| this.broadcastError('Error processing user input', error); | ||
| } | ||
| } | ||
| // ========================================== | ||
| // WebSocket Management | ||
| // ========================================== | ||
|
|
||
| /** | ||
| * Handle WebSocket message - Agent owns WebSocket lifecycle | ||
| * Delegates to centralized handler which can access both behavior and objective | ||
| */ | ||
| async onMessage(connection: Connection, message: string): Promise<void> { | ||
| handleWebSocketMessage(this, connection, message); | ||
| } | ||
|
|
||
| /** | ||
| * Handle WebSocket close - Agent owns WebSocket lifecycle | ||
| */ | ||
| async onClose(connection: Connection): Promise<void> { | ||
| handleWebSocketClose(connection); | ||
| } | ||
|
|
||
| /** | ||
| * Broadcast message to all connected WebSocket clients | ||
| * Type-safe version using proper WebSocket message types | ||
| */ | ||
| public broadcast<T extends WebSocketMessageType>( | ||
| type: T, | ||
| data?: WebSocketMessageData<T> | ||
| ): void { | ||
| broadcastToConnections(this, type, data || {} as WebSocketMessageData<T>); | ||
| } | ||
|
|
||
| protected broadcastError(context: string, error: unknown): void { | ||
| const errorMessage = error instanceof Error ? error.message : String(error); | ||
| this.logger().error(`${context}:`, error); | ||
| this.broadcast(WebSocketMessageResponses.ERROR, { | ||
| error: `${context}: ${errorMessage}` | ||
| }); | ||
| } | ||
| // ========================================== | ||
| // Git Management | ||
| // ========================================== | ||
|
|
||
| protected async gitInit() { | ||
| try { | ||
| await this.git.init(); | ||
| this.logger().info("Git initialized successfully"); | ||
| // Check if there is any commit | ||
| const head = await this.git.getHead(); | ||
|
|
||
| if (!head) { | ||
| this.logger().info("No commits found, creating initial commit"); | ||
| // get all generated files and commit them | ||
| const generatedFiles = this.fileManager.getGeneratedFiles(); | ||
| if (generatedFiles.length === 0) { | ||
| this.logger().info("No generated files found, skipping initial commit"); | ||
| return; | ||
| } | ||
| await this.git.commit(generatedFiles, "Initial commit"); | ||
| this.logger().info("Initial commit created successfully"); | ||
| } | ||
| } catch (error) { | ||
| this.logger().error("Error during git init:", error); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Export git objects | ||
| * The route handler will build the repo with template rebasing | ||
| */ | ||
| async exportGitObjects(): Promise<{ | ||
| gitObjects: Array<{ path: string; data: Uint8Array }>; | ||
| query: string; | ||
| hasCommits: boolean; | ||
| templateDetails: TemplateDetails | null; | ||
| }> { | ||
| try { | ||
| // Export git objects efficiently (minimal DO memory usage) | ||
| const gitObjects = this.git.fs.exportGitObjects(); | ||
|
|
||
| await this.gitInit(); | ||
|
|
||
| // Ensure template details are available | ||
| await this.behavior.ensureTemplateDetails(); | ||
|
|
||
| const templateDetails = this.behavior.getTemplateDetails(); | ||
|
|
||
| return { | ||
| gitObjects, | ||
| query: this.state.query || 'N/A', | ||
| hasCommits: gitObjects.length > 0, | ||
| templateDetails | ||
| }; | ||
| } catch (error) { | ||
| this.logger().error('exportGitObjects failed', error); | ||
| throw error; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Handle browser file serving requests | ||
| */ | ||
| async handleBrowserFileServing(request: Request): Promise<Response> { | ||
| const url = new URL(request.url); | ||
|
|
||
| this.logger().info('[BROWSER SERVING] Request received', { | ||
| hostname: url.hostname, | ||
| pathname: url.pathname, | ||
| method: request.method | ||
| }); | ||
|
|
||
| // Handle CORS preflight | ||
| if (request.method === 'OPTIONS') { | ||
| return new Response(null, { | ||
| status: 204, | ||
| headers: { | ||
| 'Access-Control-Allow-Origin': '*', | ||
| 'Access-Control-Allow-Methods': 'GET, OPTIONS', | ||
| 'Access-Control-Allow-Headers': '*', | ||
| 'Access-Control-Max-Age': '86400' | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| // Extract token from hostname | ||
| // Pattern: b-{agentid}-{token}.{previewDomain}/{filepath} | ||
| // Token is always 16 characters after the LAST hyphen (after removing 'b-' prefix) | ||
| const subdomain = url.hostname.split('.')[0]; | ||
|
|
||
| if (!subdomain.startsWith('b-')) { | ||
| this.logger().warn('[BROWSER SERVING] Invalid hostname pattern - missing b- prefix', { hostname: url.hostname }); | ||
| return new Response('Invalid request', { | ||
| status: 400, | ||
| headers: { 'Content-Type': 'text/plain' } | ||
| }); | ||
| } | ||
|
|
||
| const withoutPrefix = subdomain.substring(2); // Remove 'b-' | ||
| const lastHyphenIndex = withoutPrefix.lastIndexOf('-'); | ||
|
|
||
| if (lastHyphenIndex === -1) { | ||
| this.logger().warn('[BROWSER SERVING] Invalid hostname pattern - no hyphen after prefix', { hostname: url.hostname }); | ||
| return new Response('Invalid request', { | ||
| status: 400, | ||
| headers: { 'Content-Type': 'text/plain' } | ||
| }); | ||
| } | ||
|
|
||
| const providedToken = withoutPrefix.substring(lastHyphenIndex + 1); | ||
|
|
||
| // Extract file path from pathname | ||
| let filePath = url.pathname === '/' || url.pathname === '' | ||
| ? 'public/index.html' | ||
| : url.pathname.replace(/^\//, ''); // Remove leading slash | ||
|
|
||
| this.logger().info('[BROWSER SERVING] Extracted', { providedToken, filePath }); | ||
|
|
||
| // Validate token | ||
| const storedToken = this.state.fileServingToken?.token; | ||
| if (!storedToken || providedToken !== storedToken.toLowerCase()) { | ||
| this.logger().warn('[BROWSER SERVING] Token mismatch', { providedToken, storedToken }); | ||
| return new Response('Unauthorized', { | ||
| status: 403, | ||
| headers: { 'Content-Type': 'text/plain' } | ||
| }); | ||
| } | ||
|
|
||
| if (!isPathSafe(filePath)) { | ||
| return new Response('Invalid path', { | ||
| status: 400, | ||
| headers: { 'Content-Type': 'text/plain' } | ||
| }); | ||
| } | ||
| const normalized = normalizePath(filePath); | ||
| let file = this.fileManager.getFile(normalized); | ||
|
|
||
| // Try with public/ prefix if not found | ||
| if (!file && !normalized.startsWith('public/')) { | ||
| file = this.fileManager.getFile(`public/${normalized}`); | ||
| } | ||
|
|
||
| if (!file) { | ||
| this.logger().warn('[BROWSER SERVING] File not found', { normalized }); | ||
| return new Response('File not found', { | ||
| status: 404, | ||
| headers: { 'Content-Type': 'text/plain' } | ||
| }); | ||
| } | ||
|
|
||
| // Serve file with correct Content-Type | ||
| const contentType = getMimeType(normalized) || 'application/octet-stream'; | ||
|
|
||
| this.logger().info('[BROWSER SERVING] Serving file', { | ||
| path: normalized, | ||
| contentType | ||
| }); | ||
|
|
||
| let content = file.fileContents; | ||
|
|
||
| // For HTML files, inject base tag | ||
| if (normalized.endsWith('.html') || contentType.includes('text/html')) { | ||
| const baseTag = `<base href="/">`; | ||
|
|
||
| // Inject base tag after <head> tag if present | ||
| if (content.includes('<head>')) { | ||
| content = content.replace(/<head>/i, `<head>\n ${baseTag}`); | ||
| } else { | ||
| // Fallback: inject at the beginning | ||
| content = baseTag + '\n' + content; | ||
| } | ||
|
|
||
| this.logger().info('[BROWSER SERVING] Injected base tag'); | ||
| } | ||
|
|
||
| return new Response(content, { | ||
| status: 200, | ||
| headers: { | ||
| 'Content-Type': contentType, | ||
| 'Cache-Control': 'no-cache, no-store, must-revalidate', | ||
| 'Access-Control-Allow-Origin': '*', | ||
| 'Access-Control-Allow-Methods': 'GET, OPTIONS', | ||
| 'Access-Control-Allow-Headers': '*', | ||
| 'X-Sandbox-Type': 'browser-native' | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| /** | ||
| * Cache GitHub OAuth token in memory for subsequent exports | ||
| * Token is ephemeral - lost on DO eviction | ||
| */ | ||
| setGitHubToken(token: string, username: string, ttl: number = 3600000): void { | ||
| this.objective.setGitHubToken(token, username, ttl); | ||
| } | ||
|
|
||
| /** | ||
| * Get cached GitHub token if available and not expired | ||
| */ | ||
| getGitHubToken(): { token: string; username: string } | null { | ||
| return this.objective.getGitHubToken(); | ||
| } | ||
|
|
||
| /** | ||
| * Clear cached GitHub token | ||
| */ | ||
| clearGitHubToken(): void { | ||
| this.objective.clearGitHubToken(); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - File exceeds 300 lines (740 lines)
Category: quality
Description:
Agent file is too large with mixed concerns.
Suggestion:
Extract WebSocket handling, conversation management, and git operations into separate modules
Confidence: 90%
Rule: ts_large_file
| import { | ||
| PhaseConceptGenerationSchemaType, | ||
| PhaseConceptType, | ||
| FileOutputType, | ||
| PhaseImplementationSchemaType, | ||
| } from '../../schemas'; | ||
| import { StaticAnalysisResponse } from '../../../services/sandbox/sandboxTypes'; | ||
| import { CurrentDevState, MAX_PHASES, PhasicState } from '../state'; | ||
| import { AllIssues, AgentInitArgs, PhaseExecutionResult, UserContext } from '../types'; | ||
| import { WebSocketMessageResponses } from '../../constants'; | ||
| import { UserConversationProcessor } from '../../operations/UserConversationProcessor'; | ||
| import { GenerationContext, PhasicGenerationContext } from '../../domain/values/GenerationContext'; | ||
| import { IssueReport } from '../../domain/values/IssueReport'; | ||
| import { PhaseImplementationOperation } from '../../operations/PhaseImplementation'; | ||
| import { FileRegenerationOperation } from '../../operations/FileRegeneration'; | ||
| import { PhaseGenerationOperation } from '../../operations/PhaseGeneration'; | ||
| import { FastCodeFixerOperation } from '../../operations/PostPhaseCodeFixer'; | ||
| import { customizePackageJson, customizeTemplateFiles, generateProjectName } from '../../utils/templateCustomizer'; | ||
| import { generateBlueprint } from '../../planning/blueprint'; | ||
| import { RateLimitExceededError } from 'shared/types/errors'; | ||
| import { ImageAttachment, type ProcessedImageAttachment } from '../../../types/image-attachment'; | ||
| import { OperationOptions } from '../../operations/common'; | ||
| import { ConversationMessage } from '../../inferutils/common'; | ||
| import { generateNanoId } from 'worker/utils/idGenerator'; | ||
| import { IdGenerator } from '../../utils/idGenerator'; | ||
| import { BaseCodingBehavior, BaseCodingOperations } from './base'; | ||
| import { ICodingAgent } from '../../services/interfaces/ICodingAgent'; | ||
| import { SimpleCodeGenerationOperation } from '../../operations/SimpleCodeGeneration'; | ||
| import { StateMigration } from '../stateMigration'; | ||
|
|
||
| interface PhasicOperations extends BaseCodingOperations { | ||
| generateNextPhase: PhaseGenerationOperation; | ||
| implementPhase: PhaseImplementationOperation; | ||
| } | ||
|
|
||
| /** | ||
| * PhasicCodingBehavior - Deterministically orchestrated agent | ||
| * | ||
| * Manages the lifecycle of code generation including: | ||
| * - Blueprint, phase generation, phase implementation, review cycles orchestrations | ||
| * - File streaming with WebSocket updates | ||
| * - Code validation and error correction | ||
| * - Deployment to sandbox service | ||
| */ | ||
| export class PhasicCodingBehavior extends BaseCodingBehavior<PhasicState> implements ICodingAgent { | ||
| protected static readonly PROJECT_NAME_PREFIX_MAX_LENGTH = 20; | ||
|
|
||
| protected operations: PhasicOperations = { | ||
| regenerateFile: new FileRegenerationOperation(), | ||
| fastCodeFixer: new FastCodeFixerOperation(), | ||
| processUserMessage: new UserConversationProcessor(), | ||
| simpleGenerateFiles: new SimpleCodeGenerationOperation(), | ||
| generateNextPhase: new PhaseGenerationOperation(), | ||
| implementPhase: new PhaseImplementationOperation(), | ||
| }; | ||
|
|
||
| /** | ||
| * Initialize the code generator with project blueprint and template | ||
| * Sets up services and begins deployment process | ||
| */ | ||
| async initialize( | ||
| initArgs: AgentInitArgs<PhasicState>, | ||
| ..._args: unknown[] | ||
| ): Promise<PhasicState> { | ||
| await super.initialize(initArgs); | ||
| const { templateInfo } = initArgs; | ||
| if (!templateInfo || !templateInfo.templateDetails) { | ||
| throw new Error('Phasic initialization requires templateInfo.templateDetails'); | ||
| } | ||
| const { query, language, frameworks, hostname, inferenceContext, sandboxSessionId } = initArgs; | ||
|
|
||
| // Generate a blueprint | ||
| this.logger.info('Generating blueprint', { query, queryLength: query.length, imagesCount: initArgs.images?.length || 0 }); | ||
| this.logger.info(`Using language: ${language}, frameworks: ${frameworks ? frameworks.join(", ") : "none"}`); | ||
|
|
||
| const blueprint = await generateBlueprint({ | ||
| env: this.env, | ||
| inferenceContext, | ||
| query, | ||
| language: language!, | ||
| frameworks: frameworks!, | ||
| templateDetails: templateInfo?.templateDetails, | ||
| templateMetaInfo: templateInfo?.selection, | ||
| images: initArgs.images, | ||
| projectType: this.projectType, | ||
| stream: { | ||
| chunk_size: 256, | ||
| onChunk: (chunk) => { | ||
| initArgs.onBlueprintChunk(chunk); | ||
| } | ||
| } | ||
| }) | ||
|
|
||
| const packageJson = templateInfo.templateDetails.allFiles['package.json']; | ||
|
|
||
| const projectName = generateProjectName( | ||
| blueprint?.projectName || templateInfo?.templateDetails.name || '', | ||
| generateNanoId(), | ||
| PhasicCodingBehavior.PROJECT_NAME_PREFIX_MAX_LENGTH | ||
| ); | ||
|
|
||
| this.logger.info('Generated project name', { projectName }); | ||
|
|
||
| const nextState: PhasicState = { | ||
| ...this.state, | ||
| projectName, | ||
| query, | ||
| blueprint, | ||
| templateName: templateInfo.templateDetails.name, | ||
| sandboxInstanceId: undefined, | ||
| generatedPhases: [], | ||
| commandsHistory: [], | ||
| lastPackageJson: packageJson, | ||
| sessionId: sandboxSessionId!, | ||
| hostname, | ||
| inferenceContext, | ||
| projectType: this.projectType, | ||
| behaviorType: 'phasic' | ||
| }; | ||
| this.setState(nextState); | ||
| // Customize template files (package.json, wrangler.jsonc, .bootstrap.js, .gitignore) | ||
| const customizedFiles = customizeTemplateFiles( | ||
| templateInfo.templateDetails.allFiles, | ||
| { | ||
| projectName, | ||
| commandsHistory: [] | ||
| } | ||
| ); | ||
|
|
||
| this.logger.info('Customized template files', { | ||
| files: Object.keys(customizedFiles) | ||
| }); | ||
|
|
||
| // Save customized files to git | ||
| const filesToSave = Object.entries(customizedFiles).map(([filePath, content]) => ({ | ||
| filePath, | ||
| fileContents: content, | ||
| filePurpose: 'Project configuration file' | ||
| })); | ||
|
|
||
| await this.fileManager.saveGeneratedFiles( | ||
| filesToSave, | ||
| 'Initialize project configuration files', | ||
| true | ||
| ); | ||
|
|
||
| this.logger.info('Committed customized template files to git'); | ||
|
|
||
| this.initializeAsync().catch((error: unknown) => { | ||
| this.broadcastError("Initialization failed", error); | ||
| }); | ||
| this.logger.info(`Agent ${this.getAgentId()} session: ${this.state.sessionId} initialized successfully`); | ||
| return this.state; | ||
| } | ||
|
|
||
| async onStart(props?: Record<string, unknown> | undefined): Promise<void> { | ||
| await super.onStart(props); | ||
| } | ||
|
|
||
| migrateStateIfNeeded(): void { | ||
| const migratedState = StateMigration.migrateIfNeeded(this.state, this.logger) as PhasicState | null; | ||
| if (migratedState) { | ||
| this.setState(migratedState); | ||
| } | ||
|
|
||
| // migrate overwritten package.jsons | ||
| const oldPackageJson = this.fileManager.getFile('package.json')?.fileContents || this.state.lastPackageJson; | ||
| if (oldPackageJson) { | ||
| const packageJson = customizePackageJson(oldPackageJson, this.state.projectName); | ||
| this.fileManager.saveGeneratedFiles([ | ||
| { | ||
| filePath: 'package.json', | ||
| fileContents: packageJson, | ||
| filePurpose: 'Project configuration file' | ||
| } | ||
| ], 'chore: fix overwritten package.json', true); | ||
| } | ||
| } | ||
|
|
||
| rechargePhasesCounter(max_phases: number = MAX_PHASES): void { | ||
| if (this.getPhasesCounter() <= max_phases) { | ||
| this.setState({ | ||
| ...this.state, | ||
| phasesCounter: max_phases | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| decrementPhasesCounter(): number { | ||
| const counter = this.getPhasesCounter() - 1; | ||
| this.setState({ | ||
| ...this.state, | ||
| phasesCounter: counter | ||
| }); | ||
| return counter; | ||
| } | ||
|
|
||
| getPhasesCounter(): number { | ||
| return this.state.phasesCounter; | ||
| } | ||
|
|
||
| getOperationOptions(): OperationOptions<PhasicGenerationContext> { | ||
| const context = GenerationContext.from(this.state, this.getTemplateDetails(), this.logger); | ||
| if (!GenerationContext.isPhasic(context)) { | ||
| throw new Error('Expected PhasicGenerationContext'); | ||
| } | ||
| return { | ||
| env: this.env, | ||
| agentId: this.getAgentId(), | ||
| context, | ||
| logger: this.logger, | ||
| inferenceContext: this.getInferenceContext(), | ||
| agent: this | ||
| }; | ||
| } | ||
|
|
||
| private createNewIncompletePhase(phaseConcept: PhaseConceptType) { | ||
| this.setState({ | ||
| ...this.state, | ||
| generatedPhases: [...this.state.generatedPhases, { | ||
| ...phaseConcept, | ||
| completed: false | ||
| }] | ||
| }) | ||
|
|
||
| this.logger.info("Created new incomplete phase:", JSON.stringify(this.state.generatedPhases, null, 2)); | ||
| } | ||
|
|
||
| private markPhaseComplete(phaseName: string) { | ||
| // First find the phase | ||
| const phases = this.state.generatedPhases; | ||
| if (!phases.some(p => p.name === phaseName)) { | ||
| this.logger.warn(`Phase ${phaseName} not found in generatedPhases array, skipping save`); | ||
| return; | ||
| } | ||
|
|
||
| // Update the phase | ||
| this.setState({ | ||
| ...this.state, | ||
| generatedPhases: phases.map(p => p.name === phaseName ? { ...p, completed: true } : p) | ||
| }); | ||
|
|
||
| this.logger.info("Completed phases:", JSON.stringify(phases, null, 2)); | ||
| } | ||
|
|
||
| async queueUserRequest(request: string, images?: ProcessedImageAttachment[]): Promise<void> { | ||
| this.rechargePhasesCounter(3); | ||
| await super.queueUserRequest(request, images); | ||
| } | ||
|
|
||
| async build(): Promise<void> { | ||
| await this.launchStateMachine(); | ||
| } | ||
|
|
||
| private async launchStateMachine() { | ||
| this.logger.info("Launching state machine"); | ||
|
|
||
| let currentDevState = CurrentDevState.PHASE_IMPLEMENTING; | ||
| const generatedPhases = this.state.generatedPhases; | ||
| const incompletedPhases = generatedPhases.filter(phase => !phase.completed); | ||
| let phaseConcept : PhaseConceptType | undefined; | ||
| if (incompletedPhases.length > 0) { | ||
| phaseConcept = incompletedPhases[incompletedPhases.length - 1]; | ||
| this.logger.info('Resuming code generation from incompleted phase', { | ||
| phase: phaseConcept | ||
| }); | ||
| } else if (generatedPhases.length > 0) { | ||
| currentDevState = CurrentDevState.PHASE_GENERATING; | ||
| this.logger.info('Resuming code generation after generating all phases', { | ||
| phase: generatedPhases[generatedPhases.length - 1] | ||
| }); | ||
| } else { | ||
| phaseConcept = this.state.blueprint.initialPhase; | ||
| this.logger.info('Starting code generation from initial phase', { | ||
| phase: phaseConcept | ||
| }); | ||
| this.createNewIncompletePhase(phaseConcept); | ||
| } | ||
|
|
||
| let userContext: UserContext | undefined; | ||
|
|
||
| try { | ||
| let executionResults: PhaseExecutionResult; | ||
| // State machine loop - continues until IDLE state | ||
| while (currentDevState !== CurrentDevState.IDLE) { | ||
| this.logger.info(`[generateAllFiles] Executing state: ${currentDevState}`); | ||
| switch (currentDevState) { | ||
| case CurrentDevState.PHASE_GENERATING: | ||
| executionResults = await this.executePhaseGeneration(); | ||
| currentDevState = executionResults.currentDevState; | ||
| phaseConcept = executionResults.result; | ||
| userContext = executionResults.userContext; | ||
| break; | ||
| case CurrentDevState.PHASE_IMPLEMENTING: | ||
| executionResults = await this.executePhaseImplementation(phaseConcept, userContext); | ||
| currentDevState = executionResults.currentDevState; | ||
| userContext = undefined; | ||
| break; | ||
| case CurrentDevState.REVIEWING: | ||
| currentDevState = await this.executeReviewCycle(); | ||
| break; | ||
| case CurrentDevState.FINALIZING: | ||
| currentDevState = await this.executeFinalizing(); | ||
| break; | ||
| default: | ||
| break; | ||
| } | ||
| } | ||
|
|
||
| this.logger.info("State machine completed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error in state machine:", error); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Execute phase generation state - generate next phase with user suggestions | ||
| */ | ||
| async executePhaseGeneration(isFinal?: boolean): Promise<PhaseExecutionResult> { | ||
| this.logger.info("Executing PHASE_GENERATING state"); | ||
| try { | ||
| const currentIssues = await this.fetchAllIssues(); | ||
|
|
||
| // Generate next phase with user suggestions if available | ||
|
|
||
| // Get stored images if user suggestions are present | ||
| const pendingUserInputs = this.fetchPendingUserRequests(); | ||
| const userContext = (pendingUserInputs.length > 0) | ||
| ? { | ||
| suggestions: pendingUserInputs, | ||
| images: this.pendingUserImages | ||
| } as UserContext | ||
| : undefined; | ||
|
|
||
| if (userContext && userContext?.suggestions && userContext.suggestions.length > 0) { | ||
| // Only reset pending user inputs if user suggestions were read | ||
| this.logger.info("Resetting pending user inputs", { | ||
| userSuggestions: userContext.suggestions, | ||
| hasImages: !!userContext.images, | ||
| imageCount: userContext.images?.length || 0 | ||
| }); | ||
|
|
||
| // Clear images after they're passed to phase generation | ||
| if (userContext?.images && userContext.images.length > 0) { | ||
| this.logger.info('Clearing stored user images after passing to phase generation'); | ||
| this.pendingUserImages = []; | ||
| } | ||
| } | ||
|
|
||
| const nextPhase = await this.generateNextPhase(currentIssues, userContext, isFinal); | ||
|
|
||
| if (!nextPhase) { | ||
| this.logger.info("No more phases to implement, transitioning to FINALIZING"); | ||
| return { | ||
| currentDevState: CurrentDevState.FINALIZING, | ||
| }; | ||
| } | ||
|
|
||
| // Store current phase and transition to implementation | ||
| this.setState({ | ||
| ...this.state, | ||
| currentPhase: nextPhase | ||
| }); | ||
|
|
||
| return { | ||
| currentDevState: CurrentDevState.PHASE_IMPLEMENTING, | ||
| result: nextPhase, | ||
| userContext: userContext, | ||
| }; | ||
| } catch (error) { | ||
| if (error instanceof RateLimitExceededError) { | ||
| throw error; | ||
| } | ||
| this.broadcastError("Error generating phase", error); | ||
| return { | ||
| currentDevState: CurrentDevState.IDLE, | ||
| }; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Execute phase implementation state - implement current phase | ||
| */ | ||
| async executePhaseImplementation(phaseConcept?: PhaseConceptType, userContext?: UserContext): Promise<{currentDevState: CurrentDevState, staticAnalysis?: StaticAnalysisResponse}> { | ||
| try { | ||
| this.logger.info("Executing PHASE_IMPLEMENTING state"); | ||
|
|
||
| if (phaseConcept === undefined) { | ||
| phaseConcept = this.state.currentPhase; | ||
| if (phaseConcept === undefined) { | ||
| this.logger.error("No phase concept provided to implement, will call phase generation"); | ||
| const results = await this.executePhaseGeneration(); | ||
| phaseConcept = results.result; | ||
| if (phaseConcept === undefined) { | ||
| this.logger.error("No phase concept provided to implement, will return"); | ||
| return {currentDevState: CurrentDevState.FINALIZING}; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| this.setState({ | ||
| ...this.state, | ||
| currentPhase: undefined // reset current phase | ||
| }); | ||
|
|
||
| // Prepare issues for implementation | ||
| const currentIssues = await this.fetchAllIssues(true); | ||
|
|
||
| // Implement the phase with user context (suggestions and images) | ||
| await this.implementPhase(phaseConcept, currentIssues, userContext); | ||
|
|
||
| this.logger.info(`Phase ${phaseConcept.name} completed, generating next phase`); | ||
|
|
||
| const phasesCounter = this.decrementPhasesCounter(); | ||
|
|
||
| if ((phaseConcept.lastPhase || phasesCounter <= 0) && this.state.pendingUserInputs.length === 0) return {currentDevState: CurrentDevState.FINALIZING}; | ||
| return {currentDevState: CurrentDevState.PHASE_GENERATING}; | ||
| } catch (error) { | ||
| this.logger.error("Error implementing phase", error); | ||
| if (error instanceof RateLimitExceededError) { | ||
| throw error; | ||
| } | ||
| return {currentDevState: CurrentDevState.IDLE}; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Execute review cycle state - review and cleanup | ||
| */ | ||
| async executeReviewCycle(): Promise<CurrentDevState> { | ||
| this.logger.info("Executing REVIEWING state - review and cleanup"); | ||
| if (this.state.reviewingInitiated) { | ||
| this.logger.info("Reviewing already initiated, skipping"); | ||
| return CurrentDevState.IDLE; | ||
| } | ||
| this.setState({ | ||
| ...this.state, | ||
| reviewingInitiated: true | ||
| }); | ||
|
|
||
| // If issues/errors found, prompt user if they want to review and cleanup | ||
| const issues = await this.fetchAllIssues(false); | ||
| if (issues.runtimeErrors.length > 0 || issues.staticAnalysis.typecheck.issues.length > 0) { | ||
| this.logger.info("Reviewing stage - issues found, prompting user to review and cleanup"); | ||
| const message : ConversationMessage = { | ||
| role: "assistant", | ||
| content: `<system_context>If the user responds with yes, launch the 'deep_debug' tool with the prompt to fix all the issues in the app</system_context>\nThere might be some bugs in the app. Do you want me to try to fix them?`, | ||
| conversationId: IdGenerator.generateConversationId(), | ||
| } | ||
| // Store the message in the conversation history so user's response can trigger the deep debug tool | ||
| this.infrastructure.addConversationMessage(message); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.CONVERSATION_RESPONSE, { | ||
| message: message.content, | ||
| conversationId: message.conversationId, | ||
| isStreaming: false, | ||
| }); | ||
| } | ||
|
|
||
| return CurrentDevState.IDLE; | ||
| } | ||
|
|
||
| /** | ||
| * Execute finalizing state - final review and cleanup (runs only once) | ||
| */ | ||
| async executeFinalizing(): Promise<CurrentDevState> { | ||
| this.logger.info("Executing FINALIZING state - final review and cleanup"); | ||
|
|
||
| if (this.setMVPGenerated()) { | ||
| this.logger.info("Finalizing stage already done"); | ||
| return CurrentDevState.REVIEWING; | ||
| } | ||
|
|
||
| const { result: phaseConcept, userContext } = await this.executePhaseGeneration(true); | ||
| if (!phaseConcept) { | ||
| this.logger.warn("Phase concept not generated, skipping final review"); | ||
| return CurrentDevState.REVIEWING; | ||
| } | ||
|
|
||
| await this.executePhaseImplementation(phaseConcept, userContext); | ||
|
|
||
| const numFilesGenerated = this.fileManager.getGeneratedFilePaths().length; | ||
| this.logger.info(`Finalization complete. Generated ${numFilesGenerated}/${this.getTotalFiles()} files.`); | ||
|
|
||
| // Transition to IDLE - generation complete | ||
| return CurrentDevState.REVIEWING; | ||
| } | ||
|
|
||
| /** | ||
| * Generate next phase with user context (suggestions and images) | ||
| */ | ||
| async generateNextPhase(currentIssues: AllIssues, userContext?: UserContext, isFinal?: boolean): Promise<PhaseConceptGenerationSchemaType | undefined> { | ||
| const issues = IssueReport.from(currentIssues); | ||
|
|
||
| // Build notification message | ||
| let notificationMsg = "Generating next phase"; | ||
| if (isFinal) { | ||
| notificationMsg = "Generating final phase"; | ||
| } | ||
| if (userContext?.suggestions && userContext.suggestions.length > 0) { | ||
| notificationMsg = `Generating next phase incorporating ${userContext.suggestions.length} user suggestion(s)`; | ||
| } | ||
| if (userContext?.images && userContext.images.length > 0) { | ||
| notificationMsg += ` with ${userContext.images.length} image(s)`; | ||
| } | ||
|
|
||
| // Notify phase generation start | ||
| this.broadcast(WebSocketMessageResponses.PHASE_GENERATING, { | ||
| message: notificationMsg, | ||
| issues: issues, | ||
| userSuggestions: userContext?.suggestions, | ||
| }); | ||
|
|
||
| const result = await this.operations.generateNextPhase.execute( | ||
| { | ||
| issues, | ||
| userContext, | ||
| isUserSuggestedPhase: userContext?.suggestions && userContext.suggestions.length > 0 && this.state.mvpGenerated, | ||
| isFinal: isFinal ?? false, | ||
| }, | ||
| this.getOperationOptions() | ||
| ) | ||
| // Execute install commands if any | ||
| if (result.installCommands && result.installCommands.length > 0) { | ||
| this.executeCommands(result.installCommands); | ||
| } | ||
|
|
||
| // Execute delete commands if any | ||
| const filesToDelete = result.files.filter(f => f.changes?.toLowerCase().trim() === 'delete'); | ||
| if (filesToDelete.length > 0) { | ||
| this.logger.info(`Deleting ${filesToDelete.length} files: ${filesToDelete.map(f => f.path).join(", ")}`); | ||
| this.deleteFiles(filesToDelete.map(f => f.path)); | ||
| } | ||
|
|
||
| if (result.files.length === 0) { | ||
| this.logger.info("No files generated for next phase"); | ||
| // Notify phase generation complete | ||
| this.broadcast(WebSocketMessageResponses.PHASE_GENERATED, { | ||
| message: `No files generated for next phase`, | ||
| phase: undefined | ||
| }); | ||
| return undefined; | ||
| } | ||
|
|
||
| this.createNewIncompletePhase(result); | ||
| // Notify phase generation complete | ||
| this.broadcast(WebSocketMessageResponses.PHASE_GENERATED, { | ||
| message: `Generated next phase: ${result.name}`, | ||
| phase: result | ||
| }); | ||
|
|
||
| return result; | ||
| } | ||
|
|
||
| /** | ||
| * Implement a single phase of code generation | ||
| * Streams file generation with real-time updates and incorporates technical instructions | ||
| */ | ||
| async implementPhase(phase: PhaseConceptType, currentIssues: AllIssues, userContext?: UserContext, streamChunks: boolean = true, postPhaseFixing: boolean = true): Promise<PhaseImplementationSchemaType> { | ||
| const issues = IssueReport.from(currentIssues); | ||
|
|
||
| const implementationMsg = userContext?.suggestions && userContext.suggestions.length > 0 | ||
| ? `Implementing phase: ${phase.name} with ${userContext.suggestions.length} user suggestion(s)` | ||
| : `Implementing phase: ${phase.name}`; | ||
| const msgWithImages = userContext?.images && userContext.images.length > 0 | ||
| ? `${implementationMsg} and ${userContext.images.length} image(s)` | ||
| : implementationMsg; | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.PHASE_IMPLEMENTING, { | ||
| message: msgWithImages, | ||
| phase: phase, | ||
| issues: issues, | ||
| }); | ||
|
|
||
|
|
||
| const result = await this.operations.implementPhase.execute( | ||
| { | ||
| phase, | ||
| issues, | ||
| isFirstPhase: this.state.generatedPhases.filter(p => p.completed).length === 0, | ||
| fileGeneratingCallback: (filePath: string, filePurpose: string) => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATING, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath: filePath, | ||
| filePurpose: filePurpose | ||
| }); | ||
| }, | ||
| userContext, | ||
| shouldAutoFix: this.state.inferenceContext.enableRealtimeCodeFix, | ||
| fileChunkGeneratedCallback: streamChunks ? (filePath: string, chunk: string, format: 'full_content' | 'unified_diff') => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_CHUNK_GENERATED, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath: filePath, | ||
| chunk, | ||
| format, | ||
| }); | ||
| } : (_filePath: string, _chunk: string, _format: 'full_content' | 'unified_diff') => {}, | ||
| fileClosedCallback: (file: FileOutputType, message: string) => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message, | ||
| file, | ||
| }); | ||
| } | ||
| }, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.PHASE_VALIDATING, { | ||
| message: `Validating files for phase: ${phase.name}`, | ||
| phase: phase, | ||
| }); | ||
|
|
||
| // Await the already-created realtime code fixer promises | ||
| const finalFiles = await Promise.allSettled(result.fixedFilePromises).then((results: PromiseSettledResult<FileOutputType>[]) => { | ||
| return results.map((result) => { | ||
| if (result.status === 'fulfilled') { | ||
| return result.value; | ||
| } else { | ||
| return null; | ||
| } | ||
| }).filter((f): f is FileOutputType => f !== null); | ||
| }); | ||
|
|
||
| // Update state with completed phase | ||
| await this.fileManager.saveGeneratedFiles(finalFiles, `feat: ${phase.name}\n\n${phase.description}`); | ||
|
|
||
| this.logger.info("Files generated for phase:", phase.name, finalFiles.map(f => f.filePath)); | ||
|
|
||
| // Execute commands if provided | ||
| if (result.commands && result.commands.length > 0) { | ||
| this.logger.info("Phase implementation suggested install commands:", result.commands); | ||
| await this.executeCommands(result.commands, false); | ||
| } | ||
|
|
||
| // Deploy generated files | ||
| if (finalFiles.length > 0) { | ||
| await this.deployToSandbox(finalFiles, false, phase.name, true); | ||
| if (postPhaseFixing) { | ||
| await this.applyDeterministicCodeFixes(); | ||
| if (this.state.inferenceContext.enableFastSmartCodeFix) { | ||
| await this.applyFastSmartCodeFixes(); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| // Validation complete | ||
| this.broadcast(WebSocketMessageResponses.PHASE_VALIDATED, { | ||
| message: `Files validated for phase: ${phase.name}`, | ||
| phase: phase | ||
| }); | ||
|
|
||
| this.logger.info("Files generated for phase:", phase.name, finalFiles.map(f => f.filePath)); | ||
|
|
||
| this.logger.info(`Validation complete for phase: ${phase.name}`); | ||
|
|
||
| // Notify phase completion | ||
| this.broadcast(WebSocketMessageResponses.PHASE_IMPLEMENTED, { | ||
| phase: { | ||
| name: phase.name, | ||
| files: finalFiles.map(f => ({ | ||
| path: f.filePath, | ||
| purpose: f.filePurpose, | ||
| contents: f.fileContents | ||
| })), | ||
| description: phase.description | ||
| }, | ||
| message: "Files generated successfully for phase" | ||
| }); | ||
|
|
||
| this.markPhaseComplete(phase.name); | ||
|
|
||
| return { | ||
| files: finalFiles, | ||
| deploymentNeeded: result.deploymentNeeded, | ||
| commands: result.commands | ||
| }; | ||
| } | ||
|
|
||
| getTotalFiles(): number { | ||
| return this.fileManager.getGeneratedFilePaths().length + ((this.state.currentPhase || this.state.blueprint.initialPhase)?.files?.length || 0); | ||
| } | ||
|
|
||
| private async applyFastSmartCodeFixes() : Promise<void> { | ||
| try { | ||
| const startTime = Date.now(); | ||
| this.logger.info("Applying fast smart code fixes"); | ||
| // Get static analysis and do deterministic fixes | ||
| const staticAnalysis = await this.runStaticAnalysisCode(); | ||
| if (staticAnalysis.typecheck.issues.length + staticAnalysis.lint.issues.length == 0) { | ||
| this.logger.info("No issues found, skipping fast smart code fixes"); | ||
| return; | ||
| } | ||
| const issues = staticAnalysis.typecheck.issues.concat(staticAnalysis.lint.issues); | ||
| const allFiles = this.fileManager.getAllRelevantFiles(); | ||
|
|
||
| const fastCodeFixer = await this.operations.fastCodeFixer.execute({ | ||
| query: this.state.query, | ||
| issues, | ||
| allFiles, | ||
| }, this.getOperationOptions()); | ||
|
|
||
| if (fastCodeFixer.length > 0) { | ||
| await this.fileManager.saveGeneratedFiles(fastCodeFixer, "fix: Fast smart code fixes"); | ||
| await this.deployToSandbox(fastCodeFixer); | ||
| this.logger.info("Fast smart code fixes applied successfully"); | ||
| } | ||
| this.logger.info(`Fast smart code fixes applied in ${Date.now() - startTime}ms`); | ||
| } catch (error) { | ||
| this.broadcastError("Failed to apply fast smart code fixes", error); | ||
| return; | ||
| } | ||
| } | ||
|
|
||
| async handleUserInput(userMessage: string, images?: ImageAttachment[]): Promise<void> { | ||
| const result = await super.handleUserInput(userMessage, images); | ||
| return result; | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - File exceeds 300 lines (718 lines)
Category: quality
Description:
Behavior class file too large for maintainability.
Suggestion:
Extract phase generation, implementation, and review logic into separate operation classes
Confidence: 90%
Rule: ts_large_file
| protected operations: BaseCodingOperations = { | ||
| regenerateFile: new FileRegenerationOperation(), | ||
| fastCodeFixer: new FastCodeFixerOperation(), | ||
| processUserMessage: new UserConversationProcessor(), | ||
| simpleGenerateFiles: new SimpleCodeGenerationOperation(), | ||
| }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - Tight coupling - Direct instantiation of operations
Category: quality
Description:
Operations are directly instantiated in constructor, making testing and swapping implementations difficult.
Suggestion:
Pass operations via constructor (dependency injection) or use a factory pattern
Confidence: 75%
Rule: arch_tight_coupling
| const appService = new AppService(this.env); | ||
| await appService.updateApp( | ||
| this.getAgentId(), | ||
| { | ||
| status: 'completed', | ||
| } | ||
| ); | ||
| this.generationPromise = null; | ||
| this.broadcast(WebSocketMessageResponses.GENERATION_COMPLETE, { | ||
| message: "Code generation and review process completed.", | ||
| instanceId: this.state.sandboxInstanceId, | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Abstract method to be implemented by subclasses | ||
| * Contains the main logic for code generation and review process | ||
| */ | ||
| abstract build(): Promise<void> | ||
|
|
||
| async executeDeepDebug( | ||
| issue: string, | ||
| toolRenderer: RenderToolCall, | ||
| streamCb: (chunk: string) => void, | ||
| focusPaths?: string[], | ||
| ): Promise<DeepDebugResult> { | ||
| const debugPromise = (async () => { | ||
| try { | ||
| const previousTranscript = this.state.lastDeepDebugTranscript ?? undefined; | ||
| const operationOptions = this.getOperationOptions(); | ||
| const filesIndex = operationOptions.context.allFiles | ||
| .filter((f) => | ||
| !focusPaths?.length || | ||
| focusPaths.some((p) => f.filePath.includes(p)), | ||
| ); | ||
|
|
||
| const runtimeErrors = await this.fetchRuntimeErrors(false); | ||
|
|
||
| const inputs: DeepDebuggerInputs = { | ||
| issue, | ||
| previousTranscript, | ||
| filesIndex, | ||
| runtimeErrors, | ||
| streamCb, | ||
| toolRenderer, | ||
| }; | ||
|
|
||
| const operation = new DeepDebuggerOperation(); | ||
|
|
||
| const result = await operation.execute(inputs, operationOptions); | ||
|
|
||
| const transcript = result.transcript; | ||
|
|
||
| // Save transcript for next session | ||
| this.setState({ | ||
| ...this.state, | ||
| lastDeepDebugTranscript: transcript, | ||
| }); | ||
|
|
||
| return { success: true as const, transcript }; | ||
| } catch (e) { | ||
| this.logger.error('Deep debugger failed', e); | ||
| return { success: false as const, error: `Deep debugger failed: ${String(e)}` }; | ||
| } finally { | ||
| this.deepDebugPromise = null; | ||
| this.deepDebugConversationId = null; | ||
| } | ||
| })(); | ||
|
|
||
| // Store promise before awaiting | ||
| this.deepDebugPromise = debugPromise; | ||
|
|
||
| return await debugPromise; | ||
| } | ||
|
|
||
|
|
||
| getModelConfigsInfo() { | ||
| const modelService = new ModelConfigService(this.env); | ||
| return modelService.getModelConfigsInfo(this.state.inferenceContext.userId); | ||
| } | ||
|
|
||
| getTotalFiles(): number { | ||
| return this.fileManager.getGeneratedFilePaths().length | ||
| } | ||
|
|
||
| getSummary(): Promise<AgentSummary> { | ||
| const summaryData = { | ||
| query: this.state.query, | ||
| generatedCode: this.fileManager.getGeneratedFiles(), | ||
| }; | ||
| return Promise.resolve(summaryData); | ||
| } | ||
|
|
||
| async getFullState(): Promise<TState> { | ||
| return this.state; | ||
| } | ||
|
|
||
| migrateStateIfNeeded(): void { | ||
| // no-op, only older phasic agents need this, for now. | ||
| } | ||
|
|
||
| getFileGenerated(filePath: string) { | ||
| return this.fileManager!.getGeneratedFile(filePath) || null; | ||
| } | ||
|
|
||
| async fetchRuntimeErrors(clear: boolean = true, shouldWait: boolean = true): Promise<RuntimeError[]> { | ||
| if (shouldWait) { | ||
| await this.deploymentManager.waitForPreview(); | ||
| } | ||
|
|
||
| try { | ||
| const errors = await this.deploymentManager.fetchRuntimeErrors(clear); | ||
|
|
||
| if (errors.length > 0) { | ||
| this.broadcast(WebSocketMessageResponses.RUNTIME_ERROR_FOUND, { | ||
| errors, | ||
| message: "Runtime errors found", | ||
| count: errors.length | ||
| }); | ||
| } | ||
|
|
||
| return errors; | ||
| } catch (error) { | ||
| this.logger.error("Exception fetching runtime errors:", error); | ||
| // If fetch fails, initiate redeploy | ||
| this.deployToSandbox(); | ||
| const message = "<runtime errors not available at the moment as preview is not deployed>"; | ||
| return [{ message, timestamp: new Date().toISOString(), level: 0, rawOutput: message }]; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Perform static code analysis on the generated files | ||
| * This helps catch potential issues early in the development process | ||
| */ | ||
| async runStaticAnalysisCode(files?: string[]): Promise<StaticAnalysisResponse> { | ||
| try { | ||
| // Check if we have cached static analysis | ||
| if (this.staticAnalysisCache) { | ||
| return this.staticAnalysisCache; | ||
| } | ||
|
|
||
| const analysisResponse = await this.deploymentManager.runStaticAnalysis(files); | ||
| this.staticAnalysisCache = analysisResponse; | ||
|
|
||
| const { lint, typecheck } = analysisResponse; | ||
| this.broadcast(WebSocketMessageResponses.STATIC_ANALYSIS_RESULTS, { | ||
| lint: { issues: lint.issues, summary: lint.summary }, | ||
| typecheck: { issues: typecheck.issues, summary: typecheck.summary } | ||
| }); | ||
|
|
||
| return analysisResponse; | ||
| } catch (error) { | ||
| this.broadcastError("Failed to lint code", error); | ||
| return { success: false, lint: { issues: [], }, typecheck: { issues: [], } }; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Apply deterministic code fixes for common TypeScript errors | ||
| */ | ||
| protected async applyDeterministicCodeFixes() : Promise<StaticAnalysisResponse | undefined> { | ||
| try { | ||
| // Get static analysis and do deterministic fixes | ||
| const staticAnalysis = await this.runStaticAnalysisCode(); | ||
| if (staticAnalysis.typecheck.issues.length == 0) { | ||
| this.logger.info("No typecheck issues found, skipping deterministic fixes"); | ||
| return staticAnalysis; // So that static analysis is not repeated again | ||
| } | ||
| const typeCheckIssues = staticAnalysis.typecheck.issues; | ||
| this.broadcast(WebSocketMessageResponses.DETERMINISTIC_CODE_FIX_STARTED, { | ||
| message: `Attempting to fix ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`, | ||
| issues: typeCheckIssues | ||
| }); | ||
|
|
||
| this.logger.info(`Attempting to fix ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`); | ||
| const allFiles = this.fileManager.getAllFiles(); | ||
|
|
||
| const fixResult = fixProjectIssues( | ||
| allFiles.map(file => ({ | ||
| filePath: file.filePath, | ||
| fileContents: file.fileContents, | ||
| filePurpose: '' | ||
| })), | ||
| typeCheckIssues | ||
| ); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.DETERMINISTIC_CODE_FIX_COMPLETED, { | ||
| message: `Fixed ${typeCheckIssues.length} TypeScript issues using deterministic code fixer`, | ||
| issues: typeCheckIssues, | ||
| fixResult | ||
| }); | ||
|
|
||
| if (fixResult) { | ||
| // If there are unfixable issues but of type TS2307, extract external module names and install them | ||
| if (fixResult.unfixableIssues.length > 0) { | ||
| const modulesNotFound = fixResult.unfixableIssues.filter(issue => issue.issueCode === 'TS2307'); | ||
| // Reason is of the form: External package "xyz" should be handled by package manager | ||
| const moduleNames = modulesNotFound.flatMap(issue => { | ||
| const match = issue.reason.match(/External package ["'](.+?)["']/); | ||
| const name = match?.[1]; | ||
| return (typeof name === 'string' && name.trim().length > 0 && !name.startsWith('@shared')) ? [name] : []; | ||
| }).filter((name) => !name.includes('cloudflare:')); | ||
| if (moduleNames.length > 0) { | ||
| const installCommands = moduleNames.map(moduleName => `bun install ${moduleName}`); | ||
| await this.executeCommands(installCommands, false); | ||
|
|
||
| this.logger.info(`Deterministic code fixer installed missing modules: ${moduleNames.join(', ')}`); | ||
| } else { | ||
| this.logger.info(`Deterministic code fixer detected no external modules to install from unfixable TS2307 issues`); | ||
| } | ||
| } | ||
| if (fixResult.modifiedFiles.length > 0) { | ||
| this.logger.info("Applying deterministic fixes to files, Fixes: ", JSON.stringify(fixResult, null, 2)); | ||
| const fixedFiles = fixResult.modifiedFiles.map(file => ({ | ||
| filePath: file.filePath, | ||
| filePurpose: allFiles.find(f => f.filePath === file.filePath)?.filePurpose || '', | ||
| fileContents: file.fileContents | ||
| })); | ||
| await this.fileManager.saveGeneratedFiles(fixedFiles, "fix: applied deterministic fixes"); | ||
|
|
||
| await this.deployToSandbox(fixedFiles, false, "fix: applied deterministic fixes"); | ||
| this.logger.info("Deployed deterministic fixes to sandbox"); | ||
| } | ||
| } | ||
| this.logger.info(`Applied deterministic code fixes: ${JSON.stringify(fixResult, null, 2)}`); | ||
| } catch (error) { | ||
| this.broadcastError('Deterministic code fixer failed', error); | ||
| } | ||
| // return undefined; | ||
| } | ||
|
|
||
| async fetchAllIssues(resetIssues: boolean = false): Promise<AllIssues> { | ||
| if (!this.state.sandboxInstanceId) { | ||
| return { runtimeErrors: [], staticAnalysis: { success: false, lint: { issues: [], }, typecheck: { issues: [], } } }; | ||
| } | ||
| const [runtimeErrors, staticAnalysis] = await Promise.all([ | ||
| this.fetchRuntimeErrors(resetIssues), | ||
| this.runStaticAnalysisCode() | ||
| ]); | ||
| this.logger.info("Fetched all issues:", JSON.stringify({ runtimeErrors, staticAnalysis })); | ||
|
|
||
| return { runtimeErrors, staticAnalysis }; | ||
| } | ||
|
|
||
| async updateProjectName(newName: string): Promise<boolean> { | ||
| try { | ||
| const valid = /^[a-z0-9-_]{3,50}$/.test(newName); | ||
| if (!valid) return false; | ||
| const updatedBlueprint = { ...this.state.blueprint, projectName: newName }; | ||
| this.setState({ | ||
| ...this.state, | ||
| blueprint: updatedBlueprint | ||
| }); | ||
| let ok = true; | ||
| if (this.state.sandboxInstanceId) { | ||
| try { | ||
| ok = await this.getSandboxServiceClient().updateProjectName(this.state.sandboxInstanceId, newName); | ||
| } catch (_) { | ||
| ok = false; | ||
| } | ||
| } | ||
| try { | ||
| const appService = new AppService(this.env); | ||
| const dbOk = await appService.updateApp(this.getAgentId(), { title: newName }); | ||
| ok = ok && dbOk; | ||
| } catch (error) { | ||
| this.logger.error('Error updating project name in database:', error); | ||
| ok = false; | ||
| } | ||
| this.broadcast(WebSocketMessageResponses.PROJECT_NAME_UPDATED, { | ||
| message: 'Project name updated', | ||
| projectName: newName | ||
| }); | ||
| return ok; | ||
| } catch (error) { | ||
| this.logger.error('Error updating project name:', error); | ||
| return false; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Update user-facing blueprint fields | ||
| * Only allows updating safe, cosmetic fields - not internal generation state | ||
| */ | ||
| async updateBlueprint(patch: Partial<Blueprint>): Promise<Blueprint> { | ||
| // Fields that are safe to update after generation starts | ||
| // Excludes: initialPhase (breaks phasic generation) | ||
| const safeUpdatableFields = new Set([ | ||
| 'title', | ||
| 'description', | ||
| 'detailedDescription', | ||
| 'colorPalette', | ||
| 'views', | ||
| 'userFlow', | ||
| 'dataFlow', | ||
| 'architecture', | ||
| 'pitfalls', | ||
| 'frameworks', | ||
| 'implementationRoadmap' | ||
| ]); | ||
|
|
||
| // Filter to only safe fields | ||
| const filtered: Record<string, unknown> = {}; | ||
| for (const [key, value] of Object.entries(patch)) { | ||
| if (safeUpdatableFields.has(key) && value !== undefined) { | ||
| filtered[key] = value; | ||
| } | ||
| } | ||
|
|
||
| // Agentic: allow initializing plan if not set yet (first-time plan initialization only) | ||
| if (this.isAgenticState(this.state)) { | ||
| const currentPlan = this.state.blueprint?.plan; | ||
| const patchPlan = 'plan' in patch ? patch.plan : undefined; | ||
| if (Array.isArray(patchPlan) && (!Array.isArray(currentPlan) || currentPlan.length === 0)) { | ||
| filtered['plan'] = patchPlan; | ||
| } | ||
| } | ||
|
|
||
| // projectName requires sandbox update, handle separately | ||
| if ('projectName' in patch && typeof patch.projectName === 'string') { | ||
| await this.updateProjectName(patch.projectName); | ||
| } | ||
|
|
||
| // Merge and update state | ||
| const updated = { ...this.state.blueprint, ...filtered } as Blueprint; | ||
| this.setState({ | ||
| ...this.state, | ||
| blueprint: updated | ||
| }); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.BLUEPRINT_UPDATED, { | ||
| message: 'Blueprint updated', | ||
| updatedKeys: Object.keys(filtered) | ||
| }); | ||
|
|
||
| return updated; | ||
| } | ||
|
|
||
| // ===== Debugging helpers for assistants ===== | ||
| listFiles(): FileOutputType[] { | ||
| return this.fileManager.getAllRelevantFiles(); | ||
| } | ||
|
|
||
| async readFiles(paths: string[]): Promise<{ files: { path: string; content: string }[] }> { | ||
| const results: { path: string; content: string }[] = []; | ||
| const notFoundInFileManager: string[] = []; | ||
|
|
||
| // First, try to read from FileManager (template + generated files) | ||
| for (const path of paths) { | ||
| const file = this.fileManager.getFile(path); | ||
| if (file) { | ||
| results.push({ path, content: file.fileContents }); | ||
| } else { | ||
| notFoundInFileManager.push(path); | ||
| } | ||
| } | ||
|
|
||
| // If some files not found in FileManager and sandbox exists, try sandbox | ||
| if (notFoundInFileManager.length > 0 && this.state.sandboxInstanceId) { | ||
| const resp = await this.getSandboxServiceClient().getFiles( | ||
| this.state.sandboxInstanceId, | ||
| notFoundInFileManager | ||
| ); | ||
| if (resp.success) { | ||
| results.push(...resp.files.map(f => ({ | ||
| path: f.filePath, | ||
| content: f.fileContents | ||
| }))); | ||
| } | ||
| } | ||
|
|
||
| return { files: results }; | ||
| } | ||
|
|
||
| async execCommands(commands: string[], shouldSave: boolean, timeout?: number): Promise<ExecuteCommandsResponse> { | ||
| const { sandboxInstanceId } = this.state; | ||
| if (!sandboxInstanceId) { | ||
| return { success: false, results: [], error: 'No sandbox instance' }; | ||
| } | ||
| const result = await this.getSandboxServiceClient().executeCommands(sandboxInstanceId, commands, timeout); | ||
| if (shouldSave) { | ||
| this.saveExecutedCommands(commands); | ||
| } | ||
| return result; | ||
| } | ||
|
|
||
| updateSlideManifest(file: FileOutputType) { | ||
| // If the project type is presentation and this is a slide file, update the manifest | ||
| if (this.projectType === 'presentation') { | ||
| const templateDetails = this.getTemplateDetails() | ||
| if (!templateDetails) { | ||
| return; | ||
| } | ||
| const slidesDirectory = templateDetails.slideDirectory ?? '/public/slides'; | ||
| if (file.filePath.startsWith(slidesDirectory) && file.filePath.endsWith('.json')) { | ||
| const manifestPath = `${slidesDirectory}/manifest.json` | ||
| const existingManifest = this.fileManager.getFile(manifestPath) | ||
|
|
||
| // Parse existing manifest or create new one | ||
| let manifestData: { slides: string[] } = { slides: [] }; | ||
| if (existingManifest) { | ||
| try { | ||
| const parsed = JSON.parse(existingManifest.fileContents); | ||
| manifestData = { | ||
| slides: Array.isArray(parsed.slides) ? parsed.slides : [] | ||
| }; | ||
| } catch (error) { | ||
| this.logger.error('Failed to parse existing manifest.json', error); | ||
| manifestData = { slides: [] }; | ||
| } | ||
| } else { | ||
| manifestData = { slides: [] }; | ||
| } | ||
|
|
||
| // Add slide path to slides array if not already present | ||
| const relativeSlidePath = file.filePath.replace(slidesDirectory + '/', ''); | ||
| if (!manifestData.slides.includes(relativeSlidePath)) { | ||
| manifestData.slides.push(relativeSlidePath); | ||
|
|
||
| // Save updated manifest | ||
| const updatedManifest: FileOutputType = { | ||
| filePath: manifestPath, | ||
| fileContents: JSON.stringify(manifestData, null, 2), | ||
| filePurpose: 'Presentation slides manifest' | ||
| }; | ||
| this.fileManager.recordFileChanges([updatedManifest]); | ||
|
|
||
| this.logger.info('Updated manifest.json with new slide', { | ||
| slidePath: relativeSlidePath, | ||
| totalSlides: manifestData.slides.length | ||
| }); | ||
| } | ||
| } | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Regenerate a file to fix identified issues | ||
| * Retries up to 3 times before giving up | ||
| */ | ||
| async regenerateFile(file: FileOutputType, issues: string[], retryIndex: number = 0) { | ||
| this.broadcast(WebSocketMessageResponses.FILE_REGENERATING, { | ||
| message: `Regenerating file: ${file.filePath}`, | ||
| filePath: file.filePath, | ||
| original_issues: issues, | ||
| }); | ||
|
|
||
| const result = await this.operations.regenerateFile.execute( | ||
| {file, issues, retryIndex}, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| this.updateSlideManifest(result); | ||
| const fileState = await this.fileManager.saveGeneratedFile(result); | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.FILE_REGENERATED, { | ||
| message: `Regenerated file: ${file.filePath}`, | ||
| file: fileState, | ||
| original_issues: issues, | ||
| }); | ||
|
|
||
| return fileState; | ||
| } | ||
|
|
||
| async regenerateFileByPath(path: string, issues: string[]): Promise<{ path: string; diff: string }> { | ||
| const templateDetails = this.getTemplateDetails(); | ||
| if (templateDetails && templateDetails.dontTouchFiles && templateDetails.dontTouchFiles.includes(path)) { | ||
| return { | ||
| path, | ||
| diff: '<WRITE PROTECTED - TEMPLATE FILE, CANNOT MODIFY - SKIPPED - NO CHANGES MADE>' | ||
| }; | ||
| } | ||
| // Prefer local file manager; fallback to sandbox | ||
| let fileContents = ''; | ||
| let filePurpose = ''; | ||
| try { | ||
| const fmFile = this.fileManager.getFile(path); | ||
| if (fmFile) { | ||
| fileContents = fmFile.fileContents; | ||
| filePurpose = fmFile.filePurpose || ''; | ||
| } else { | ||
| const { sandboxInstanceId } = this.state; | ||
| if (!sandboxInstanceId) { | ||
| throw new Error('No sandbox instance available'); | ||
| } | ||
| const resp = await this.getSandboxServiceClient().getFiles(sandboxInstanceId, [path]); | ||
| const f = resp.success ? resp.files.find(f => f.filePath === path) : undefined; | ||
| if (!f) throw new Error(resp.error || `File not found: ${path}`); | ||
| fileContents = f.fileContents; | ||
| } | ||
| } catch (e) { | ||
| throw new Error(`Failed to read file for regeneration: ${String(e)}`); | ||
| } | ||
|
|
||
| const regenerated = await this.regenerateFile({ filePath: path, fileContents, filePurpose }, issues, 0); | ||
| // Invalidate cache | ||
| this.staticAnalysisCache = null; | ||
| // Persist to sandbox instance | ||
| // await this.getSandboxServiceClient().writeFiles(sandboxInstanceId, [{ filePath: regenerated.filePath, fileContents: regenerated.fileContents }], `Deep debugger fix: ${path}`); | ||
| await this.deploymentManager.deployToSandbox([regenerated]) | ||
| return { path, diff: regenerated.lastDiff }; | ||
| } | ||
|
|
||
| async generateFiles( | ||
| phaseName: string, | ||
| phaseDescription: string, | ||
| requirements: string[], | ||
| files: FileConceptType[] | ||
| ): Promise<{ files: Array<{ path: string; purpose: string; diff: string }> }> { | ||
| this.logger.info('Generating files for deep debugger', { | ||
| phaseName, | ||
| requirementsCount: requirements.length, | ||
| filesCount: files.length | ||
| }); | ||
|
|
||
| // Broadcast file generation started | ||
| this.broadcast(WebSocketMessageResponses.PHASE_IMPLEMENTING, { | ||
| message: `Generating files: ${phaseName}`, | ||
| phaseName | ||
| }); | ||
|
|
||
| let skippedFiles: { path: string; purpose: string; diff: string }[] = []; | ||
|
|
||
| // Enforce template donttouch constraints | ||
| const templateDetails = this.getTemplateDetails(); | ||
| if (templateDetails && templateDetails.dontTouchFiles) { | ||
| const dontTouchFiles = new Set<string>(templateDetails.dontTouchFiles); | ||
| files = files.filter(file => { | ||
| if (dontTouchFiles.has(file.path)) { | ||
| this.logger.info('Skipping dont-touch file', { filePath: file.path }); | ||
| skippedFiles.push({ path: file.path, purpose: `WRITE-PROTECTED FILE, CANNOT MODIFY`, diff: "<WRITE PROTECTED - TEMPLATE FILE, CANNOT MODIFY - SKIPPED - NO CHANGES MADE>" }); | ||
| return false; | ||
| } | ||
| return true; | ||
| }); | ||
| } | ||
|
|
||
| const savedFiles: FileState[] = []; | ||
|
|
||
| const operation = new SimpleCodeGenerationOperation(); | ||
| const result = await operation.execute( | ||
| { | ||
| phaseName, | ||
| phaseDescription, | ||
| requirements, | ||
| files, | ||
| fileGeneratingCallback: (filePath: string, filePurpose: string) => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATING, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath, | ||
| filePurpose | ||
| }); | ||
| }, | ||
| fileChunkGeneratedCallback: (filePath: string, chunk: string, format: 'full_content' | 'unified_diff') => { | ||
| this.broadcast(WebSocketMessageResponses.FILE_CHUNK_GENERATED, { | ||
| message: `Generating file: ${filePath}`, | ||
| filePath, | ||
| chunk, | ||
| format | ||
| }); | ||
| }, | ||
| fileClosedCallback: (file, message) => { | ||
| // Record file to state (sync) | ||
| const saved = this.fileManager.recordFileChanges([file]); | ||
| savedFiles.push(...saved); | ||
| this.updateSlideManifest(file); | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message, | ||
| file | ||
| }); | ||
| } | ||
| }, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| await this.fileManager.saveGeneratedFiles( | ||
| [], | ||
| `feat: ${phaseName}\n\n${phaseDescription}` | ||
| ); | ||
|
|
||
| this.logger.info('Files generated and saved', { | ||
| fileCount: result.files.length | ||
| }); | ||
|
|
||
| await this.deployToSandbox(savedFiles, false); | ||
|
|
||
| return { | ||
| files: [ | ||
| ...skippedFiles, | ||
| ...savedFiles.map(f => { | ||
| return { | ||
| path: f.filePath, | ||
| purpose: f.filePurpose || '', | ||
| diff: f.lastDiff || '' | ||
| }; | ||
| }) | ||
| ] | ||
| }; | ||
| } | ||
|
|
||
| /** | ||
| * Get or create file serving token (lazy generation) | ||
| */ | ||
| private getOrCreateFileServingToken(): string { | ||
| if (!this.state.fileServingToken) { | ||
| const token = generatePortToken(); | ||
| this.setState({ | ||
| ...this.state, | ||
| fileServingToken: { | ||
| token, | ||
| createdAt: Date.now() | ||
| } | ||
| }); | ||
| } | ||
| return this.state.fileServingToken!.token; | ||
| } | ||
|
|
||
| /** | ||
| * Get browser preview URL for file serving | ||
| */ | ||
| public getBrowserPreviewURL(): string { | ||
| const token = this.getOrCreateFileServingToken(); | ||
| const agentId = this.getAgentId(); | ||
| const previewDomain = isDev(this.env) ? 'localhost:5173' : getPreviewDomain(this.env); | ||
|
|
||
| // Format: b-{agentid}-{token}.{previewDomain} | ||
| return `${getProtocolForHost(previewDomain)}://b-${agentId}-${token}.${previewDomain}`; | ||
| } | ||
|
|
||
| // A wrapper for LLM tool to deploy to sandbox | ||
| async deployPreview(clearLogs: boolean = true, forceRedeploy: boolean = false): Promise<string> { | ||
| const response = await this.deployToSandbox([], forceRedeploy, undefined, clearLogs); | ||
| if (response && response.previewURL) { | ||
| this.broadcast(WebSocketMessageResponses.PREVIEW_FORCE_REFRESH, {}); | ||
| return `Deployment successful: ${response.previewURL}`; | ||
| } | ||
| return `Failed to deploy: ${response?.tunnelURL}`; | ||
| } | ||
|
|
||
| async deployToSandbox(files: FileOutputType[] = [], redeploy: boolean = false, commitMessage?: string, clearLogs: boolean = false): Promise<PreviewType | null> { | ||
| // Only deploy if project is previewable | ||
| if (!this.isPreviewable()) { | ||
| throw new Error('Project is not previewable'); | ||
| } | ||
| this.logger.info('[AGENT] Deploying to sandbox', { files: files.length, redeploy, commitMessage, renderMode: this.getTemplateDetails()?.renderMode, templateDetails: this.getTemplateDetails() }); | ||
|
|
||
| if (this.getTemplateDetails()?.renderMode === 'browser') { | ||
| this.logger.info('Deploying to browser native sandbox'); | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_STARTED, {}); | ||
| const result: PreviewType = { | ||
| previewURL: this.getBrowserPreviewURL() | ||
| } | ||
| this.logger.info('Deployed to browser native sandbox'); | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_COMPLETED, result); | ||
| return result; | ||
| } | ||
|
|
||
| // Invalidate static analysis cache | ||
| this.staticAnalysisCache = null; | ||
|
|
||
| // Call deployment manager with callbacks for broadcasting at the right times | ||
| const result = await this.deploymentManager.deployToSandbox( | ||
| files, | ||
| redeploy, | ||
| commitMessage, | ||
| clearLogs, | ||
| { | ||
| onStarted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_STARTED, data); | ||
| }, | ||
| onCompleted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_COMPLETED, data); | ||
| }, | ||
| onError: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.DEPLOYMENT_FAILED, data); | ||
| }, | ||
| onAfterSetupCommands: async () => { | ||
| // Sync package.json after setup commands (includes dependency installs) | ||
| await this.syncPackageJsonFromSandbox(); | ||
| } | ||
| } | ||
| ); | ||
|
|
||
| return result; | ||
| } | ||
|
|
||
| /** | ||
| * Deploy the generated code to Cloudflare Workers | ||
| */ | ||
| async deployToCloudflare(target: DeploymentTarget = 'platform'): Promise<{ deploymentUrl?: string; workersUrl?: string } | null> { | ||
| try { | ||
| // Ensure sandbox instance exists first | ||
| if (!this.state.sandboxInstanceId) { | ||
| this.logger.info('No sandbox instance, deploying to sandbox first'); | ||
| await this.deployToSandbox(); | ||
|
|
||
| if (!this.state.sandboxInstanceId) { | ||
| this.logger.error('Failed to deploy to sandbox service'); | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, { | ||
| message: 'Deployment failed: Failed to deploy to sandbox service', | ||
| error: 'Sandbox service unavailable' | ||
| }); | ||
| return null; | ||
| } | ||
| } | ||
|
|
||
| // Call service - handles orchestration, callbacks for broadcasting | ||
| const result = await this.deploymentManager.deployToCloudflare({ | ||
| target, | ||
| callbacks: { | ||
| onStarted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_STARTED, data); | ||
| }, | ||
| onCompleted: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_COMPLETED, data); | ||
| }, | ||
| onError: (data) => { | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, data); | ||
| }, | ||
| } | ||
| }); | ||
|
|
||
| // Update database with deployment ID if successful | ||
| if (result.deploymentUrl && result.deploymentId) { | ||
| const appService = new AppService(this.env); | ||
| await appService.updateDeploymentId( | ||
| this.getAgentId(), | ||
| result.deploymentId | ||
| ); | ||
| } | ||
|
|
||
| return result.deploymentUrl ? { deploymentUrl: result.deploymentUrl } : null; | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Cloudflare deployment error:', error); | ||
| this.broadcast(WebSocketMessageResponses.CLOUDFLARE_DEPLOYMENT_ERROR, { | ||
| message: 'Deployment failed', | ||
| error: error instanceof Error ? error.message : String(error) | ||
| }); | ||
| return null; | ||
| } | ||
| } | ||
|
|
||
| async importTemplate(templateName: string): Promise<{ templateName: string; filesImported: number; files: TemplateFile[] }> { | ||
| this.logger.info(`Importing template into project: ${templateName}`); | ||
|
|
||
| if (this.state.templateName !== templateName) { | ||
| // Get template catalog info to sync projectType | ||
| const catalogResponse = await BaseSandboxService.listTemplates(); | ||
| const catalogInfo = catalogResponse.success | ||
| ? catalogResponse.templates.find(t => t.name === templateName) | ||
| : null; | ||
|
|
||
| // Update state with template name and projectType if available | ||
| this.setState({ | ||
| ...this.state, | ||
| templateName: templateName, | ||
| ...(catalogInfo?.projectType ? { projectType: catalogInfo.projectType } : {}), | ||
| }); | ||
|
|
||
| this.templateDetailsCache = null; // Clear template details cache | ||
| } | ||
| const templateDetails = await this.ensureTemplateDetails(); | ||
| if (!templateDetails) { | ||
| throw new Error(`Failed to get template details for: ${templateName}`); | ||
| } | ||
|
|
||
| this.setState({ | ||
| ...this.state, | ||
| lastPackageJson: templateDetails.allFiles['package.json'] || this.state.lastPackageJson, | ||
| }); | ||
|
|
||
| // Get important files for return value | ||
| const importantFiles = getTemplateImportantFiles(templateDetails); | ||
|
|
||
| // Ensure deployment to sandbox | ||
| await this.deployToSandbox(); | ||
|
|
||
| // Notify frontend about template metadata update | ||
| this.broadcast(WebSocketMessageResponses.TEMPLATE_UPDATED, { | ||
| templateDetails | ||
| }); | ||
|
|
||
| return { | ||
| templateName: templateDetails.name, | ||
| filesImported: Object.keys(templateDetails.allFiles).length, | ||
| files: importantFiles | ||
| }; | ||
| } | ||
|
|
||
| async waitForGeneration(): Promise<void> { | ||
| if (this.generationPromise) { | ||
| try { | ||
| await this.generationPromise; | ||
| this.logger.info("Code generation completed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error during code generation:", error); | ||
| } | ||
| } else { | ||
| this.logger.error("No generation process found"); | ||
| } | ||
| } | ||
|
|
||
| isDeepDebugging(): boolean { | ||
| return this.deepDebugPromise !== null; | ||
| } | ||
|
|
||
| getDeepDebugSessionState(): { conversationId: string } | null { | ||
| if (this.deepDebugConversationId && this.deepDebugPromise) { | ||
| return { conversationId: this.deepDebugConversationId }; | ||
| } | ||
| return null; | ||
| } | ||
|
|
||
| async waitForDeepDebug(): Promise<void> { | ||
| if (this.deepDebugPromise) { | ||
| try { | ||
| await this.deepDebugPromise; | ||
| this.logger.info("Deep debug session completed successfully"); | ||
| } catch (error) { | ||
| this.logger.error("Error during deep debug session:", error); | ||
| } finally { | ||
| // Clear promise after waiting completes | ||
| this.deepDebugPromise = null; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| protected async onProjectUpdate(message: string): Promise<void> { | ||
| this.setState({ | ||
| ...this.state, | ||
| projectUpdatesAccumulator: [...this.state.projectUpdatesAccumulator, message] | ||
| }); | ||
| } | ||
|
|
||
| protected async getAndResetProjectUpdates() { | ||
| const projectUpdates = this.state.projectUpdatesAccumulator || []; | ||
| this.setState({ | ||
| ...this.state, | ||
| projectUpdatesAccumulator: [] | ||
| }); | ||
| return projectUpdates; | ||
| } | ||
|
|
||
| public broadcast<T extends WebSocketMessageType>(msg: T, data?: WebSocketMessageData<T>): void { | ||
| if (this.operations.processUserMessage.isProjectUpdateType(msg)) { | ||
| let message = msg as string; | ||
| if (data && 'message' in data) { | ||
| message = (data as { message: string }).message; | ||
| } | ||
| this.onProjectUpdate(message); | ||
| } | ||
| super.broadcast(msg, data); | ||
| } | ||
|
|
||
| protected getBootstrapCommands() { | ||
| const bootstrapCommands = this.state.commandsHistory || []; | ||
| // Validate, deduplicate, and clean | ||
| const { validCommands } = validateAndCleanBootstrapCommands(bootstrapCommands); | ||
| return validCommands; | ||
| } | ||
|
|
||
| protected async saveExecutedCommands(commands: string[]) { | ||
| this.logger.info('Saving executed commands', { commands }); | ||
|
|
||
| // Merge with existing history | ||
| const mergedCommands = [...(this.state.commandsHistory || []), ...commands]; | ||
|
|
||
| // Validate, deduplicate, and clean | ||
| const { validCommands, invalidCommands, deduplicated } = validateAndCleanBootstrapCommands(mergedCommands); | ||
|
|
||
| // Log what was filtered out | ||
| if (invalidCommands.length > 0 || deduplicated > 0) { | ||
| this.logger.warn('[commands] Bootstrap commands cleaned', { | ||
| invalidCommands, | ||
| invalidCount: invalidCommands.length, | ||
| deduplicatedCount: deduplicated, | ||
| finalCount: validCommands.length | ||
| }); | ||
| } | ||
|
|
||
| // Update state with cleaned commands | ||
| this.setState({ | ||
| ...this.state, | ||
| commandsHistory: validCommands | ||
| }); | ||
|
|
||
| // Update bootstrap script with validated commands | ||
| await this.updateBootstrapScript(validCommands); | ||
|
|
||
| // Sync package.json if any dependency-modifying commands were executed | ||
| const hasDependencyCommands = commands.some(cmd => | ||
| cmd.includes('install') || | ||
| cmd.includes(' add ') || | ||
| cmd.includes('remove') || | ||
| cmd.includes('uninstall') | ||
| ); | ||
|
|
||
| if (hasDependencyCommands) { | ||
| this.logger.info('Dependency commands executed, syncing package.json from sandbox'); | ||
| await this.syncPackageJsonFromSandbox(); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Execute commands with retry logic | ||
| * Chunks commands and retries failed ones with AI assistance | ||
| */ | ||
| protected async executeCommands(commands: string[], shouldRetry: boolean = true, chunkSize: number = 5): Promise<void> { | ||
| const state = this.state; | ||
| if (!state.sandboxInstanceId) { | ||
| this.logger.warn('No sandbox instance available for executing commands'); | ||
| return; | ||
| } | ||
|
|
||
| // Sanitize and prepare commands | ||
| commands = commands.join('\n').split('\n').filter(cmd => cmd.trim() !== '').filter(cmd => looksLikeCommand(cmd) && !cmd.includes(' undefined')); | ||
| if (commands.length === 0) { | ||
| this.logger.warn("No commands to execute"); | ||
| return; | ||
| } | ||
|
|
||
| commands = commands.map(cmd => cmd.trim().replace(/^\s*-\s*/, '').replace(/^npm/, 'bun')); | ||
| this.logger.info(`AI suggested ${commands.length} commands to run: ${commands.join(", ")}`); | ||
|
|
||
| // Remove duplicate commands | ||
| commands = Array.from(new Set(commands)); | ||
|
|
||
| // Execute in chunks | ||
| const commandChunks = []; | ||
| for (let i = 0; i < commands.length; i += chunkSize) { | ||
| commandChunks.push(commands.slice(i, i + chunkSize)); | ||
| } | ||
|
|
||
| const successfulCommands: string[] = []; | ||
|
|
||
| for (const chunk of commandChunks) { | ||
| // Retry failed commands up to 3 times | ||
| let currentChunk = chunk; | ||
| let retryCount = 0; | ||
| const maxRetries = shouldRetry ? 3 : 1; | ||
|
|
||
| while (currentChunk.length > 0 && retryCount < maxRetries) { | ||
| try { | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: retryCount > 0 ? `Retrying commands (attempt ${retryCount + 1}/${maxRetries})` : "Executing commands", | ||
| commands: currentChunk | ||
| }); | ||
|
|
||
| const resp = await this.getSandboxServiceClient().executeCommands( | ||
| state.sandboxInstanceId, | ||
| currentChunk | ||
| ); | ||
| if (!resp.results || !resp.success) { | ||
| this.logger.error('Failed to execute commands', { response: resp }); | ||
| // Check if instance is still running | ||
| const status = await this.getSandboxServiceClient().getInstanceStatus(state.sandboxInstanceId); | ||
| if (!status.success || !status.isHealthy) { | ||
| this.logger.error(`Instance ${state.sandboxInstanceId} is no longer running`); | ||
| return; | ||
| } | ||
| break; | ||
| } | ||
|
|
||
| // Process results | ||
| const successful = resp.results.filter(r => r.success); | ||
| const failures = resp.results.filter(r => !r.success); | ||
|
|
||
| // Track successful commands | ||
| if (successful.length > 0) { | ||
| const successfulCmds = successful.map(r => r.command); | ||
| this.logger.info(`Successfully executed ${successful.length} commands: ${successfulCmds.join(", ")}`); | ||
| successfulCommands.push(...successfulCmds); | ||
| } | ||
|
|
||
| // If all succeeded, move to next chunk | ||
| if (failures.length === 0) { | ||
| this.logger.info(`All commands in chunk executed successfully`); | ||
| break; | ||
| } | ||
|
|
||
| // Handle failures | ||
| const failedCommands = failures.map(r => r.command); | ||
| this.logger.warn(`${failures.length} commands failed: ${failedCommands.join(", ")}`); | ||
|
|
||
| // Only retry if shouldRetry is true | ||
| if (!shouldRetry) { | ||
| break; | ||
| } | ||
|
|
||
| retryCount++; | ||
|
|
||
| // For install commands, try AI regeneration | ||
| const failedInstallCommands = failedCommands.filter(cmd => | ||
| cmd.startsWith("bun") || cmd.startsWith("npm") || cmd.includes("install") | ||
| ); | ||
|
|
||
| if (failedInstallCommands.length > 0 && retryCount < maxRetries) { | ||
| // Use AI to suggest alternative commands | ||
| const newCommands = await this.getProjectSetupAssistant().generateSetupCommands( | ||
| `The following install commands failed: ${JSON.stringify(failures, null, 2)}. Please suggest alternative commands.` | ||
| ); | ||
|
|
||
| if (newCommands?.commands && newCommands.commands.length > 0) { | ||
| this.logger.info(`AI suggested ${newCommands.commands.length} alternative commands`); | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: "Executing regenerated commands", | ||
| commands: newCommands.commands | ||
| }); | ||
| currentChunk = newCommands.commands.filter(looksLikeCommand); | ||
| } else { | ||
| this.logger.warn('AI could not generate alternative commands'); | ||
| currentChunk = []; | ||
| } | ||
| } else { | ||
| // No retry needed for non-install commands | ||
| currentChunk = []; | ||
| } | ||
| } catch (error) { | ||
| this.logger.error('Error executing commands:', error); | ||
| // Stop retrying on error | ||
| break; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| // Record command execution history | ||
| const failedCommands = commands.filter(cmd => !successfulCommands.includes(cmd)); | ||
|
|
||
| if (failedCommands.length > 0) { | ||
| this.broadcastError('Failed to execute commands', new Error(failedCommands.join(", "))); | ||
| } else { | ||
| this.logger.info(`All commands executed successfully: ${successfulCommands.join(", ")}`); | ||
| } | ||
|
|
||
| this.saveExecutedCommands(successfulCommands); | ||
| } | ||
|
|
||
| /** | ||
| * Sync package.json from sandbox to agent's git repository | ||
| * Called after install/add/remove commands to keep dependencies in sync | ||
| */ | ||
| protected async syncPackageJsonFromSandbox(): Promise<void> { | ||
| try { | ||
| this.logger.info('Fetching current package.json from sandbox'); | ||
| const results = await this.readFiles(['package.json']); | ||
| if (!results || !results.files || results.files.length === 0) { | ||
| this.logger.warn('Failed to fetch package.json from sandbox', { results }); | ||
| return; | ||
| } | ||
| const packageJsonContent = results.files[0].content; | ||
|
|
||
| const { updated, packageJson } = updatePackageJson(this.state.lastPackageJson, packageJsonContent); | ||
| if (!updated) { | ||
| this.logger.info('package.json has not changed, skipping sync'); | ||
| return; | ||
| } | ||
| // Update state with latest package.json | ||
| this.setState({ | ||
| ...this.state, | ||
| lastPackageJson: packageJson | ||
| }); | ||
|
|
||
| // Commit to git repository | ||
| const fileState = await this.fileManager.saveGeneratedFile( | ||
| { | ||
| filePath: 'package.json', | ||
| fileContents: packageJson, | ||
| filePurpose: 'Project dependencies and configuration' | ||
| }, | ||
| 'chore: sync package.json dependencies from sandbox', | ||
| true | ||
| ); | ||
|
|
||
| this.logger.info('Successfully synced package.json to git', { | ||
| filePath: fileState.filePath, | ||
| }); | ||
|
|
||
| // Broadcast update to clients | ||
| this.broadcast(WebSocketMessageResponses.FILE_GENERATED, { | ||
| message: 'Synced package.json from sandbox', | ||
| file: fileState | ||
| }); | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Failed to sync package.json from sandbox', error); | ||
| // Non-critical error - don't throw, just log | ||
| } | ||
| } | ||
|
|
||
| async getLogs(_reset?: boolean, durationSeconds?: number): Promise<string> { | ||
| if (!this.state.sandboxInstanceId) { | ||
| throw new Error('Cannot get logs: No sandbox instance available'); | ||
| } | ||
|
|
||
| const response = await this.getSandboxServiceClient().getLogs(this.state.sandboxInstanceId, _reset, durationSeconds); | ||
| if (response.success) { | ||
| return `STDOUT: ${response.logs.stdout}\nSTDERR: ${response.logs.stderr}`; | ||
| } else { | ||
| return `Failed to get logs, ${response.error}`; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Delete files from the file manager | ||
| */ | ||
| async deleteFiles(filePaths: string[]) : Promise<{ success: boolean, error?: string }> { | ||
| const deleteCommands: string[] = []; | ||
| for (const filePath of filePaths) { | ||
| deleteCommands.push(`rm -rf ${filePath}`); | ||
| } | ||
| // Remove the files from file manager | ||
| this.fileManager.deleteFiles(filePaths); | ||
| try { | ||
| await this.executeCommands(deleteCommands, false); | ||
| this.logger.info(`Deleted ${filePaths.length} files: ${filePaths.join(", ")}`); | ||
| return { success: true }; | ||
| } catch (error) { | ||
| this.logger.error('Error deleting files:', error); | ||
| return { success: false, error: error as string }; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Handle user input during conversational code generation | ||
| * Processes user messages and updates pendingUserInputs state | ||
| */ | ||
| async handleUserInput(userMessage: string, images?: ImageAttachment[]): Promise<void> { | ||
| try { | ||
| this.logger.info('Processing user input message', { | ||
| messageLength: userMessage.length, | ||
| pendingInputsCount: this.state.pendingUserInputs.length, | ||
| hasImages: !!images && images.length > 0, | ||
| imageCount: images?.length || 0 | ||
| }); | ||
|
|
||
| // Ensure template details are loaded before processing | ||
| await this.ensureTemplateDetails(); | ||
|
|
||
| // Just fetch runtime errors | ||
| const errors = await this.fetchRuntimeErrors(false, false); | ||
| const projectUpdates = await this.getAndResetProjectUpdates(); | ||
| this.logger.info('Passing context to user conversation processor', { errors, projectUpdates }); | ||
|
|
||
|
|
||
| const conversationState = this.infrastructure.getConversationState(); | ||
| // If there are images, upload them and pass the URLs to the conversation processor | ||
| let uploadedImages: ProcessedImageAttachment[] = []; | ||
| if (images) { | ||
| uploadedImages = await Promise.all(images.map(async (image) => { | ||
| return await uploadImage(this.env, image, ImageType.UPLOADS); | ||
| })); | ||
|
|
||
| this.logger.info('Uploaded images', { uploadedImages }); | ||
| } | ||
|
|
||
| // Process the user message using conversational assistant | ||
| const conversationalResponse = await this.operations.processUserMessage.execute( | ||
| { | ||
| userMessage, | ||
| conversationState, | ||
| conversationResponseCallback: ( | ||
| message: string, | ||
| conversationId: string, | ||
| isStreaming: boolean, | ||
| tool?: { name: string; status: 'start' | 'success' | 'error'; args?: Record<string, unknown> } | ||
| ) => { | ||
| // Track conversationId when deep_debug starts | ||
| if (tool?.name === 'deep_debug' && tool.status === 'start') { | ||
| this.deepDebugConversationId = conversationId; | ||
| } | ||
|
|
||
| this.broadcast(WebSocketMessageResponses.CONVERSATION_RESPONSE, { | ||
| message, | ||
| conversationId, | ||
| isStreaming, | ||
| tool, | ||
| }); | ||
| }, | ||
| errors, | ||
| projectUpdates, | ||
| images: uploadedImages | ||
| }, | ||
| this.getOperationOptions() | ||
| ); | ||
|
|
||
| const { conversationResponse, conversationState: newConversationState } = conversationalResponse; | ||
| this.logger.info('User input processed successfully', { | ||
| responseLength: conversationResponse.userResponse.length, | ||
| }); | ||
|
|
||
| this.infrastructure.setConversationState(newConversationState); | ||
| } catch (error) { | ||
| this.logger.error('Error processing user input', error); | ||
| throw error; | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Capture screenshot of the given URL using Cloudflare Browser Rendering REST API | ||
| */ | ||
| public async captureScreenshot( | ||
| url: string, | ||
| viewport: { width: number; height: number } = { width: 1280, height: 720 } | ||
| ): Promise<string> { | ||
| if (!this.env.DB || !this.getAgentId()) { | ||
| const error = 'Cannot capture screenshot: DB or agentId not available'; | ||
| this.logger.warn(error); | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| configurationError: true | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| if (!url) { | ||
| const error = 'URL is required for screenshot capture'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| this.logger.info('Capturing screenshot via REST API', { url, viewport }); | ||
|
|
||
| // Notify start of screenshot capture | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_STARTED, { | ||
| message: `Capturing screenshot of ${url}`, | ||
| url, | ||
| viewport | ||
| }); | ||
|
|
||
| try { | ||
| // Use Cloudflare Browser Rendering REST API | ||
| const apiUrl = `https://api.cloudflare.com/client/v4/accounts/${this.env.CLOUDFLARE_ACCOUNT_ID}/browser-rendering/snapshot`; | ||
|
|
||
| const response = await fetch(apiUrl, { | ||
| method: 'POST', | ||
| headers: { | ||
| 'Authorization': `Bearer ${this.env.CLOUDFLARE_API_TOKEN}`, | ||
| 'Content-Type': 'application/json', | ||
| }, | ||
| body: JSON.stringify({ | ||
| url: url, | ||
| viewport: viewport, | ||
| gotoOptions: { | ||
| waitUntil: 'networkidle0', | ||
| timeout: 10000 | ||
| }, | ||
| screenshotOptions: { | ||
| fullPage: false, | ||
| type: 'png' | ||
| } | ||
| }), | ||
| }); | ||
|
|
||
| if (!response.ok) { | ||
| const errorText = await response.text(); | ||
| const error = `Browser Rendering API failed: ${response.status} - ${errorText}`; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| statusCode: response.status, | ||
| statusText: response.statusText | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| const result = await response.json() as { | ||
| success: boolean; | ||
| result: { | ||
| screenshot: string; // base64 encoded | ||
| content: string; // HTML content | ||
| }; | ||
| }; | ||
|
|
||
| if (!result.success || !result.result.screenshot) { | ||
| const error = 'Browser Rendering API succeeded but no screenshot returned'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| apiResponse: result | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| // Get base64 screenshot data | ||
| const base64Screenshot = result.result.screenshot; | ||
| const screenshot: ImageAttachment = { | ||
| id: this.getAgentId(), | ||
| filename: 'latest.png', | ||
| mimeType: 'image/png', | ||
| base64Data: base64Screenshot | ||
| }; | ||
| const uploadedImage = await uploadImage(this.env, screenshot, ImageType.SCREENSHOTS); | ||
|
|
||
| // Persist in database | ||
| try { | ||
| const appService = new AppService(this.env); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - Tight coupling - Multiple direct AppService instantiations
Category: quality
Description:
AppService instantiated directly 4 times throughout the class, preventing testability.
Suggestion:
Inject AppService via constructor or create a single instance as a class property
Confidence: 85%
Rule: arch_tight_coupling
| protected async executeCommands(commands: string[], shouldRetry: boolean = true, chunkSize: number = 5): Promise<void> { | ||
| const state = this.state; | ||
| if (!state.sandboxInstanceId) { | ||
| this.logger.warn('No sandbox instance available for executing commands'); | ||
| return; | ||
| } | ||
|
|
||
| // Sanitize and prepare commands | ||
| commands = commands.join('\n').split('\n').filter(cmd => cmd.trim() !== '').filter(cmd => looksLikeCommand(cmd) && !cmd.includes(' undefined')); | ||
| if (commands.length === 0) { | ||
| this.logger.warn("No commands to execute"); | ||
| return; | ||
| } | ||
|
|
||
| commands = commands.map(cmd => cmd.trim().replace(/^\s*-\s*/, '').replace(/^npm/, 'bun')); | ||
| this.logger.info(`AI suggested ${commands.length} commands to run: ${commands.join(", ")}`); | ||
|
|
||
| // Remove duplicate commands | ||
| commands = Array.from(new Set(commands)); | ||
|
|
||
| // Execute in chunks | ||
| const commandChunks = []; | ||
| for (let i = 0; i < commands.length; i += chunkSize) { | ||
| commandChunks.push(commands.slice(i, i + chunkSize)); | ||
| } | ||
|
|
||
| const successfulCommands: string[] = []; | ||
|
|
||
| for (const chunk of commandChunks) { | ||
| // Retry failed commands up to 3 times | ||
| let currentChunk = chunk; | ||
| let retryCount = 0; | ||
| const maxRetries = shouldRetry ? 3 : 1; | ||
|
|
||
| while (currentChunk.length > 0 && retryCount < maxRetries) { | ||
| try { | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: retryCount > 0 ? `Retrying commands (attempt ${retryCount + 1}/${maxRetries})` : "Executing commands", | ||
| commands: currentChunk | ||
| }); | ||
|
|
||
| const resp = await this.getSandboxServiceClient().executeCommands( | ||
| state.sandboxInstanceId, | ||
| currentChunk | ||
| ); | ||
| if (!resp.results || !resp.success) { | ||
| this.logger.error('Failed to execute commands', { response: resp }); | ||
| // Check if instance is still running | ||
| const status = await this.getSandboxServiceClient().getInstanceStatus(state.sandboxInstanceId); | ||
| if (!status.success || !status.isHealthy) { | ||
| this.logger.error(`Instance ${state.sandboxInstanceId} is no longer running`); | ||
| return; | ||
| } | ||
| break; | ||
| } | ||
|
|
||
| // Process results | ||
| const successful = resp.results.filter(r => r.success); | ||
| const failures = resp.results.filter(r => !r.success); | ||
|
|
||
| // Track successful commands | ||
| if (successful.length > 0) { | ||
| const successfulCmds = successful.map(r => r.command); | ||
| this.logger.info(`Successfully executed ${successful.length} commands: ${successfulCmds.join(", ")}`); | ||
| successfulCommands.push(...successfulCmds); | ||
| } | ||
|
|
||
| // If all succeeded, move to next chunk | ||
| if (failures.length === 0) { | ||
| this.logger.info(`All commands in chunk executed successfully`); | ||
| break; | ||
| } | ||
|
|
||
| // Handle failures | ||
| const failedCommands = failures.map(r => r.command); | ||
| this.logger.warn(`${failures.length} commands failed: ${failedCommands.join(", ")}`); | ||
|
|
||
| // Only retry if shouldRetry is true | ||
| if (!shouldRetry) { | ||
| break; | ||
| } | ||
|
|
||
| retryCount++; | ||
|
|
||
| // For install commands, try AI regeneration | ||
| const failedInstallCommands = failedCommands.filter(cmd => | ||
| cmd.startsWith("bun") || cmd.startsWith("npm") || cmd.includes("install") | ||
| ); | ||
|
|
||
| if (failedInstallCommands.length > 0 && retryCount < maxRetries) { | ||
| // Use AI to suggest alternative commands | ||
| const newCommands = await this.getProjectSetupAssistant().generateSetupCommands( | ||
| `The following install commands failed: ${JSON.stringify(failures, null, 2)}. Please suggest alternative commands.` | ||
| ); | ||
|
|
||
| if (newCommands?.commands && newCommands.commands.length > 0) { | ||
| this.logger.info(`AI suggested ${newCommands.commands.length} alternative commands`); | ||
| this.broadcast(WebSocketMessageResponses.COMMAND_EXECUTING, { | ||
| message: "Executing regenerated commands", | ||
| commands: newCommands.commands | ||
| }); | ||
| currentChunk = newCommands.commands.filter(looksLikeCommand); | ||
| } else { | ||
| this.logger.warn('AI could not generate alternative commands'); | ||
| currentChunk = []; | ||
| } | ||
| } else { | ||
| // No retry needed for non-install commands | ||
| currentChunk = []; | ||
| } | ||
| } catch (error) { | ||
| this.logger.error('Error executing commands:', error); | ||
| // Stop retrying on error | ||
| break; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| // Record command execution history | ||
| const failedCommands = commands.filter(cmd => !successfulCommands.includes(cmd)); | ||
|
|
||
| if (failedCommands.length > 0) { | ||
| this.broadcastError('Failed to execute commands', new Error(failedCommands.join(", "))); | ||
| } else { | ||
| this.logger.info(`All commands executed successfully: ${successfulCommands.join(", ")}`); | ||
| } | ||
|
|
||
| this.saveExecutedCommands(successfulCommands); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - SRP violation - executeCommands handles too many concerns
Category: quality
Description:
Single method handles sanitization, chunking, retrying, AI regeneration, error handling, and history tracking.
Suggestion:
Extract into CommandExecutor class with separate methods for validation, execution, retry logic, and history management
Confidence: 85%
Rule: arch_srp_violation
| public async captureScreenshot( | ||
| url: string, | ||
| viewport: { width: number; height: number } = { width: 1280, height: 720 } | ||
| ): Promise<string> { | ||
| if (!this.env.DB || !this.getAgentId()) { | ||
| const error = 'Cannot capture screenshot: DB or agentId not available'; | ||
| this.logger.warn(error); | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| configurationError: true | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| if (!url) { | ||
| const error = 'URL is required for screenshot capture'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| this.logger.info('Capturing screenshot via REST API', { url, viewport }); | ||
|
|
||
| // Notify start of screenshot capture | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_STARTED, { | ||
| message: `Capturing screenshot of ${url}`, | ||
| url, | ||
| viewport | ||
| }); | ||
|
|
||
| try { | ||
| // Use Cloudflare Browser Rendering REST API | ||
| const apiUrl = `https://api.cloudflare.com/client/v4/accounts/${this.env.CLOUDFLARE_ACCOUNT_ID}/browser-rendering/snapshot`; | ||
|
|
||
| const response = await fetch(apiUrl, { | ||
| method: 'POST', | ||
| headers: { | ||
| 'Authorization': `Bearer ${this.env.CLOUDFLARE_API_TOKEN}`, | ||
| 'Content-Type': 'application/json', | ||
| }, | ||
| body: JSON.stringify({ | ||
| url: url, | ||
| viewport: viewport, | ||
| gotoOptions: { | ||
| waitUntil: 'networkidle0', | ||
| timeout: 10000 | ||
| }, | ||
| screenshotOptions: { | ||
| fullPage: false, | ||
| type: 'png' | ||
| } | ||
| }), | ||
| }); | ||
|
|
||
| if (!response.ok) { | ||
| const errorText = await response.text(); | ||
| const error = `Browser Rendering API failed: ${response.status} - ${errorText}`; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| statusCode: response.status, | ||
| statusText: response.statusText | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| const result = await response.json() as { | ||
| success: boolean; | ||
| result: { | ||
| screenshot: string; // base64 encoded | ||
| content: string; // HTML content | ||
| }; | ||
| }; | ||
|
|
||
| if (!result.success || !result.result.screenshot) { | ||
| const error = 'Browser Rendering API succeeded but no screenshot returned'; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| apiResponse: result | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| // Get base64 screenshot data | ||
| const base64Screenshot = result.result.screenshot; | ||
| const screenshot: ImageAttachment = { | ||
| id: this.getAgentId(), | ||
| filename: 'latest.png', | ||
| mimeType: 'image/png', | ||
| base64Data: base64Screenshot | ||
| }; | ||
| const uploadedImage = await uploadImage(this.env, screenshot, ImageType.SCREENSHOTS); | ||
|
|
||
| // Persist in database | ||
| try { | ||
| const appService = new AppService(this.env); | ||
| await appService.updateAppScreenshot(this.getAgentId(), uploadedImage.publicUrl); | ||
| } catch (dbError) { | ||
| const error = `Database update failed: ${dbError instanceof Error ? dbError.message : 'Unknown database error'}`; | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error, | ||
| url, | ||
| viewport, | ||
| screenshotCaptured: true, | ||
| databaseError: true | ||
| }); | ||
| throw new Error(error); | ||
| } | ||
|
|
||
| this.logger.info('Screenshot captured and stored successfully', { | ||
| url, | ||
| storage: uploadedImage.publicUrl.startsWith('data:') ? 'database' : (uploadedImage.publicUrl.includes('/api/screenshots/') ? 'r2' : 'images'), | ||
| length: base64Screenshot.length | ||
| }); | ||
|
|
||
| // Notify successful screenshot capture | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_SUCCESS, { | ||
| message: `Successfully captured screenshot of ${url}`, | ||
| url, | ||
| viewport, | ||
| screenshotSize: base64Screenshot.length, | ||
| timestamp: new Date().toISOString() | ||
| }); | ||
|
|
||
| return uploadedImage.publicUrl; | ||
|
|
||
| } catch (error) { | ||
| this.logger.error('Failed to capture screenshot via REST API:', error); | ||
|
|
||
| // Only broadcast if error wasn't already broadcast above | ||
| const errorMessage = error instanceof Error ? error.message : 'Unknown error'; | ||
| if (!errorMessage.includes('Browser Rendering API') && !errorMessage.includes('Database update failed')) { | ||
| this.broadcast(WebSocketMessageResponses.SCREENSHOT_CAPTURE_ERROR, { | ||
| error: errorMessage, | ||
| url, | ||
| viewport | ||
| }); | ||
| } | ||
|
|
||
| throw new Error(`Screenshot capture failed: ${error instanceof Error ? error.message : 'Unknown error'}`); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟡 MEDIUM - SRP violation - captureScreenshot handles API calls and storage
Category: quality
Description:
Method mixes screenshot capture API logic with image upload and database persistence.
Suggestion:
Extract into ScreenshotService with separate methods for capture, upload, and persistence
Confidence: 80%
Rule: arch_srp_violation
|
Summary
This PR refactors the agent architecture to separate behavior logic from infrastructure, enabling support for multiple agent behaviors (phasic and agentic modes).
Changes
Core Architecture
simpleGeneratorAgent.ts→baseAgent.tswith extracted base behavior classAgentInfrastructure<TState>interface to decouple behavior from Durable ObjectsBaseAgentBehavior<TState>abstract class with common agent functionalityworker/agents/core/phasic/behavior.ts(852 lines) - phasic behavior implementationsmartGeneratorAgent.ts- thin wrapper implementingAgentInfrastructureState Management
state.ts- split state intoBaseProjectState,PhasicState,AgenticStatetypes.ts- addedBehaviorType, genericAgentInitArgs<TState>agentMode: 'deterministic' | 'smart'withbehaviorType: 'phasic' | 'agentic'Operations & Services
worker/agents/to useICodingAgentinterfaceScreenshotAnalysisOperation.ts- screenshot handling moved/removedICodingAgentinstead of concrete classPhaseImplementation.ts- simplified, moved logic to behaviorGenerationContext.ts- addedPhasicGenerationContextvariantInterface Changes
ICodingAgent.ts- formalized interface for all agent behaviorsMotivation
The previous architecture tightly coupled agent behavior to Durable Objects infrastructure, making it difficult to:
This refactoring enables:
AgentInfrastructurefor unit testsTesting
Manual Testing:
Areas Requiring Extra Attention:
agentMode→ newbehaviorType)Breaking Changes
State Schema:
agentMode: 'deterministic' | 'smart'behaviorType: 'phasic' | 'agentic'Impact: Existing Durable Object sessions may need migration logic or will default to phasic mode.
Related Issues
ScreenshotAnalysisOperationremovalThis PR description was automatically generated by Claude Code