-
Notifications
You must be signed in to change notification settings - Fork 3
Project Overview And Goals
Shubham edited this page Aug 25, 2024
·
1 revision
This project was developed during GSoC 2024 and involved the creation of a Rocket.Chat app called LLM Prompt Editor. The app integrates Golem, a prompt editor, with Rocket.Chat's in-house AI models to provide advanced prompt editing and AI inference capabilities directly within Rocket.Chat.
- Create a Prompt Editor: Develop an LLP (Large Language Model) prompt editor using Golem.
- Decouple Inference Functionality: Replace Golem's existing OpenAPI inference with Rocket.Chat’s in-house AI models.
- Support Streaming: Implement concurrent streaming of responses across multiple devices.
- Integrate with Rocket.Chat: Bundle the Golem editor and serve it through a Rocket.Chat app.
- API Endpoints: Develop API endpoints in the app to serve the Golem build.
- Data Persistence: Store chat history in Rocket.Chat's MongoDB for future inference.
- Conversation History API: Provide an endpoint for Golem's frontend to request conversation history.
- App Commands: Create Rocket.Chat app commands to initiate workflows from the main Rocket.Chat instance.
The process of decoupling Golem's behavior from OpenAPI inference was complex due to the numerous hooks involved.
Similarly, fetching and rendering conversation data posed challenges due to the hooks.
Golem used Vite for packaging, which by default splits JavaScript into multiple files. For this project, it was crucial to minimize the number of files, requiring significant adjustments.