Skip to content

Project Overview And Goals

Shubham edited this page Aug 25, 2024 · 1 revision

Project Overview and Goals

Overview

This project was developed during GSoC 2024 and involved the creation of a Rocket.Chat app called LLM Prompt Editor. The app integrates Golem, a prompt editor, with Rocket.Chat's in-house AI models to provide advanced prompt editing and AI inference capabilities directly within Rocket.Chat.

Goals

  1. Create a Prompt Editor: Develop an LLP (Large Language Model) prompt editor using Golem.
  2. Decouple Inference Functionality: Replace Golem's existing OpenAPI inference with Rocket.Chat’s in-house AI models.
  3. Support Streaming: Implement concurrent streaming of responses across multiple devices.
  4. Integrate with Rocket.Chat: Bundle the Golem editor and serve it through a Rocket.Chat app.
  5. API Endpoints: Develop API endpoints in the app to serve the Golem build.
  6. Data Persistence: Store chat history in Rocket.Chat's MongoDB for future inference.
  7. Conversation History API: Provide an endpoint for Golem's frontend to request conversation history.
  8. App Commands: Create Rocket.Chat app commands to initiate workflows from the main Rocket.Chat instance.

Challenges Faced

1. Decoupling Inference

The process of decoupling Golem's behavior from OpenAPI inference was complex due to the numerous hooks involved.

2. Fetching and Rendering Data

Similarly, fetching and rendering conversation data posed challenges due to the hooks.

3. Vite Packaging Issues

Golem used Vite for packaging, which by default splits JavaScript into multiple files. For this project, it was crucial to minimize the number of files, requiring significant adjustments.