Skip to content

Commit

Permalink
Merge pull request #1 from llm-agents-php/feature/laravel-support
Browse files Browse the repository at this point in the history
Adds Laravel support
  • Loading branch information
butschster authored Sep 1, 2024
2 parents b846992 + a1aa99a commit fbc3e44
Show file tree
Hide file tree
Showing 4 changed files with 91 additions and 28 deletions.
51 changes: 41 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

This package is your go-to solution for integrating OpenAI's powerful API into your LLM Agents projects.

## What's in the box?
## What's in the box?

- Easy setup with Spiral framework
- Smooth integration with OpenAI's API
Expand All @@ -22,13 +22,15 @@ composer require llm-agents/openai-client

2. That's it! You're ready to roll.

## Setting it up 🔧
### Setting it up in Spiral

To get the OpenAI client up and running in your Spiral app, you need to register the bootloader. Here's how:
To get the OpenAI client up and running in your Spiral app, you need to register the bootloader.

**Here's how:**

1. Open up your `app/src/Application/Kernel.php` file.

2. In your `Kernel` class, find or create the `defineBootloaders()` method and add the `OpenAIClientBootloader`:
2. In your `Kernel` class add the `LLM\Agents\OpenAI\Client\Integration\Spiral\OpenAIClientBootloader` bootloader:

```php
class Kernel extends \Spiral\Framework\Kernel
Expand All @@ -37,22 +39,51 @@ class Kernel extends \Spiral\Framework\Kernel
{
return [
// ... other bootloaders ...
\LLM\Agents\OpenAI\Client\Bootloader\OpenAIClientBootloader::class,
\LLM\Agents\OpenAI\Client\Integration\Spiral\OpenAIClientBootloader::class,
];
}
}
```

## Configuration ⚙️
The package uses your OpenAI API key and organization (if you have one) to authenticate.

The package uses your OpenAI API key and organization (if you have one) to authenticate. Set these up in your `.env`
file:
Set these up in your `.env` file:

```
OPENAI_KEY=your_api_key_here
```

## Contributing
### Setting it up in Laravel

If you're using the Laravel framework, you'll need to install the `openai-php/laravel` package register the Service
provider.

**Here's how:**

1. Install the `openai-php/laravel` package:

```bash
composer require openai-php/laravel
```

2. Next, execute the install command:

```bash
php artisan openai:install
```

3. Finally, add your OpenAI API key to your `.env` file:

```
OPENAI_API_KEY=sk-...
OPENAI_ORGANIZATION=org-...
```

4. And register the `LLM\Agents\OpenAI\Client\Integration\Laravel\OpenAIClientServiceProvider`

And that's it! The service provider will take care of registering the `LLMInterface` for you.

## Contributing

We're always happy to get help making this package even better! Here's how you can chip in:

Expand All @@ -62,7 +93,7 @@ We're always happy to get help making this package even better! Here's how you c

Please make sure your code follows PSR-12 coding standards and include tests for any new features.

## License
## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

Expand Down
12 changes: 10 additions & 2 deletions composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,12 @@
"php": "^8.3",
"openai-php/client": "^0.10.1",
"llm-agents/agents": "^1.0",
"spiral/boot": "^3.13",
"guzzlehttp/guzzle": "^7.0"
},
"require-dev": {
"phpunit/phpunit": "^11.3"
"phpunit/phpunit": "^11.3",
"spiral/boot": "^3.13",
"illuminate/support": "^11.0"
},
"autoload": {
"psr-4": {
Expand All @@ -25,6 +26,13 @@
"config": {
"sort-packages": true
},
"extra": {
"laravel": {
"providers": [
"LLM\\Agents\\OpenAI\\Client\\Integration\\Laravel\\OpenAIClientServiceProvider"
]
}
},
"minimum-stability": "dev",
"prefer-stable": true
}
39 changes: 39 additions & 0 deletions src/Integration/Laravel/OpenAIClientServiceProvider.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
<?php

declare(strict_types=1);

namespace LLM\Agents\OpenAI\Client\Integration\Laravel;

use Illuminate\Contracts\Foundation\Application;
use Illuminate\Support\ServiceProvider;
use LLM\Agents\LLM\LLMInterface;
use LLM\Agents\OpenAI\Client\LLM;
use LLM\Agents\OpenAI\Client\Parsers\ChatResponseParser;
use LLM\Agents\OpenAI\Client\StreamResponseParser;
use OpenAI\Responses\Chat\CreateStreamedResponse;

final class OpenAIClientServiceProvider extends ServiceProvider
{
public function register(): void
{
$this->app->singleton(
LLMInterface::class,
LLM::class,
);

$this->app->singleton(
StreamResponseParser::class,
static function (Application $app): StreamResponseParser {
$parser = new StreamResponseParser();

// Register parsers here
$parser->registerParser(
CreateStreamedResponse::class,
$app->make(ChatResponseParser::class),
);

return $parser;
},
);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,14 @@

declare(strict_types=1);

namespace LLM\Agents\OpenAI\Client\Bootloader;
namespace LLM\Agents\OpenAI\Client\Integration\Spiral;

use GuzzleHttp\Client as HttpClient;
use LLM\Agents\LLM\LLMInterface;
use LLM\Agents\OpenAI\Client\LLM;
use LLM\Agents\OpenAI\Client\Parsers\ChatResponseParser;
use LLM\Agents\OpenAI\Client\StreamResponseParser;
use OpenAI\Contracts\ClientContract;
use OpenAI\Responses\Chat\CreateStreamedResponse;
use Spiral\Boot\Bootloader\Bootloader;
use Spiral\Boot\EnvironmentInterface;

final class OpenAIClientBootloader extends Bootloader
{
Expand All @@ -21,18 +18,6 @@ public function defineSingletons(): array
return [
LLMInterface::class => LLM::class,

ClientContract::class => static fn(
EnvironmentInterface $env,
): ClientContract => \OpenAI::factory()
->withApiKey($env->get('OPENAI_KEY'))
->withHttpHeader('OpenAI-Beta', 'assistants=v1')
->withHttpClient(
new HttpClient([
'timeout' => (int) $env->get('OPENAI_HTTP_CLIENT_TIMEOUT', 2 * 60),
]),
)
->make(),

StreamResponseParser::class => static function (
ChatResponseParser $chatResponseParser,
): StreamResponseParser {
Expand Down

0 comments on commit fbc3e44

Please sign in to comment.