1
1
<h1 align =" center " >
2
- Transformers PHP
2
+ TransformersPHP
3
3
</h1 >
4
4
5
5
<h3 align =" center " >
6
6
<p>State-of-the-art Machine Learning for PHP</p>
7
7
</h3 >
8
8
9
- Transformers PHP is designed to be functionally equivalent to the Python library, while still maintaining the same level
9
+ TransformersPHP is designed to be functionally equivalent to the Python library, while still maintaining the same level
10
10
of performance and ease of use. This library is built on top of the Hugging Face's Transformers library, which provides
11
11
thousands of pre-trained models in 100+ languages. It is designed to be a simple and easy-to-use library for PHP
12
12
developers using a similar API to the Python library. These models can be used for a variety of tasks, including text
13
13
generation, summarization, translation, and more.
14
14
15
- Transformers PHP uses [ ONNX Runtime] ( https://onnxruntime.ai/ ) to run the models, which is a high-performance scoring
15
+ TransformersPHP uses [ ONNX Runtime] ( https://onnxruntime.ai/ ) to run the models, which is a high-performance scoring
16
16
engine for Open Neural Network Exchange (ONNX) models. You can easily convert any PyTorch or TensorFlow model to ONNX
17
- and use it with Transformers PHP using [ 🤗 Optimum] ( https://github.com/huggingface/optimum#onnx--onnx-runtime ) .
17
+ and use it with TransformersPHP using [ 🤗 Optimum] ( https://github.com/huggingface/optimum#onnx--onnx-runtime ) .
18
18
19
19
TO learn more about the library and how it works, head over to
20
20
our [ extensive documentation] ( https://codewithkyrian.github.io/transformers-php/introduction ) .
21
21
22
22
## Quick tour
23
23
24
- Because Transformers PHP is designed to be functionally equivalent to the Python library, it's super easy to learn from
24
+ Because TransformersPHP is designed to be functionally equivalent to the Python library, it's super easy to learn from
25
25
existing Python or Javascript code. We provide the ` pipeline ` API, which is a high-level, easy-to-use API that groups
26
26
together a model with its necessary preprocessing and postprocessing steps.
27
27
@@ -109,7 +109,7 @@ Next, you must run the installation/initialize command to download the shared li
109
109
110
110
## PHP FFI Extension
111
111
112
- Transformers PHP uses the PHP FFI extension to interact with the ONNX runtime. The FFI extension is included by default
112
+ TransformersPHP uses the PHP FFI extension to interact with the ONNX runtime. The FFI extension is included by default
113
113
in PHP 7.4 and later, but it may not be enabled by default. If the FFI extension is not enabled, you can enable it by
114
114
uncommenting(remove the ` ; ` from the beginning of the line) the
115
115
following line in your ` php.ini ` file:
@@ -133,13 +133,13 @@ documentation : [https://codewithkyrian.github.io/transformers-php](https://code
133
133
134
134
## Usage
135
135
136
- By default, Transformers PHP uses hosted pretrained ONNX models. For supported tasks, models that have been converted to
136
+ By default, TransformersPHP uses hosted pretrained ONNX models. For supported tasks, models that have been converted to
137
137
work with [ Xenova's Transformers.js] ( https://huggingface.co/models?library=transformers.js ) on HuggingFace should work
138
- out of the box with Transformers PHP .
138
+ out of the box with TransformersPHP .
139
139
140
140
## Configuration
141
141
142
- You can configure the behaviour of the Transformers PHP library as follows:
142
+ You can configure the behaviour of the TransformersPHP library as follows:
143
143
144
144
``` php
145
145
use Codewithkyrian\Transformers\Transformers;
@@ -159,16 +159,16 @@ the [documentation](https://codewithkyrian.github.io/transformers-php/configurat
159
159
160
160
## Convert your models to ONNX
161
161
162
- Transformers PHP only works with ONNX models, therefore, you must convert your PyTorch, TensorFlow or JAX models to
162
+ TransformersPHP only works with ONNX models, therefore, you must convert your PyTorch, TensorFlow or JAX models to
163
163
ONNX. It is recommended to use [ 🤗 Optimum] ( https://huggingface.co/docs/optimum ) to perform the conversion and
164
164
quantization of your model.
165
165
166
166
## Pre-Download Models
167
167
168
- By default, Transformers PHP automatically retrieves model weights (ONNX format) from the Hugging Face model hub when
168
+ By default, TransformersPHP automatically retrieves model weights (ONNX format) from the Hugging Face model hub when
169
169
you first use a pipeline or pretrained model. This can lead to a slight delay during the initial use. To improve the
170
170
user experience, it's recommended to pre-download the models you intend to use before running them in your PHP
171
- application, especially for larger models. One way to do that is run the request once manually, but Transformers PHP
171
+ application, especially for larger models. One way to do that is run the request once manually, but TransformersPHP
172
172
also comes with a command line tool to help you do just that:
173
173
174
174
``` bash
@@ -195,7 +195,7 @@ Explanation of Arguments:
195
195
196
196
## Supported tasks/models
197
197
198
- This package is a WIP, but here's a list of tasks and architectures currently tested and supported by Transformers PHP .
198
+ This package is a WIP, but here's a list of tasks and architectures currently tested and supported by TransformersPHP .
199
199
200
200
### Tasks
201
201
0 commit comments