Skip to content

Releases: CodeWithKyrian/transformers-php

v0.3.0

13 Apr 23:58
Compare
Choose a tag to compare
v0.3.0 Pre-release
Pre-release

What's Changed

Breaking Changes

  • The install command no longer exists, as the required libraries are downloaded automatically on composer install.
  • New Image driver configuration settings added that required either GD, Imagick or Vips

New Contributors

Full Changelog: 0.2.2...0.3.0

v0.2.2

25 Mar 20:03
Compare
Choose a tag to compare
v0.2.2 Pre-release
Pre-release

What's new

  • bugfix: Fix the wrong argument being passed in Autotokenizer by @CodeWithKyrian in 05e5588
  • feat: cache tokenizer output to improve speed in repetitive tasks leading to 75% speed improvement (11.7687s to 2.9687s) by @CodeWithKyrian in b115c28

Full Changelog: 0.2.1...0.2.2

v0.2.1

22 Mar 06:16
Compare
Choose a tag to compare
v0.2.1 Pre-release
Pre-release

What's Changed

  • bugfix: Add symfony/console explicitly as a dependency by @CodeWithKyrian in #7
  • bugfix: Autoload errors for WordPieceTokenizer on case-sensitive operating systems in 0f1fc8b

Full Changelog: 0.2.0...0.2.1

v0.2.0

21 Mar 11:51
Compare
Choose a tag to compare
v0.2.0 Pre-release
Pre-release

What's Changed

  • feat: Add ability to use chat templates in Text Generation by @CodeWithKyrian in #1
  • bugfix: Autoload errors for PretrainedModel on case-sensitive operating systems by @CodeWithKyrian in #4
  • feat: Bump OnnxRuntime PHP to 0.2.0 in b333162
  • feat: Improve download and install command interfaces to show progress bar in b333162

Full Changelog: 0.1.0...0.2.0

v0.1.0

15 Mar 08:13
Compare
Choose a tag to compare
v0.1.0 Pre-release
Pre-release

Initial Release 🎉

We are thrilled to announce the launch of Transformers PHP, a groundbreaking library that brings the power of state-of-the-art machine learning to the PHP community. Inspired by the HuggingFace Transformers and Xenova Transformers.js, Transformers PHP aims to provide an easy-to-use, high-performance toolset for developers looking to integrate advanced NLP, and in future updates potentially more, capabilities into their PHP applications.

Key Features:

  • Seamless Integration: Designed to be functionally equivalent to its Python counterpart, making the transition and usage straightforward for developers familiar with the original Transformers library.
  • Performance Optimized: Utilizes ONNX Runtime for efficient model inference, ensuring high performance even in demanding scenarios.
  • Comprehensive Model Support: Access to thousands of pre-trained models across 100+ languages, covering a wide range of tasks including text generation, summarization, translation, sentiment analysis, and more.
  • Easy Model Conversion: With 🤗 Optimum, easily convert PyTorch or TensorFlow models to ONNX format for use with Transformers PHP.
  • Developer Friendly: From installation to deployment, every aspect of Transformers PHP is designed with ease of use in mind, featuring extensive documentation and a streamlined API.

Getting Started:

Installation is a breeze with Composer:

composer require codewithkyrian/transformers

And you must initialize the library to download neccesary libraries for ONNX

./vendor/bin/transformers install

Checkout the Documentation

For a comprehensive guide on how to use Transformers PHP, including detailed examples and configuration options, visit our documentation.

Pre-Download Models:

To ensure a smooth user experience, especially with larger models, we recommend pre-downloading models before deployment. Transformers PHP includes a handy CLI tool for this purpose:

./vendor/bin/transformers download <model_identifier>

What's Next?

This initial release lays the groundwork for a versatile machine learning toolkit within the PHP ecosystem. We are committed to continuous improvement and expansion of Transformers PHP, with future updates aimed at increasing supported tasks, enhancing functionality, and broadening the scope of models.

Get Involved!

We encourage feedback, contributions, and discussions from the community. Whether you're reporting bugs, requesting features, or contributing code, your input is invaluable in making Transformers PHP better for everyone.

Acknowledgments:

A huge thank you to Hugging Face for their incredible work on the Transformers library, to Xenova for inspiring this package, and to the broader machine learning community for their ongoing research and contributions. Transformers PHP stands on the shoulders of giants, and we are excited to see how it will empower PHP developers to push the boundaries of what's possible.