From 295ec16cb396f79132d40a2748c9f0697c33dc66 Mon Sep 17 00:00:00 2001 From: Kaito Sugimoto Date: Sun, 8 Dec 2024 17:34:11 +0900 Subject: [PATCH] add LLM-jp-3 VILA 14B --- README.md | 3 ++- en/README.md | 3 ++- fr/README.md | 5 +++-- 3 files changed, 7 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 1046397..1d63731 100644 --- a/README.md +++ b/README.md @@ -322,9 +322,10 @@ **汎用** -| | アーキテクチャ | 学習画像/テキスト | 開発元 | ライセンス | +| | アーキテクチャ | 学習画像/テキスト | 開発元 | ライセンス / 利用規約 | |:---|:---:|:---:|:---:|:---:| | [llava-calm2-siglip](https://www.cyberagent.co.jp/news/detail/id=30344)
([llava-calm2-siglip](https://huggingface.co/cyberagent/llava-calm2-siglip)) | LLaVA-1.5 | MS-COCO と VisualGenome から生成された対話データ | サイバーエージェント | Apache 2.0 | +| [LLM-jp-3 VILA 14B](https://llmc.nii.ac.jp/topics/llm-jp-3-vila-14b/)
([14b](https://huggingface.co/llm-jp/llm-jp-3-vila-14b)) | LLaVA-1.5 | [Japanese image text pairs](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-image-text-pairs), LLaVA-Pretrain, [Japanese interleaved data](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-interleaved-data), coyo (subset), mmc4-core (subset), [llava-instruct-ja](https://huggingface.co/datasets/llm-jp/llava-instruct-ja), [japanese-photos-conv](https://huggingface.co/datasets/llm-jp/japanese-photos-conversation), ja-vg-vqa, synthdog-ja, LLaVA-1.5 instruction data (subset) | 大規模言語モデル研究開発センター (LLMC) | Apache 2.0 & OpenAI Terms of Use | | [Heron](https://github.com/turingmotors/heron/blob/main/docs/README_JP.md)
([blip-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v0), [blip-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1), [blip-ja-stablelm-base-7b-v1-llava-620k](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1-llava-620k), [git-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v0), [git-ELYZA-fast-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ELYZA-fast-7b-v0), [git-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v1)) | BLIP-2 または GIT | v1: LLaVA-Instruct-150K-JA または LLaVA-Instruct-620K-JA
v0: LLaVA-Instruct-150K-JA, Japanese STAIR Captions, Japanese Visual Genome VQA dataset | Turing | CC BY-NC 4.0 | | [Japanese Stable VLM](https://ja.stability.ai/blog/japanese-stable-vlm)
([japanese-stable-vlm](https://huggingface.co/stabilityai/japanese-stable-vlm)) | LLaVA-1.5 | Japanese CC12M, STAIR Captions, Japanese Visual Genome VQA dataset | Stability AI | STABILITY AI JAPANESE STABLE VLM COMMUNITY LICENSE | | [Japanese InstructBLIP Alpha](https://ja.stability.ai/blog/japanese-instructblip-alpha)
([japanese-instructblip-alpha](https://huggingface.co/stabilityai/japanese-instructblip-alpha)) | InstructBLIP | Japanese CC12M, STAIR Captions, Japanese Visual Genome VQA dataset | Stability AI | JAPANESE STABLELM RESEARCH LICENSE | diff --git a/en/README.md b/en/README.md index 5b0d518..9473696 100644 --- a/en/README.md +++ b/en/README.md @@ -320,9 +320,10 @@ Please point out any errors on the [issues page](https://github.com/llm-jp/aweso **General purpose** -| | Architecture | Training Data | Developer | License | +| | Architecture | Training Data | Developer | License / Terms of Use | |:---|:---:|:---:|:---:|:---:| | [llava-calm2-siglip](https://www.cyberagent.co.jp/news/detail/id=30344)
([llava-calm2-siglip](https://huggingface.co/cyberagent/llava-calm2-siglip)) | LLaVA-1.5 | coversational data generated from MS-COCO and VisualGenome | CyberAgent | Apache 2.0 | +| [LLM-jp-3 VILA 14B](https://llmc.nii.ac.jp/en/topics/llm-jp-3-vila-14b/)
([14b](https://huggingface.co/llm-jp/llm-jp-3-vila-14b)) | LLaVA-1.5 | [Japanese image text pairs](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-image-text-pairs), LLaVA-Pretrain, [Japanese interleaved data](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-interleaved-data), coyo (subset), mmc4-core (subset), [llava-instruct-ja](https://huggingface.co/datasets/llm-jp/llava-instruct-ja), [japanese-photos-conv](https://huggingface.co/datasets/llm-jp/japanese-photos-conversation), ja-vg-vqa, synthdog-ja, LLaVA-1.5 instruction data (subset) | Research and Development Center for Large Language Models (LLMC) | Apache 2.0 & OpenAI Terms of Use | | [Heron](https://github.com/turingmotors/heron)
([blip-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v0), [blip-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1), [blip-ja-stablelm-base-7b-v1-llava-620k](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1-llava-620k), [git-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v0), [git-ELYZA-fast-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ELYZA-fast-7b-v0), [git-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v1)) | BLIP-2 / GIT | v1: LLaVA-Instruct-150K-JA or LLaVA-Instruct-620K-JA
v0: LLaVA-Instruct-150K-JA, Japanese STAIR Captions, Japanese Visual Genome VQA dataset | Turing | CC BY-NC 4.0 | | [Japanese Stable VLM](https://ja.stability.ai/blog/japanese-stable-vlm)
([japanese-stable-vlm](https://huggingface.co/stabilityai/japanese-stable-vlm)) | LLaVA-1.5 | Japanese CC12M, STAIR Captions, Japanese Visual Genome VQA dataset | Stability AI | STABILITY AI JAPANESE STABLE VLM COMMUNITY LICENSE | | [Japanese InstructBLIP Alpha](https://ja.stability.ai/blog/japanese-instructblip-alpha)
([japanese-instructblip-alpha](https://huggingface.co/stabilityai/japanese-instructblip-alpha)) | InstructBLIP | Japanese CC12M, STAIR Captions, Japanese Visual Genome VQA dataset | Stability AI | JAPANESE STABLELM RESEARCH LICENSE | diff --git a/fr/README.md b/fr/README.md index fa3dcd9..84f55ed 100644 --- a/fr/README.md +++ b/fr/README.md @@ -320,9 +320,10 @@ N'hésitez pas à signaler les erreurs sur la page [issues](https://github.com/l **D'usage général** -| | Architecture | Données d'entraînement | Développeur | Licence | +| | Architecture | Données d'entraînement | Développeur | License / Terms of Use | |:---|:---:|:---:|:---:|:---:| -| [llava-calm2-siglip](https://www.cyberagent.co.jp/news/detail/id=30344)
([llava-calm2-siglip](https://huggingface.co/cyberagent/llava-calm2-siglip)) | LLaVA-1.5 | coversational data generated from MS-COCO and VisualGenome | CyberAgent | Apache 2.0 | +| [llava-calm2-siglip](https://www.cyberagent.co.jp/news/detail/id=30344)
([llava-calm2-siglip](https://huggingface.co/cyberagent/llava-calm2-siglip)) | LLaVA-1.5 | coversational data generated from MS-COCO and VisualGenome | CyberAgent | Apache 2.0 | +| [LLM-jp-3 VILA 14B](https://llmc.nii.ac.jp/en/topics/llm-jp-3-vila-14b/)
([14b](https://huggingface.co/llm-jp/llm-jp-3-vila-14b)) | LLaVA-1.5 | [Japanese image text pairs](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-image-text-pairs), LLaVA-Pretrain, [Japanese interleaved data](https://gitlab.llm-jp.nii.ac.jp/datasets/llm-jp-japanese-interleaved-data), coyo (subset), mmc4-core (subset), [llava-instruct-ja](https://huggingface.co/datasets/llm-jp/llava-instruct-ja), [japanese-photos-conv](https://huggingface.co/datasets/llm-jp/japanese-photos-conversation), ja-vg-vqa, synthdog-ja, LLaVA-1.5 instruction data (subset) | Research and Development Center for Large Language Models (LLMC) | Apache 2.0 & OpenAI Terms of Use | [Heron](https://github.com/turingmotors/heron)
([blip-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v0), [blip-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1), [blip-ja-stablelm-base-7b-v1-llava-620k](https://huggingface.co/turing-motors/heron-chat-blip-ja-stablelm-base-7b-v1-llava-620k), [git-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v0), [git-ELYZA-fast-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ELYZA-fast-7b-v0), [git-ja-stablelm-base-7b-v1](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v1)) | BLIP-2 / GIT | v1: LLaVA-Instruct-150K-JA or LLaVA-Instruct-620K-JA
v0: LLaVA-Instruct-150K-JA, Japanese STAIR Captions, Japanese Visual Genome VQA dataset | Turing | CC BY-NC 4.0 | | [Japanese Stable VLM](https://ja.stability.ai/blog/japanese-stable-vlm)
([japanese-stable-vlm](https://huggingface.co/stabilityai/japanese-stable-vlm)) | LLaVA-1.5 | Japanese CC12M, STAIR Captions, jeu de données Japanese Visual Genome VQA | Stability AI | STABILITY AI JAPANESE STABLE VLM COMMUNITY LICENSE | | [Japanese InstructBLIP Alpha](https://ja.stability.ai/blog/japanese-instructblip-alpha)
([japanese-instructblip-alpha](https://huggingface.co/stabilityai/japanese-instructblip-alpha)) | InstructBLIP | Japanese CC12M, STAIR Captions, jeu de données Japanese Visual Genome VQA | Stability AI | JAPANESE STABLELM RESEARCH LICENSE |