|
| 1 | +created_by: ahmed-azraq |
| 2 | +version: 3 |
| 3 | +domain: large-language-model |
| 4 | +document_outline: Knowledge contribution about the IBM Granite model |
| 5 | +seed_examples: |
| 6 | + - context: >- |
| 7 | + IBM Granite is a series of decoder-only AI foundation models created by |
| 8 | + IBM. It was announced on September 7, 2023, and an initial paper was |
| 9 | + published 4 days later. |
| 10 | + questions_and_answers: |
| 11 | + - question: What is IBM Granite? |
| 12 | + answer: >- |
| 13 | + IBM Granite is a series of decoder-only AI foundation models created |
| 14 | + by IBM. |
| 15 | + - question: When was IBM Granite announced? |
| 16 | + answer: September 7, 2023 |
| 17 | + - question: What's a series of IBM decoder-only AI foundation models? |
| 18 | + answer: IBM Granite |
| 19 | + - context: >- |
| 20 | + Initially intended for use in the IBM's cloud-based data and generative AI |
| 21 | + platform Watsonx along with other models, IBM opened the source code of |
| 22 | + some code models. Granite models are trained on datasets curated from |
| 23 | + Internet, academic publishings, code datasets, legal and finance |
| 24 | + documents. |
| 25 | + questions_and_answers: |
| 26 | + - question: What was the original intention for IBM Granite? |
| 27 | + answer: >- |
| 28 | + Initially intended for use in the IBM's cloud-based data and |
| 29 | + generative AI platform Watsonx. |
| 30 | + - question: What are granite models trained on? |
| 31 | + answer: >- |
| 32 | + Datasets curated from Internet, academic publishings, code datasets, |
| 33 | + legal and finance documents. |
| 34 | + - question: Is Granite models open-source? |
| 35 | + answer: 'Yes' |
| 36 | + - context: >- |
| 37 | + IBM Granite is a series of decoder-only AI foundation models created by |
| 38 | + IBM. A foundation model is an AI model trained on broad data at scale. |
| 39 | + questions_and_answers: |
| 40 | + - question: What is a foundation model? |
| 41 | + answer: AI model trained on broad data at scale |
| 42 | + - question: What is an example of foundation model from IBM? |
| 43 | + answer: IBM Granite |
| 44 | + - question: What is an AI model trained on broad data at scale from IBM? |
| 45 | + answer: IBM Granite |
| 46 | + - context: >- |
| 47 | + Granite's first foundation models were Granite.13b.instruct and |
| 48 | + Granite.13b.chat. The "13b" in their name comes from 13 billion, the |
| 49 | + amount of parameters they have as models, lesser than most of the larger |
| 50 | + models of the time. Later models vary from 3 to 34 billion parameters. |
| 51 | + questions_and_answers: |
| 52 | + - question: What are Granite's first foundation models? |
| 53 | + answer: Granite.13b.instruct and Granite.13b.chat |
| 54 | + - question: What does 13b in Granite.13b indicate? |
| 55 | + answer: The "13b" in their name comes from 13 billion parameters. |
| 56 | + - question: What are the latest models variations from Granite? |
| 57 | + answer: Later models vary from 3 to 34 billion parameters. |
| 58 | + - context: >- |
| 59 | + On May 6, 2024, IBM released the source code of four variations of Granite |
| 60 | + Code Models under Apache 2, an open source permissive license. |
| 61 | + questions_and_answers: |
| 62 | + - question: When has IBM released Granite Models as open source? |
| 63 | + answer: May 6, 2024 |
| 64 | + - question: What are the open source license for IBM Granite models? |
| 65 | + answer: Apache 2 |
| 66 | + - question: >- |
| 67 | + How many variations has IBM released as open source for Granite on |
| 68 | + 6-May? |
| 69 | + answer: Four |
| 70 | +document: |
| 71 | + repo: https://github.com/ahmed-azraq/taxonomy-knowledge-docs |
| 72 | + commit: f82016ee5187852adac9e917f83c24861801db64 |
| 73 | + patterns: |
| 74 | + - IBM-Granite-20241011T123439034.md |
0 commit comments