Skip to content
This repository was archived by the owner on Sep 9, 2025. It is now read-only.
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
Title of work: IBM Granite
Link to work: https://en.wikipedia.org/wiki/IBM_Granite
Revision: https://en.wikipedia.org/wiki/IBM_Granite?utm_source=ibm_developer&utm_content=in_content_link&utm_id=tutorials_awb-contributing-llm-granite-instructlab-ui
License of the work: IBM-AJ-1.0
Creator names: Wikiepedia Authors
82 changes: 82 additions & 0 deletions knowledge/technology/large_language_model/granite/qna.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
created_by: amnajamal-ibm
version: 3
domain: large-language-model
document_outline: Knowledge contribution about the IBM Granite model
seed_examples:
- context: >-
IBM Granite is a series of decoder-only AI foundation models created by
IBM. It was announced on September 7, 2023, and an initial paper was
published 4 days later
questions_and_answers:
- question: What is IBM Granite
answer: >-
IBM Granite is a series of decoder-only AI foundation models created
by IBM
- question: When was IBM Granite announced?
answer: September 7, 2023
- question: What's a series of IBM decoder-only AI foundation models?
answer: IBM Granite
- context: >-
Granite's first foundation models were Granite.13b.instruct and
Granite.13b.chat. The "13b" in their name comes from 13 billion, the
amount of parameters they have as models, lesser than most of the larger
models of the time. Later models vary from 3 to 34 billion parameters
questions_and_answers:
- question: What was Granite's first foundation model?
answer: ' Granite.13b.instruct and Granite.13b.chat'
- question: What does 13b in the name mean?
answer: 13 billion, amount of parameters they have as models
- question: What is the range of parameters in the later models?
answer: 3 to 34 billion parameters
- context: >-
On May 6, 2024, IBM released the source code of four variations of Granite
Code Models under Apache 2, an open source permissive license that allows
completely free use, modification and sharing of the software, and put
them on Hugging Face for public use. According to IBM's own report,
Granite 8b outperforms Llama 3 on several coding related tasks within
similar range of parameters
questions_and_answers:
- question: >-
When did IBM release the source code of four variations of Granite
code Models under Apache 2?
answer: May 6, 2024
- question: What is Apache 2 standard?
answer: >-
an open source permissive license that allows completely free use,
modification and sharing of the software
- question: >-
Which Granite model outperforms Lllama 3 on several coding related
taks?
answer: 'Granite 8b '
- context: >-
A foundation model is an AI model trained on broad data at scale such that
it can be adapted to a wide range of downstream tasks
questions_and_answers:
- question: What is a foundation model?
answer: A foundation model is an AI model trained on broad data at scale
- question: Can foundation models adapt to a wide range of downstream tasks?
answer: 'Yes'
- question: What is trained on broad data at scale?
answer: Foundation models
- context: >-
Initially intended for use in the IBM's cloud-based data and generative AI
platform Watsonx along with other models, IBM opened the source code of
some code models. Granite models are trained on datasets curated from
Internet, academic publishings, code datasets, legal and finance
documents.
questions_and_answers:
- question: What data is Granite model trained on?
answer: >-
Granite models are trained on datasets curated from Internet, academic
publishings, code datasets, legal and finance documents.
- question: Where was Granite series initially intended for use ?
answer: >-
Initially intended for use in the IBM's cloud-based data and
generative AI platform Watsonx
- question: Did IBM open source code of some code models?
answer: 'Yes'
document:
repo: https://github.com/amnajamal-ibm/taxonomy-knowledge-docs
commit: 2d5313a1a3db04f68a23fe0f6da45466b1b4cf03
patterns:
- IBM-Granite-20241206T034852310.md
Loading