Skip to content

Commit

Permalink
Update README.md to be more detailed
Browse files Browse the repository at this point in the history
  • Loading branch information
jshuadvd committed Jul 16, 2024
1 parent d64cc51 commit 7a803d1
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ The **LongRoPE** model architecture is designed to extend the context window of
The LongRoPE model extends the context window of large language models beyond 2 million tokens. Key components include:

1. Rotary Position Encoding (RoPE):

```python
class RoPEPositionalEncoding(nn.Module):
def __init__(self, d_model, max_len=1000000, base=10000):
Expand All @@ -85,6 +86,7 @@ The LongRoPE model extends the context window of large language models beyond 2
return sin_cos.view(*sin_cos.shape[:-2], -1)

2. Non-uniform Interpolation:

```python
def non_uniform_interpolation(pos_embed, extension_ratio, lambda_factors, n_hat):
d_model = pos_embed.shape[-1]
Expand All @@ -97,6 +99,7 @@ The LongRoPE model extends the context window of large language models beyond 2
interpolated_pos[..., 2 * i + 1] *= scale
return interpolated_pos


### Progressive Extension Strategy

The architecture begins with a pre-trained LLM and extends its context window incrementally. Initially, the model is fine-tuned to handle a context length of 256k tokens. This progressive approach avoids the need for direct fine-tuning on extremely long texts, which are rare and computationally expensive to process. By gradually increasing the context length, the model can adapt more effectively to longer sequences.
Expand Down

0 comments on commit 7a803d1

Please sign in to comment.