We present a series of long-context LLMs that support effective context windows of up to 32,768 tokens. Our model series are built through continual...

Read more here: External Link