[OC] Exponential Growth of Context Length in Language Models

    by porkbellyqueen111

    2 Comments

    1. porkbellyqueen111 on

      Methodology:
      Had to track down each individual model’s release blog (if there was one) and cross reference with their API docs (if it existed). Or a paper (if there was one). This field changes so fast, and also it’s not uncommon for a company to release a model with X context window then 1 month later update the API docs and be like “BUT WAIT! The context length is now Y”)

      Sharing the raw data here, since I spent so much time painstakingly collecting this data. Also, open to spot checking in case I missed something.

      [https://docs.google.com/spreadsheets/d/1xaU5Aj16mejjNvReQof0quwBJEXPOtN8nLsdBZZmepU/edit?gid=0#gid=0](https://docs.google.com/spreadsheets/d/1xaU5Aj16mejjNvReQof0quwBJEXPOtN8nLsdBZZmepU/edit?gid=0#gid=0)

    2. How long before the context length is just about everything one person might have ever said to an AI in their lifetime?

      Some quick calculations indicate we are only about 3 orders of magnitude from that possibility. Assuming something akin to Moore’s law, in 20 years or so that will be reality.

    Leave A Reply