• en
  • de
  • Public Offering

    GPT-2

    GPT-2 is a common deep learning language model developed by OpenAI for text generation. It is open source and has been trained on over 1.5 billion parameters to generate the next sequence of text for a given sentence.

    The model is capable of learning language tasks such as reading, summarising and translating from raw text. No domain-specific training data is required. Compared to its predecessor GPT, GPT-2 has ten times the parameters and ten times the amount of data.

     

    Sources:

    https://www.kdnuggets.com/2021/02/gpt2-gpt3-openai-showdown.html#:~:text=GPT%2D2%20is%20an%20acronym,text%20for%20a%20given%20sentence.