+49 221 - 77 268 261
  • en
  • de
  • Lexicon

    GPT-2

    GPT-2 is a common deep learning language model developed by OpenAI for text generation. It is open source and has been trained on over 1.5 billion parameters to generate the next sequence of text for a given sentence.

    (more…)

    GPT-3

    Further development of GPT-2 (see above). Both models use approximately the same architecture, but GPT-3 has more layers (see neural network) and has been trained with more data.