+41 544 36 40
  • en
  • de
  • Lexicon


    GPT-2 is a common deep learning language model developed by OpenAI for text generation. It is open source and has been trained on over 1.5 billion parameters to generate the next sequence of text for a given sentence.



    Further development of GPT-2 (see above). Both models use approximately the same architecture, but GPT-3 has more layers (see neural network) and has been trained with more data.


    Graphics Processing Unit / Graphics Card. Graphics cards offer very high computing power and are therefore used for training deep learning models.