"...GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text."
And? What for? To mimic Trump?
AI is a purely commercial project from the very beginning! AI is looking for groups of patterns that are both contextually and subtextually aimed at a practical purpose. For instance they are commands for a driverless car or information for a financial broker.
AI is trained using a very good indexed dictionary, 25Mb big.