Re: Constant change is here to stay
Good question. Learning about cutting edge stuff comes with obvious shortcomings, for example in NLP:
* more than ten years ago you would learn about basic RNN
* five to ten years ago it would be LSTM
* more recently, Transformer architecture
However in all of these case there is one constant - you would be reading up the recent research papers and constantly catching up on the development of new models. So perhaps that's where the value of this teaching is - giving the students the tools to understand the current research, as long as they keep following it.