Researcher
Phone:
+49 711 685 88115
We develop transformer models for multistep prediction of vehicle states. Multistep prediction is the prediction of states based on initial states and a series of control inputs. We test different modifications of the transformer architecture using the example of the prediction of a ship simulation. Research in NLP promises advantages w.r.t. training time and prediction accuracy for the transformer architecture compared to a state-of-the-art LSTM model. We
also investigate whether positional encodings are useful in this scenario and if a transformer model can learn the order of the inputs without positional encodings.