NCLTP: Non-Contrastive Learning for Trajectory Prediction
The ability to predict the trajectories of pedestriansand cars is an important task for tasks such as autonomousdriving and navigation for robots. Many current state-of-artmethods are trained using contrastive methods that requirehuman labelled data about the pedestrians, for instance theircurrent action e.g., walking or standing. Human labelleddata is both expensive and time-consuming to produce. Inthis study, I will present a method using a non-contrastivemethod, which produces competitive results without the needfor human labelled data. Instead of comparing the actionlabels of pedestrians, the model uses different augmentationsof the data to learn similar representations. Experimentsfor the proposed method are conducted on both first-personview (FPV) datasets and bird’s-eye view (BEV) datasets. Thismethod provides competitive results to existing state-of-the-artmethods, including methods that make use of human labeledannotations. The results of this paper should provide furtherresearch a base to work from and expand further upon thistopic. The ability to predict the trajectories of pedestriansand cars is an important task for tasks such as autonomousdriving and navigation for robots. Many current state-of-artmethods are trained using contrastive methods that requirehuman labelled data about the pedestrians, for instance theircurrent action e.g., walking or standing. Human labelleddata is both expensive and time-consuming to produce. Inthis study, I will present a method using a non-contrastivemethod, which produces competitive results without the needfor human labelled data. Instead of comparing the actionlabels of pedestrians, the model uses different augmentationsof the data to learn similar representations. Experimentsfor the proposed method are conducted on both first-personview (FPV) datasets and bird’s-eye view (BEV) datasets. Thismethod provides competitive results to existing state-of-the-artmethods, including methods that make use of human labeledannotations. The results of this paper should provide furtherresearch a base to work from and expand further upon thistopic.
https://vbn.aau.dk/ws/files/538309878/P10_Soren_Hjorth_Boelskifte.pdf