Transfer learning in Brain-Computer Interfaces: Language-Pretrained Transformers for Classifying Electroencephalography
This project was my master’s thesis for the degree of MSc in Applied Sciences and Engineering: Computer Science, at the Vrije Universiteit Brussel. I worked on transfer learning in Brain-Computer Interfaces (BCI). In an article published by a collaboration between Facebook and Google, it was found that language-pretrained GPT2 could transfer to solving tasks unrelated to language, sometimes even without finetuning. The goal of my thesis was to verify how well it could classify electroencephalography (EEG, brain waves). While the results were not spectacular, I did find some knowledge transfer from the language domain to the EEG domain.
I tried my very best to make the work as reproducible as possible.
If you are interested, take a look at the GitHub repository:
The thesis itself can be found there there as well.
Additionally, I published a summary of the results on Weights & Biases.
An extended abstract of the thesis was accepted at BNAIC, where I presented it in a short talk.
Supervisors: Prof. Dr. Geraint Wiggins & Prof. Dr. Kevin De Pauw