| ## Population Transformer | |
| Weights for pretrained Population Transformer ([paper](https://arxiv.org/abs/2406.03044), [website](https://glchau.github.io/population-transformer/), [code](https://github.com/czlwang/PopulationTransformer)) using pretrained BrainBERT stft model ([paper](https://arxiv.org/abs/2302.14367), [code](https://github.com/czlwang/BrainBERT)). | |
| Trained on Brain TreeBank ([paper](https://arxiv.org/pdf/2411.08343), [dataset](https://braintreebank.dev/)) | |
| ### Cite: | |
| ``` | |
| @misc{chau2024populationtransformer, | |
| title={Population Transformer: Learning Population-level Representations of Neural Activity}, | |
| author={Geeling Chau and Christopher Wang and Sabera Talukder and Vighnesh Subramaniam and Saraswati Soedarmadji and Yisong Yue and Boris Katz and Andrei Barbu}, | |
| year={2024}, | |
| eprint={2406.03044}, | |
| archivePrefix={arXiv}, | |
| primaryClass={cs.LG}, | |
| url={https://arxiv.org/abs/2406.03044}, | |
| } | |
| ``` | |