Code developed for the FakeDes 2021. Here I reused the Attention Model from my Master's Thesis, but also probed Random Forest, XGBoost or SVM. The best results at training were the ones from the Attention Model, so I only uploaded this submission.
I ran out of time to participate in the competition, so I didn't have time to use a HugginFace model from the transformers package. The idea was using a pre-trained model such as BERT or DistilBERT.
The fake news corpus in Spanish was used for the Fake News Detection Task in the MEX-A3T competition at the IberLEF 2020 congress. The details of the competition can be viewed in the main page of the competition.