Abstract:
Transformer models such as BERT and Robustly Optimized BERT (RoBERTa) are
popular choices for text processing in Natural Language Processing (NLP). However,
these models cannot be used for the entirety of the contents because of their limited
token size of 512. This research proposes two Fuzzy-based Optimized fake-News Detection
(FOND) models: Fuzzy Optimized Big Bird (FOBB) and Fuzzy Optimized
Longformer (FOLF). Big Bird and Longformer were chosen to classify the authenticity
of complete content, such as news and articles, due to their large input size of 4096
tokens, which is eight times more than BERT can take. In addition, Fuzzy Logic was
used to quantify the probability of news authenticity and the LookAhead optimization
technique to improve the generalization of the proposed models further and prevent
overfitting. This research also aims to raise awareness of the benefits of using Big
Bird and Longformer with Fuzzy Logic in Fake News Detection. Two datasets were
employed in this research: Multifake and COVID-19 Fake News.