Abstract:
This thesis provides an in-depth analysis of LoRaWAN technology, encompassing its latest trends, practical applications, and security aspects. A novel technique, SmartLoRaML, is introduced, leveraging machine learning to optimize spreading factor (SF) allocation in LoRaWAN networks. This approach dynamically adjusts SF based on network conditions, leading to improved collision detection. The technique is implemented using a dataset generated from a LoRaWAN simulator, which is then modified to incorporate SmartLoRaML. Performance is assessed across various scenarios using metrics such as accuracy, packet delivery ratio (PDR), energy consumption, recall, precision, F1 score, and Matthews Correlation Coefficient (MCC). Evaluations show that performance varies based on node density and communication radius. Additionally, SmartLoRaML has enhanced transmit energy consumption compared to the random SF allocation method used by the simulator. This comprehensive analysis provides valuable insights for optimizing LoRa resource allocation and enhancing IoT network anomaly detection. Furthermore, this thesis explores a selection of commonly used open-source LoRa/LoRaWAN simulation tools, providing a comparative summary based on programming language, target domain, operating system support, and the presence of a graphical user interface (GUI).