Paper
23 May 2023 Malicious language identification based on neural network
Mingyang Ma, Hanyi Luo
Author Affiliations +
Proceedings Volume 12645, International Conference on Computer, Artificial Intelligence, and Control Engineering (CAICE 2023); 1264532 (2023) https://doi.org/10.1117/12.2680805
Event: International Conference on Computer, Artificial Intelligence, and Control Engineering (CAICE 2023), 2023, Hangzhou, China
Abstract
Malicious language identification has been a critical issue in current network environment, automatically investigating malicious sentences has been concerned by security protection of neural language processing However, existing research about malicious investigation is almost based on the keywords detection and the system cannot detect the cryptic information in networks, which needs human to sign the malicious keyword and cost lots of computation costs. In this paper, we describe the usage of a neural network for language identification of malicious information. The neural network model was trained on a dataset of malicious and non-malicious information samples and then simulated on a set of unseen malicious text samples. From our extensive experimental results, we can conclude that the proposed model was able to precisely identify malicious language and achieve a high score approximately about 0.874. The results indicate that neural network models can be a useful tool for detecting malicious language.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Mingyang Ma and Hanyi Luo "Malicious language identification based on neural network", Proc. SPIE 12645, International Conference on Computer, Artificial Intelligence, and Control Engineering (CAICE 2023), 1264532 (23 May 2023); https://doi.org/10.1117/12.2680805
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Education and training

Data modeling

Machine learning

Computing systems

Tumor growth modeling

Neurons

Back to Top