Following a peer-review process, Sensors MDPI Journal has accepted to publish a scientific manuscript, co-authored by UBITECH’s Theodora Anastasiou, Sophia Karagiorgou, Petros Petrou (members of the Privacy-preserving Distributed Machine Learning Research Group), Dimitris Papamartzivanos and Thanassis Giannetsos (members of the Digital Security and Trusted Computing Research Group), entitled “Towards Robustifying Image Classifiers against the Perils of Adversarial Attacks on Artificial Intelligence Systems”. UBITECH’s team members and their co-editors (from the Hellenic Army Information Technology Support Center and Phillips Netherlands) introduces an AI architecture augmented with adversarial examples and defense algorithms to safeguard, secure, and make more reliable AI systems.
This can be conducted by robustifying deep neural network (DNN) classifiers and explicitly focusing on the specific case of convolutional neural networks (CNNs) used in non-trivial manufacturing environments prone to noise, vibrations, and errors when capturing and transferring data. The proposed architecture enables the imitation of the interplay between the attacker and a defender based on the deployment and cross-evaluation of adversarial and defense strategies.
The AI architecture enables (i) the creation and usage of adversarial examples in the training process, which robustify the accuracy of CNNs, (ii) the evaluation of defense algorithms to recover the classifiers’ accuracy, and (iii) the provision of a multiclass discriminator to distinguish and report on non-attacked and attacked data. The experimental results show promising results in a hybrid solution combining the defense algorithms and the multiclass discriminator in an effort to revitalize the attacked base models and robustify the DNN classifiers. The proposed architecture is ratified in the context of a real manufacturing environment utilizing datasets stemming from the actual production lines.
You may access the full text of the paper at https://www.mdpi.com/1424-8220/22/18/6905/htm