Explaining Deep Learning Outcomes of Cultural Heritage Images Analytics (Master Thesis, Finished)


Cristina Illi


The proposed thesis focuses on two aspects in the context of Explainable Artificial Intelligence (XAI) with historic images. First, different deep learning methods that extract object and context information from images are examined. The challenge lies in the format of the historic images from the PIA project that differs from contemporary machine learning data sets. Second, different explainability methods are studied and implemented. For this, it is necessary to determine which explanations have to be given to which audiences in which format at what time in the workflow. Finally, an evaluation of the machine learning and explainability approaches will be carried out through the means of a user study.

Start / End Dates

2021/05/31 - 2021/11/30


Research Topics