Javascript must be enabled for the correct page display

Nested State Clouds: Distilling Knowledge Graphs from Contextual Embeddings

Bricman, Paul (2022) Nested State Clouds: Distilling Knowledge Graphs from Contextual Embeddings. Bachelor's Thesis, Artificial Intelligence.

[img]
Preview
Text
NSCv5.pdf

Download (2MB) | Preview
[img] Text
toestemming.pdf
Restricted to Registered users only

Download (129kB)

Abstract

Interpretability techniques help ensure the safe deployment of deep learning (DL) models into production by providing practitioners with diverse debugging tools, yet the inner workings of large models remain elusive. In this work, we propose a novel interpretability technique which can be used to distill sparse knowledge graphs from a model's high-dimensional embeddings using conceptors. This technique, termed Nested State Clouds (NSC), takes advantage of the way state clouds of contextual embeddings are positioned relative to each other in latent space. For instance, "fruit" contextual embeddings appear to engulf "apple" ones, as the former includes not only the senses of the latter, but some additional ones as well. We successfully apply NSC to a pretrained masked language model, and recover an ontology of concepts grounded in the model's latent space.

Item Type: Thesis (Bachelor's Thesis)
Supervisor name: Jaeger, H. and Rij-Tange, J.C. van
Degree programme: Artificial Intelligence
Thesis type: Bachelor's Thesis
Language: English
Date Deposited: 14 Jul 2022 07:49
Last Modified: 14 Jul 2022 07:49
URI: https://fse.studenttheses.ub.rug.nl/id/eprint/27840

Actions (login required)

View Item View Item