We tackle three disparate topics in this thesis - graph embeddings, disentanglement, and algorithm maps.
In Chapters 2 and 3, we cover graph embeddings: methods which embed graph as vectors in a structure-preserving way. We start at the principle of superposition used by vector-symbolic architectures and derive the tensor product as the canonical binding operation along with a natural random code. Together, they form a novel embedding method, and we list some common graph operations our embeddings are capable of. We give a precise characterization of its statistical behavior, showing it achieves a packing upper bound. and go on to establish a link to adjacency matrices with applications toward their drastic compression. Then, we compare our method to other bind-and-sum methods and showcase a fundamental memory vs. capacity tradeoff.
In Chapter t4, we cover disentanglement, the discovery of semantically meaningful latent factors. We assume that data lives on a low-dimensional manifold and define disentanglement in terms of local charts. Exploring the consequences of this definition, we find that commutativity of the latent factors is an equivalent condition for disentanglement. Under our framework, we also show that sufficiently rich generative models can always be disentangled, and we apply the commutativity condition to learning a dictionary of Lie group operators. We conclude with a discussion of how our definition relates to previous work, suggesting that it is no accident that commutativity - which has been used a computational or conceptual convenience - is so prevalent in these methods.
In Chapter 5, we give some preliminary thoughts on the transfer of algorithms between different contexts. Our key insight is that, rather than focusing on the transfer of a single algorithm, it is more profitable to consider the conditions in which the states and actions of one setting can be mapped to those of another setting. This in turn induces a natural map of algorithms between contexts, and we give some basic consequences of these definitions.