Overview
Explore the concept of compositionality in vector space models of meaning in this comprehensive lecture. Delve into deep neural networks, simple tasks, and the role of grammar in language processing. Examine the differences between "apple" and "I" in linguistic contexts, and understand role filler bindings. Analyze the results of various experiments and gain insights into how meaning is constructed and represented in vector spaces. Enhance your understanding of computational linguistics and natural language processing through this in-depth presentation.
Syllabus
Introduction
Deep Neural Networks
Simple Tasks
Compositionality
Outline
Grammar
Apple vs I
Role filler bindings
Recap
Results
Summary
Taught by
Santa Fe Institute