Cengiz is co-organizing a NeurIPS Workshop on Associative Memory & Hopfield Networks. See workshop page here: https://amhn.vizhub.ai/
---------------------------------------------------------------------------------------------------
Call for Papers
We invite submissions on novel research results (theoretical and empirical), software frameworks and abstractions,benchmarks, demos, visualizations, and work-in-progress research. The format of submissions is 4-page papers (excluding references) submitted to OpenReview. The reviews will not be shared publicly.
Scope
Associative memory is defined as a network that can link a set of features into high-dimensional vectors, called memories. Prompted by a large enough subset of features taken from one memory, an animal or an AI network with an associative memory can retrieve the rest of the features belonging to that memory. The diverse human cognitive abilities which involve making appropriate responses to stimulus patterns can often be understood as the operation of an associative memory, with the memories often being distillations and consolidations of multiple experiences rather than merely corresponding to a single event.
In the world of artificial neural networks. a canonical mathematical model of this phenomenon is the Hopfield network (Hopfield, 1982). Although often narrowly viewed as a model that can store and retrieve predefined verbatim memories of past events, its contemporary variants make it possible to store consolidated memories turning individual experiences into useful representations of the training data. Such modern variants are often trained using the backpropagation algorithm and often benefit from superior memory storage properties (Krotov & Hopfield, 2016). Contemporary Hopfield networks can be used as submodules in larger AI networks solving a diverse set of tasks. The goal of this workshop is to discuss the existing and emerging developments of these ideas. The research topics of interest at this workshop include (but are not limited to):
-
Novel architectures for associative memory, Hopfield Networks, Dense Associative Memories, and related models [Krotov & Hopfield (2016), Demircigil et al. (2017), Ramsauer et al. (2020), Millidge et al. (2022), Krotov (2021), Burns & Fukai (2023), Bricken & Pehlevan (2021)].
-
Hybrid memory augmented architectures, e.g., memory augmented Transformers and RNNs, networks with fast weight updates [Rae et al. (2019), Wu et al. (2022), Wang et al. (2023), Schlag et al. (2021)].
-
Energy-based models and their applications [Hoover et al. (2023a), Hoover et al. (2023b), Ota & Taki (2023)].
-
Training algorithms for energy-based, or memory-based architectures [Du & Mordach (2019)].
-
The connection between associative memory and neuroscience, including both insights from neuroscience for better AI, and AI-inspired neurobiological work [Krotov & Hopfield (2020), Whittington et al. (2021), Sharma et al. (2022), Tyulmankov et al. (2021)].
-
Kernel methods and associative memories [Choromanski et al. (2020)].
-
Theoretical properties of associative memories with insights from statistical physics, contraction analysis, control theory, and related areas [Lucibello & Mezard (2023), Agliari et al. (2022)].
-
Multimodal architectures with associative memories.
-
Sequential Hopfield networks for temporal sequences [Karuvally et al. (2022), Chaudhry et al. (2023)].
-
Other machine learning tasks (such as clustering, dimensionality reduction) with associative memories [Saha et al. (2023)].
-
Energy-based Transformers [Hoover et al. (2023a)].
-
Applications of associative memories and energy-based models to various data domains, such as language, images, sound, graphs, temporal sequences, computational chemistry and biology [Widrich et al. (2020), Liang et al. (2022),Furst et al. (2022), Bricken et al. (2023), Tang & Kopp (2021), Sandler et al. (2022), Xu et al. (2022)].