Notice: This website is mostly outdated as of 2024. A new website is coming soon. Proceed with caution regarding earlier posts.

Emergence: A Library of Hopfield Networks

1 minute read

Welcome to Emergence

This is a research library to study the emergent properties of undirected neural networks, including Hopfield networks and Boltzmann machines.

Access the library here.

To read more about Hopfield networks, see the primer I wrote here.

Setup

Set up your local environment and download the requirements.txt.

python3 -m venv env
pip3 install -r requirements.txt

See /tests for sample use cases.

Hopfield Networks

Hopfield networks are fascinating one-shot data-denoisers. We train the network to “remember” the top row of fashion MNIST images using Hebbian learning. The network does not store the actual image, but encodes information of the image in its weights.

Then, we add noise to the image, randomly setting 30% of the pixels to the opposite value. When we feed each random image into the pre-trained Hopfield network, we get the original image back (bottom row)!

Example on FMNIST

Original
Noisy <
Reconstructed

Use

from emergence.hopfield import Hopfield
from emergence.preprocess.preprocess_image import * 

# Preprocess data
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()
row, col = train_images[0].shape
data = train_images[5:9]
data = [normalize_binarize(i) for i in data]
data = [i.flatten() for i in data]

# Train network
hn = Hopfield()
hn.train(data)

# Get noisy data
noise_data = [noise_image(i, .3) for i in data]

# Run network on noisy data
data_hat = [hn.run(i, 1) for i in noise_data]

See /tests for more examples.

Leave a comment