# Discrete neural nets This repository contains code and examples of implementing the notion of a discrete neural net and polymorphic learning in Python. For more information on these notions, see [the corresponding preprint](https://arxiv.org/abs/2308.00677) on the arXiv. ### Git workflow 1) Fork main repo (http://git.aten.cool/caten/DiscreteNeuralNets). This is different than just cloning. There is a button on Forgejo for this. 2) Clone your fork to your machine. 3) Make a branch on your machine. 4) At any time, you can push your changes to your personal fork of the repo. 5) On your machine, pull from the main repo's main branch. 6) On your machine, merge the main branch into your new branch. 7) Resolve any conflicts that arise, then merge your new branch into the main branch. 8) Push your work to your personal fork of the repo. 9) Make a pull request to pull your fork's main into the original repo's main. ### Project structure The scripts that define basic components of the system are in the `src` folder. These are: * `arithmetic_operations.py`: Definitions of arithmetic operations modulo some positive integer. These are used to test the basic functionality of the `NeuralNet` class. * `dominion.py`: Tools for creating dominions, a combinatorial object used in the definition of the dominion polymorphisms in `polymorphisms.py`. * `dominion_setup.py`: Utilities for creating files of trees, dominions with those trees as constraint graphs, and the data for the corresponding polymorphisms. * `graphs.py`: Utilities for creating and storing simple graphs, including randomly-generated trees. * `mnist_training_binary.py`: Describes how to manufacture binary relations from the MNIST dataset which can be passed as arguments into the polymorphisms in `polymorphisms.py`. * `neural_net.py`: Definition of the `NeuralNet` class, including feeding forward and learning. * `operations.py`: Definitions pertaining to the `Operation` class, whose objects are to be thought of as operations in the sense of universal algebra/model theory. * `polymorphisms.py`: Definitions of polymorphisms of the Hamming graph, as well as a neighbor function for the learning algorithm implemented in `neural_net.py`. * `random_neural_net.py`: Tools for making `NeuralNet` objects with randomly-chosen architectures and activation functions. * `relations.py`: Definitions pertaining to the `Relation` class, whose objects are relations in the sense of model theory. The scripts that run various tests and example applications of the system are in the `tests` folder. These are: * `src.py`: This script allows horizontal imports from the sibling `src` folder. (That is, it adds it to the system `PATH` variable.) * `test_binary_relation_polymorphisms`: Examples of the basic functionality for the polymorphisms defined in `polymorphisms.py` when applied to binary relations. * `test_dominion.py`: Examples of constructing and displaying dominions as defined in `dominion.py`. * `test_dominion_setup.py`: Create trees and dominions for use with dominion polymorphisms. * `test_graphs.py`: Examples of creating graphs (including random trees) as defined in `graphs.py`. * `test_mnist_training_binary.py`: Verification that MNIST training data is being loaded correctly from the training dataset. * `test_neural_net.py`: Examples of creating `NeuralNet`s using activation functions from `arithmetic_operations.py` and the `RandomOperation` from `random_neural_net.py`. * `test_relations.py`: Examples of the basic functionality for the `Relation`s defined in `relations.py`. ### Environment This project should run on any Python3 environment without configuration. It assumes that there is a project folder which contains these subdirectories: `src` (for source code), `tests` (for tests of basic functionality and examples), and `output` (for output json, image files, etc.). The `output` folder is in the `.gitignore`, so it should not be seen on cloning. It will be created when a script that needs to use it is run. ### TODO * Reincorporate the polymorphisms for the higher-arity analogues of the Hamming graph which Lillian coded. ### Thanks Thanks to all the contributors to the original incarnation of this repository: * Rachel Dennis * Hussein Khalil * Lillian Stolberg * Kevin Xue * Andrey Yao Thanks also to the University of Rochester and the University of Colorado Boulder for supporting this project.