Neural networks
Neural Networks: Zero to Hero is great intro. Building micrograd is fascinating too.
Transformer Neural Networks are useful to understand. RASPy is a nice tool.
Transformers from Scratch & Toy Models of Superposition are great reads.
I am trying to go through this list to truly understand transformer models.
Notes
- Neural Networks are great identifying patterns in data. As a classic example, if you wanted to predict housing prices, you could build a data set that maps features about houses (square feet, location, proximity to Caltrain, etc) onto their actual price, and then train a network to recognize the complex relationship between features and pricing. Training happens by feeding the network features, letting it make a guess about the price, and then correcting the guess (backpropagation).
- Convolutional Neural Networks work similarly, but with images. Instead of giving a CNN discrete features, you'll usually just use the pixels of the image itself. Through a series of layers, the CNN is able to build features itself (traditionally things like edges, corners) and learn patterns in image data. For example, a CNN might be trained on a dataset that maps images onto labels, and learn how to label new images on its own.
Links
- Neural Network from Scratch (Interactive) (HN)
- But what is a Neural Network? | Deep learning, chapter 1 (2017)
- A Neural Network Playground
- A Beginner's Guide To Understanding Convolutional Neural Networks
- Capsule Networks (CapsNets) – Tutorial
- Chris Olah explains neural nets
- How I Shipped a Neural Network on iOS with CoreML, PyTorch, and React Native - Detailed and awesome article.
- Generative Adversarial Networks (GANs) in 50 lines of code (PyTorch)
- Neural Networks, Types, and Functional Programming
- Recurrent Neural Networks lecture by Yoshua Bengio
- Practical Advice for Building Deep Neural Networks
- Differentiable Architecture Search - Code for DARTS: Differentiable Architecture Search paper.
- TensorSpace.js - Neural network 3D visualization framework, build interactive and intuitive model in browsers, support pre-trained deep learning models from TensorFlow, Keras, TensorFlow.js
- UIS-RNN - Library for the Unbounded Interleaved-State Recurrent Neural Network (UIS-RNN) algorithm, corresponding to the paper Fully Supervised Speaker Diarization.
- ONNX - Open Neural Network Exchange. (Scoring ONNX ML Models with Scala)
- DyNet - Dynamic Neural Network Toolkit.
- gonn - Building a simple neural network in Go.
- Neural Ordinary Differential Equations (2018) - Video explanation | Notes
- Neural Network framework in 25 LOC
- Learning and Processing over Networks (2019) - Workshop presented by Michaël Defferrard and Rodrigo Pena at the Applied Machine Learning Days in January 2019.
- The Next Generation of Neural Networks by Geoffrey Hinton (2007)
- Who Invented Backpropagation? (2014)
- Deep Learning in Neural Networks: An Overview (2015)
- Neural Networks (E01: introduction) (2018) - This series is intended as a light introduction to neural networks, with a focus on the task of classifying handwritten digits.
- Machine Learning for Beginners: An Introduction to Neural Networks (2019)
- A Recipe for Training Neural Networks (2019)
- Exploring Neural Networks with Activation Atlases (2019)
- Curated list of neural architecture search and related resources
- Weight Agnostic Neural Networks (2019) (HN)
- Geoffrey Hinton explains the evolution of neural networks (2019)
- Evolved Turing neural networks
- Intelligent Machinery paper by Alan Turing
- SRU - Training RNNs as Fast as CNNs.
- ODIN - Out-of-Distribution Detector for Neural Networks.
- Ask HN: What Neural Networks/Deep Learning Books Should I Read? (2019)
- Python vs Rust for Neural Networks (2019) (HN)
- Exploring Weight Agnostic Neural Networks (2019) (HN)
- Neural Networks, Types, and Functional Programming (2015)
- Glow - Compiler for Neural Network hardware accelerators.
- Go Neural Net Sandbox - Sandbox for personal experimentation in Go neural net training and Go AI.
- layer - Neural network inference the Unix way.
- XNNPACK - Highly optimized library of floating-point neural network inference operators for ARM, WebAssembly, and x86 (SSE2 level) platforms.
- LSTM implementation explained (2015)
- The Neural Process Family - Contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).
- Notes on Neural Nets
- RNNoise - Recurrent neural network for audio noise reduction.
- Hacking Neural Networks - Short introduction on methods that use neural networks in an offensive manner.
- Distilling knowledge from Neural Networks to build smaller and faster models (2019)
- Neural Network Processing Neural Networks: An efficient way to learn higher order functions (2019)
- Building a neural net from scratch in Go (2017)
- SparseConvNet - Spatially-sparse convolutional neural network.
- Norse - Library to do deep learning with spiking neural networks.
- Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes (2019) (Code)
- Single Headed Attention RNN: Stop Thinking With Your Head (2019) (HN)
- Visualizing the Loss Landscape of Neural Nets
- primitiv - Neural Network Toolkit.
- On the Relationship between Self-Attention and Convolutional Layers (2019) (Code) (Reddit)
- Implementation of a deep learning library in Futhark
- Single Headed Attention RNN: Stop Thinking With Your Head (2019)
- Using neural networks to solve advanced mathematics equations (2020)
- AlphaFold - Provides an implementation of the contact prediction network, associated model weights and CASP13 dataset as published in Nature. (Paper)
- Go Perceptron - Single / multi layer / recurrent neural network written in Golang.
- Temperature Scaling - Simple way to calibrate your neural network.
- Recurrent Geometric Networks for end-to-end differentiable learning of protein structure
- FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence (2020) (Tweet)
- kapre - Keras Audio Preprocessors.
- Putting An End to End-to-End: Gradient-Isolated Learning of Representations (2019)
- Memory-Augmented Neural Networks for Machine Translation (2019)
- Have there been any important developments on content addressable memory since hopfield network? (neural networkish) (2020)
- G-Bert - Pre-training of Graph Augmented Transformers for Medication Recommendation.
- Two strange useless things to do with neural nets: a demonstration
- Understanding the Neural Tangent Kernel (2019)
- Cutting out the Middle-Man: Training and Evaluating Energy-Based Models without Sampling (Tweet)
- Haiku - JAX-based neural network library.
- Convolution in one dimension for neural networks (2020)
- Lucid - Collection of infrastructure and tools for research in neural network interpretability.
- Minkowski Engine - Auto-diff neural network library for high-dimensional sparse tensors.
- Generating MIDI melody from lyrics using LSTM-GANs (HN)
- Zoom In: An Introduction to Circuits (2020)
- Lagrangian Neural Networks (2020) (HN)
- Neural Tangents - Fast and Easy Infinite Neural Networks in Python.
- A Survey of Long-Term Context in Transformers (2020)
- OpenNMT-py - Open Source Neural Machine Translation in PyTorch.
- Deep Learning for Symbolic Mathematics (2019) (Paper)
- Google Brain AutoML
- Physics Informed Neural Networks - Data-driven Solutions and Discovery of Nonlinear Partial Differential Equations.
- An introduction to Bayesian neural networks (2020)
- PyTorch Neural Turing Machine
- PyTorch Neural Turing Machine 2
- Learning DAGs with Continuous Optimization (2020)
- Early Vision (2020) - Guided tour of the first five layers of InceptionV1, taxonomized into “neuron groups.”.
- micrograd - Tiny autograd engine and a neural net library on top of it, potentially for educational purposes.
- MiniGrad - Minimal implementation of reverse-mode automatic differentiation (a.k.a. autograd / backpropagation) in pure Python.
- Learning from Small Neural Networks (2020)
- Neural Game Engine - Code to reproduce Neural Game Engine experiments and pre-trained models.
- Graph Convolutional Neural Network Approach to Antibiotic Discovery (2020) (HN)
- KPNNs - Knowledge-primed neural networks.
- ResNeSt - Split-Attention Network.
- Shortcut Learning in Deep Neural Networks (2020)
- Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
- Perovskite neural trees (2020) (HN)
- RigNet: Neural Rigging for Articulated Characters (2020) (Reddit)
- Convolutional neural networks for artistic style transfer (2017)
- Certifiable Robustness to adversarial Attacks; What is the Point? | Nick Frosst (2020)
- LAG: Latent Adversarial Generator
- Towards improved generalization in few-shot classification (2019)
- Convolutional Neural Networks in One Dimension
- Neural Network Pruning (2020)
- Hyperbolic RNN in PyTorch
- deeplearn-rs - Deep learning in Rust.
- Neural networks trained to communicate with each other without any training data
- Classical Piano Composer - Allows you to train a neural network to generate midi music files that make use of a single instrument.
- Weight Standardization - Normalization method to accelerate micro-batch training.
- Teaching Machines to Draw (2017) (In action)
- Benchmarking Neural Network Robustness to Common Corruptions and Perturbations
- pix2code - Generating Code from a Graphical User Interface Screenshot.
- Gated Linear Networks (2019) (HN)
- Curve Detectors (2020)
- Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
- Neural Networks and Deep Learning - What they are and how they work.
- Teaching physics to neural networks removes 'chaos blindness' (2020) (HN)
- Understanding Convolutional Neural Networks (Code) (HN)
- Business Card Neural Network (2020)
- Functional Neural Networks (2020)
- Attention Is All You Need (2017)
- Getting Artificial Neural Networks Closer to Animal Brains (2020)
- Foolbox Native - Python toolbox to create adversarial examples that fool neural networks in PyTorch, TensorFlow, and JAX.
- Genann - Minimal, well-tested library for training and using feedforward artificial neural networks (ANN) in C.
- High-Frequency Component Helps Explain the Generalization of Convolutional Neural Networks (2020)
- Awesome Pruning - Curated list of neural network pruning resources.
- Hopfield Networks is All You Need (2020) (Code) (Article) (HN)
- Sparse Networks from Scratch: Faster Training without Losing Performance (2019)
- Jigsaw Labs - Learn Neural Nets
- Implementing a Neural Network in C (Code)
- Web Neural Network API - Dedicated low-level API for neural network inference hardware acceleration. Polyfill
- Clarifying exceptions and visualizing tensor operations in deep learning code (2020)
- Tensor Sensor - Generate more helpful exception messages for numpy/pytorch matrix algebra expressions. (Tweet)
- Explaining RNNs without neural networks (2020)
- A visual explanation for regularization of linear models (2020)
- A Guide to Deep Learning and Neural Networks (2020)
- Handwriting Synthesis - Handwriting Synthesis with RNNs.
- How DeepMind learns physics simulators with Graph Networks (w/ author interview) (2020)
- Build Your Own Artificial Neural Network. It’s Easy! (2020)
- Neural Circuit Policies Enabling Auditable Autonomy
- FermiNet: Fermionic Neural Networks (Quantum Physics and Chemistry from First Principles (2020)) (Tweet)
- What is the Role of a Neuron?
- Marabou - SMT-based tool that can answer queries about a network’s properties by transforming these queries into constraint satisfaction problems.
- Demonstration of the attention mechanism with some toy experiments and explanations
- Augerino - Codebase for Learning Invariances in Neural Networks.
- ELI5 - Python package which helps to debug machine learning classifiers and explain their predictions.
- Cream of the Crop: Distilling Prioritized Paths For One-Shot Neural Architecture Search
- Brian2 - Free, open source simulator for spiking neural networks. (Web)
- Neural Networks from Scratch in Python
- Naszilla - Python library for neural architecture search (NAS).
- The Unreasonable Syntactic Expressivity of RNNs (2020)
- DeepMath Conference - Conference on the Mathematical Theory of DNN's. (HN)
- Coding a Neural Network: A Beginner's Guide (2020)
- Eiffel2 - Neural Network architecture Visualization tool.
- Graph Convolutional Neural Networks (GCNN) models
- Elegy - Neural Networks framework based on Jax inspired by Keras and Haiku.
- SpinalNet - Deep Neural Network with Gradual Input.
- Deeply-supervised Nets
- diagNNose - Python library that facilitates a broad set of tools for analysing hidden activations of neural models.
- MiniSom - Minimalistic implementation of the Self Organizing Maps.
- Basics of Convolution (2020)
- DeepGCNs: Can GCNs Go as Deep as CNNs?
- Tinn - 200 line dependency free neural network library written in C99.
- musicnn - Set of pre-trained musically motivated convolutional neural networks for music audio tagging.
- Convolution Is Fancy Multiplication (HN)
- Tools to Design or Visualize Architecture of Neural Network
- ENNUI - Elegant Neural Network User Interface. (Code)
- Dynamic Graph CNN for Learning on Point Clouds
- robustness - Library for experimenting with, training and evaluating neural networks, with a focus on adversarial robustness.
- JAX, M.D. - Accelerated, Differentiable, Molecular Dynamics. (Paper)
- Neural Reverse Engineering of Stripped Binaries using Augmented Control Flow Graphs
- NN SVG - Generate publication-ready NN-architecture schematics. (HN)
- e3nn - Modular framework for neural networks with Euclidean symmetry.
- Delve - Python package for visualizing deep learning model training.
- Graph Mining @ NeurIPS 2020 (Talks)
- jax_verify - Neural network verification in JAX.
- Self-supervised learning through the eyes of a child (2020) (Code)
- Object-based attention for spatio-temporal reasoning: Outperforming neuro-symbolic models with flexible distributed architectures (2020)
- Soft Threshold Weight Reparameterization for Learnable Sparsity (2020) (Code)
- Understanding the Difficulty of Training Transformers (2020) (Code)
- Awesome Implicit Neural Models
- Edit-distance as objective function papers - Curated list of papers dedicated to edit-distance as objective function.
- SuPar - Collection of state-of-the-art models for Dependency Parsing, Constituency Parsing and Semantic Dependency Parsing.
- Drawing early-bird tickets: Towards more efficient training of deep networks (2020) (Code)
- CountNet: Speaker Count Estimation using Deep Neural Networks
- Applications of Deep Neural Networks Course (2021) (Code)
- DDSL: Deep Differential Simplex Layer for Neural Networks
- Making sense of sensory input (2021)
- char-rnn - Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch.
- Awesome Equivariant Networks
- Named Tensor Notation (Code)
- Applications of Deep Neural Networks v2 (2020) (HN)
- Make Your Own Neural Network Blog
- Make Your Own Neural Network Book (Code)
- Representation Learning for Attributed Multiplex Heterogeneous Network (2019) (Code)
- Awesome VAEs - Curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
- Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting (2020) (Code)
- Training Neural Networks is ER-complete (2021)
- Geoff Hinton 2021 – How to represent part-whole hierarchies in a neural network (HN)
- Neural Network Matrix Visualization (2021)
- Multimodal Neurons in Artificial Neural Networks (2021) (HN) (Code)
- OpenAI Microscope - Collection of visualizations of every significant layer and neuron of several common “model organisms” which are often studied in interpretability.
- Real time Interactive Visualization of Convolutional Neural Networks in Unity
- Techniques for Reducing Overfitting (2021) (Tweet)
- Accelerating Neural Networks on Mobile and Web with Sparse Inference (2021)
- Quantization for Neural Networks (2020)
- Introduction to Automatic Hyperparameter Tuning
- Neural Networks Block Movement Pruning
- Torch-Dreams - Making neural networks more interpretable, for research and art.
- Are Deep Neural Networks Dramatically Overfitted? (2019) (HN)
- NASLib - Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
- Neural Network Visualization - Visualization of neural network architectures and parameters.
- Restricted Boltzmann Machine in Haskell
- X-Transformers - Simple but complete full-attention transformer with a set of promising experimental features from various papers. (HN)
- Introduction to Attention Mechanism (2021)
- Understanding Positional Encoding in Transformers (2021)
- Measuring XAI methods with Infidelity and Sensitivity (2021)
- Quasi-Recurrent Neural Networks (2016) (Code)
- deepdream.c - Experiment trying to implement Convolutional Neural Network inference and back-propagation using a minimal subset of C89 language and standard library features.
- Adapting Neural Networks for the Estimation of Treatment Effects (2018) (Code)
- Constructions in combinatorics via neural networks (2021) (Code)
- Artificial Neural Nets Finally Yield Clues to How Brains Learn (2021)
- Neural Additive Models: Interpretable Machine Learning with Neural Nets (2020) (Code)
- Neural-Backed Decision Trees (Code)
- Introduction to Neural Network Verification Book
- ERAN - ETH Robustness Analyzer for Deep Neural Networks.
- What are Transformer Neural Networks? (2021)
- Thinking Like Transformers (2021) (HN)
- TIL: Convolutional Filters Are Weights (2017)
- Solving Mixed Integer Programs Using Neural Networks (2020) (Tweet)
- What Are Convolutional Neural Networks? (2021)
- Fooling Neural Networks (HN)
- Introduction to Neural Network Verification (2021)
- Explainable neural networks that simulate reasoning (2021)
- Evolving Neural Networks through Augmenting Topologies (Code)
- Minimal, clean example of lstm neural network training in python, for learning purposes
- What nice mathematical results there are about neural networks? (2021)
- Temporal and Object Quantification Networks (2021) (Code)
- Encoding Events for Neural Networks (2021)
- Telestrations Neural Networks (2020)
- NN-SVG - Publication-ready NN-architecture schematics. (Web)
- Scientists built deep neural networks that can map between infinite dimensional spaces (2021)
- CNN Explainer - Interpreting Convolutional Neural Networks (2021)
- Transformers from Scratch (2019)
- Transformers from Scratch (2021) (HN) (HN)
- General and Scalable Parallelization for Neural Networks (2021)
- 8 Types of Activation Functions in Neural Networks (2021)
- Building a Neural Network in Go (2021)
- Echo State Networks in Python
- Neural Networks for Inference, Inference for Neural Networks (2019)
- Let's Play Distill: Building Blocks (2019) (Tweet)
- Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (2017) (Code)
- Neural Network From Scratch (2022) (HN)
- Noether’s Theorem, Symmetries, and Invariant Neural Networks
- CNNs and Equivariance - Part 1/2
- Building a Neural Network in Pure Lisp without Built-in Numbers using only Atoms and Lists (2022)
- Neural Methods in Simulation-Based Inference (2022)
- Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments (2021)
- Computer scientists prove why bigger neural networks do better (2022) (HN)
- On the Difficulty of Extrapolation with NN Scaling (2022)
- Depth Estimation by Convolutional Neural Networks (2016) (Code)
- Generative Flow Networks (2022)
- Feature Learning in Infinite-Width Neural Networks (2021) (Code)
- µTransfer: A technique for hyperparameter tuning of enormous neural networks (2022)
- Restoring and attributing ancient texts using deep neural networks
- Reproducing Yann LeCun 1989 paper "Backpropagation Applied to Handwritten Zip Code Recognition"
- Deep Neural Nets: 33 years ago and 33 years from now (2022) (Reddit) (HN)
- ONNC - Open Neural Network Compiler. (Web)
- Provably robust neural networks - Method for training neural networks that are provably robust to adversarial attacks.
- Rust + WebAssembly + Neural Network
- Neural Networks are not the only universal approximators, so why are they so uniquely effective? (2022)
- Awesome Spiking Neural Networks
- Transformers for software engineers (2022) (HN)
- Exploring Neural Networks Visually in the Browser (2022)
- Neural Network Visualization in the Browser - Neural network library written from scratch in Rust along with a web-based application for building + training neural networks + visualizing their outputs. (Code)
- Sharpened Cosine Similarity - Alternative to convolution for neural networks.
- Google AI Blog: Controlling Neural Networks with Rule Representations (2022)
- Epistemic Neural Networks - Library for uncertainty representation and training in neural networks.
- Transformer in Triton - Implementation of a Transformer, but completely in Triton.
- This AI Does Not Exist - Generate realistic descriptions of made-up machine learning models. (Code)
- Perplexity - Language is a notational semantic for documenting neural networks through diagrams.
- Meta-AF: Meta-Learning for Adaptive Filters
- Sequence Transduction with Recurrent Neural Networks (2021) (Code)
- Papers and Codes for the deep learning in hyperbolic space
- Friends don’t let friends train small diffusion models (2022) (HN)
- Physicists are building neural networks out of vibrations, voltages and lasers (2022) (HN)
- Techniques for Training Large Neural Networks (2022) (HN)
- How fast can we perform a forward pass? (2022) - How fast can you run a transformer model? (Tweet)
- Neural Network Loss Landscapes: What do we know? (2021) (HN)
- Logic Through the Lens of Neural Networks
- The spelled-out intro to neural networks and backpropagation: building micrograd (2022)
- What is the SOTA explanation for why deep learning works? (2022)
- Normalization effects on deep neural networks (2022)
- Delphi - Python, C++, and Rust library for Secure Deep Neural Network Inference.
- Game Emulation via Neural Network (2022)
- Neural Networks: Zero to Hero
- How are memories stored in neural networks? (2022)
- Building a neural network from scratch (just numpy/math) (2020)
- Polysemanticity in neural networks
- How does the Deep & Cross Network v2 find good feature interactions? (2022) (Tweet)
- 10 Days Of Grad: Deep Learning From The First Principles (Code)
- Tesla AI Day 2022 Neural Net
- Building makemore Part 2: MLP (2022) (Tweet)
- Mega: Moving Average Equipped Gated Attention (2022)
- Attention Is All You Need Paper Explained (Tweet)
- Neural Networks are Decision Trees (2022) (Tweet)
- What Makes Convolutional Models Great on Long Sequence Modeling?
- A Step by Step Backpropagation Example (2015)
- DecoMon - Automatic Certified Perturbation Analysis of Neural Networks.
- Directional Message Passing Neural Network (DimeNet and DimeNet++)
- Reverse Engineering a Neural Network's Clever Solution to Binary Addition (HN)
- 10 Days Of Grad: Deep Learning From The First Principles
- Neural Networks and the Chomsky Hierarchy (2022) (HN)
- Sizing Up Neural Nets
- ManimML - Neural Network Animations with Python.
- C++ Neural Network in a Weekend (2020) (HN)
- Understanding and coding the self-attention mechanism of large language models (2023) (HN)
- What are Transformer Neural Networks? (2021)
- Stitchable Neural Networks (2023) (Code)
- Transformer models: an introduction and catalog (2023) (HN) (Code)
- Scalable training for dense retrieval models
- Transformer Architecture from scratch in PyTorch
- Reimplementation of the Forward-Forward Algorithm
- Optical Transformers (2023) (HN)
- Toy neural network in C/Lua
- CHGNet: Pretrained universal neural network potential for charge-informed atomistic modeling (2023) (Code)
- Tuned Lens - Tools for understanding how transformer predictions are built layer-by-layer.
- Neural Networks: Zero To Hero (HN)
- Understanding LSTM Networks (2015)
- Neural Network library from scratch in Rust (Reddit)
- Scaling Transformer to 1M tokens and beyond with RMT (2023) (HN)
- Chaos Networks (2023)
- Architectures of Topological Deep Learning: A Survey on Topological Neural Networks (2023) (HN)
- Training Full Spike Neural Networks via Auxiliary Accumulation Pathway (2023) (Code)
- Language models can explain neurons in language models (2023) (HN)
- Ask HN: Can someone ELI5 Transformers and the “Attention is all we need” paper (2023)
- Structural Probes - Codebase for testing whether hidden states of neural networks encode discrete structures.
- A Gentle Introduction to 8-bit Matrix Multiplication for transformers at scale using transformers, accelerate and bitsandbytes (2022)
- Stanford Seminar - Transformers United: Introduction to Transformers (2023)
- Multitrack Music Transformer (2023) (Code)
- Attention is all you need (Transformer) - Model explanation (including math), Inference and Training (2023)
- The long story of how neural nets got to where they are (2023)
- Hello, Perceptron: An introduction to artificial neural networks
- What is a transformer model? (2022) (HN)
- Training Transformers with 4-bit Integers (2023)
- Faster Neural Networks Straight from JPEG (2018) (HN)
- How to Create a Neural Network (and Train it to Identify Doodles) (2022)
- Visualizing Attention in Transformers (2023)
- EfficientAT - Efficient Large-Scale Audio Tagging.
- Transformer Neural Networks, ChatGPT's foundation, Clearly Explained (2023)
- Rastermap - Visualization method for neural data.
- Growing Bonsai Networks with RNNs (2023)
- Watching Neural Networks Learn (2023)
- Deep Neural Nets: 33 years ago and 33 years from now (2022) (HN)
- Visualizing Neural Networks (2023) (HN)
- Recursive Least Squares by predicting errors