Topology and Weight Evolving Artificial Neural Networks for Erlang
Evolutionary neural networks that evolve both topology and weights, now with Liquid Time-Constant (LTC) neurons for adaptive temporal processing. Based on DXNN2 by Gene Sher.
- First TWEANN library with LTC neurons in Erlang/OTP
- CfC closed-form approximation - ~100x faster than ODE-based LTC
- Rust NIF acceleration - Optional 30-200x speedup for fitness statistics, novelty search, selection
- Pure Erlang fallback - Works on any system without Rust toolchain
- Hybrid networks - Mix standard and LTC neurons in the same network
- Production ready - Comprehensive logging, error handling, and process safety
%% Add to rebar.config
{deps, [{macula_tweann, "~> 0.17.0"}]}.
%% Create and evolve a standard network
genotype:init_db(),
Constraint = #constraint{morphology = xor_mimic},
{ok, AgentId} = genotype:construct_agent(Constraint),
genome_mutator:mutate(AgentId).
%% Use LTC dynamics directly
{NewState, Output} = ltc_dynamics:evaluate_cfc(Input, State, Tau, Bound).Liquid Time-Constant neurons enable adaptive temporal processing with input-dependent time constants:
%% CfC evaluation (fast, closed-form)
{State1, _} = ltc_dynamics:evaluate_cfc(1.0, 0.0, 1.0, 1.0),
{State2, _} = ltc_dynamics:evaluate_cfc(1.0, State1, 1.0, 1.0).
%% State persists between evaluations - temporal memory!Key equations:
- LTC ODE:
dx/dt = -[1/τ + f(x,I,θ)]·x + f(x,I,θ)·A - CfC:
x(t+Δt) = σ(-f)·x(t) + (1-σ(-f))·h(100x faster)
See the LTC Neurons Guide for details.
- Installation - Add to your project
- Quick Start - Basic usage
- LTC Neurons - Temporal dynamics
- LTC Usage Guide - Practical examples
- Architecture - System design
- Full Documentation - All guides and module docs
- Topology Evolution: Networks add/remove neurons and connections
- Weight Evolution: Synaptic weights optimized through selection
- Speciation: Behavioral diversity preservation (NEAT-style)
- Multi-objective: Pareto dominance optimization
- Temporal Memory: Neurons maintain persistent internal state
- Adaptive Dynamics: Input-dependent time constants
- CfC Mode: ~100x faster than ODE-based evaluation
- Hybrid Networks: Mix standard and LTC neurons
- Process Safety: Timeouts and crash handling
- Comprehensive Logging: Structured logging throughout
- Rust NIF (optional): High-performance network evaluation
- Mnesia Storage: Persistent genotype storage
This library is available in two editions:
| Feature | Community (hex.pm) | Enterprise |
|---|---|---|
| TWEANN topology evolution | Yes | Yes |
| LTC/CfC neurons | Yes | Yes |
| Weight evolution | Yes | Yes |
| Speciation | Yes | Yes |
| Rust NIF acceleration | No (pure Erlang) | Yes (30-200x faster) |
| Source code | Hex package only | Full repository |
The hex.pm package uses pure Erlang implementations for all algorithms. This is fully functional and suitable for:
- Learning and experimentation
- Small to medium populations (< 1000 individuals)
- Development and prototyping
%% Check if NIFs are available
tweann_nif:is_loaded(). %% Returns false on Community EditionEnterprise users with full source access can enable Rust NIF acceleration by:
- Installing Rust toolchain (rustup.rs)
- Uncommenting NIF hooks in
rebar.config - Building from source
NIF-accelerated functions include:
fitness_stats/1- Population statistics (30x faster)tournament_select/2- Selection (50x faster)roulette_select/3- Selection (40x faster)knn_novelty/4- Novelty search (200x faster)evaluate/2- Network forward pass (100x faster)
Contact Macula.io for enterprise licensing.
Process-based neural networks with evolutionary operators. See Architecture Guide for details.
rebar3 eunit # Unit tests (858 tests)
rebar3 dialyzer # Static analysis
rebar3 ex_doc # Generate documentation-
Sher, G.I. (2013). Handbook of Neuroevolution Through Erlang. Springer.
- Primary reference for DXNN2 architecture and Erlang implementation patterns.
-
Stanley, K.O. & Miikkulainen, R. (2002). Evolving Neural Networks through Augmenting Topologies. Evolutionary Computation, 10(2), 99-127.
- Foundational NEAT paper introducing speciation and structural innovation protection.
-
Stanley, K.O. (2004). Efficient Evolution of Neural Network Topologies. Proceedings of the 2002 Congress on Evolutionary Computation (CEC).
- Complexity analysis and efficiency improvements for topology evolution.
-
Hasani, R., Lechner, M., et al. (2021). Liquid Time-constant Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(9), 7657-7666.
- Introduces adaptive time-constant neurons with continuous-time dynamics.
-
Hasani, R., Lechner, M., et al. (2022). Closed-form Continuous-time Neural Networks. Nature Machine Intelligence, 4, 992-1003.
- CfC closed-form approximation enabling ~100x speedup over ODE-based LTC.
- Glorot, X. & Bengio, Y. (2010). Understanding the difficulty of training deep feedforward neural networks. Proceedings of AISTATS.
- Xavier initialization theory used for network weight initialization.
-
Holland, J.H. (1975). Adaptation in Natural and Artificial Systems. MIT Press.
- Foundational text on genetic algorithms.
-
Yao, X. (1999). Evolving Artificial Neural Networks. Proceedings of the IEEE, 87(9), 1423-1447.
- Comprehensive survey of neuroevolution approaches.
- ONNX Consortium (2017-present). Open Neural Network Exchange.
- Open standard for neural network interoperability enabling cross-platform inference.
-
macula - HTTP/3 mesh networking platform with NAT traversal, Pub/Sub, and async RPC. Enables distributed neuroevolution across edge devices.
-
macula_neuroevolution - Population-based evolutionary training engine that orchestrates neural network evolution using this library.
-
DXNN2 - Gene Sher's original TWEANN implementation in Erlang, the foundation for this library.
-
NEAT-Python - Popular Python implementation of NEAT.
-
SharpNEAT - High-performance C# NEAT implementation.
-
PyTorch-NEAT - Uber's PyTorch-based NEAT implementation.
-
LTC/CfC Reference Implementation - MIT/ISTA reference implementation of LTC networks.
Apache License 2.0 - See LICENSE
Based on DXNN2 by Gene Sher. Adapted with LTC extensions by Macula.io.