|
| 1 | +# NEAT-Python Network Export and Analysis Tools |
| 2 | + |
| 3 | +This directory contains tools for exporting, analyzing, and converting NEAT-Python neural networks to other frameworks. |
| 4 | + |
| 5 | +NOTE: These conversion tools are a work in progress. If you find they do not work correctly please open a GitHub issue. |
| 6 | + |
| 7 | +## Overview |
| 8 | + |
| 9 | +The export functionality allows you to: |
| 10 | +1. Train a NEAT network and export it to JSON format |
| 11 | +2. Analyze the exported network structure and properties |
| 12 | +3. Convert the network to PyTorch, TensorFlow, or ONNX formats |
| 13 | + |
| 14 | +## Usage Workflow |
| 15 | + |
| 16 | +1. First, run `export_example.py` to train a simple XOR network and export it to JSON: |
| 17 | + ```bash |
| 18 | + python export_example.py |
| 19 | + ``` |
| 20 | + This will create a `xor_winner.json` file containing the exported network. |
| 21 | + |
| 22 | +2. Next, use `neat_analyzer.py` to analyze the exported network: |
| 23 | + ```bash |
| 24 | + python neat_analyzer.py xor_winner.json |
| 25 | + ``` |
| 26 | + This will provide detailed statistics about the network structure and generate a Graphviz visualization. |
| 27 | + |
| 28 | +3. Finally, use `neat_to_frameworks.py` to convert the network to other formats: |
| 29 | + ```bash |
| 30 | + python neat_to_frameworks.py xor_winner.json --format all |
| 31 | + ``` |
| 32 | + This will create PyTorch, TensorFlow, and ONNX versions of the network. |
| 33 | + |
| 34 | +## Converter Features |
| 35 | + |
| 36 | +Convert NEAT-Python exported neural networks to PyTorch, TensorFlow, and ONNX formats. |
| 37 | + |
| 38 | +## Converter Features |
| 39 | + |
| 40 | +- **Complete topology preservation**: Handles arbitrary network structures, not just layered architectures |
| 41 | +- **All activation functions**: Supports sigmoid, tanh, relu, identity, sin, cos, abs, square, gauss, hat |
| 42 | +- **Multiple targets**: Export to PyTorch, TensorFlow, or ONNX |
| 43 | +- **Verified correctness**: Numpy reference implementation for validation |
| 44 | + |
| 45 | +## Installing dependencies |
| 46 | + |
| 47 | +```bash |
| 48 | +# For PyTorch support |
| 49 | +pip install torch |
| 50 | + |
| 51 | +# For TensorFlow support |
| 52 | +pip install tensorflow |
| 53 | + |
| 54 | +# For ONNX (requires PyTorch + onnxscript) |
| 55 | +pip install torch onnxscript |
| 56 | + |
| 57 | +# Or install everything at once |
| 58 | +pip install torch tensorflow onnxscript |
| 59 | +``` |
| 60 | + |
| 61 | +## Usage |
| 62 | + |
| 63 | +### Command Line |
| 64 | + |
| 65 | +```bash |
| 66 | +# Convert to all formats |
| 67 | +python neat_to_frameworks.py xor_winner.json --format all --output-dir ./models |
| 68 | + |
| 69 | +# Convert to specific format |
| 70 | +python neat_to_frameworks.py xor_winner.json --format pytorch |
| 71 | +python neat_to_frameworks.py xor_winner.json --format tensorflow |
| 72 | +python neat_to_frameworks.py xor_winner.json --format onnx |
| 73 | + |
| 74 | +# Test the network |
| 75 | +python neat_to_frameworks.py xor_winner.json --test |
| 76 | +``` |
| 77 | + |
| 78 | +### Python API |
| 79 | + |
| 80 | +```python |
| 81 | +from neat_to_frameworks import NEATNetwork |
| 82 | + |
| 83 | +# Load network (created by export_example.py) |
| 84 | +net = NEATNetwork('xor_winner.json') |
| 85 | + |
| 86 | +# Test with numpy (always available) |
| 87 | +import numpy as np |
| 88 | +inputs = np.array([[0.0, 1.0]]) |
| 89 | +outputs = net.evaluate(inputs) |
| 90 | +print(outputs) |
| 91 | + |
| 92 | +# Convert to PyTorch |
| 93 | +pytorch_model = net.to_pytorch() |
| 94 | +import torch |
| 95 | +output = pytorch_model(torch.tensor([[0.0, 1.0]], dtype=torch.float32)) |
| 96 | + |
| 97 | +# Convert to TensorFlow |
| 98 | +tf_model = net.to_tensorflow() |
| 99 | +import tensorflow as tf |
| 100 | +output = tf_model(tf.constant([[0.0, 1.0]], dtype=tf.float32)) |
| 101 | + |
| 102 | +# Export to ONNX |
| 103 | +net.to_onnx('model.onnx') # Uses PyTorch's default opset (cleanest, no warnings) |
| 104 | + |
| 105 | +# Or specify opset version for specific compatibility needs: |
| 106 | +# net.to_onnx('model.onnx', opset_version=17) # ONNX Runtime 1.13+ |
| 107 | +# net.to_onnx('model.onnx', opset_version=13) # ONNX Runtime 1.10+ |
| 108 | + |
| 109 | +# Save models |
| 110 | +net.save_pytorch('model.pth') |
| 111 | +net.save_tensorflow('model_tf') # SavedModel format (default) |
| 112 | +# Or: net.save_tensorflow('model', format='keras') # Keras format |
| 113 | +``` |
| 114 | + |
| 115 | +## ONNX Opset Versions |
| 116 | + |
| 117 | +The converter defaults to `None` (let PyTorch choose), which produces the cleanest export with no version conversion warnings. |
| 118 | + |
| 119 | +| Opset | Use Case | |
| 120 | +|-------|----------| |
| 121 | +| None (default) | **Recommended** - PyTorch chooses best version, no warnings | |
| 122 | +| 18-19 | Modern ONNX Runtime (1.14+), most runtimes support this | |
| 123 | +| 17 | ONNX Runtime 1.13+ | |
| 124 | +| 13-15 | Older ONNX Runtime (1.10+) | |
| 125 | +| 11 | Legacy systems only | |
| 126 | + |
| 127 | +**Note:** Most modern ONNX runtimes (2022+) support opset 18+. Only specify an older opset if deploying to legacy systems. |
| 128 | + |
| 129 | +## Network Computation Model |
| 130 | + |
| 131 | +NEAT networks compute each node as follows: |
| 132 | + |
| 133 | +``` |
| 134 | +For each node (in topological order): |
| 135 | + 1. Aggregate: weighted_sum = Σ(weight_i × input_i) |
| 136 | + 2. Scale: scaled = response × weighted_sum |
| 137 | + 3. Bias: biased = scaled + bias |
| 138 | + 4. Activate: output = activation(biased) |
| 139 | +``` |
| 140 | + |
| 141 | +This differs from standard neural networks in a few ways: |
| 142 | +- **Arbitrary topology**: Not restricted to layers |
| 143 | +- **Response multiplier**: Additional scaling parameter (usually 1.0) |
| 144 | +- **Topological evaluation**: Nodes computed in dependency order, not layer order |
| 145 | + |
| 146 | +## TensorFlow Model Formats |
| 147 | + |
| 148 | +The converter supports two TensorFlow save formats: |
| 149 | + |
| 150 | +1. **SavedModel format** (default): `net.save_tensorflow('model_tf')` |
| 151 | + - Best for deployment (TF Serving, TFLite, TensorFlow.js) |
| 152 | + - Creates a directory with saved model |
| 153 | + - Load with: `tf.saved_model.load('model_tf')` |
| 154 | + |
| 155 | +2. **Keras format**: `net.save_tensorflow('model', format='keras')` |
| 156 | + - Native Keras format (.keras file) |
| 157 | + - More compact single file |
| 158 | + - Load with: `tf.keras.models.load_model('model.keras')` |
| 159 | + |
| 160 | +## Key Differences from Standard Networks |
| 161 | + |
| 162 | +| Aspect | Standard NN | NEAT | |
| 163 | +|--------|-------------|------| |
| 164 | +| Structure | Layered | Arbitrary DAG | |
| 165 | +| Weights | Dense matrices | Sparse connections | |
| 166 | +| Evaluation | Layer-by-layer | Topological order | |
| 167 | +| Node params | Bias only | Bias + response | |
| 168 | + |
| 169 | +## Example: XOR Network |
| 170 | + |
| 171 | +The `export_example.py` script will generate a XOR network with: |
| 172 | +- 2 input nodes (keys: -1, -2) |
| 173 | +- 2 hidden nodes (keys: variable) |
| 174 | +- 1 output node (key: 0) |
| 175 | +- Several enabled connections |
| 176 | + |
| 177 | +The exact structure will vary depending on the NEAT evolution, but the evaluation order will always follow the topological sorting of the network nodes. |
| 178 | + |
| 179 | +## Network Analysis |
| 180 | + |
| 181 | +The `neat_analyzer.py` script provides detailed analysis of exported networks, including: |
| 182 | +- Node and connection statistics |
| 183 | +- Network depth and critical path analysis |
| 184 | +- Complexity metrics |
| 185 | +- Graphviz visualization generation |
| 186 | + |
| 187 | +To analyze a network: |
| 188 | +```bash |
| 189 | +python neat_analyzer.py xor_winner.json |
| 190 | +``` |
| 191 | + |
| 192 | +This will generate a detailed report and a Graphviz DOT file that can be converted to an image: |
| 193 | +```bash |
| 194 | +dot -Tpng xor_winner.dot -o network.png |
| 195 | +``` |
| 196 | + |
| 197 | +## Extending the Converter |
| 198 | + |
| 199 | +To add custom activation functions: |
| 200 | + |
| 201 | +1. Add to `_numpy_activation()` method |
| 202 | +2. Add corresponding PyTorch operation in `to_pytorch()` |
| 203 | +3. Add corresponding TensorFlow operation in `to_tensorflow()` |
| 204 | + |
| 205 | +Example: |
| 206 | +```python |
| 207 | +# In _numpy_activation() |
| 208 | +'my_activation': lambda x: np.custom_function(x) |
| 209 | + |
| 210 | +# In PyTorch forward() |
| 211 | +elif act_name == 'my_activation': |
| 212 | + node_values[node_id] = custom_torch_function(agg) |
| 213 | + |
| 214 | +# In TensorFlow call() |
| 215 | +elif act_name == 'my_activation': |
| 216 | + node_values[node_id] = custom_tf_function(agg) |
| 217 | +``` |
| 218 | + |
| 219 | +## Limitations |
| 220 | + |
| 221 | +1. **Custom activations**: If your NEAT config uses custom activation functions, you'll need to implement them in PyTorch/TensorFlow |
| 222 | +2. **Network size**: Very large networks may be slow to evaluate due to sequential node computation |
| 223 | +3. **Recurrent networks**: This converter handles feedforward networks only (no recurrent connections) |
| 224 | + |
| 225 | +## Troubleshooting |
| 226 | + |
| 227 | +**Q: Getting "Cycle detected" error?** |
| 228 | +A: Your network has recurrent connections. NEAT-Python's feedforward mode should prevent this, but check your configuration. |
| 229 | + |
| 230 | +**Q: Different outputs from NEAT-Python vs converted model?** |
| 231 | +A: Verify: |
| 232 | +- Same input preprocessing |
| 233 | +- Same activation functions |
| 234 | +- No custom aggregation functions |
| 235 | +- Response values are all 1.0 (or correctly handled) |
| 236 | + |
| 237 | +**Q: ImportError for torch/tensorflow?** |
| 238 | +A: Install the required framework: |
| 239 | +```bash |
| 240 | +pip install torch # for PyTorch/ONNX |
| 241 | +pip install tensorflow # for TensorFlow |
| 242 | +``` |
| 243 | + |
| 244 | +**Q: FileNotFoundError for xor_winner.json?** |
| 245 | +A: Make sure you've run `python export_example.py` first to generate the exported network file. |
0 commit comments