architecture/network

architecture/network/network.activate.ts

activateBatch

(inputs: number[][], training: boolean) => number[][]

Activate the network over a mini‑batch (array) of input vectors, returning a 2‑D array of outputs.

This helper simply loops, invoking {@link Network.activate} (or its bound variant) for each sample. It is intentionally naive: no attempt is made to fuse operations across the batch. For very large batch sizes or performance‑critical paths consider implementing a custom vectorized backend that exploits SIMD, GPU kernels, or parallel workers.

Input validation occurs per row to surface the earliest mismatch with a descriptive index.

Parameters:

Returns: 2‑D array: outputs[i] is the activation result for inputs[i].

activateRaw

(input: number[], training: boolean, maxActivationDepth: number) => number[]

Thin semantic alias to the network's main activation path.

At present this simply forwards to {@link Network.activate}. The indirection is useful for:

Parameters:

Returns: Implementation-defined result of Network.activate (typically an output vector).

NetworkInternals

Runtime interface for accessing Network internal properties during activation.

noTraceActivate

(input: number[]) => number[]

Network activation helpers (forward pass utilities).

This module provides progressively lower–overhead entry points for performing forward propagation through a {@link Network}. The emphasis is on:

  1. Educative clarity – each step is documented so newcomers can follow the life‑cycle of a forward pass in a neural network graph.
  2. Performance – fast paths avoid unnecessary allocation and bookkeeping when gradients / evolution traces are not needed.
  3. Safety – pooled buffers are never exposed directly to the public API.

Exported functions:

Design terminology used below:

architecture/network/network.connect.ts

connect

(from: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, to: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, weight: number | undefined) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/connection").default[]

Network structural mutation helpers (connect / disconnect).

This module centralizes the logic for adding and removing edges (connections) between nodes in a {@link Network}. By isolating the book‑keeping here we keep the primary Network class lean and ensure consistent handling of:

Exported functions:

Key terminology:

disconnect

(from: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, to: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default) => void

Remove (at most) one directed connection from source 'from' to target 'to'.

Only a single direct edge is removed because typical graph configurations maintain at most one logical connection between a given pair of nodes (excluding potential future multi‑edge semantics). If the target edge is gated we first call {@link Network.ungate} to maintain gating invariants (ensuring the gater node's internal gate list remains consistent).

Algorithm outline:

  1. Choose the correct list (selfconns vs connections) based on whether from === to.
  2. Linear scan to find the first edge with matching endpoints.
  3. If gated, ungate to detach gater bookkeeping.
  4. Splice the edge out; exit loop (only one expected).
  5. Delegate per‑node cleanup via from.disconnect(to) (clears reverse references, traces, etc.).
  6. Mark structural caches dirty for lazy recomputation.

Complexity:

Idempotence: If no such edge exists we still perform node-level disconnect and flag caches dirty – this conservative approach simplifies callers (they need not pre‑check existence).

Parameters:

NetworkInternals

Runtime interface for accessing Network internal properties during connection operations.

architecture/network/network.deterministic.ts

getRandomFn

() => (() => number) | undefined

Retrieve the active random function reference (for testing, instrumentation, or swapping).

Mutating the returned function's closure variables (if any) is not recommended; prefer using higher-level APIs (setSeed / restoreRNG) to manage state.

Parameters:

Returns: Function producing numbers in [0,1). May be undefined if never seeded (call setSeed first).

getRNGState

() => number | undefined

Get the current internal 32‑bit RNG state value.

Parameters:

Returns: Unsigned 32‑bit state integer or undefined if generator not yet seeded or was reset.

network.deterministic

Default export bundle for convenient named imports.

NetworkInternals

Runtime properties accessed for deterministic PRNG operations. Local interface to avoid circular dependencies.

restoreRNG

(fn: () => number) => void

Restore a previously captured RNG function implementation (advanced usage).

This does NOT rehydrate _rngState (it explicitly sets it to undefined). Intended for scenarios where a caller has customly serialized a full RNG closure or wants to inject a deterministic stub. If you only need to restore the raw state word produced by {@link snapshotRNG}, prefer {@link setRNGState} instead.

Parameters:

RNGSnapshot

Deterministic pseudo‑random number generation (PRNG) utilities for {@link Network}.

Why this module exists:

Implementation notes:

Public surface:

Design rationale:

setRNGState

(state: number) => void

Explicitly set (override) the internal 32‑bit RNG state without changing the generator function.

This is a low‑level operation; typical clients should call {@link setSeed}. Provided for advanced replay functionality where the same PRNG algorithm is assumed but you want to resume exactly at a known state word.

Parameters:

setSeed

(seed: number) => void

Seed the internal PRNG and install a deterministic random() implementation on the Network instance.

Process:

  1. Coerce the provided seed to an unsigned 32‑bit integer (>>> 0) for predictable wraparound behavior.
  2. Define an inline closure that advances an internal 32‑bit state using: a. A Weyl increment (adding constant 0x6D2B79F5 each call) ensuring full-period traversal of the 32‑bit space when combined with mixing. b. Two rounds of xorshift / integer mixing (xor, shifts, multiplications) to decorrelate bits. c. Normalization to [0,1) by dividing the final 32‑bit unsigned integer by 2^32.

Bit-mixing explanation (rough intuition):

Parameters:

snapshotRNG

() => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.deterministic").RNGSnapshot

Capture a snapshot of the RNG state together with the network's training step.

Useful for implementing speculative evolutionary mutations where you may revert both the structural change and the randomness timeline if accepting/rejecting a candidate.

Parameters:

Returns: Object containing current training step & 32‑bit RNG state (both possibly undefined if unseeded).

architecture/network/network.evolve.ts

CostFunction

(target: number[], output: number[]) => number

Cost function type: takes target and output arrays, returns error scalar.

CostFunctionOrRef

Cost function OR a serializable reference with just the name (for worker threads).

EvolutionConfig

Internal evolution configuration summary (for potential logging / debugging) capturing normalized option values used by the local evolutionary loop.

evolveNetwork

(set: TrainingSample[], options: EvolveOptions) => Promise<{ error: number; iterations: number; time: number; }>

Evolve (optimize) the current network's topology and weights using a NEAT-like evolutionary loop until a stopping criterion (target error or max iterations) is met.

High-level process:

  1. Validate dataset shape (input/output vector sizes must match network I/O counts).
  2. Normalize / default option values and construct an internal configuration summary.
  3. Build appropriate fitness evaluation function (single or multi-thread).
  4. Initialize a Neat population (optionally with speciation) seeded by this network.
  5. Iteratively call neat.evolve():
    • Retrieve fittest genome + its fitness.
    • Derive an error metric from fitness (inverse relationship considering complexity penalty).
    • Track best genome overall (elitism) and perform logging/scheduling callbacks.
    • Break if error criterion satisfied or iterations exceeded.
  6. Replace this network's internal structural arrays with the best discovered genome's (in-place upgrade).
  7. Cleanup any worker threads and report final statistics.

Fitness / Error relationship: fitness = -error - complexityPenalty => error = -(fitness - complexityPenalty) We recompute error from the stored fitness plus penalty to ensure consistent reporting.

Resilience strategies:

Parameters:

Returns: Summary object { error, iterations, time(ms) }.

EvolveOptions

Evolution options for the evolveNetwork method. Includes core evolution parameters plus NEAT-specific options passed through.

TrainingSample

A single supervised training example used to evaluate fitness.

architecture/network/network.gating.ts

gate

(node: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, connection: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/connection").default) => void

Gating & node removal utilities for {@link Network}.

Gating concept:

Removal strategy (removeNode):

Mutation interplay:

Determinism note:

Exported functions:

NetworkGatingProps

Internal Network properties accessed during gating operations.

removeNode

(node: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default) => void

Remove a hidden node from the network while attempting to preserve functional connectivity.

Algorithm outline:

  1. Reject removal if node is input/output (structural invariants) or absent (error).
  2. Optionally collect gating nodes (if keep_gates flag) from inbound & outbound connections.
  3. Remove self-loop (if present) to simplify subsequent edge handling.
  4. Disconnect all inbound edges (record their source nodes) and all outbound edges (record targets).
  5. For every (input predecessor, output successor) pair create a new connection unless: a. input === output (avoid trivial self loops) OR b. an existing projection already connects them.
  6. Reassign preserved gater nodes randomly onto newly created bridging connections.
  7. Ungate any connections that were gated BY this node (where node acted as gater).
  8. Remove node from network node list and flag node index cache as dirty.

Complexity summary:

Preservation rationale:

Parameters:

ungate

(connection: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/connection").default) => void

Remove gating from a connection, restoring its static weight contribution.

Idempotent: If the connection is not currently gated, the call performs no structural changes (and optionally logs a warning). After ungating, the connection's weight will be used directly without modulation by a gater activation.

Complexity: O(n) where n = number of gated connections (indexOf lookup) – typically small.

Parameters:

architecture/network/network.genetic.ts

ConnectionGene

Connection gene descriptor used during crossover.

ConnectionGeneticProps

Extended Connection properties for genetic operations.

crossOver

(network1: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, network2: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, equal: boolean) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default

NetworkGeneticProps

Internal runtime properties used during genetic operations. These properties are accessed dynamically and not part of the formal Network interface.

architecture/network/network.mutate.ts

_addBackConn

() => void

ADD_BACK_CONN: Add a backward (recurrent) connection (acyclic mode must be off).

_addConn

() => void

ADD_CONN: Add a new forward (acyclic) connection between two previously unconnected nodes. Recurrent edges are handled separately by ADD_BACK_CONN.

_addGate

() => void

ADD_GATE: Assign a random (hidden/output) node to gate a random ungated connection.

_addGRUNode

() => void

ADD_GRU_NODE: Replace a random connection with a minimal 1‑unit GRU block.

_addLSTMNode

() => void

ADD_LSTM_NODE: Replace a random connection with a minimal 1‑unit LSTM block (macro mutation).

_addNode

() => void

ADD_NODE: Insert a new hidden node by splitting an existing connection.

Deterministic test mode (config.deterministicChainMode):

Standard evolutionary mode:

Core algorithm (stochastic variant):

  1. Pick connection (random).
  2. Disconnect it (preserve any gater reference).
  3. Create hidden node (random activation mutation).
  4. Insert before output tail to preserve ordering invariants.
  5. Connect source→hidden and hidden→target.
  6. Reassign gater uniformly to one of the new edges.

_addSelfConn

() => void

ADD_SELF_CONN: Add a self loop to a random eligible node (only when cycles allowed).

_batchNorm

() => void

BATCH_NORM: Placeholder mutation – marks a random hidden node with a flag for potential future batch normalization integration. Currently a no-op beyond tagging.

_modActivation

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

MOD_ACTIVATION: Swap activation (squash) of a random eligible node; may exclude outputs.

_modBias

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

MOD_BIAS: Delegate to node.mutate to adjust bias of a random non‑input node.

_modWeight

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

MOD_WEIGHT: Perturb a single (possibly self) connection weight by uniform delta in [min,max].

_reinitWeight

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

REINIT_WEIGHT: Reinitialize all incoming/outgoing/self connection weights for a random node. Useful as a heavy mutation to escape local minima. Falls back silently if no eligible node.

_subBackConn

() => void

SUB_BACK_CONN: Remove a backward connection meeting redundancy heuristics.

_subConn

() => void

SUB_CONN: Remove a forward connection chosen under redundancy heuristics to avoid disconnects.

_subGate

() => void

SUB_GATE: Remove gating from a random previously gated connection.

_subNode

() => void

SUB_NODE: Remove a random hidden node (if any remain). After removal a tiny deterministic weight nudge encourages observable phenotype change in tests.

_subSelfConn

() => void

SUB_SELF_CONN: Remove a random existing self loop.

_swapNodes

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

SWAP_NODES: Exchange bias & activation function between two random eligible nodes.

mutateImpl

(method: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.mutate").MutationMethod | undefined) => void

Public entry point: apply a single mutation operator to the network.

Steps:

  1. Validate the supplied method (enum value or descriptor object).
  2. Resolve helper implementation from the dispatch map (supports objects exposing name/type/identity).
  3. Invoke helper (passing through method for parameterized operators).
  4. Flag topology caches dirty so ordering / slabs rebuild lazily.

Accepts either the raw enum value (e.g. mutation.ADD_NODE) or an object carrying an identifying name | type | identity field allowing future parameterization without breaking call sites.

Parameters:

MutationMethod

Mutation method descriptor (can be string enum value or object with identity).

NetworkMutationProps

Internal Network properties accessed during mutations.

architecture/network/network.onnx.ts

Conv2DMapping

Mapping declaration for treating a fully-connected layer as a 2D convolution during export. This assumes the dense layer was originally synthesized from a convolution with weight sharing; we reconstitute spatial metadata. Each mapping references an export-layer index (1-based across hidden layers, output layer would be hiddenCount+1) and supplies spatial/kernel hyperparameters. Validation ensures that input spatial * channels product equals the previous layer width and that output channels * output spatial equals the current layer width.

exportToONNX

(network: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, options: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.onnx").OnnxExportOptions) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.onnx").OnnxModel

Export a minimal multilayer perceptron Network to a lightweight ONNX JSON object.

Steps:

  1. Rebuild connection cache ensuring up-to-date adjacency.
  2. Index nodes for error messaging.
  3. Infer strict layer ordering (throws if structure unsupported).
  4. Validate homogeneity & full connectivity layer-to-layer.
  5. Build initializer tensors (weights + biases) and node list (Gemm + activation pairs).

Constraints: See module doc. Throws descriptive errors when assumptions violated.

importFromONNX

(onnx: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.onnx").OnnxModel) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default

Import a model previously produced by {@link exportToONNX} into a fresh Network instance.

Core Steps:

  1. Parse input/output tensor shapes (supports optional symbolic batch dim).
  2. Derive hidden layer sizes (prefer layer_sizes metadata; fallback to weight tensor grouping heuristic).
  3. Instantiate matching layered MLP (inputs -> hidden[] -> outputs); remove placeholder hidden nodes for single layer perceptrons.
  4. Assign weights & biases (aggregated or per-neuron) from W/B initializers.
  5. Reconstruct activation functions from Activation node op_types (layer or per-neuron).
  6. Restore recurrent self connections from recorded diagonal Rk matrices if recurrent_single_step metadata present.
  7. Experimental: Reconstruct LSTM / GRU layers when fused initializers & metadata (lstm_emitted_layers, gru_emitted_layers) detected by replacing the corresponding hidden node block with a freshly constructed Layer.lstm / Layer.gru instance and remapping weights.
  8. Rebuild flat connection array for downstream invariants.

Experimental Behavior:

Limitations:

NodeInternals

Runtime interface for accessing node internal properties. Nodes have runtime properties for connections, bias, and squash that aren't in the public interface.

OnnxAttribute

ONNX node attribute.

OnnxDimension

ONNX tensor type shape dimension.

OnnxExportOptions

Options controlling ONNX export behavior (Phase 1).

OnnxModel

OnnxShape

ONNX tensor type shape.

OnnxTensorType

ONNX tensor type.

OnnxValueInfo

ONNX value info (input/output description).

Pool2DMapping

Mapping describing a pooling operation inserted after a given export-layer index.

architecture/network/network.prune.ts

getCurrentSparsity

() => number

Current sparsity fraction relative to the training-time pruning baseline.

maybePrune

(iteration: number) => void

Opportunistically perform scheduled pruning during gradient-based training.

Scheduling model:

SNIP heuristic:

NetworkPruningProps

Internal Network properties accessed during pruning operations.

pruneToSparsity

(targetSparsity: number, method: "magnitude" | "snip") => void

Evolutionary (generation-based) pruning toward a target sparsity baseline. Unlike maybePrune this operates immediately relative to the first invocation's connection count (stored separately as _evoInitialConnCount) and does not implement scheduling or regrowth.

architecture/network/network.remove.ts

NetworkRemoveProps

Internal Network properties accessed during node removal operations.

removeNode

(node: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default) => void

Node removal utilities.

This module provides a focused implementation for removing a single hidden node from a network while attempting to preserve overall functional connectivity. The removal procedure mirrors the legacy Neataptic logic but augments it with clearer documentation and explicit invariants.

High‑level algorithm (removeNode):

  1. Guard: ensure the node exists and is not an input or output (those are structural anchors).
  2. Ungate: detach any connections gated BY the node (we don't currently reassign gater roles).
  3. Snapshot inbound / outbound connections (before mutation of adjacency lists).
  4. Disconnect all inbound, outbound, and self connections.
  5. Physically remove the node from the network's node array.
  6. Simple path repair heuristic: for every former inbound source and outbound target, add a direct connection if (a) both endpoints still exist, (b) they are distinct, and (c) no direct connection already exists. This keeps forward information flow possibilities.
  7. Mark topology / caches dirty so that subsequent activation / ordering passes rebuild state.

Notes / Limitations:

architecture/network/network.serialize.ts

deserialize

(data: [number[], number[], string[], import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.serialize").SerializedConnection[], number, number], inputSize: number | undefined, outputSize: number | undefined) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default

fromJSONImpl

(json: NetworkJSON) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default

network.serialize

NetworkInternals

Runtime interface for accessing Network internal properties.

NetworkJSON

Verbose JSON format for network serialization.

NodeInternals

Runtime interface for accessing Node internal properties.

serialize

() => [number[], number[], string[], import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.serialize").SerializedConnection[], number, number]

Serialization & deserialization helpers for Network instances.

Provides two independent formats:

  1. Compact tuple (serialize/deserialize): optimized for fast structured clone / worker transfer.
  2. Verbose JSON (toJSONImpl/fromJSONImpl): stable, versioned representation retaining structural genes.

Compact tuple format layout: [ activations: number[], states: number[], squashes: string[], connections: { from:number; to:number; weight:number; gater:number|null }[], inputSize: number, outputSize: number ]

Design Principles:

Verbose JSON (formatVersion = 2) adds:

Future Ideas:

SerializedConnection

Serialized connection representation.

toJSONImpl

() => NetworkJSON

Verbose JSON export (stable formatVersion). Omits transient runtime fields but keeps structural genetics. formatVersion=2 adds: enabled flags, stable geneId (if present), dropout value.

default

_flags

Packed state flags (private for future-proofing hidden class): bit0 => enabled gene expression (1 = active) bit1 => DropConnect active mask (1 = not dropped this forward pass) bit2 => hasGater (1 = symbol field present) bit3 => plastic (plasticityRate > 0) bits4+ reserved.

acquire

(from: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, to: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, weight: number | undefined) => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/connection").default

Acquire a Connection from the pool (or construct new). Fields are fully reset & given a fresh sequential innovation id. Prefer this in evolutionary algorithms that mutate topology frequently to reduce GC pressure.

Parameters:

Returns: Reinitialized connection instance.

dcMask

DropConnect active mask: 1 = not dropped (active), 0 = dropped for this stochastic pass.

dropConnectActiveMask

Convenience alias for DropConnect mask with clearer naming.

eligibility

Standard eligibility trace (e.g., for RTRL / policy gradient credit assignment).

enabled

Whether the gene (connection) is currently expressed (participates in forward pass).

firstMoment

First moment estimate (Adam / AdamW) (was opt_m).

from

The source (pre-synaptic) node supplying activation.

gain

Multiplicative modulation applied after weight. Default is 1 (neutral). We only store an internal symbol-keyed property when the gain is non-neutral, reducing memory usage across large populations where most connections are ungated.

gater

Optional gating node whose activation can modulate effective weight (symbol-backed).

gradientAccumulator

Generic gradient accumulator (RMSProp / AdaGrad) (was opt_cache).

hasGater

Whether a gater node is assigned (modulates gain); true if the gater symbol field is present.

infinityNorm

Adamax: Exponential moving infinity norm (was opt_u).

innovation

Unique historical marking (auto-increment) for evolutionary alignment.

innovationID

(sourceNodeId: number, targetNodeId: number) => number

Deterministic Cantor pairing function for a (sourceNodeId, targetNodeId) pair. Useful when you want a stable innovation id without relying on global mutable counters (e.g., for hashing or reproducible experiments).

NOTE: For large indices this can overflow 53-bit safe integer space; keep node indices reasonable.

Parameters:

Returns: Unique non-negative integer derived from the ordered pair.

lookaheadShadowWeight

Lookahead: shadow (slow) weight parameter (was _la_shadowWeight).

maxSecondMoment

AMSGrad: Maximum of past second moment (was opt_vhat).

plastic

Whether this connection participates in plastic adaptation (rate > 0).

plasticityRate

Per-connection plasticity / learning rate (0 means non-plastic). Setting >0 marks plastic flag.

previousDeltaWeight

Last applied delta weight (used by classic momentum).

release

(conn: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/connection").default) => void

Return a Connection to the internal pool for later reuse. Do NOT use the instance again afterward unless re-acquired (treat as surrendered). Optimizer / trace fields are not scrubbed here (they're overwritten during acquire).

Parameters:

resetInnovationCounter

(value: number) => void

Reset the monotonic auto-increment innovation counter (used for newly constructed / pooled instances). You normally only call this at the start of an experiment or when deserializing a full population.

Parameters:

secondMoment

Second raw moment estimate (Adam family) (was opt_v).

secondMomentum

Secondary momentum (Lion variant) (was opt_m2).

to

The target (post-synaptic) node receiving activation.

toJSON

() => { from: number | undefined; to: number | undefined; weight: number; gain: number; innovation: number; enabled: boolean; gater?: number | undefined; }

Serialize to a minimal JSON-friendly shape (used for saving genomes / networks). Undefined indices are preserved as undefined to allow later resolution / remapping.

Returns: Object with node indices, weight, gain, gater index (if any), innovation id & enabled flag.

totalDeltaWeight

Accumulated (batched) delta weight awaiting an apply step.

weight

Scalar multiplier applied to the source activation (prior to gain modulation).

xtrace

Extended trace structure for modulatory / eligibility propagation algorithms. Parallel arrays for cache-friendly iteration.

architecture/network/network.slab.ts

_buildAdjacency

() => void

Build / refresh CSR‑style adjacency (outStart + outOrder) enabling fast fan‑out traversal. Only rebuilds when marked dirty. Stores arrays on internal network instance.

_canUseFastSlab

(training: boolean) => boolean

Predicate gating usage of high‑performance slab forward pass. Disallows training / stochastic / dynamic edge behaviours (gating, dropout, noise, self‑connections).

Parameters:

Returns: True if fast path can be safely used for deterministic forward activation.

_reindexNodes

() => void

Assign sequential indices to each node (stable ordering prerequisite for slab packing). Clears _nodeIndexDirty flag.

canUseFastSlab

(training: boolean) => boolean

Public convenience wrapper exposing fast path eligibility. Mirrors _canUseFastSlab internal predicate.

Parameters:

Returns: True when slab fast path predicates hold.

ConnectionInternals

Internal Connection properties accessed during slab operations.

ConnectionSlabView

Shape returned by getConnectionSlab() describing the packed SoA view. Note: The arrays SHOULD NOT be mutated by callers; treat as read‑only.

fastSlabActivate

(input: number[]) => number[]

High‑performance forward pass using packed slabs + CSR adjacency.

Fallback Conditions (auto‑detected):

Implementation Notes:

Parameters:

Returns: Output activations (detached plain array) of length network.output.

getConnectionSlab

() => import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.slab").ConnectionSlabView

Obtain (and lazily rebuild if dirty) the current packed SoA view of connections.

Gain Omission: If the internal gain slab is absent (all gains neutral) a synthetic neutral array is created and returned (NOT retained) to keep external educational tooling branch‑free while preserving omission memory savings internally.

Returns: Read‑only style view (do not mutate) containing typed arrays + metadata.

getSlabAllocationStats

() => { pool: Record<string, PoolKeyMetrics>; fresh: number; pooled: number; }

getSlabVersion

() => number

Retrieve current monotonic slab version (increments on each successful rebuild).

Returns: Non‑negative integer (0 if slab never built yet).

NetworkSlabProps

Internal Network properties for slab operations.

PoolKeyMetrics

Per-pool-key allocation & reuse counters (educational / diagnostics). Tracks how many slabs were freshly created vs reused plus the high‑water mark (maxRetained) of simultaneously retained arrays for the key. Exposed indirectly via getSlabAllocationStats() so users can introspect the effectiveness of pooling under their workload.

rebuildConnectionSlab

(force: boolean) => void

Build (or refresh) the packed connection slabs for the network synchronously.

ACTIONS

  1. Optionally reindex nodes if structural mutations invalidated indices.
  2. Grow (geometric) or reuse existing typed arrays to ensure capacity >= active connections.
  3. Populate the logical slice [0, connectionCount) with weight/from/to/flag data.
  4. Lazily allocate gain & plastic slabs only on first non‑neutral / plastic encounter; omit otherwise.
  5. Release previously allocated optional slabs when they revert to neutral / unused (omission optimization).
  6. Update internal bookkeeping: logical count, dirty flags, version counter.

PERFORMANCE

O(C) over active connections with amortized allocation cost due to geometric growth.

Parameters:

rebuildConnectionSlabAsync

(chunkSize: number) => Promise<void>

Cooperative asynchronous slab rebuild (Browser only).

Strategy:

Metrics: Increments _slabAsyncBuilds for observability. Fallback: On Node (no window) defers to synchronous rebuild for simplicity.

Parameters:

Returns: Promise resolving once rebuild completes.

TypedArray

Union of slab typed array element container types. We purposefully restrict to the specific constructors actually used by this module so TypeScript can narrow accurately and editors display concise hover info.

TypedArrayConstructor

Constructor type for typed arrays used in slabs.

architecture/network/network.standalone.ts

generateStandalone

(net: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default) => string

NetworkStandaloneProps

Internal Network properties accessed during standalone code generation.

NodeWithIndex

Internal Node properties with assigned indices for code generation.

architecture/network/network.stats.ts

getRegularizationStats

() => Record<string, unknown> | null

Obtain the last recorded regularization / stochastic statistics snapshot.

Returns a defensive deep copy so callers can inspect metrics without risking mutation of the internal _lastStats object maintained by the training loop (e.g., during pruning, dropout, or noise scheduling updates).

Returns: A deep-cloned stats object or null if no stats have been recorded yet.

GlobalThisWithStructuredClone

GlobalThis interface with optional structuredClone.

NetworkStatsProps

Internal Network properties accessed during stats operations.

architecture/network/network.topology.ts

computeTopoOrder

() => void

Topology utilities.

Provides:

Design Notes:

hasPath

(from: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default, to: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/node").default) => boolean

Depth-first reachability test (avoids infinite loops via visited set).

NetworkTopologyProps

Internal Network properties accessed during topology operations.

architecture/network/network.training.ts

__trainingInternals

applyGradientClippingImpl

(net: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, cfg: { mode: "norm" | "percentile" | "layerwiseNorm" | "layerwisePercentile"; maxNorm?: number | undefined; percentile?: number | undefined; }) => void

CheckpointConfig

Checkpoint callback spec.

ConnectionInternals

Runtime interface for accessing Connection internal properties during training.

CostFunction

(target: number[], output: number[]) => number


Internal Type Definitions (documentation only; optional for callers)

CostFunctionOrObject

Cost function or cost object interface.

GradientClipConfig

Gradient clipping configuration accepted by options.gradientClip.

MetricsHook

(m: { iteration: number; error: number; plateauError?: number | undefined; gradNorm: number; }) => void

Metrics hook signature.

MixedPrecisionConfig

MixedPrecisionDynamicConfig

Mixed precision configuration.

MonitoredSmoothingConfig

Configuration passed to monitored (primary) smoothing computation.

MovingAverageType

Moving average strategy identifiers.

NetworkInternals

Runtime interface for accessing Network internal properties during training.

NodeInternals

Runtime interface for accessing Node internal properties during training.

OptimizerConfigBase

Optimizer configuration (subset – delegated to node.applyBatchUpdatesWithOptimizer).

PlateauSmoothingConfig

Configuration for plateau smoothing computation.

PlateauSmoothingState

State container for plateau EMA smoothing.

PrimarySmoothingState


Internal Helper Utilities (non-exported)

These functions encapsulate cohesive sub-steps of the training pipeline so the main exported functions remain readable while preserving original behavior. Each helper is intentionally pure where reasonable or documents its side-effects.

RegularizationConfig

Regularization configuration (used internally during training).

ScheduleConfig

Schedule hook executed every N iterations.

SerializedNetwork

Serialized network structure (used in checkpoint callbacks).

trainImpl

(net: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, set: { input: number[]; output: number[]; }[], options: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.training").TrainingOptions) => { error: number; iterations: number; time: number; }

TrainingOptions

Primary training options object (public shape).

trainSetImpl

(net: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network").default, set: { input: number[]; output: number[]; }[], batchSize: number, accumulationSteps: number, currentRate: number, momentum: number, regularization: RegularizationConfig, costFunction: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.training").CostFunction | CostFunctionOrObject, optimizer: import("/home/runner/work/NeatapticTS/NeatapticTS/src/architecture/network/network.training").OptimizerConfigBase | undefined) => number

Generated from source JSDoc • GitHub