architecture/network/runtime

Runtime control utilities for advanced network inference features.

Provides:

architecture/network/runtime/network.runtime.controls.utils.ts

clearStochasticDepthSchedule

clearStochasticDepthSchedule(): void

Clear the stochastic-depth schedule function.

Parameters:

Returns: Nothing.

clearWeightNoiseSchedule

clearWeightNoiseSchedule(): void

Clear the dynamic global weight-noise schedule.

Parameters:

Returns: Nothing.

configurePruning

configurePruning(
  configuration: PruningConfiguration,
): void

Configure scheduled pruning during training.

This stores the pruning window and target policy on the network so the training loop can opportunistically apply structured sparsification later.

Parameters:

Returns: Nothing.

disableStochasticDepth

disableStochasticDepth(): void

Disable stochastic depth.

Parameters:

Returns: Nothing.

disableWeightNoise

disableWeightNoise(): void

Disable all configured weight noise.

Parameters:

Returns: Nothing.

enableWeightNoise

enableWeightNoise(
  configuration: WeightNoiseConfiguration,
): void

Enable weight noise using either one global standard deviation or per-hidden-layer values.

A single global value is useful for quick experiments, while the per-hidden schedule keeps layered models explicit about which hidden stage receives how much perturbation.

Parameters:

Returns: Nothing.

getLastSkippedLayers

getLastSkippedLayers(): number[]

Read the last hidden-layer indices skipped by stochastic depth.

Parameters:

Returns: Snapshot of the last skipped hidden-layer indices.

getRuntimeRegularizationStats

getRuntimeRegularizationStats(): Record<string, unknown> | null

Read regularization statistics collected during training.

Parameters:

Returns: Last regularization stats payload or null when none exists yet.

getTrainingStep

getTrainingStep(): number

Read the current training-step counter.

Parameters:

Returns: Current training step.

setRandom

setRandom(
  randomFunction: () => number,
): void

Replace the network random number generator.

This lets advanced callers share one deterministic source across mutation, stochastic depth, DropConnect, and other runtime randomness.

Parameters:

Returns: Nothing.

setStochasticDepth

setStochasticDepth(
  survivalProbabilities: number[],
): void

Configure stochastic depth with one survival probability per hidden layer.

Matching survival values to hidden layers keeps the runtime contract explicit and avoids silently applying one layer's policy to another.

Parameters:

Returns: Nothing.

setStochasticDepthSchedule

setStochasticDepthSchedule(
  schedule: StochasticDepthSchedule,
): void

Set the stochastic-depth schedule function.

Parameters:

Returns: Nothing.

setWeightNoiseSchedule

setWeightNoiseSchedule(
  schedule: (step: number) => number,
): void

Set a dynamic scheduler for global weight noise.

Parameters:

Returns: Nothing.

testForceOverflow

testForceOverflow(): void

Force the next mixed-precision overflow path.

This is a test-oriented hook used to exercise loss-scale recovery logic without waiting for a real floating-point overflow.

Parameters:

Returns: Nothing.

architecture/network/runtime/network.runtime.diagnostics.utils.ts

Runtime diagnostics and safety helpers for the public Network class.

This chapter owns the public readers and small runtime controls that expose training-health state, DropConnect policy, and dropout-mask cleanup without changing the network topology itself.

disableDropConnect

disableDropConnect(): void

Disable DropConnect.

Parameters:

Returns: Nothing.

enableDropConnect

enableDropConnect(
  probability: number,
): void

Enable DropConnect with a probability in $[0,1)$.

Parameters:

Returns: Nothing.

getLastGradClipGroupCount

getLastGradClipGroupCount(): number

Read the last recorded gradient-clipping group count.

Parameters:

Returns: Last gradient-clipping group count.

getLossScale

getLossScale(): number

Read the active mixed-precision loss scale.

Parameters:

Returns: Current loss scale.

getRawGradientNorm

getRawGradientNorm(): number

Read the last recorded raw gradient norm.

Parameters:

Returns: Last raw gradient norm.

getTrainingStats

getTrainingStats(): TrainingStatsSnapshot

Read a consolidated training-health snapshot.

Parameters:

Returns: Training statistics snapshot.

resetDropoutMasks

resetDropoutMasks(): void

Reset every dropout mask to 1.

This is useful after training so later inference does not inherit transient node-level dropout state from a previous activation pass.

Parameters:

Returns: Nothing.

architecture/network/runtime/network.runtime.errors.ts

Raised when a pruning schedule window is invalid.

NetworkRuntimeDropConnectProbabilityRangeError

Raised when DropConnect probability is outside [0, 1).

NetworkRuntimeLayeredWeightNoiseRequiredError

Raised when per-hidden-layer weight noise is requested on a non-layered network.

NetworkRuntimePruningScheduleWindowError

Raised when a pruning schedule window is invalid.

NetworkRuntimeStochasticDepthEntryCountError

Raised when stochastic-depth survival entries do not match hidden-layer count.

NetworkRuntimeStochasticDepthLayeredNetworkRequiredError

Raised when stochastic depth is requested on a non-layered network.

NetworkRuntimeStochasticDepthSurvivalArrayError

Raised when stochastic-depth survival input is not an array.

NetworkRuntimeStochasticDepthSurvivalRangeError

Raised when a stochastic-depth survival probability falls outside (0, 1].

NetworkRuntimeTargetSparsityRangeError

Raised when pruning target sparsity is outside the open interval (0, 1).

NetworkRuntimeWeightNoiseConfigurationError

Raised when weight-noise configuration shape is invalid.

NetworkRuntimeWeightNoiseEntryCountError

Raised when hidden-layer weight-noise entries do not match hidden-layer count.

NetworkRuntimeWeightNoisePerLayerRangeError

Raised when a per-hidden-layer weight-noise value is negative.

NetworkRuntimeWeightNoiseStdDevRangeError

Raised when weight-noise standard deviation is negative.

Generated from source JSDoc • GitHub