architecture

Preconfigured neural networks!

Ready to be built with simple one line functions.

Source:

Methods

(static) Construct (list) → {Network} open an issue

Constructs a network from a given array of connected nodes

Behind the scenes, Construct expects nodes to have connections and gates already made which it uses to infer the structure of the network and assemble it.

It's useful because it's a generic function to produce a network from custom architectures

Parameters:
Name Type Description
list Array.<Group> | Array.<Layer> | Array.<Node>

A list of Groups, Layers, and Nodes to combine into a Network

Examples

A Network built with Nodes

      Copy
      let { architect } = require("@liquid-carrot/carrot");

var A = new Node();
var B = new Node();
var C = new Node();
var D = new Node();

// Create connections
A.connect(B);
A.connect(C);
B.connect(D);
C.connect(D);

// Construct a network
var network = architect.Construct([A, B, C, D]);
    

A Network built with Groups

      Copy
      let { architect } = require("@liquid-carrot/carrot");

var A = new Group(4);
var B = new Group(2);
var C = new Group(6);

// Create connections between the groups
A.connect(B);
A.connect(C);
B.connect(C);

// Construct a square-looking network
var network = architect.Construct([A, B, C, D]);
    
Source:

(static) GRU (input, …units, output) → {Network} open an issue

Creates a gated recurrent unit network

Parameters:
Name Type Description
input number

Number of input nodes

units number <repeatable>

Number of gated recurrent units per layer

output number

Number of output nodes

Examples

GRU is being tested, and may not always work for your dataset.

      Copy
      let { architect } = require("@liquid-carrot/carrot");

// Input, gated recurrent unit layer, output
let my_LSTM = new architect.GRU(2,6,1);

// with multiple layers of gated recurrent units
let my_LSTM = new architect.GRU(2, 4, 4, 4, 1);
    

Training XOR gate

      Copy
      let { architect } = require("@liquid-carrot/carrot");

var training_set = [
  { input: [0], output: [0]},
  { input: [1], output: [1]},
  { input: [1], output: [0]},
  { input: [0], output: [1]},
  { input: [0], output: [0]}
];

var network = new architect.GRU(1,1,1);

// Train a sequence: 00100100..
network.train(training_set, {
  log: 1,
  rate: 0.1, // lower rates work best
  error: 0.005,
  iterations: 3000,
  clear: true // set to true while training
});
    
Source:

(static) Hopfield (size) → {Network} open an issue

Creates a hopfield network of the given size

Parameters:
Name Type Description
size number

Number of inputs and outputs (which is the same number)

Example

Output will always be binary due to `Activation.STEP` function.

      Copy
      let { architect } = require("@liquid-carrot/carrot");

var network = architect.Hopfield(10);
var training_set = [
  { input: [0, 1, 0, 1, 0, 1, 0, 1, 0, 1], output: [0, 1, 0, 1, 0, 1, 0, 1, 0, 1] },
  { input: [1, 1, 1, 1, 1, 0, 0, 0, 0, 0], output: [1, 1, 1, 1, 1, 0, 0, 0, 0, 0] }
];

network.train(training_set);

network.activate([0,1,0,1,0,1,0,1,1,1]); // [0, 1, 0, 1, 0, 1, 0, 1, 0, 1]
network.activate([1,1,1,1,1,0,0,1,0,0]); // [1, 1, 1, 1, 1, 0, 0, 0, 0, 0]
    
Source:

(static) Liquid () open an issue

Source:
To Do:
  • Build Liquid network constructor

(static) LSTM (input, …memory, output, optionsopt) → {Network} open an issue

Creates a long short-term memory network

Parameters:
Name Type Description
input number

Number of input nodes

memory number <repeatable>

Number of memory block_size assemblies (input gate, memory cell, forget gate, and output gate) per layer

output number

Number of output nodes

options

Configuration options

Properties
Name Type Default Description
memory_to_memory boolean <optional> false

Form internal connections between memory blocks

output_to_memory boolean <optional> false

Form output to memory layer connections and gate them

output_to_gates boolean <optional> false

Form output to gate connections (connects to all gates)

input_to_output boolean <optional> true

Form direct input to output connections

input_to_deep boolean <optional> true

Form input to memory layer conections and gate them

Example

While training sequences or timeseries prediction, set the clear option to true in training

      Copy
      let { architect } = require("@liquid-carrot/carrot");

// Input, memory block_size layer, output
let my_LSTM = new architect.LSTM(2,6,1);

// with multiple memory block_size layer_sizes
let my_LSTM = new architect.LSTM(2, 4, 4, 4, 1);

// with options
var options = {
  memory_to_memory: false,    // default
  output_to_memory: false,    // default
  output_to_gates: false,     // default
  input_to_output: true,      // default
  input_to_deep: true         // default
};

let my_LSTM = new architect.LSTM(2, 4, 4, 4, 1, options);
    
Source:
See:

(static) NARX (input, hidden, output, input_memory, output_memory) → {Network} alpha

Creates a NARX network (remember previous inputs/outputs)

Parameters:
Name Type Description
input number

Number of input nodes

hidden Array.<number> | number

Array of hidden layer sizes, e.g. [10,20,10] If only one hidden layer, can be a number (of nodes)

output number

Number of output nodes

input_memory number

Number of previous inputs to remember

output_memory number

Number of previous outputs to remember

Example
      Copy
      let { architect } = require("@liquid-carrot/carrot");

let narx = new architect.NARX(1, 5, 1, 3, 3);

// Training a sequence
let training_data = [
  { input: [0], output: [0] },
  { input: [0], output: [0] },
  { input: [0], output: [1] },
  { input: [1], output: [0] },
  { input: [0], output: [0] },
  { input: [0], output: [0] },
  { input: [0], output: [1] },
];
narx.train(training_data, {
  log: 1,
  iterations: 3000,
  error: 0.03,
  rate: 0.05
});
    
Source:

(static) Perceptron (…layer_neurons) → {Network} open an issue

Creates a multilayer perceptron (MLP)

Parameters:
Name Type Description
layer_neurons number <repeatable>

Number of neurons in input layer, hidden layer(s), and output layer as a series of numbers (min 3 arguments)

Returns:

Feed forward neural network

Example
      Copy
      let { architect } = require("@liquid-carrot/carrot");

// Input 2 neurons, Hidden layer: 3 neurons, Output: 1 neuron
let my_perceptron = new architect.Perceptron(2,3,1);

// Input: 2 neurons, 4 Hidden layers: 10 neurons, Output: 1 neuron
let my_perceptron = new architect.Perceptron(2, 10, 10, 10, 10, 1);
    
Source:

(static) Random (input, hiddenopt, output, optionsopt) → {Network} open an issue

Creates a randomly connected network

Parameters:
Name Type Description
input number

Number of input nodes

hidden number <optional>

Number of nodes inbetween input and output

output number

Number of output nodes

options

Configuration options

Properties
Name Type Default Description
connections number <optional> hidden*2

Number of connections (Larger than hidden)

backconnections number <optional> 0

Number of recurrent connections

selfconnections number <optional> 0

Number of self connections

gates number <optional> 0

Number of gates

Example
      Copy
      let { architect } = require("@liquid-carrot/carrot");

let network = architect.Random(1, 20, 2, {
  connections: 40,
  gates: 4,
  selfconnections: 4
});
    
Source: