Layer

new Layer () open an issue

Layers are pre-built architectures that allow you to combine different network architectures into óne network.

Always start your network with a Dense layer and always end it with a Dense layer. You can connect layers with each other just like you can connect Nodes and Groups with each other.

Example

Custom architecture built with layers

let { Layer } = require("@liquid-carrot/carrot");

let input = new Layer.Dense(1);
let hidden1 = new Layer.LSTM(5);
let hidden2 = new Layer.GRU(1);
let output = new Layer.Dense(1);

// connect however you want
input.connect(hidden1);
hidden1.connect(hidden2);
hidden2.connect(output);

let network = architect.Construct([input, hidden1, hidden2, output]);
Properties:
Name Type Description
output Array.<Node>

Output nodes

nodes Array.<Node>

Nodes within the layer

connections.in Array.<Group> | Array.<Node>

Income connections

connections.out Array.<Group> | Array.<Node>

Outgoing connections

connections.self Array.<Group> | Array.<Node>

Self connections

Source:

Methods

(static) activate (value) → {Array.<number>} open an issue

Activates all the nodes in the group

Parameters:
Name Type Description
value Array.<object>

Array with length equal to amount of nodes

Returns:

Layer output values

Source:

(static) clear () open an issue

Clear the context of this group

Source:

(static) connect (target, method, weight) → {Array.<Connection>} open an issue

Connects the nodes in this group to nodes in another group or just a node

Parameters:
Name Type Description
target Group | Node | Layer

Node(s) to form connections with

method connection

Connection Methods

weight number

An initial weight to build the connections with

Returns:

An array of connections between the nodes in this layer and target

Source:

(static) Dense (size) → {Layer} open an issue

Creates a regular (dense) layer.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

Plain layer

Example
let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.Dense(size);
Source:

(static) disconnect (target, twosidedopt) open an issue

Disconnects all nodes from this group from another given group/node

Parameters:
Name Type Default Description
target Group | Node | Layer

A Group, Node, or Layer to disconnect from

twosided boolean <optional> false

Flag indicating incoming connections

Source:

(static) gate (connections, method) open an issue

Make nodes from this group gate the given connection(s)

Parameters:
Name Type Description
connections Array.<Connection>

Connections to gate

method gating_method

Gating Method

Source:
See:

(static) GRU (size) → {Layer} open an issue

Creates a GRU layer.

The GRU layer is similar to the LSTM layer, however it has no memory cell and only two gates. It is also a recurrent layer that is excellent for timeseries prediction.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

GRU layer

Example
let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.GRU(size);
Source:

(static) LSTM (size) → {Layer} open an issue

Creates an LSTM layer.

LSTM layers are useful for detecting and predicting patterns over long time lags. This is a recurrent layer.

Note: architect.LSTM currently performs better than an equivalent network built with LSTM Layers.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

LSTM layer

Example
let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.LSTM(size);
Source:

(static) Memory (size, memory) → {Layer} open an issue

Creates a Memory layer.

The Memory layer makes networks remember a number of previous inputs in an absolute way. For example, if you set the memory option to 3, it will remember the last 3 inputs in the same state as they were inputted.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

memory number

Number of previous inputs to remember

Returns:

Layer with nodes that store previous inputs

Example
let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.Memory(size, memory);
Source:

(static) propagate (targetopt, optionsopt) open an issue

Propagates all the node in the group

Parameters:
Name Type Description
target Array.<number> <optional>

Ideal/target values - required for output layers

options
Properties
Name Type Default Description
rate number <optional> 0.3

Sets the learning rate of the backpropagation process

momentum number <optional> 0

Momentum. Adds a fraction of the previous weight update to the current one.

update boolean <optional> true
Source:

(static) set (values) open an issue

Sets the value of a property for every node

Parameters:
Name Type Description
values Array.<object>

An object with (all optional) bias, squash, and type properties to overwrite in the node

Source: