Layer

Layers are pre-built architectures that allow you to combine different network architectures into óne network.

Always start your network with a Dense layer and always end it with a Dense layer. You can connect layers with each other just like you can connect Nodes and Groups with each other.

Constructor

new Layer () open an issue

Accepts the same arguments as Group, since it inherits from Group

Example

Custom architecture built with layers

      Copy
      let { Layer } = require("@liquid-carrot/carrot");

let input = new Layer.Dense(1);
let hidden1 = new Layer.LSTM(5);
let hidden2 = new Layer.GRU(1);
let output = new Layer.Dense(1);

// connect however you want
input.connect(hidden1);
hidden1.connect(hidden2);
hidden2.connect(output);

let network = architect.Construct([input, hidden1, hidden2, output]);
    
Properties:
Name Type Description
output Array.<Node>

Output nodes

nodes Array.<Node>

Nodes within the layer

incoming Array.<Group> | Array.<Node>

Incoming connections

outgoing Array.<Group> | Array.<Node>

Outgoing connections

connections_self Array.<Group> | Array.<Node>

Self connections

Source:

Methods

(static) Dense (size) → {Layer} open an issue

Creates a regular (dense) layer.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

Plain layer

Example
      Copy
      let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.Dense(size);
    
Source:

(static) GRU (size) → {Layer} open an issue

Creates a GRU layer.

The GRU layer is similar to the LSTM layer, however it has no memory cell and only two gates. It is also a recurrent layer that is excellent for timeseries prediction.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

GRU layer

Example
      Copy
      let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.GRU(size);
    
Source:

(static) LSTM (size) → {Layer} open an issue

Creates an LSTM layer.

LSTM layers are useful for detecting and predicting patterns over long time lags. This is a recurrent layer.

Note: architect.LSTM currently performs better than an equivalent network built with LSTM Layers.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

Returns:

LSTM layer

Example
      Copy
      let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.LSTM(size);
    
Source:

(static) Memory (size, memory_size) → {Layer} open an issue

Creates a Memory layer.

The Memory layer makes networks remember a number of previous inputs in an absolute way. For example, if you set the memory option to 3, it will remember the last 3 inputs in the same state as they were inputted.

Parameters:
Name Type Description
size number

Amount of nodes to build the layer with

memory_size number

Number of previous inputs to remember

Returns:

Layer with nodes that store previous inputs

Example
      Copy
      let { Layer } = require("@liquid-carrot/carrot");

let layer = new Layer.Memory(size, memory);
    
Source: