Network

new Network (input_size, output_size) open an issue

Create a neural network

Networks are easy to create, all you need to specify is an input and an output size.

Parameters:
Name Type Description
input_size number

Size of input layer AKA neurons in input layer

output_size number

Size of output layer AKA neurons in output layer

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

// Network with 2 input neurons and 1 output neuron
let myNetwork = new Network(2, 1);

// and a multi-layered network
let myNetwork = new architect.Perceptron(5, 20, 10, 5, 1)
    
Properties:
Name Type Description
input_size number

Size of input layer AKA neurons in input layer

output_size number

Size of output layer AKA neurons in output layer

nodes Array.<Node>

Nodes currently within the network

gates Array.<Node>

Gates within the network

connections Array.<Connection>

Connections within the network

Source:

Namespaces

architecture

Methods

(static) activate (inputopt) → {Array.<number>} open an issue

Activates the network

It will activate all the nodes in activation order and produce an output.

Parameters:
Name Type Default Description
input Array.<number> <optional>

Input values to activate nodes with

options.dropout_rate Number <optional> 0

The dropout rate. dropout

options.trace bool <optional> true

Controls whether traces are created when activation happens (a trace is meta information left behind for different uses, e.g. backpropagation).

Returns:

Squashed output values

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

// Create a network
let myNetwork = new Network(3, 2);

myNetwork.activate([0.8, 1, 0.21]); // gives: [0.49, 0.51]
    
Source:

(static) clear () open an issue

Clear the context of the network

Source:

(static) clone () → {Network} beta

Returns a deep copy of Network.

Returns:

Returns an identical network

Source:

(static) connect (from, to, weightopt) → {Array.<Connection>} open an issue

Connects a Node to another Node or Group in the network

Parameters:
Name Type Description
from Node

The source Node

to Node | Group

The destination Node or Group

weight number <optional>

An initial weight for the connections to be formed

Returns:

An array of the formed connections

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

myNetwork.connect(myNetwork.nodes[4], myNetwork.nodes[5]); // connects network node 4 to network node 5
    
Source:

(static) crossOver (network1, network2, equalopt) → {Network} open an issue

Create an offspring from two parent networks.

Networks are not required to have the same size, however input and output size should be the same!

Parameters:
Name Type Description
network1 Network

First parent network

network2 Network

Second parent network

equal boolean <optional>

Flag to indicate equally fit Networks

Returns:

New network created from mixing parent networks

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

// Initialise two parent networks
let network1 = new architect.Perceptron(2, 4, 3);
let network2 = new architect.Perceptron(2, 4, 5, 3);

// Produce an offspring
let network3 = Network.crossOver(network1, network2);
    
Source:
To Do:
  • Add custom [crossover](crossover) method customization

(static) disconnect (from, to) open an issue

Removes the connection of the from node to the to node

Parameters:
Name Type Description
from Node

Source node

to Node

Destination node

Example
      Copy
      myNetwork.disconnect(myNetwork.nodes[4], myNetwork.nodes[5]);
// now node 4 does not have an effect on the output of node 5 anymore
    
Source:

(static) evolve (dataset, optionsopt) → {Object} open an issue

Evolves the network to reach a lower error on a dataset using the NEAT algorithm

If both iterations and error options are unset, evolve will default to iterations as an end condition.

Parameters:
Name Type Description
dataset Array.<{Array.<input:number>, Array.<output:number>}>

A set of input values and ideal output values to train the network with

options

Configuration options

Properties
Name Type Default Description
iterations number <optional> 1000

Set the maximum amount of iterations/generations for the algorithm to run.

error number <optional> 0.05

Set the target error. The algorithm will stop once this target error has been reached.

growth number <optional> 0.0001

Set the penalty for large networks. Penalty calculation: penalty = (genome.nodes.length + genome.connectoins.length + genome.gates.length) * growth; This penalty will get added on top of the error. Your growth should be a very small number.

cost cost <optional> cost.MSE

Specify the cost function for the evolution, this tells a genome in the population how well it's performing. Default: methods.cost.MSE (recommended).

amount number <optional> 1

Set the amount of times to test the trainingset on a genome each generation. Useful for timeseries. Do not use for regular feedfoward problems.

threads number <optional>

Specify the amount of threads to use. Default value is the amount of cores in your CPU.

network Network <optional>
log number | boolean <optional> false

If set to n, outputs training status every n iterations. Setting log to 1 will log the status every iteration

schedule.iterations number <optional>

You can schedule tasks to happen every n iterations. Paired with options.schedule.function

schedule.function schedule <optional>

A function to run every n iterations as set by options.schedule.iterations. Passed as an object with a "function" property that contains the function to run.

clear boolean <optional> false

If set to true, will clear the network after every activation. This is useful for evolving recurrent networks, more importantly for timeseries prediction.

equal boolean <optional> true

If set to true when Network.crossOver runs it will assume both genomes are equally fit.

population_size number <optional> 50

Population size of each generation.

elitism number <optional> 1

Elitism of every evolution loop. Elitism in genetic algorithms.

provenance number <optional> 0

Number of genomes inserted into the original network template (Network(input,output)) per evolution.

mutation_rate number <optional> 0.4

Sets the mutation rate. If set to 0.3, 30% of the new population will be mutated.

mutation_amount number <optional> 1

If mutation occurs (randomNumber < mutation_rate), sets amount of times a mutation method will be applied to the network.

fitness_population boolean <optional> true

Flag to return the fitness of a population of genomes. false => evaluate each genome individually. true => evaluate entire population. Adjust fitness function accordingly

fitness function <optional>

A fitness function to evaluate the networks. Takes a genome, i.e. a network, and a dataset and sets the genome's score property

selection string <optional> FITNESS_PROPORTIONATE

Selection method for evolution (e.g. methods.Selection.FITNESS_PROPORTIONATE).

crossover Array <optional>

Sets allowed crossover methods for evolution.

mutation Array <optional>

Sets allowed mutation methods for evolution, a random mutation method will be chosen from the array when mutation occurs. Optional, but default methods are non-recurrent.

max_nodes number <optional> Infinity

Maximum nodes for a potential network

maxConns number <optional> Infinity

Maximum connections for a potential network

maxGates number <optional> Infinity

Maximum gates for a potential network

mutationSelection function <optional> random

Custom mutation selection function if given

efficientMutation boolean <optional> false

Test & reduce mutation methods to avoid failed mutation attempts

Returns:

A summary object of the network's performance.
Properties include: error - error of the best genome, iterations - generations used to evolve networks, time - clock time elapsed while evolving

Example
      Copy
      let { Network, methods } = require("@liquid-carrot/carrot");

async function execute () {
   var network = new Network(2,1);

   // XOR dataset
   var trainingSet = [
       { input: [0,0], output: [0] },
       { input: [0,1], output: [1] },
       { input: [1,0], output: [1] },
       { input: [1,1], output: [0] }
   ];

   await network.evolve(trainingSet, {
       mutation: methods.mutation.FFW,
       equal: true,
       error: 0.05,
       elitism: 5,
       mutation_rate: 0.5
   });

   // another option
   // await network.evolve(trainingSet, {
   //     mutation: methods.mutation.FFW,
   //     equal: true,
   //     error: 0.05,
   //     elitism: 5,
   //     mutation_rate: 0.5,
   //     cost: (targets, outputs) => {
   //       const error = outputs.reduce(function(total, value, index) {
   //         return total += Math.pow(targets[index] - outputs[index], 2);
   //       }, 0);
   //
   //       return error / outputs.length;
   //     }
   // });


   network.activate([0,0]); // 0.2413
   network.activate([0,1]); // 1.0000
   network.activate([1,0]); // 0.7663
   network.activate([1,1]); // -0.008
}

execute();
    
Source:

(static) fromJSON (json) → {Network} open an issue

Convert a json object to a network

Parameters:
Name Type Description
json Object

A network represented as a json object

Returns:

Network A reconstructed network

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

let exported = myNetwork.toJSON();
let imported = Network.fromJSON(exported) // imported will be a new instance of Network that is an exact clone of myNetwork
    
Source:

(static) gate (node, connection) open an issue

Makes a network node gate a connection

Parameters:
Name Type Description
node Node

Gating node

connection Connection

Connection to gate with node

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

myNetwork.gate(myNetwork.nodes[1], myNetwork.connections[5])
// now: connection 5's weight is multiplied with node 1's activaton
    
Source:
To Do:
  • Add ability to gate several network connections at once

(static) merge (network1, network2) → {Network} open an issue

Merge two networks into one.

The merge functions takes two networks, the output size of network1 should be the same size as the input of network2. Merging will always be one to one to conserve the purpose of the networks.

Parameters:
Name Type Description
network1 Network

Network to merge

network2 Network

Network to merge

Returns:

Network Merged Network

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let XOR = architect.Perceptron(2,4,1); // assume this is a trained XOR
let NOT = architect.Perceptron(1,2,1); // assume this is a trained NOT

// combining these will create an XNOR
let XNOR = Network.merge(XOR, NOT);
    
Source:

(static) mutate (method, options) → {network} open an issue

Mutates the network with the given method.

Parameters:
Name Type Description
method mutation

Mutation method

options
Properties
Name Type Default Description
maxNodes number <optional> Infinity

Maximum amount of Nodes a network can grow to

maxConns number <optional> Infinity

Maximum amount of Connections a network can grow to

maxGates number <optional> Infinity

Maximum amount of Gates a network can grow to

Returns:

A mutated version of this network

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let myNetwork = new architect.Perceptron(2,2)

myNetwork = myNetwork.mutate(mutation.ADD_GATE) // returns a mutated network with an added gate
    
Source:

(static) mutateRandom (allowedMethodsopt, options) → {network} alpha

Selects a random mutation method and returns a mutated copy of the network. Warning! Mutates network directly.

Parameters:
Name Type Default Description
allowedMethods Array.<mutation> <optional> methods.mutation.ALL

An array of Mutation methods to automatically pick from

options
Properties
Name Type Default Description
maxNodes number <optional> Infinity

Maximum amount of Nodes a network can grow to

maxConns number <optional> Infinity

Maximum amount of Connections a network can grow to

maxGates number <optional> Infinity

Maximum amount of Gates a network can grow to

Returns:

A mutated version of this network

Source:

(static) noTraceActivate (input) → {Array.<number>} deprecated

Deprecated, here for backwards compatibility only! Simply calls .activate() with option trace: false

Activates network without creating traces

Activates the network without calculating elegibility traces for the nodes within it.

Since this doesn't calculate traces it won't factor in backpropagation afterwards. That's also why it's quite a bit faster than regular activate.

Parameters:
Name Type Description
input Array.<number>

An array of input values equal in size to the input layer

Returns:

output An array of output values equal in size to the output layer

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

// Create a network
let myNetwork = new Network(3, 2);

myNetwork.noTraceActivate([0.8, 1, 0.21]); // gives: [0.49, 0.51]
    
Source:

(static) possible (method) → {false|Array.<object>} open an issue

Checks whether a given mutation is possible, returns an array of candidates to use for a mutation when it is.

Parameters:
Name Type Description
method mutation

Mutation method

Returns:

Candidates to use for a mutation. Entries may be arrays containing pairs / tuples when appropriate.

Example
      Copy
      const network = new architect.Perceptron(2,3,1)

network.possible(mutation.SUB_NODE) // returns an array of nodes that can be removed
    
Source:

(static) propagate (rate, momentum, update, target) open an issue

Backpropagate the network

This function allows you to teach the network. If you want to do more complex training, use the network.train() function.

Parameters:
Name Type Default Description
rate number 0.3

Sets the learning rate of the backpropagation process

momentum number 0

Momentum. Adds a fraction of the previous weight update to the current one.

update boolean false

When set to false weights won't update, but when set to true after being false the last propagation will include the deltaweights of the first "update:false" propagations too.

target Array.<number>

Ideal values of the previous activate. Will use the difference to improve the weights

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

let myNetwork = new Network(1,1);

// This trains the network to function as a NOT gate
for(var node_index = 0; i < 1000; i++){
 network.activate([0]);
 network.propagate(0.2, 0, true, [1]);
 network.activate([1]);
 network.propagate(0.3, 0, true, [0]);
}
    
Source:

(static) remove (node) open an issue

Removes a node from a network, all its connections will be redirected. If it gates a connection, the gate will be removed.

Parameters:
Name Type Description
node Node

Node to remove from the network

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let myNetwork = new architect.Perceptron(1,4,1);

// Remove a node
myNetwork.remove(myNetwork.nodes[2]);
    
Source:

(static) set () open an issue

Sets the value of a property for every node in this network

Parameters:
Name Type Description
values.bias number

Bias to set for all network nodes

values.squash activation

Activation function to set for all network nodes

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

var network = new architect.Random(4, 4, 1);

// All nodes in 'network' now have a bias of 1
network.set({bias: 1});
    
Source:

(static) standalone () → {string} open an issue

Creates a standalone function of the network which can be run without the need of a library

Returns:

Function as a string that can be eval'ed

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

var myNetwork = new architect.Perceptron(2,4,1);
myNetwork.activate([0,1]); // [0.24775789809]

// a string
var standalone = myNetwork.standalone();

// turns your network into an 'activate' function
eval(standalone);

// calls the standalone function
activate([0,1]);// [0.24775789809]
    
Source:

(static) test (set, costopt) → {Object} open an issue

Tests a set and returns the error and elapsed time

Parameters:
Name Type Default Description
set Array.<{Array.<input:number>, Array.<output:number>}>

A set of input values and ideal output values to test the network against

cost cost <optional> methods.cost.MSE

The cost function used to determine network error

Returns:

A summary object of the network's performance

Source:

(static) toJSON () → {Object} open an issue

Convert the network to a json object

Returns:

The network represented as a json object

Example
      Copy
      let { Network } = require("@liquid-carrot/carrot");

let exported = myNetwork.toJSON();
let imported = Network.fromJSON(exported) // imported will be a new instance of Network that is an exact clone of myNetwork
    
Source:

(static) train (data, options) → {Object} open an issue

Train the given data to this network

Parameters:
Name Type Description
data Array.<{Array.<input:number>, Array.<output:number>}>

A data of input values and ideal output values to train the network with

options

Options used to train network

Properties
Name Type Default Description
cost options.cost <optional> options.cost.MSE

The options.cost function used to determine network error

rate_policy rate <optional> rate.FIXED

A learning rate policy, i.e. how to change the learning rate during training to get better network performance

rate number <optional> 0.3

Sets the learning rate of the backpropagation process

iterations number <optional> 1000

Sets amount of training cycles the process will maximally run, even when the target error has not been reached.

error number <optional>

The target error to train for, once the network falls below this error, the process is stopped. Lower error rates require more training cycles.

dropout number <optional> 0

Dropout rate likelihood for any given neuron to be ignored during network training. Must be between zero and one, numbers closer to one will result in more neurons ignored.

momentum number <optional> 0

Momentum. Adds a fraction of the previous weight update to the current one.

batch_size number <optional> 1

Sets the (mini-) batch size of your training. Default: 1 (online training)

cross_validate.testSize number <optional>

Sets the amount of test cases that should be assigned to cross validation. If data to 0.4, 40% of the given data will be used for cross validation.

cross_validate.test_error number <optional>

Sets the target error of the validation data.

clear boolean <optional> false

If set to true, will clear the network after every activation. This is useful for training LSTM's, more importantly for timeseries prediction.

shuffle boolean <optional> false

When set to true, will shuffle the training data every iteration_number. Good option to use if the network is performing worse in cross validation than in the real training data.

log number | boolean <optional> false

If set to n, outputs training status every n iterations. Setting log to 1 will log the status every iteration_number

schedule.iterations number <optional>

You can schedule tasks to happen every n iterations. Paired with options.schedule.function

schedule.function schedule <optional>

A function to run every n iterations as data by options.schedule.iterations. Passed as an object with a "function" property that contains the function to run.

Returns:

A summary object of the network's performance

Examples

Training with Defaults

      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let network = new architect.Perceptron(2,4,1);

// Train the XOR gate
network.train([{ input: [0,0], output: [0] },
               { input: [0,1], output: [1] },
               { input: [1,0], output: [1] },
               { input: [1,1], output: [0] }]);

network.activate([0,1]); // 0.9824...
    

Training with Options

      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let network = new architect.Perceptron(2,4,1);

let trainingSet = [
   { input: [0,0], output: [0] },
   { input: [0,1], output: [1] },
   { input: [1,0], output: [1] },
   { input: [1,1], output: [0] }
];

// Train the XNOR gate
network.train(trainingSet, {
   log: 1,
   iterations: 1000,
   error: 0.0001,
   rate: 0.2
});
    

Cross Validation Example

      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let network = new architect.Perceptron(2,4,1);

let trainingSet = [ // PS: don't use cross validation for small sets, this is just an example
 { input: [0,0], output: [1] },
 { input: [0,1], output: [0] },
 { input: [1,0], output: [0] },
 { input: [1,1], output: [1] }
];

// Train the XNOR gate
network.train(trainingSet, {
 crossValidate:
   {
     testSize: 0.4,
     test_error: 0.02
   }
});
    
Source:

(static) ungate (connection) open an issue

Remove the gate of a connection.

Parameters:
Name Type Description
connection Connection

Connection to remove gate from

Example
      Copy
      let { Network, architect } = require("@liquid-carrot/carrot");

let myNetwork = new architect.Perceptron(1, 4, 2);

// Gate a connection
myNetwork.gate(myNetwork.nodes[2], myNetwork.connections[5]);

// Remove the gate from the connection
myNetwork.ungate(myNetwork.connections[5]);
    
Source: