activation

Activation functions

Activation functions determine what activation value neurons should get. Depending on your network's environment, choosing a suitable activation function can have a positive impact on the learning ability of the network.

Source:
See:

Example

      ;

// eg.
A.squash = methods.activation.LOGISTIC;'>Copy
      ;

// eg.
A.squash = methods.activation.LOGISTIC;'>let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.<ACTIVATION_FUNCTION>;

// eg.
A.squash = methods.activation.LOGISTIC;
    

Methods

(static) ABSOLUTE (x, derivateopt) open an issue

Absolute function.

Avoid using this activation function on a node with a selfconnection

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.ABSOLUTE;
    
Source:

(static) BENT_IDENTITY (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.BENT_IDENTITY;
    
Source:

(static) BIPOLAR (x, derivateopt) open an issue

Bipolar function, if x > 0 then returns 1, otherwise returns -1

Parameters:
Name Type Description
x number | Array.<number>

Input value to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.BIPOLAR;
    
Source:

(static) BIPOLAR_SIGMOID (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.BIPOLAR_SIGMOID;
    
Source:

(static) GAUSSIAN (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.GAUSSIAN;
    
Source:

(static) HARD_TANH (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.HARD_TANH;
    
Source:

(static) IDENTITY (x, derivateopt) open an issue

Identity function.

Returns input as output, used for memory neurons.

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.IDENTITY;
    
Source:

(static) INVERSE (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.INVERSE;
    
Source:

(static) LOGISTIC (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.LOGISTIC;
    
Source:

(static) RELU (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.RELU;
    
Source:

(static) SELU (x, derivateopt) open an issue

Scaled exponential linear unit.

Exponential linear units try to make the mean activations closer to zero which speeds up learning. It has been shown that ELUs can obtain higher classification accuracy than ReLUs. α is a hyper-parameter here and to be tuned and the constraint is α ≥ 0(zero).

Parameters:
Name Type Description
x number | Array.<number>

Input value to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.SELU;
    
Source:
See:

(static) SINUSOID (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.SINUSOID;
    
Source:

(static) SOFTSIGN (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.SOFTSIGN;
    
Source:

(static) STEP (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.STEP;
    
Source:

(static) TANH (x, derivateopt) open an issue

Parameters:
Name Type Description
x number | Array.<number>

Input value(s) to activation function

derivate boolean <optional>

Flag to select derivative function

Example
      Copy
      let { methods, Node } = require("@liquid-carrot/carrot");

// Changing a neuron's activation function
let A = new Node();
A.squash = methods.activation.TANH;
    
Source: