softplus(value) Softplus function. Parameters: value: float, value to process. Returns: float
softsign(value) Softsign function. Parameters: value: float, value to process. Returns: float
elu(value, alpha) Exponential Linear Unit (ELU) function. Parameters: value: float, value to process. alpha: float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. . Returns: float
selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function. Parameters: value: float, value to process. alpha: float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. . scale: float, default=1.05070098, predefined constant. Returns: float
exponential(value) Pointer to math.exp() function. Parameters: value: float, value to process. Returns: float
function(name, value, alpha, scale) Activation function. Parameters: name: string, name of activation function. value: float, value to process. alpha: float, default=na, if required. scale: float, default=na, if required. Returns: float
derivative(name, value, alpha, scale) Derivative Activation function. Parameters: name: string, name of activation function. value: float, value to process. alpha: float, default=na, if required. scale: float, default=na, if required. Returns: float