RicardoSantos

MLActivationFunctions

RicardoSantos Wizard Telah dikemas kini   
Library "MLActivationFunctions"
Activation functions for Neural networks.

binary_step(value) Basic threshold output classifier to activate/deactivate neuron.
  Parameters:
    value: float, value to process.
  Returns: float

linear(value) Input is the same as output.
  Parameters:
    value: float, value to process.
  Returns: float

sigmoid(value) Sigmoid or logistic function.
  Parameters:
    value: float, value to process.
  Returns: float

sigmoid_derivative(value) Derivative of sigmoid function.
  Parameters:
    value: float, value to process.
  Returns: float

tanh(value) Hyperbolic tangent function.
  Parameters:
    value: float, value to process.
  Returns: float

tanh_derivative(value) Hyperbolic tangent function derivative.
  Parameters:
    value: float, value to process.
  Returns: float

relu(value) Rectified linear unit (RELU) function.
  Parameters:
    value: float, value to process.
  Returns: float

relu_derivative(value) RELU function derivative.
  Parameters:
    value: float, value to process.
  Returns: float

leaky_relu(value) Leaky RELU function.
  Parameters:
    value: float, value to process.
  Returns: float

leaky_relu_derivative(value) Leaky RELU function derivative.
  Parameters:
    value: float, value to process.
  Returns: float

relu6(value) RELU-6 function.
  Parameters:
    value: float, value to process.
  Returns: float

softmax(value) Softmax function.
  Parameters:
    value: float array, values to process.
  Returns: float

softplus(value) Softplus function.
  Parameters:
    value: float, value to process.
  Returns: float

softsign(value) Softsign function.
  Parameters:
    value: float, value to process.
  Returns: float

elu(value, alpha) Exponential Linear Unit (ELU) function.
  Parameters:
    value: float, value to process.
    alpha: float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
  Returns: float

selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function.
  Parameters:
    value: float, value to process.
    alpha: float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
    scale: float, default=1.05070098, predefined constant.
  Returns: float

exponential(value) Pointer to math.exp() function.
  Parameters:
    value: float, value to process.
  Returns: float

function(name, value, alpha, scale) Activation function.
  Parameters:
    name: string, name of activation function.
    value: float, value to process.
    alpha: float, default=na, if required.
    scale: float, default=na, if required.
  Returns: float

derivative(name, value, alpha, scale) Derivative Activation function.
  Parameters:
    name: string, name of activation function.
    value: float, value to process.
    alpha: float, default=na, if required.
    scale: float, default=na, if required.
  Returns: float
Nota Keluaran:
v2

Added:
softmax_derivative(value) Softmax derivative function.
  Parameters:
    value: float array, values to process.
  Returns: float
Perpustakaan Pine

Di dalam semangat sebenar TradingView, pengarang telah menerbitkan kod Pine ini sebagai perpustakaan sumber terbuka, jadi pengaturcara-pengaturcara Pine yang lain dari komuniti kami boleh menggunakannya semula. Sorakan kepada penulis! Anda boleh menggunakan perpustakaan ini secara peribadi atau pada penerbitan-penerbitan sumber terbuka lain, tetapi penggunaan semula kod ini di dalam penerbitan adalah ditadbir oleh Peraturan Dalaman.

Penafian

Maklumat dan penerbitan adalah tidak dimaksudkan untuk menjadi, dan tidak membentuk, nasihat untuk kewangan, pelaburan, perdagangan dan jenis-jenis lain atau cadangan yang dibekalkan atau disahkan oleh TradingView. Baca dengan lebih lanjut di Terma Penggunaan.

Mahu gunakan perpustakaan ini?

Salin garisan ini dan tampalkan ia di dalam skrip anda.