Package smile.base.mlp
Class OutputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.OutputLayer
- All Implemented Interfaces:
Serializable
The output layer in the neural network.
- See Also:
-
Field Summary
Fields inherited from class smile.base.mlp.Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate
-
Constructor Summary
ConstructorDescriptionOutputLayer
(int n, int p, OutputFunction activation, Cost cost) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoid
backpropagate
(double[] lowerLayerGradient) Propagates the errors back to a lower layer.void
computeOutputGradient
(double[] target, double weight) Compute the network output gradient.cost()
Returns the cost function of neural network.toString()
void
transform
(double[] x) The activation or output function.Methods inherited from class smile.base.mlp.Layer
backpopagateDropout, builder, computeGradient, computeGradientUpdate, getInputSize, getOutputSize, gradient, input, input, leaky, leaky, leaky, linear, linear, mle, mse, of, output, propagate, propagateDropout, rectifier, rectifier, sigmoid, sigmoid, tanh, tanh, update
-
Constructor Details
-
OutputLayer
Constructor.- Parameters:
n
- the number of neurons.p
- the number of input variables (not including bias value).activation
- the output activation function.cost
- the cost function.
-
-
Method Details
-
toString
-
cost
Returns the cost function of neural network.- Returns:
- the cost function.
-
transform
public void transform(double[] x) Description copied from class:Layer
The activation or output function. -
backpropagate
public void backpropagate(double[] lowerLayerGradient) Description copied from class:Layer
Propagates the errors back to a lower layer.- Specified by:
backpropagate
in classLayer
- Parameters:
lowerLayerGradient
- the gradient vector of lower layer.
-
computeOutputGradient
public void computeOutputGradient(double[] target, double weight) Compute the network output gradient.- Parameters:
target
- the desired output.weight
- a positive weight value associated with the training instance.
-