For hᵢ = σ(Wᵢx + b)
z = Σ wᵢxᵢ + b
n
i=1
Dendrites ⓘ
(Input Vector)
Synapses ⓘ
(Weights)
Axon Hillock ⓘ
(Activation Function)
Axon ⓘ
(Output)
X
1
X
2
X
3
X
4
W
1
W
2
W
3
W
4
∑
Soma ⓘ
(Summation)
Bias ⓘ
(Activation Threshold)
f
Sigmoid ⓘ
h = σ(z) =
1
1 + e⁻ᶻ
ReLU ⓘ
ReLU(z) = max(0, z)
Tanh ⓘ
tanh(z) =
e
z
- e
-z
e
z
+ e
-z
Softmax ⓘ
softmax(z
i
) =
e
z
i
∑
j
e
z
j
for j = 1 ... n
0
Compute
×