Numpy实现Activation
【摘要】
activation_functions = {
'relu': ReLU,
'sigmoid': Sigmoid,
'selu': SELU,
'elu': ELU,
...
activation_functions = {
'relu': ReLU,
'sigmoid': Sigmoid,
'selu': SELU,
'elu': ELU,
'softmax': Softmax,
'leaky_relu': LeakyReLU,
'tanh': TanH,
'softplus': SoftPlus
}
class Activation(Layer):
"""A layer that applies an activation operation to the input.
Parameters:
-----------
name: string
The name of the activation function that will be used.
"""
def __init__(self, name):
self.activation_name = name
self.activation_func = activation_functions[name]()
self.trainable = True
def layer_name(self):
return "Activation (%s)" % (self.activation_func.__class__.__name__)
def forward_pass(self, X, training=True):
self.layer_input = X
return self.activation_func(X)
def backward_pass(self, accum_grad):
return accum_grad * self.activation_func.gradient(self.layer_input)
def output_shape(self):
return self.input_shape
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
文章来源: wanghao.blog.csdn.net,作者:AI浩,版权归原作者所有,如需转载,请联系作者。
原文链接:wanghao.blog.csdn.net/article/details/120322469
【版权声明】本文为华为云社区用户转载文章,如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱:
cloudbbs@huaweicloud.com
- 点赞
- 收藏
- 关注作者
评论(0)