【图像分类】手撕ResNet——复现ResNet(Keras,Tensorflow 2.x)

举报
AI浩 发表于 2021/12/22 23:36:49 2021/12/22
【摘要】 目录 摘要 ​实现残差模块 ResNet18, ResNet34 ResNet50、ResNet101、ResNet152 摘要 ResNet(Residual Neural Network)由微软研究院的Kaiming He等四名华人提出,通过使用ResNet Unit成功训练出了152层的神经网络,并在ILS...

目录

摘要

​实现残差模块

ResNet18, ResNet34

ResNet50、ResNet101、ResNet152


摘要

ResNet(Residual Neural Network)由微软研究院的Kaiming He等四名华人提出,通过使用ResNet Unit成功训练出了152层的神经网络,并在ILSVRC2015比赛中取得冠军,在top5上的错误率为3.57%,同时参数量比VGGNet低,效果非常明显。

模型的创新点在于提出残差学习的思想,在网络中增加了直连通道,将原始输入信息直接传到后面的层中,如下图所示:


传统的卷积网络或者全连接网络在信息传递的时候或多或少会存在信息丢失,损耗等问题,同时还有导致梯度消失或者梯度爆炸,导致很深的网络无法训练。ResNet在一定程度上解决了这个问题,通过直接将输入信息绕道传到输出,保护信息的完整性,整个网络只需要学习输入、输出差别的那一部分,简化学习目标和难度。VGGNet和ResNet的对比如下图所示。ResNet最大的区别在于有很多的旁路将输入直接连接到后面的层,这种结构也被称为shortcut或者skip connections。

  在ResNet网络结构中会用到两种残差模块,一种是以两个3*3的卷积网络串接在一起作为一个残差模块,另外一种是1*1、3*3、1*1的3个卷积网络串接在一起作为一个残差模块。如下图所示:

 ResNet有不同的网络层数,比较常用的是18-layer,34-layer,50-layer,101-layer,152-layer。他们都是由上述的残差模块堆叠在一起实现的。 下图展示了不同的ResNet模型。

​实现残差模块

第一个残差模块


  
  1. # 第一个残差模块
  2. class BasicBlock(layers.Layer):
  3. def __init__(self, filter_num, stride=1):
  4. super(BasicBlock, self).__init__()
  5. self.conv1 = layers.Conv2D(filter_num, (3, 3), strides=stride, padding='same')
  6. self.bn1 = layers.BatchNormalization()
  7. self.relu = layers.Activation('relu')
  8. self.conv2 = layers.Conv2D(filter_num, (3, 3), strides=1, padding='same')
  9. self.bn2 = layers.BatchNormalization()
  10. if stride != 1:
  11. self.downsample = Sequential()
  12. self.downsample.add(layers.Conv2D(filter_num, (1, 1), strides=stride))
  13. else:
  14. self.downsample = lambda x: x
  15. def call(self, input, training=None):
  16. out = self.conv1(input)
  17. out = self.bn1(out)
  18. out = self.relu(out)
  19. out = self.conv2(out)
  20. out = self.bn2(out)
  21. identity = self.downsample(input)
  22. output = layers.add([out, identity])
  23. output = tf.nn.relu(output)
  24. return output

第二个残差模块

 


  
  1. # 第二个残差模块
  2. class Block(layers.Layer):
  3. def __init__(self, filters, downsample=False, stride=1):
  4. super(Block, self).__init__()
  5. self.downsample = downsample
  6. self.conv1 = layers.Conv2D(filters, (1, 1), strides=stride, padding='same')
  7. self.bn1 = layers.BatchNormalization()
  8. self.relu = layers.Activation('relu')
  9. self.conv2 = layers.Conv2D(filters, (3, 3), strides=1, padding='same')
  10. self.bn2 = layers.BatchNormalization()
  11. self.conv3 = layers.Conv2D(4 * filters, (1, 1), strides=1, padding='same')
  12. self.bn3 = layers.BatchNormalization()
  13. if self.downsample:
  14. self.shortcut = Sequential()
  15. self.shortcut.add(layers.Conv2D(4 * filters, (1, 1), strides=stride))
  16. self.shortcut.add(layers.BatchNormalization(axis=3))
  17. def call(self, input, training=None):
  18. out = self.conv1(input)
  19. out = self.bn1(out)
  20. out = self.relu(out)
  21. out = self.conv2(out)
  22. out = self.bn2(out)
  23. out = self.relu(out)
  24. out = self.conv3(out)
  25. out = self.bn3(out)
  26. if self.downsample:
  27. shortcut = self.shortcut(input)
  28. else:
  29. shortcut = input
  30. output = layers.add([out, shortcut])
  31. output = tf.nn.relu(output)
  32. return output

ResNet18, ResNet34


  
  1. import tensorflow as tf
  2. from tensorflow import keras
  3. from tensorflow.keras import layers, Sequential
  4. # 第一个残差模块
  5. class BasicBlock(layers.Layer):
  6. def __init__(self, filter_num, stride=1):
  7. super(BasicBlock, self).__init__()
  8. self.conv1 = layers.Conv2D(filter_num, (3, 3), strides=stride, padding='same')
  9. self.bn1 = layers.BatchNormalization()
  10. self.relu = layers.Activation('relu')
  11. self.conv2 = layers.Conv2D(filter_num, (3, 3), strides=1, padding='same')
  12. self.bn2 = layers.BatchNormalization()
  13. if stride != 1:
  14. self.downsample = Sequential()
  15. self.downsample.add(layers.Conv2D(filter_num, (1, 1), strides=stride))
  16. else:
  17. self.downsample = lambda x: x
  18. def call(self, input, training=None):
  19. out = self.conv1(input)
  20. out = self.bn1(out)
  21. out = self.relu(out)
  22. out = self.conv2(out)
  23. out = self.bn2(out)
  24. identity = self.downsample(input)
  25. output = layers.add([out, identity])
  26. output = tf.nn.relu(output)
  27. return output
  28. class ResNet(keras.Model):
  29. def __init__(self, layer_dims, num_classes=10):
  30. super(ResNet, self).__init__()
  31. # 预处理层
  32. self.padding = keras.layers.ZeroPadding2D((3, 3))
  33. self.stem = Sequential([
  34. layers.Conv2D(64, (7, 7), strides=(2, 2)),
  35. layers.BatchNormalization(),
  36. layers.Activation('relu'),
  37. layers.MaxPool2D(pool_size=(3, 3), strides=(2, 2), padding='same')
  38. ])
  39. # resblock
  40. self.layer1 = self.build_resblock(64, layer_dims[0])
  41. self.layer2 = self.build_resblock(128, layer_dims[1], stride=2)
  42. self.layer3 = self.build_resblock(256, layer_dims[2], stride=2)
  43. self.layer4 = self.build_resblock(512, layer_dims[3], stride=2)
  44. # 全局池化
  45. self.avgpool = layers.GlobalAveragePooling2D()
  46. # 全连接层
  47. self.fc = layers.Dense(num_classes, activation=tf.keras.activations.softmax)
  48. def call(self, input, training=None):
  49. x=self.padding(input)
  50. x = self.stem(x)
  51. x = self.layer1(x)
  52. x = self.layer2(x)
  53. x = self.layer3(x)
  54. x = self.layer4(x)
  55. # [b,c]
  56. x = self.avgpool(x)
  57. x = self.fc(x)
  58. return x
  59. def build_resblock(self, filter_num, blocks, stride=1):
  60. res_blocks = Sequential()
  61. res_blocks.add(BasicBlock(filter_num, stride))
  62. for pre in range(1, blocks):
  63. res_blocks.add(BasicBlock(filter_num, stride=1))
  64. return res_blocks
  65. def ResNet34(num_classes=10):
  66. return ResNet([2, 2, 2, 2], num_classes=num_classes)
  67. def ResNet34(num_classes=10):
  68. return ResNet([3, 4, 6, 3], num_classes=num_classes)
  69. model = ResNet34(num_classes=1000)
  70. model.build(input_shape=(1, 224, 224, 3))
  71. print(model.summary()) # 统计网络参数

ResNet50、ResNet101、ResNet152


  
  1. import tensorflow as tf
  2. from tensorflow import keras
  3. from tensorflow.keras import layers, Sequential
  4. # 第一个残差模块
  5. class Block(layers.Layer):
  6. def __init__(self, filters, downsample=False, stride=1):
  7. super(Block, self).__init__()
  8. self.downsample = downsample
  9. self.conv1 = layers.Conv2D(filters, (1, 1), strides=stride, padding='same')
  10. self.bn1 = layers.BatchNormalization()
  11. self.relu = layers.Activation('relu')
  12. self.conv2 = layers.Conv2D(filters, (3, 3), strides=1, padding='same')
  13. self.bn2 = layers.BatchNormalization()
  14. self.conv3 = layers.Conv2D(4 * filters, (1, 1), strides=1, padding='same')
  15. self.bn3 = layers.BatchNormalization()
  16. if self.downsample:
  17. self.shortcut = Sequential()
  18. self.shortcut.add(layers.Conv2D(4 * filters, (1, 1), strides=stride))
  19. self.shortcut.add(layers.BatchNormalization(axis=3))
  20. def call(self, input, training=None):
  21. out = self.conv1(input)
  22. out = self.bn1(out)
  23. out = self.relu(out)
  24. out = self.conv2(out)
  25. out = self.bn2(out)
  26. out = self.relu(out)
  27. out = self.conv3(out)
  28. out = self.bn3(out)
  29. if self.downsample:
  30. shortcut = self.shortcut(input)
  31. else:
  32. shortcut = input
  33. output = layers.add([out, shortcut])
  34. output = tf.nn.relu(output)
  35. return output
  36. class ResNet(keras.Model):
  37. def __init__(self, layer_dims, num_classes=10):
  38. super(ResNet, self).__init__()
  39. # 预处理层
  40. self.padding = keras.layers.ZeroPadding2D((3, 3))
  41. self.stem = Sequential([
  42. layers.Conv2D(64, (7, 7), strides=(2, 2)),
  43. layers.BatchNormalization(),
  44. layers.Activation('relu'),
  45. layers.MaxPool2D(pool_size=(3, 3), strides=(2, 2), padding='same')
  46. ])
  47. # resblock
  48. self.layer1 = self.build_resblock(64, layer_dims[0],stride=1)
  49. self.layer2 = self.build_resblock(128, layer_dims[1], stride=2)
  50. self.layer3 = self.build_resblock(256, layer_dims[2], stride=2)
  51. self.layer4 = self.build_resblock(512, layer_dims[3], stride=2)
  52. # 全局池化
  53. self.avgpool = layers.GlobalAveragePooling2D()
  54. # 全连接层
  55. self.fc = layers.Dense(num_classes, activation=tf.keras.activations.softmax)
  56. def call(self, input, training=None):
  57. x = self.padding(input)
  58. x = self.stem(x)
  59. x = self.layer1(x)
  60. x = self.layer2(x)
  61. x = self.layer3(x)
  62. x = self.layer4(x)
  63. # [b,c]
  64. x = self.avgpool(x)
  65. x = self.fc(x)
  66. return x
  67. def build_resblock(self, filter_num, blocks, stride=1):
  68. res_blocks = Sequential()
  69. if stride != 1 or filter_num * 4 != 64:
  70. res_blocks.add(Block(filter_num, downsample=True,stride=stride))
  71. for pre in range(1, blocks):
  72. res_blocks.add(Block(filter_num, stride=1))
  73. return res_blocks
  74. def ResNet50(num_classes=10):
  75. return ResNet([3, 4, 6, 3], num_classes=num_classes)
  76. def ResNet101(num_classes=10):
  77. return ResNet([3, 4, 23, 3], num_classes=num_classes)
  78. def ResNet152(num_classes=10):
  79. return ResNet([3, 8, 36, 3], num_classes=num_classes)
  80. model = ResNet50(num_classes=1000)
  81. model.build(input_shape=(1, 224, 224, 3))
  82. print(model.summary()) # 统计网络参数

运行结果: 

文章来源: wanghao.blog.csdn.net,作者:AI浩,版权归原作者所有,如需转载,请联系作者。

原文链接:wanghao.blog.csdn.net/article/details/117420186

【版权声明】本文为华为云社区用户转载文章,如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱: cloudbbs@huaweicloud.com
  • 点赞
  • 收藏
  • 关注作者

评论(0

0/1000
抱歉,系统识别当前为高风险访问,暂不支持该操作

全部回复

上滑加载中

设置昵称

在此一键设置昵称,即可参与社区互动!

*长度不超过10个汉字或20个英文字符,设置后3个月内不可修改。

*长度不超过10个汉字或20个英文字符,设置后3个月内不可修改。