网站如何做映射,南京装修公司十大排名榜,莱州市招聘网站,杭州网站开发网络构建
神经网络模型由神经网络层和Tensor操作构成
#实验环境已经预装了mindspore2.2.14#xff0c;如需更换mindspore版本#xff0c;可更改下面mindspore的版本号
!pip uninstall mindspore -y
!pip install -i https://pypi.mirrors.ustc.edu.cn/simple mindspore2.2.…网络构建
神经网络模型由神经网络层和Tensor操作构成
#实验环境已经预装了mindspore2.2.14如需更换mindspore版本可更改下面mindspore的版本号
!pip uninstall mindspore -y
!pip install -i https://pypi.mirrors.ustc.edu.cn/simple mindspore2.2.14定义模型类
import mindspore
from mindspore import nn, opsclass Network(nn.Cell):def __init__(self):super().__init__()self.flatten nn.Flatten()self.dense_relu_sequential nn.SequentialCell(nn.Dense(28*28, 512, weight_initnormal, bias_initzeros),nn.ReLU(),nn.Dense(512, 512, weight_initnormal, bias_initzeros),nn.ReLU(),nn.Dense(512, 10, weight_initnormal, bias_initzeros))def construct(self, x):x self.flatten(x)logits self.dense_relu_sequential(x)return logitsmodel Network() #打印模型结构
print(model)
#Output#Network#(flatten): Flatten#(dense_relu_sequential): SequentialCell#(0): Denseinput_channels784, output_channels512, has_biasTrue#(1): ReLU#(2): Denseinput_channels512, output_channels512, has_biasTrue#(3): ReLU#(4): Denseinput_channels512, output_channels10, has_biasTrue##
X ops.ones((1, 28, 28), mindspore.float32)
logits model(X)
# print logits
logits
#Output
#Tensor(shape[1, 10], dtypeFloat32, value [[-2.40761833e-03, 2.76332069e-03, 4.36006673e-03 ... -2.03372864e-03, 2.23693671e-04, 5.74092008e-03]])
# 过Softmax获得预测概率
pred_prob nn.Softmax(axis1)(logits)
y_pred pred_prob.argmax(1) #给出预测结果
print(fPredicted class: {y_pred})模型层
#构建3个28*28的图像
input_image ops.ones((3,28,28), mindspore.float32)
print(input_image.shape)
#Output (3, 28, 28)nn.Flatten
flatten nn.Flatten()
flat_image flatten(input_image)
print(flat_image.shape()
#Output(3, 728) 把图像碾平成一维 满足模型输入要求nn.dense
layer1 nn.Dense(in_channels 28 * 28, out_channels 20)
hidden1 layer1(flat_image)
print(hidden1.shape)
#Output(3, 20)nn.ReLu
#加入非线性激活函数增加模型复杂度
hidden1 nn.ReLU()(hidden1)nn.Sequential
seq_modules nn.SequentialCell(flatten,layer1,nn.ReLU(),nn.Dense(20, 10)
)logits seq_modules(input_image)
print(logits.shape)nn.Softmax
#将神经网络最后一个全连接层返回的logits值缩放至[0,1], 表示每个类别的概率预测。axis指定的维度数值和为1
softmax nn.Softmax(axis1)
pred_probab softmax(logits)模型参数
通过model.parameters_and_names()获取
for name, param in model.parameters_and_names():print(fLayer: {name}\nSize: {param.shape}\nValues : {param[:2]} \n)#Layer: dense_relu_sequential.0.weight
#Size: (512, 784)
#Values : [[ 0.01388695 -0.00604919 -0.00993734 ... 0.00366266 0.00065028 0.00334988] [ 0.01483851 -0.00953137 0.01422684 ... 0.01928892 0.00024049 -0.00365605]] #Layer: dense_relu_sequential.0.bias
#Size: (512,)
#Values : [0. 0.] #Layer: dense_relu_sequential.2.weight
#Size: (512, 512)
#Values : [[ 0.00495729 -0.01029267 -0.00672846 ... 0.02216997 -0.00423945 0.00603404] [-0.02003012 0.00643059 0.0076612 ... -0.0097923 -0.01475079 0.00485153]] #Layer: dense_relu_sequential.2.bias
#Size: (512,)
#Values : [0. 0.] #Layer: dense_relu_sequential.4.weight
#Size: (10, 512)
#Values : [[-0.00212924 0.0067424 0.00244794 ... -0.00193389 -0.01660973 -0.00875264] [-0.01889533 0.01057486 -0.0233639 ... -0.00306869 -0.007126 -0.00609088]] #Layer: dense_relu_sequential.4.bias
#Size: (10,)
#Values : [0. 0.]