【CNN时序预测】基于混合卷积神经网络和循环神经网络 CNN - RNN 实现时间序列预测附Matlab代码

简介: 【CNN时序预测】基于混合卷积神经网络和循环神经网络 CNN - RNN 实现时间序列预测附Matlab代码

✅作者简介:热爱科研的Matlab仿真开发者,修心和技术同步精进,matlab项目合作可私信。

🍎个人主页:Matlab科研工作室

🍊个人信条:格物致知。

更多Matlab仿真内容点击👇

智能优化算法       神经网络预测       雷达通信      无线传感器        电力系统

信号处理              图像处理               路径规划       元胞自动机        无人机

⛄ 内容介绍

基于卷积循环神经网络的有杆泵工况预警方法,具有如下步骤:有杆泵渐变型工况图集进行预处理后,将预处理后的工况图集输入卷积神经网络CNN进行训练;CNN输出得到目标工况图集所对应的特征序列;训练循环神经网络RNN,提取特征序列的深度特征,建立渐变型工况图集的特征模板,对有杆泵故障进行判断.本发明采用卷积循环神经网络,在传统的通过示功体判断有杆泵工况方法中加入时间因素,用以判别与时间序列有关的信息,对于发生渐变型故障的油井,在先期就提出预警,通知现场工作人员及时进行处理,节约资源,实现经济高效的生产.同时,学习与更新后卷积循环神经网络随着使用的过程越来越智能,效果越来越好.

⛄ 部分代码

function lgraph = resnet50(inputSize,numResponses)

%% RESNET50 Create Deep Learning Network Architecture

% Script for creating the layers for a deep learning network with the following

% properties:


% Generate MATLAB Code From Deep Network Designer>.

%

% Auto-generated by MATLAB on 27-Apr-2021 09:05:58

%% Create Layer Graph

% Create the layer graph variable to contain the network layers.

lgraph = layerGraph();

%% Add Layer Branches

% Add the branches of the network to the layer graph. Each branch is a linear

% array of layers.

tempLayers = [

   % change the first input layer and procee for the folding layer

   sequenceInputLayer([inputSize 1 1],"Name","input")

   sequenceFoldingLayer("Name","fold")];

lgraph = addLayers(lgraph,tempLayers);

% generic resnet50

tempLayers = [

   convolution2dLayer([5 5],64,"Name","conv1","Padding","same","Stride",[2 2])

   batchNormalizationLayer("Name","bn_conv1","Epsilon",0.001)

   reluLayer("Name","activation_1_relu")

   maxPooling2dLayer([3 3],"Name","max_pooling2d_1","Padding",[1 1 1 1],"Stride",[2 2])];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res2a_branch1","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2a_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_2_relu")

   convolution2dLayer([3 3],64,"Name","res2a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_3_relu")

   convolution2dLayer([1 1],256,"Name","res2a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_1")

   reluLayer("Name","activation_4_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_5_relu")

   convolution2dLayer([3 3],64,"Name","res2b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_6_relu")

   convolution2dLayer([1 1],256,"Name","res2b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_2")

   reluLayer("Name","activation_7_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_8_relu")

   convolution2dLayer([3 3],64,"Name","res2c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_9_relu")

   convolution2dLayer([1 1],256,"Name","res2c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_3")

   reluLayer("Name","activation_10_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn3a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_11_relu")

   convolution2dLayer([3 3],128,"Name","res3a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_12_relu")

   convolution2dLayer([1 1],512,"Name","res3a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res3a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn3a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_4")

   reluLayer("Name","activation_13_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_14_relu")

   convolution2dLayer([3 3],128,"Name","res3b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_15_relu")

   convolution2dLayer([1 1],512,"Name","res3b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_5")

   reluLayer("Name","activation_16_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_17_relu")

   convolution2dLayer([3 3],128,"Name","res3c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_18_relu")

   convolution2dLayer([1 1],512,"Name","res3c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_6")

   reluLayer("Name","activation_19_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3d_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3d_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_20_relu")

   convolution2dLayer([3 3],128,"Name","res3d_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3d_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_21_relu")

   convolution2dLayer([1 1],512,"Name","res3d_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3d_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_7")

   reluLayer("Name","activation_22_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],1024,"Name","res4a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn4a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn4a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_23_relu")

   convolution2dLayer([3 3],256,"Name","res4a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_24_relu")

   convolution2dLayer([1 1],1024,"Name","res4a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_8")

   reluLayer("Name","activation_25_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_26_relu")

   convolution2dLayer([3 3],256,"Name","res4b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_27_relu")

   convolution2dLayer([1 1],1024,"Name","res4b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_9")

   reluLayer("Name","activation_28_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_29_relu")

   convolution2dLayer([3 3],256,"Name","res4c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_30_relu")

   convolution2dLayer([1 1],1024,"Name","res4c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_10")

   reluLayer("Name","activation_31_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4d_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4d_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_32_relu")

   convolution2dLayer([3 3],256,"Name","res4d_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4d_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_33_relu")

   convolution2dLayer([1 1],1024,"Name","res4d_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4d_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_11")

   reluLayer("Name","activation_34_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4e_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4e_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_35_relu")

   convolution2dLayer([3 3],256,"Name","res4e_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4e_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_36_relu")

   convolution2dLayer([1 1],1024,"Name","res4e_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4e_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_12")

   reluLayer("Name","activation_37_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4f_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4f_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_38_relu")

   convolution2dLayer([3 3],256,"Name","res4f_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4f_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_39_relu")

   convolution2dLayer([1 1],1024,"Name","res4f_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4f_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_13")

   reluLayer("Name","activation_40_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],2048,"Name","res5a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn5a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn5a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_41_relu")

   convolution2dLayer([3 3],512,"Name","res5a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_42_relu")

   convolution2dLayer([1 1],2048,"Name","res5a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_14")

   reluLayer("Name","activation_43_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_44_relu")

   convolution2dLayer([3 3],512,"Name","res5b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_45_relu")

   convolution2dLayer([1 1],2048,"Name","res5b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_15")

   reluLayer("Name","activation_46_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_47_relu")

   convolution2dLayer([3 3],512,"Name","res5c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_48_relu")

   convolution2dLayer([1 1],2048,"Name","res5c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_16")

   reluLayer("Name","activation_49_relu")

   globalAveragePooling2dLayer("Name","avg_pool")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   % unfold here and the flatten

   sequenceUnfoldingLayer("Name","sequnfold")

   flattenLayer("Name","flatten")

   

   

   % from here the RNN design. Feel free to add or remove layers

   gruLayer(128,'Name','gru1','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   lstmLayer(64,'Name','gru2','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   dropoutLayer(0.25,'Name','drop2')

   % this last part you must change the outputmode to last

   lstmLayer(32,'OutputMode',"last",'Name','bil4','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   dropoutLayer(0.25,'Name','drop3')

   % here finish the RNN design

   

   

   fullyConnectedLayer(numResponses,"Name","fc")

   regressionLayer("Name","regressionoutput")];

lgraph = addLayers(lgraph,tempLayers);

% clean up helper variable

clear tempLayers;

%% Connect Layer Branches

% Connect all the branches of the network to create the network graph.

lgraph = connectLayers(lgraph,"fold/out","conv1");

lgraph = connectLayers(lgraph,"fold/miniBatchSize","sequnfold/miniBatchSize");

lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch1");

lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch2a");

lgraph = connectLayers(lgraph,"bn2a_branch1","add_1/in2");

lgraph = connectLayers(lgraph,"bn2a_branch2c","add_1/in1");

lgraph = connectLayers(lgraph,"activation_4_relu","res2b_branch2a");

lgraph = connectLayers(lgraph,"activation_4_relu","add_2/in2");

lgraph = connectLayers(lgraph,"bn2b_branch2c","add_2/in1");

lgraph = connectLayers(lgraph,"activation_7_relu","res2c_branch2a");

lgraph = connectLayers(lgraph,"activation_7_relu","add_3/in2");

lgraph = connectLayers(lgraph,"bn2c_branch2c","add_3/in1");

lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch2a");

lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch1");

lgraph = connectLayers(lgraph,"bn3a_branch2c","add_4/in1");

lgraph = connectLayers(lgraph,"bn3a_branch1","add_4/in2");

lgraph = connectLayers(lgraph,"activation_13_relu","res3b_branch2a");

lgraph = connectLayers(lgraph,"activation_13_relu","add_5/in2");

lgraph = connectLayers(lgraph,"bn3b_branch2c","add_5/in1");

lgraph = connectLayers(lgraph,"activation_16_relu","res3c_branch2a");

lgraph = connectLayers(lgraph,"activation_16_relu","add_6/in2");

lgraph = connectLayers(lgraph,"bn3c_branch2c","add_6/in1");

lgraph = connectLayers(lgraph,"activation_19_relu","res3d_branch2a");

lgraph = connectLayers(lgraph,"activation_19_relu","add_7/in2");

lgraph = connectLayers(lgraph,"bn3d_branch2c","add_7/in1");

lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch1");

lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch2a");

lgraph = connectLayers(lgraph,"bn4a_branch1","add_8/in2");

lgraph = connectLayers(lgraph,"bn4a_branch2c","add_8/in1");

lgraph = connectLayers(lgraph,"activation_25_relu","res4b_branch2a");

lgraph = connectLayers(lgraph,"activation_25_relu","add_9/in2");

lgraph = connectLayers(lgraph,"bn4b_branch2c","add_9/in1");

lgraph = connectLayers(lgraph,"activation_28_relu","res4c_branch2a");

lgraph = connectLayers(lgraph,"activation_28_relu","add_10/in2");

lgraph = connectLayers(lgraph,"bn4c_branch2c","add_10/in1");

lgraph = connectLayers(lgraph,"activation_31_relu","res4d_branch2a");

lgraph = connectLayers(lgraph,"activation_31_relu","add_11/in2");

lgraph = connectLayers(lgraph,"bn4d_branch2c","add_11/in1");

lgraph = connectLayers(lgraph,"activation_34_relu","res4e_branch2a");

lgraph = connectLayers(lgraph,"activation_34_relu","add_12/in2");

lgraph = connectLayers(lgraph,"bn4e_branch2c","add_12/in1");

lgraph = connectLayers(lgraph,"activation_37_relu","res4f_branch2a");

lgraph = connectLayers(lgraph,"activation_37_relu","add_13/in2");

lgraph = connectLayers(lgraph,"bn4f_branch2c","add_13/in1");

lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch1");

lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch2a");

lgraph = connectLayers(lgraph,"bn5a_branch2c","add_14/in1");

lgraph = connectLayers(lgraph,"bn5a_branch1","add_14/in2");

lgraph = connectLayers(lgraph,"activation_43_relu","res5b_branch2a");

lgraph = connectLayers(lgraph,"activation_43_relu","add_15/in2");

lgraph = connectLayers(lgraph,"bn5b_branch2c","add_15/in1");

lgraph = connectLayers(lgraph,"activation_46_relu","res5c_branch2a");

lgraph = connectLayers(lgraph,"activation_46_relu","add_16/in2");

lgraph = connectLayers(lgraph,"bn5c_branch2c","add_16/in1");

lgraph = connectLayers(lgraph,"avg_pool","sequnfold/in");

%% Plot Layers

% plot(lgraph);

⛄ 运行结果

⛄ 参考文献

[1]周克强, 王浩竣. 基于卷积神经网络和循环神经网络的源代码漏洞挖掘研究[J]. 数码设计(下), 2019.

[2]谢子凡, 陈志, 岳文静,等. 基于卷积神经网络和循环神经网络的人体行为识别方法:, CN110321833A[P]. 2019.

[3]赫晓慧罗浩田乔梦佳田智慧周广胜. 基于CNN-RNN网络的中国冬小麦估产[J]. 农业工程学报, 2021, 37(17):124-132.

⛳️ 完整代码

❤️部分理论引用网络文献,若有侵权联系博主删除
❤️ 关注我领取海量matlab电子书和数学建模资料


相关文章
|
5天前
|
机器学习/深度学习 算法 计算机视觉
卷积神经网络(CNN)的工作原理深度解析
【6月更文挑战第14天】本文深度解析卷积神经网络(CNN)的工作原理。CNN由输入层、卷积层、激活函数、池化层、全连接层和输出层构成。卷积层通过滤波器提取特征,激活函数增加非线性,池化层降低维度。全连接层整合特征,输出层根据任务产生预测。CNN通过特征提取、整合、反向传播和优化进行学习。尽管存在计算量大、参数多等问题,但随着技术发展,CNN在计算机视觉领域的潜力将持续增长。
|
5天前
|
机器学习/深度学习 PyTorch 算法框架/工具
【从零开始学习深度学习】28.卷积神经网络之NiN模型介绍及其Pytorch实现【含完整代码】
【从零开始学习深度学习】28.卷积神经网络之NiN模型介绍及其Pytorch实现【含完整代码】
|
5天前
|
机器学习/深度学习 PyTorch 算法框架/工具
【从零开始学习深度学习】26.卷积神经网络之AlexNet模型介绍及其Pytorch实现【含完整代码】
【从零开始学习深度学习】26.卷积神经网络之AlexNet模型介绍及其Pytorch实现【含完整代码】
|
5天前
|
机器学习/深度学习
【从零开始学习深度学习】23. CNN中的多通道输入及多通道输出计算方式及1X1卷积层介绍
【从零开始学习深度学习】23. CNN中的多通道输入及多通道输出计算方式及1X1卷积层介绍
【从零开始学习深度学习】23. CNN中的多通道输入及多通道输出计算方式及1X1卷积层介绍
|
1天前
|
监控 网络协议 安全
Verilog代码在上网行为管理软件中的网络设备控制与协议分析
**Verilog摘要:** Verilog是硬件描述语言,用于数字电路设计和网络设备控制。它在上网行为管理软件中用于控制路由器、交换机等,例如通过简单模块控制端口状态。此外,Verilog还支持协议分析,如解析以太网帧提取MAC地址。监控数据可结合Verilog实现自动化提交,例如通过HTTP发送请求到服务器,实现实时监控和响应。这些示例展示了Verilog在网络安全和性能优化中的应用潜力。
20 1
|
4天前
|
机器学习/深度学习 自然语言处理 前端开发
深度学习-[数据集+完整代码]基于卷积神经网络的缺陷检测
深度学习-[数据集+完整代码]基于卷积神经网络的缺陷检测
|
5天前
|
机器学习/深度学习 自然语言处理 并行计算
YOLOv8改进 | 注意力机制 | 在主干网络中添加MHSA模块【原理+附完整代码】
Transformer中的多头自注意力机制(Multi-Head Self-Attention, MHSA)被用来增强模型捕捉序列数据中复杂关系的能力。该机制通过并行计算多个注意力头,使模型能关注不同位置和子空间的特征,提高了表示多样性。在YOLOv8的改进中,可以将MHSA代码添加到`/ultralytics/ultralytics/nn/modules/conv.py`,以增强网络的表示能力。完整实现和教程可在提供的链接中找到。
|
5天前
|
机器学习/深度学习 算法
m基于PSO-GRU粒子群优化长门控循环单元网络的电力负荷数据预测算法matlab仿真
摘要: 在MATLAB 2022a中,对比了电力负荷预测算法优化前后的效果。优化前为"Ttttttt111222",优化后为"Tttttttt333444",明显改进体现为"Tttttttttt5555"。该算法结合了粒子群优化(PSO)和长门控循环单元(GRU)网络,利用PSO优化GRU的超参数,提升预测准确性和稳定性。PSO模仿鸟群行为寻找最优解,而GRU通过更新门和重置门处理长期依赖问题。核心MATLAB程序展示了训练和预测过程,包括使用'adam'优化器和超参数调整,最终评估并保存预测结果。
15 0
|
5天前
|
机器学习/深度学习 PyTorch 算法框架/工具
【从零开始学习深度学习】29.卷积神经网络之GoogLeNet模型介绍及用Pytorch实现GoogLeNet模型【含完整代码】
【从零开始学习深度学习】29.卷积神经网络之GoogLeNet模型介绍及用Pytorch实现GoogLeNet模型【含完整代码】
|
5天前
|
机器学习/深度学习 PyTorch 算法框架/工具
【从零开始学习深度学习】27.卷积神经网络之VGG11模型介绍及其Pytorch实现【含完整代码】
【从零开始学习深度学习】27.卷积神经网络之VGG11模型介绍及其Pytorch实现【含完整代码】

热门文章

最新文章