【CNN时序预测】基于混合卷积神经网络和循环神经网络 CNN - RNN 实现时间序列预测附Matlab代码

简介: 【CNN时序预测】基于混合卷积神经网络和循环神经网络 CNN - RNN 实现时间序列预测附Matlab代码

✅作者简介:热爱科研的Matlab仿真开发者,修心和技术同步精进,matlab项目合作可私信。

🍎个人主页:Matlab科研工作室

🍊个人信条:格物致知。

更多Matlab仿真内容点击👇

智能优化算法       神经网络预测       雷达通信      无线传感器        电力系统

信号处理              图像处理               路径规划       元胞自动机        无人机

⛄ 内容介绍

基于卷积循环神经网络的有杆泵工况预警方法,具有如下步骤:有杆泵渐变型工况图集进行预处理后,将预处理后的工况图集输入卷积神经网络CNN进行训练;CNN输出得到目标工况图集所对应的特征序列;训练循环神经网络RNN,提取特征序列的深度特征,建立渐变型工况图集的特征模板,对有杆泵故障进行判断.本发明采用卷积循环神经网络,在传统的通过示功体判断有杆泵工况方法中加入时间因素,用以判别与时间序列有关的信息,对于发生渐变型故障的油井,在先期就提出预警,通知现场工作人员及时进行处理,节约资源,实现经济高效的生产.同时,学习与更新后卷积循环神经网络随着使用的过程越来越智能,效果越来越好.

⛄ 部分代码

function lgraph = resnet50(inputSize,numResponses)

%% RESNET50 Create Deep Learning Network Architecture

% Script for creating the layers for a deep learning network with the following

% properties:


% Generate MATLAB Code From Deep Network Designer>.

%

% Auto-generated by MATLAB on 27-Apr-2021 09:05:58

%% Create Layer Graph

% Create the layer graph variable to contain the network layers.

lgraph = layerGraph();

%% Add Layer Branches

% Add the branches of the network to the layer graph. Each branch is a linear

% array of layers.

tempLayers = [

   % change the first input layer and procee for the folding layer

   sequenceInputLayer([inputSize 1 1],"Name","input")

   sequenceFoldingLayer("Name","fold")];

lgraph = addLayers(lgraph,tempLayers);

% generic resnet50

tempLayers = [

   convolution2dLayer([5 5],64,"Name","conv1","Padding","same","Stride",[2 2])

   batchNormalizationLayer("Name","bn_conv1","Epsilon",0.001)

   reluLayer("Name","activation_1_relu")

   maxPooling2dLayer([3 3],"Name","max_pooling2d_1","Padding",[1 1 1 1],"Stride",[2 2])];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res2a_branch1","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2a_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_2_relu")

   convolution2dLayer([3 3],64,"Name","res2a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_3_relu")

   convolution2dLayer([1 1],256,"Name","res2a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_1")

   reluLayer("Name","activation_4_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_5_relu")

   convolution2dLayer([3 3],64,"Name","res2b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_6_relu")

   convolution2dLayer([1 1],256,"Name","res2b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_2")

   reluLayer("Name","activation_7_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],64,"Name","res2c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_8_relu")

   convolution2dLayer([3 3],64,"Name","res2c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn2c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_9_relu")

   convolution2dLayer([1 1],256,"Name","res2c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn2c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_3")

   reluLayer("Name","activation_10_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn3a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_11_relu")

   convolution2dLayer([3 3],128,"Name","res3a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_12_relu")

   convolution2dLayer([1 1],512,"Name","res3a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res3a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn3a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_4")

   reluLayer("Name","activation_13_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_14_relu")

   convolution2dLayer([3 3],128,"Name","res3b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_15_relu")

   convolution2dLayer([1 1],512,"Name","res3b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_5")

   reluLayer("Name","activation_16_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_17_relu")

   convolution2dLayer([3 3],128,"Name","res3c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_18_relu")

   convolution2dLayer([1 1],512,"Name","res3c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_6")

   reluLayer("Name","activation_19_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],128,"Name","res3d_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3d_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_20_relu")

   convolution2dLayer([3 3],128,"Name","res3d_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn3d_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_21_relu")

   convolution2dLayer([1 1],512,"Name","res3d_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn3d_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_7")

   reluLayer("Name","activation_22_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],1024,"Name","res4a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn4a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn4a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_23_relu")

   convolution2dLayer([3 3],256,"Name","res4a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_24_relu")

   convolution2dLayer([1 1],1024,"Name","res4a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_8")

   reluLayer("Name","activation_25_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_26_relu")

   convolution2dLayer([3 3],256,"Name","res4b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_27_relu")

   convolution2dLayer([1 1],1024,"Name","res4b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_9")

   reluLayer("Name","activation_28_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_29_relu")

   convolution2dLayer([3 3],256,"Name","res4c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_30_relu")

   convolution2dLayer([1 1],1024,"Name","res4c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_10")

   reluLayer("Name","activation_31_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4d_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4d_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_32_relu")

   convolution2dLayer([3 3],256,"Name","res4d_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4d_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_33_relu")

   convolution2dLayer([1 1],1024,"Name","res4d_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4d_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_11")

   reluLayer("Name","activation_34_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4e_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4e_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_35_relu")

   convolution2dLayer([3 3],256,"Name","res4e_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4e_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_36_relu")

   convolution2dLayer([1 1],1024,"Name","res4e_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4e_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_12")

   reluLayer("Name","activation_37_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],256,"Name","res4f_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4f_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_38_relu")

   convolution2dLayer([3 3],256,"Name","res4f_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn4f_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_39_relu")

   convolution2dLayer([1 1],1024,"Name","res4f_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn4f_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_13")

   reluLayer("Name","activation_40_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],2048,"Name","res5a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn5a_branch1","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])

   batchNormalizationLayer("Name","bn5a_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_41_relu")

   convolution2dLayer([3 3],512,"Name","res5a_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5a_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_42_relu")

   convolution2dLayer([1 1],2048,"Name","res5a_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5a_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_14")

   reluLayer("Name","activation_43_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5b_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5b_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_44_relu")

   convolution2dLayer([3 3],512,"Name","res5b_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5b_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_45_relu")

   convolution2dLayer([1 1],2048,"Name","res5b_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5b_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_15")

   reluLayer("Name","activation_46_relu")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   convolution2dLayer([1 1],512,"Name","res5c_branch2a","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5c_branch2a","Epsilon",0.001)

   reluLayer("Name","activation_47_relu")

   convolution2dLayer([3 3],512,"Name","res5c_branch2b","BiasLearnRateFactor",0,"Padding","same")

   batchNormalizationLayer("Name","bn5c_branch2b","Epsilon",0.001)

   reluLayer("Name","activation_48_relu")

   convolution2dLayer([1 1],2048,"Name","res5c_branch2c","BiasLearnRateFactor",0)

   batchNormalizationLayer("Name","bn5c_branch2c","Epsilon",0.001)];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   additionLayer(2,"Name","add_16")

   reluLayer("Name","activation_49_relu")

   globalAveragePooling2dLayer("Name","avg_pool")];

lgraph = addLayers(lgraph,tempLayers);

tempLayers = [

   % unfold here and the flatten

   sequenceUnfoldingLayer("Name","sequnfold")

   flattenLayer("Name","flatten")

   

   

   % from here the RNN design. Feel free to add or remove layers

   gruLayer(128,'Name','gru1','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   lstmLayer(64,'Name','gru2','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   dropoutLayer(0.25,'Name','drop2')

   % this last part you must change the outputmode to last

   lstmLayer(32,'OutputMode',"last",'Name','bil4','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')

   dropoutLayer(0.25,'Name','drop3')

   % here finish the RNN design

   

   

   fullyConnectedLayer(numResponses,"Name","fc")

   regressionLayer("Name","regressionoutput")];

lgraph = addLayers(lgraph,tempLayers);

% clean up helper variable

clear tempLayers;

%% Connect Layer Branches

% Connect all the branches of the network to create the network graph.

lgraph = connectLayers(lgraph,"fold/out","conv1");

lgraph = connectLayers(lgraph,"fold/miniBatchSize","sequnfold/miniBatchSize");

lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch1");

lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch2a");

lgraph = connectLayers(lgraph,"bn2a_branch1","add_1/in2");

lgraph = connectLayers(lgraph,"bn2a_branch2c","add_1/in1");

lgraph = connectLayers(lgraph,"activation_4_relu","res2b_branch2a");

lgraph = connectLayers(lgraph,"activation_4_relu","add_2/in2");

lgraph = connectLayers(lgraph,"bn2b_branch2c","add_2/in1");

lgraph = connectLayers(lgraph,"activation_7_relu","res2c_branch2a");

lgraph = connectLayers(lgraph,"activation_7_relu","add_3/in2");

lgraph = connectLayers(lgraph,"bn2c_branch2c","add_3/in1");

lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch2a");

lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch1");

lgraph = connectLayers(lgraph,"bn3a_branch2c","add_4/in1");

lgraph = connectLayers(lgraph,"bn3a_branch1","add_4/in2");

lgraph = connectLayers(lgraph,"activation_13_relu","res3b_branch2a");

lgraph = connectLayers(lgraph,"activation_13_relu","add_5/in2");

lgraph = connectLayers(lgraph,"bn3b_branch2c","add_5/in1");

lgraph = connectLayers(lgraph,"activation_16_relu","res3c_branch2a");

lgraph = connectLayers(lgraph,"activation_16_relu","add_6/in2");

lgraph = connectLayers(lgraph,"bn3c_branch2c","add_6/in1");

lgraph = connectLayers(lgraph,"activation_19_relu","res3d_branch2a");

lgraph = connectLayers(lgraph,"activation_19_relu","add_7/in2");

lgraph = connectLayers(lgraph,"bn3d_branch2c","add_7/in1");

lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch1");

lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch2a");

lgraph = connectLayers(lgraph,"bn4a_branch1","add_8/in2");

lgraph = connectLayers(lgraph,"bn4a_branch2c","add_8/in1");

lgraph = connectLayers(lgraph,"activation_25_relu","res4b_branch2a");

lgraph = connectLayers(lgraph,"activation_25_relu","add_9/in2");

lgraph = connectLayers(lgraph,"bn4b_branch2c","add_9/in1");

lgraph = connectLayers(lgraph,"activation_28_relu","res4c_branch2a");

lgraph = connectLayers(lgraph,"activation_28_relu","add_10/in2");

lgraph = connectLayers(lgraph,"bn4c_branch2c","add_10/in1");

lgraph = connectLayers(lgraph,"activation_31_relu","res4d_branch2a");

lgraph = connectLayers(lgraph,"activation_31_relu","add_11/in2");

lgraph = connectLayers(lgraph,"bn4d_branch2c","add_11/in1");

lgraph = connectLayers(lgraph,"activation_34_relu","res4e_branch2a");

lgraph = connectLayers(lgraph,"activation_34_relu","add_12/in2");

lgraph = connectLayers(lgraph,"bn4e_branch2c","add_12/in1");

lgraph = connectLayers(lgraph,"activation_37_relu","res4f_branch2a");

lgraph = connectLayers(lgraph,"activation_37_relu","add_13/in2");

lgraph = connectLayers(lgraph,"bn4f_branch2c","add_13/in1");

lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch1");

lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch2a");

lgraph = connectLayers(lgraph,"bn5a_branch2c","add_14/in1");

lgraph = connectLayers(lgraph,"bn5a_branch1","add_14/in2");

lgraph = connectLayers(lgraph,"activation_43_relu","res5b_branch2a");

lgraph = connectLayers(lgraph,"activation_43_relu","add_15/in2");

lgraph = connectLayers(lgraph,"bn5b_branch2c","add_15/in1");

lgraph = connectLayers(lgraph,"activation_46_relu","res5c_branch2a");

lgraph = connectLayers(lgraph,"activation_46_relu","add_16/in2");

lgraph = connectLayers(lgraph,"bn5c_branch2c","add_16/in1");

lgraph = connectLayers(lgraph,"avg_pool","sequnfold/in");

%% Plot Layers

% plot(lgraph);

⛄ 运行结果

⛄ 参考文献

[1]周克强, 王浩竣. 基于卷积神经网络和循环神经网络的源代码漏洞挖掘研究[J]. 数码设计(下), 2019.

[2]谢子凡, 陈志, 岳文静,等. 基于卷积神经网络和循环神经网络的人体行为识别方法:, CN110321833A[P]. 2019.

[3]赫晓慧罗浩田乔梦佳田智慧周广胜. 基于CNN-RNN网络的中国冬小麦估产[J]. 农业工程学报, 2021, 37(17):124-132.

⛳️ 完整代码

❤️部分理论引用网络文献,若有侵权联系博主删除
❤️ 关注我领取海量matlab电子书和数学建模资料


相关文章
|
4月前
|
机器学习/深度学习 计算机视觉
RT-DETR改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
RT-DETR改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
170 3
RT-DETR改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
|
3月前
|
机器学习/深度学习 算法 数据安全/隐私保护
基于模糊神经网络的金融序列预测算法matlab仿真
本程序为基于模糊神经网络的金融序列预测算法MATLAB仿真,适用于非线性、不确定性金融数据预测。通过MAD、RSI、KD等指标实现序列预测与收益分析,运行环境为MATLAB2022A,完整程序无水印。算法结合模糊逻辑与神经网络技术,包含输入层、模糊化层、规则层等结构,可有效处理金融市场中的复杂关系,助力投资者制定交易策略。
|
6月前
|
机器学习/深度学习 算法
基于改进遗传优化的BP神经网络金融序列预测算法matlab仿真
本项目基于改进遗传优化的BP神经网络进行金融序列预测,使用MATLAB2022A实现。通过对比BP神经网络、遗传优化BP神经网络及改进遗传优化BP神经网络,展示了三者的误差和预测曲线差异。核心程序结合遗传算法(GA)与BP神经网络,利用GA优化BP网络的初始权重和阈值,提高预测精度。GA通过选择、交叉、变异操作迭代优化,防止局部收敛,增强模型对金融市场复杂性和不确定性的适应能力。
302 80
|
4月前
|
机器学习/深度学习 计算机视觉
YOLOv11改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
YOLOv11改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
203 9
YOLOv11改进策略【Neck】| ASF-YOLO 注意力尺度序列融合模块改进颈部网络,提高小目标检测精度
|
6月前
|
机器学习/深度学习 算法
基于遗传优化的双BP神经网络金融序列预测算法matlab仿真
本项目基于遗传优化的双BP神经网络实现金融序列预测,使用MATLAB2022A进行仿真。算法通过两个初始学习率不同的BP神经网络(e1, e2)协同工作,结合遗传算法优化,提高预测精度。实验展示了三个算法的误差对比结果,验证了该方法的有效性。
|
6月前
|
机器学习/深度学习 算法 Python
基于BP神经网络的金融序列预测matlab仿真
本项目基于BP神经网络实现金融序列预测,使用MATLAB2022A版本进行开发与测试。通过构建多层前馈神经网络模型,利用历史金融数据训练模型,实现对未来金融时间序列如股票价格、汇率等的预测,并展示了预测误差及训练曲线。
133 12
|
10月前
|
安全
【2023高教社杯】D题 圈养湖羊的空间利用率 问题分析、数学模型及MATLAB代码
本文介绍了2023年高教社杯数学建模竞赛D题的圈养湖羊空间利用率问题,包括问题分析、数学模型建立和MATLAB代码实现,旨在优化养殖场的生产计划和空间利用效率。
425 6
【2023高教社杯】D题 圈养湖羊的空间利用率 问题分析、数学模型及MATLAB代码
|
10月前
|
存储 算法 搜索推荐
【2022年华为杯数学建模】B题 方形件组批优化问题 方案及MATLAB代码实现
本文提供了2022年华为杯数学建模竞赛B题的详细方案和MATLAB代码实现,包括方形件组批优化问题和排样优化问题,以及相关数学模型的建立和求解方法。
247 3
【2022年华为杯数学建模】B题 方形件组批优化问题 方案及MATLAB代码实现
|
10月前
|
数据采集 存储 移动开发
【2023五一杯数学建模】 B题 快递需求分析问题 建模方案及MATLAB实现代码
本文介绍了2023年五一杯数学建模竞赛B题的解题方法,详细阐述了如何通过数学建模和MATLAB编程来分析快递需求、预测运输数量、优化运输成本,并估计固定和非固定需求,提供了完整的建模方案和代码实现。
409 0
【2023五一杯数学建模】 B题 快递需求分析问题 建模方案及MATLAB实现代码
|
数据安全/隐私保护
耐震时程曲线,matlab代码,自定义反应谱与地震波,优化源代码,地震波耐震时程曲线
地震波格式转换、时程转换、峰值调整、规范反应谱、计算反应谱、计算持时、生成人工波、时频域转换、数据滤波、基线校正、Arias截波、傅里叶变换、耐震时程曲线、脉冲波合成与提取、三联反应谱、地震动参数、延性反应谱、地震波缩尺、功率谱密度