✅作者简介:热爱科研的Matlab仿真开发者,修心和技术同步精进,matlab项目合作可私信。
🍎个人主页:Matlab科研工作室
🍊个人信条:格物致知。
更多Matlab仿真内容点击👇
⛄ 内容介绍
基于卷积循环神经网络的有杆泵工况预警方法,具有如下步骤:有杆泵渐变型工况图集进行预处理后,将预处理后的工况图集输入卷积神经网络CNN进行训练;CNN输出得到目标工况图集所对应的特征序列;训练循环神经网络RNN,提取特征序列的深度特征,建立渐变型工况图集的特征模板,对有杆泵故障进行判断.本发明采用卷积循环神经网络,在传统的通过示功体判断有杆泵工况方法中加入时间因素,用以判别与时间序列有关的信息,对于发生渐变型故障的油井,在先期就提出预警,通知现场工作人员及时进行处理,节约资源,实现经济高效的生产.同时,学习与更新后卷积循环神经网络随着使用的过程越来越智能,效果越来越好.
⛄ 部分代码
function lgraph = resnet50(inputSize,numResponses)
%% RESNET50 Create Deep Learning Network Architecture
% Script for creating the layers for a deep learning network with the following
% properties:
% Generate MATLAB Code From Deep Network Designer>.
%
% Auto-generated by MATLAB on 27-Apr-2021 09:05:58
%% Create Layer Graph
% Create the layer graph variable to contain the network layers.
lgraph = layerGraph();
%% Add Layer Branches
% Add the branches of the network to the layer graph. Each branch is a linear
% array of layers.
tempLayers = [
% change the first input layer and procee for the folding layer
sequenceInputLayer([inputSize 1 1],"Name","input")
sequenceFoldingLayer("Name","fold")];
lgraph = addLayers(lgraph,tempLayers);
% generic resnet50
tempLayers = [
convolution2dLayer([5 5],64,"Name","conv1","Padding","same","Stride",[2 2])
batchNormalizationLayer("Name","bn_conv1","Epsilon",0.001)
reluLayer("Name","activation_1_relu")
maxPooling2dLayer([3 3],"Name","max_pooling2d_1","Padding",[1 1 1 1],"Stride",[2 2])];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res2a_branch1","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2a_branch1","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],64,"Name","res2a_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2a_branch2a","Epsilon",0.001)
reluLayer("Name","activation_2_relu")
convolution2dLayer([3 3],64,"Name","res2a_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn2a_branch2b","Epsilon",0.001)
reluLayer("Name","activation_3_relu")
convolution2dLayer([1 1],256,"Name","res2a_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2a_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_1")
reluLayer("Name","activation_4_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],64,"Name","res2b_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2b_branch2a","Epsilon",0.001)
reluLayer("Name","activation_5_relu")
convolution2dLayer([3 3],64,"Name","res2b_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn2b_branch2b","Epsilon",0.001)
reluLayer("Name","activation_6_relu")
convolution2dLayer([1 1],256,"Name","res2b_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2b_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_2")
reluLayer("Name","activation_7_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],64,"Name","res2c_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2c_branch2a","Epsilon",0.001)
reluLayer("Name","activation_8_relu")
convolution2dLayer([3 3],64,"Name","res2c_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn2c_branch2b","Epsilon",0.001)
reluLayer("Name","activation_9_relu")
convolution2dLayer([1 1],256,"Name","res2c_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn2c_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_3")
reluLayer("Name","activation_10_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","res3a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn3a_branch2a","Epsilon",0.001)
reluLayer("Name","activation_11_relu")
convolution2dLayer([3 3],128,"Name","res3a_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn3a_branch2b","Epsilon",0.001)
reluLayer("Name","activation_12_relu")
convolution2dLayer([1 1],512,"Name","res3a_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3a_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","res3a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn3a_branch1","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_4")
reluLayer("Name","activation_13_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","res3b_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3b_branch2a","Epsilon",0.001)
reluLayer("Name","activation_14_relu")
convolution2dLayer([3 3],128,"Name","res3b_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn3b_branch2b","Epsilon",0.001)
reluLayer("Name","activation_15_relu")
convolution2dLayer([1 1],512,"Name","res3b_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3b_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_5")
reluLayer("Name","activation_16_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","res3c_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3c_branch2a","Epsilon",0.001)
reluLayer("Name","activation_17_relu")
convolution2dLayer([3 3],128,"Name","res3c_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn3c_branch2b","Epsilon",0.001)
reluLayer("Name","activation_18_relu")
convolution2dLayer([1 1],512,"Name","res3c_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3c_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_6")
reluLayer("Name","activation_19_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],128,"Name","res3d_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3d_branch2a","Epsilon",0.001)
reluLayer("Name","activation_20_relu")
convolution2dLayer([3 3],128,"Name","res3d_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn3d_branch2b","Epsilon",0.001)
reluLayer("Name","activation_21_relu")
convolution2dLayer([1 1],512,"Name","res3d_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn3d_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_7")
reluLayer("Name","activation_22_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],1024,"Name","res4a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn4a_branch1","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn4a_branch2a","Epsilon",0.001)
reluLayer("Name","activation_23_relu")
convolution2dLayer([3 3],256,"Name","res4a_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4a_branch2b","Epsilon",0.001)
reluLayer("Name","activation_24_relu")
convolution2dLayer([1 1],1024,"Name","res4a_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4a_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_8")
reluLayer("Name","activation_25_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4b_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4b_branch2a","Epsilon",0.001)
reluLayer("Name","activation_26_relu")
convolution2dLayer([3 3],256,"Name","res4b_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4b_branch2b","Epsilon",0.001)
reluLayer("Name","activation_27_relu")
convolution2dLayer([1 1],1024,"Name","res4b_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4b_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_9")
reluLayer("Name","activation_28_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4c_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4c_branch2a","Epsilon",0.001)
reluLayer("Name","activation_29_relu")
convolution2dLayer([3 3],256,"Name","res4c_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4c_branch2b","Epsilon",0.001)
reluLayer("Name","activation_30_relu")
convolution2dLayer([1 1],1024,"Name","res4c_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4c_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_10")
reluLayer("Name","activation_31_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4d_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4d_branch2a","Epsilon",0.001)
reluLayer("Name","activation_32_relu")
convolution2dLayer([3 3],256,"Name","res4d_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4d_branch2b","Epsilon",0.001)
reluLayer("Name","activation_33_relu")
convolution2dLayer([1 1],1024,"Name","res4d_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4d_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_11")
reluLayer("Name","activation_34_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4e_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4e_branch2a","Epsilon",0.001)
reluLayer("Name","activation_35_relu")
convolution2dLayer([3 3],256,"Name","res4e_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4e_branch2b","Epsilon",0.001)
reluLayer("Name","activation_36_relu")
convolution2dLayer([1 1],1024,"Name","res4e_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4e_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_12")
reluLayer("Name","activation_37_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],256,"Name","res4f_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4f_branch2a","Epsilon",0.001)
reluLayer("Name","activation_38_relu")
convolution2dLayer([3 3],256,"Name","res4f_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn4f_branch2b","Epsilon",0.001)
reluLayer("Name","activation_39_relu")
convolution2dLayer([1 1],1024,"Name","res4f_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn4f_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_13")
reluLayer("Name","activation_40_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],2048,"Name","res5a_branch1","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn5a_branch1","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","res5a_branch2a","BiasLearnRateFactor",0,"Stride",[2 2])
batchNormalizationLayer("Name","bn5a_branch2a","Epsilon",0.001)
reluLayer("Name","activation_41_relu")
convolution2dLayer([3 3],512,"Name","res5a_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn5a_branch2b","Epsilon",0.001)
reluLayer("Name","activation_42_relu")
convolution2dLayer([1 1],2048,"Name","res5a_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn5a_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_14")
reluLayer("Name","activation_43_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","res5b_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn5b_branch2a","Epsilon",0.001)
reluLayer("Name","activation_44_relu")
convolution2dLayer([3 3],512,"Name","res5b_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn5b_branch2b","Epsilon",0.001)
reluLayer("Name","activation_45_relu")
convolution2dLayer([1 1],2048,"Name","res5b_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn5b_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_15")
reluLayer("Name","activation_46_relu")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
convolution2dLayer([1 1],512,"Name","res5c_branch2a","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn5c_branch2a","Epsilon",0.001)
reluLayer("Name","activation_47_relu")
convolution2dLayer([3 3],512,"Name","res5c_branch2b","BiasLearnRateFactor",0,"Padding","same")
batchNormalizationLayer("Name","bn5c_branch2b","Epsilon",0.001)
reluLayer("Name","activation_48_relu")
convolution2dLayer([1 1],2048,"Name","res5c_branch2c","BiasLearnRateFactor",0)
batchNormalizationLayer("Name","bn5c_branch2c","Epsilon",0.001)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name","add_16")
reluLayer("Name","activation_49_relu")
globalAveragePooling2dLayer("Name","avg_pool")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
% unfold here and the flatten
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
% from here the RNN design. Feel free to add or remove layers
gruLayer(128,'Name','gru1','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')
lstmLayer(64,'Name','gru2','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')
dropoutLayer(0.25,'Name','drop2')
% this last part you must change the outputmode to last
lstmLayer(32,'OutputMode',"last",'Name','bil4','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')
dropoutLayer(0.25,'Name','drop3')
% here finish the RNN design
fullyConnectedLayer(numResponses,"Name","fc")
regressionLayer("Name","regressionoutput")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
%% Connect Layer Branches
% Connect all the branches of the network to create the network graph.
lgraph = connectLayers(lgraph,"fold/out","conv1");
lgraph = connectLayers(lgraph,"fold/miniBatchSize","sequnfold/miniBatchSize");
lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch1");
lgraph = connectLayers(lgraph,"max_pooling2d_1","res2a_branch2a");
lgraph = connectLayers(lgraph,"bn2a_branch1","add_1/in2");
lgraph = connectLayers(lgraph,"bn2a_branch2c","add_1/in1");
lgraph = connectLayers(lgraph,"activation_4_relu","res2b_branch2a");
lgraph = connectLayers(lgraph,"activation_4_relu","add_2/in2");
lgraph = connectLayers(lgraph,"bn2b_branch2c","add_2/in1");
lgraph = connectLayers(lgraph,"activation_7_relu","res2c_branch2a");
lgraph = connectLayers(lgraph,"activation_7_relu","add_3/in2");
lgraph = connectLayers(lgraph,"bn2c_branch2c","add_3/in1");
lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch2a");
lgraph = connectLayers(lgraph,"activation_10_relu","res3a_branch1");
lgraph = connectLayers(lgraph,"bn3a_branch2c","add_4/in1");
lgraph = connectLayers(lgraph,"bn3a_branch1","add_4/in2");
lgraph = connectLayers(lgraph,"activation_13_relu","res3b_branch2a");
lgraph = connectLayers(lgraph,"activation_13_relu","add_5/in2");
lgraph = connectLayers(lgraph,"bn3b_branch2c","add_5/in1");
lgraph = connectLayers(lgraph,"activation_16_relu","res3c_branch2a");
lgraph = connectLayers(lgraph,"activation_16_relu","add_6/in2");
lgraph = connectLayers(lgraph,"bn3c_branch2c","add_6/in1");
lgraph = connectLayers(lgraph,"activation_19_relu","res3d_branch2a");
lgraph = connectLayers(lgraph,"activation_19_relu","add_7/in2");
lgraph = connectLayers(lgraph,"bn3d_branch2c","add_7/in1");
lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch1");
lgraph = connectLayers(lgraph,"activation_22_relu","res4a_branch2a");
lgraph = connectLayers(lgraph,"bn4a_branch1","add_8/in2");
lgraph = connectLayers(lgraph,"bn4a_branch2c","add_8/in1");
lgraph = connectLayers(lgraph,"activation_25_relu","res4b_branch2a");
lgraph = connectLayers(lgraph,"activation_25_relu","add_9/in2");
lgraph = connectLayers(lgraph,"bn4b_branch2c","add_9/in1");
lgraph = connectLayers(lgraph,"activation_28_relu","res4c_branch2a");
lgraph = connectLayers(lgraph,"activation_28_relu","add_10/in2");
lgraph = connectLayers(lgraph,"bn4c_branch2c","add_10/in1");
lgraph = connectLayers(lgraph,"activation_31_relu","res4d_branch2a");
lgraph = connectLayers(lgraph,"activation_31_relu","add_11/in2");
lgraph = connectLayers(lgraph,"bn4d_branch2c","add_11/in1");
lgraph = connectLayers(lgraph,"activation_34_relu","res4e_branch2a");
lgraph = connectLayers(lgraph,"activation_34_relu","add_12/in2");
lgraph = connectLayers(lgraph,"bn4e_branch2c","add_12/in1");
lgraph = connectLayers(lgraph,"activation_37_relu","res4f_branch2a");
lgraph = connectLayers(lgraph,"activation_37_relu","add_13/in2");
lgraph = connectLayers(lgraph,"bn4f_branch2c","add_13/in1");
lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch1");
lgraph = connectLayers(lgraph,"activation_40_relu","res5a_branch2a");
lgraph = connectLayers(lgraph,"bn5a_branch2c","add_14/in1");
lgraph = connectLayers(lgraph,"bn5a_branch1","add_14/in2");
lgraph = connectLayers(lgraph,"activation_43_relu","res5b_branch2a");
lgraph = connectLayers(lgraph,"activation_43_relu","add_15/in2");
lgraph = connectLayers(lgraph,"bn5b_branch2c","add_15/in1");
lgraph = connectLayers(lgraph,"activation_46_relu","res5c_branch2a");
lgraph = connectLayers(lgraph,"activation_46_relu","add_16/in2");
lgraph = connectLayers(lgraph,"bn5c_branch2c","add_16/in1");
lgraph = connectLayers(lgraph,"avg_pool","sequnfold/in");
%% Plot Layers
% plot(lgraph);
⛄ 运行结果
⛄ 参考文献
[1]周克强, 王浩竣. 基于卷积神经网络和循环神经网络的源代码漏洞挖掘研究[J]. 数码设计(下), 2019.
[2]谢子凡, 陈志, 岳文静,等. 基于卷积神经网络和循环神经网络的人体行为识别方法:, CN110321833A[P]. 2019.
[3]赫晓慧罗浩田乔梦佳田智慧周广胜. 基于CNN-RNN网络的中国冬小麦估产[J]. 农业工程学报, 2021, 37(17):124-132.