点击下载原文: 基于神经网络的马尾松毛虫精细化预报Matlab建模试验.doc 基于神经网络的马尾松毛虫精细化预报 Matlab 建模试验 张国庆 (安徽省潜山县林业局) 1. 数据来源 马尾松毛虫发生量、发生期数据来源于潜山县监测数据,气象数据来源于国家气候中心。 2. 数据预处理 为了体现马尾松毛虫发生发展时间上的完整性,在数据处理时,将越冬代数据与上一年第二代数据合并,这样,就在时间上保持了一个马尾松毛虫世代的完整性,更便于建模和预测。 ( 1 ) 气象数据处理 根据《 松毛虫综合管理 》、《中国松毛虫》等学术资料以及近年来有关马尾松毛虫监测预报学术论文,初步选择与松毛虫发生量、发生期有一定相关性气象因子,包括卵期极低气温,卵期平均气温,卵期积温(日度),卵期降雨量,第 1 、 2 龄极低气温,第 1 、 2 龄平均气温,第 1 、 2 龄积温(日度),第 12 龄降雨量,幼虫期极低气温,幼虫期平均气温,幼虫期积温(日度),幼虫期降雨量,世代极低气温,世代平均气温,世代积温(日度),世代降雨量共 16 个变量。将来自于 国家气候中心的气象原始数据,按年度分世代转换成上述 16 个变量数据系列。 ( 2 ) 发生量数据处理 为了在建模时分析发生强度,在对潜山县 1983 ~ 2014 年原始监测数据预处理时,按照“轻”、“中”、“重” 3 个强度等级,分类按世代逐年汇总。 ( 3 ) 发生期数据处理 首先对潜山县 1983 ~ 2014 年原始发生期监测数据按世代逐年汇总,然后日期数据转换成日历天,使之数量化,以便于建模分析。 3. 因子变量选择 通过相关性分析和建模试验比较,第一代发生量因子变量选择第 1 、 2 龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积;第二代发生量因子变量选择第 1 、 2 龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积,第 1 、 2 龄降雨量,卵期降雨量;第一代幼虫高峰期因子变量选择第 1 、 2 龄平均气温,第 1 、 2 龄积温(日度),第 1 、 2 龄极低气温,卵期极低气温;第二代幼虫高峰期因子变量选择成虫始见期,卵期平均气温,卵期积温(日度),第 1 、 2 龄极低气温。 将第一代发生量变量命名为 s1y ,因变量命名为 s1x ;第二代发生量变量命名为 s2y ,因变量命名为 s2x ;第一代幼虫高峰期变量命名为 t1y ,因变量命名为 t1x ;第二代幼虫高峰期变量命名为 t2y ,因变量命名为 t2x 。 4. 第一代发生量建模试验 4.1 程序代码 程序代码( Simple Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 19:28:48 CST 2015 % % This script assumes these variables are defined: % % s1x - input data. % s1y - target data. x = s1x'; t = s1y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Setup Division of Data for Training, Validation, Testing net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) 程序代码( Advanced Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 19:29:03 CST 2015 % % This script assumes these variables are defined: % % s1x - input data. % s1y - target data. x = s1x'; t = s1y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm' ; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.input.processFcns = { 'removeconstantrows' , 'mapminmax' }; net.output.processFcns = { 'removeconstantrows' , 'mapminmax' }; % Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand' ; % Divide data randomly net.divideMode = 'sample' ; % Divide up every sample net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse' ; % Mean squared error % Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = { 'plotperform' , 'plottrainstate' , 'ploterrhist' , ... 'plotregression' , 'plotfit' }; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % Recalculate Training, Validation and Test Performance trainTargets = t .* tr.trainMask{1}; valTargets = t .* tr.valMask{1}; testTargets = t .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) % Deployment % Change the (false) values to (true) to enable the following code blocks. if (false) % Generate MATLAB function for neural network for application deployment % in MATLAB scripts or with MATLAB Compiler and Builder tools, or simply % to examine the calculations your trained neural network performs. genFunction(net, 'myNeuralNetworkFunction' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a matrix-only MATLAB function for neural network code % generation with MATLAB Coder tools. genFunction(net, 'myNeuralNetworkFunction' , 'MatrixOnly' , 'yes' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a Simulink diagram for simulation or deployment with. % Simulink Coder tools. gensim(net); end 4.2 网络训练过程 网络训练为: 4.3 训练结果 训练结果为: 训练样本、验证样本、测试样本的 R 值分别为 0.875337 、 -1 和 1 。 误差直方图为: 训练样本、验证样本、测试样本、所有数据回归图为: 验证样本和测试样本 R 值均为 1 。 5. 第二代发生量建模试验 5.1 程序代码 程序代码( Simple Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:04:18 CST 2015 % % This script assumes these variables are defined: % % s2x - input data. % s2y - target data. x = s2x'; t = s2y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Setup Division of Data for Training, Validation, Testing net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) 程序代码( Advanced Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:04:31 CST 2015 % % This script assumes these variables are defined: % % s2x - input data. % s2y - target data. x = s2x'; t = s2y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm' ; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.input.processFcns = { 'removeconstantrows' , 'mapminmax' }; net.output.processFcns = { 'removeconstantrows' , 'mapminmax' }; % Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand' ; % Divide data randomly net.divideMode = 'sample' ; % Divide up every sample net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse' ; % Mean squared error % Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = { 'plotperform' , 'plottrainstate' , 'ploterrhist' , ... 'plotregression' , 'plotfit' }; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % Recalculate Training, Validation and Test Performance trainTargets = t .* tr.trainMask{1}; valTargets = t .* tr.valMask{1}; testTargets = t .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) % Deployment % Change the (false) values to (true) to enable the following code blocks. if (false) % Generate MATLAB function for neural network for application deployment % in MATLAB scripts or with MATLAB Compiler and Builder tools, or simply % to examine the calculations your trained neural network performs. genFunction(net, 'myNeuralNetworkFunction' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a matrix-only MATLAB function for neural network code % generation with MATLAB Coder tools. genFunction(net, 'myNeuralNetworkFunction' , 'MatrixOnly' , 'yes' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a Simulink diagram for simulation or deployment with. % Simulink Coder tools. gensim(net); end 5.2 网络训练过程 网络训练为: 5.3 训练结果 训练结果为: 训练样本、验证样本、测试样本的 R 值分别为 0.942388 、 0.999999 和 1 。 误差直方图为: 训练样本、验证样本、测试样本、所有数据回归图为: 验证样本和测试样本 R 值均为 1 ,训练样本 R=0.94239 ,所有数据 R=0.89479 。 6. 第一代幼虫高峰期建模试验 6.1 程序代码 程序代码( Simple Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:16:32 CST 2015 % % This script assumes these variables are defined: % % t1x - input data. % t1y - target data. x = t1x'; t = t1y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Setup Division of Data for Training, Validation, Testing net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) 程序代码( Advanced Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:17:08 CST 2015 % % This script assumes these variables are defined: % % t1x - input data. % t1y - target data. x = t1x'; t = t1y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm' ; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.input.processFcns = { 'removeconstantrows' , 'mapminmax' }; net.output.processFcns = { 'removeconstantrows' , 'mapminmax' }; % Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand' ; % Divide data randomly net.divideMode = 'sample' ; % Divide up every sample net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse' ; % Mean squared error % Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = { 'plotperform' , 'plottrainstate' , 'ploterrhist' , ... 'plotregression' , 'plotfit' }; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % Recalculate Training, Validation and Test Performance trainTargets = t .* tr.trainMask{1}; valTargets = t .* tr.valMask{1}; testTargets = t .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) % Deployment % Change the (false) values to (true) to enable the following code blocks. if (false) % Generate MATLAB function for neural network for application deployment % in MATLAB scripts or with MATLAB Compiler and Builder tools, or simply % to examine the calculations your trained neural network performs. genFunction(net, 'myNeuralNetworkFunction' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a matrix-only MATLAB function for neural network code % generation with MATLAB Coder tools. genFunction(net, 'myNeuralNetworkFunction' , 'MatrixOnly' , 'yes' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a Simulink diagram for simulation or deployment with. % Simulink Coder tools. gensim(net); end 6.2 网络训练过程 网络训练为: 6.3 训练结果 训练结果为: 训练样本、验证样本、测试样本的 R 值分别为 0.875337 、 -1 和 1 。 误差直方图为: 训练样本、验证样本、测试样本、所有数据回归图为: 验证样本和测试样本 R 值均为 1 。 7. 第二代幼虫高峰期建模试验 7.1 程序代码 程序代码( Simple Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:22:04 CST 2015 % % This script assumes these variables are defined: % % t2x - input data. % t2y - target data. x = t2x'; t = t2y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm'; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Setup Division of Data for Training, Validation, Testing net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) 程序代码( Advanced Script )为: % Solve an Input-Output Fitting problem with a Neural Network % Script generated by Neural Fitting app % Created Wed Oct 28 20:22:29 CST 2015 % % This script assumes these variables are defined: % % t2x - input data. % t2y - target data. x = t2x'; t = t2y'; % Choose a Training Function % For a list of all training functions type: help nntrain % 'trainlm' is usually fastest. % 'trainbr' takes longer but may be better for challenging problems. % 'trainscg' uses less memory. NFTOOL falls back to this in low memory situations. trainFcn = 'trainlm' ; % Levenberg-Marquardt % Create a Fitting Network hiddenLayerSize = 10; net = fitnet(hiddenLayerSize,trainFcn); % Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.input.processFcns = { 'removeconstantrows' , 'mapminmax' }; net.output.processFcns = { 'removeconstantrows' , 'mapminmax' }; % Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand' ; % Divide data randomly net.divideMode = 'sample' ; % Divide up every sample net.divideParam.trainRatio = 90/100; net.divideParam.valRatio = 5/100; net.divideParam.testRatio = 5/100; % Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse' ; % Mean squared error % Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = { 'plotperform' , 'plottrainstate' , 'ploterrhist' , ... 'plotregression' , 'plotfit' }; % Train the Network = train(net,x,t); % Test the Network y = net(x); e = gsubtract(t,y); performance = perform(net,t,y) % Recalculate Training, Validation and Test Performance trainTargets = t .* tr.trainMask{1}; valTargets = t .* tr.valMask{1}; testTargets = t .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y) % View the Network view(net) % Plots % Uncomment these lines to enable various plots. %figure, plotperform(tr) %figure, plottrainstate(tr) %figure, plotfit(net,x,t) %figure, plotregression(t,y) %figure, ploterrhist(e) % Deployment % Change the (false) values to (true) to enable the following code blocks. if (false) % Generate MATLAB function for neural network for application deployment % in MATLAB scripts or with MATLAB Compiler and Builder tools, or simply % to examine the calculations your trained neural network performs. genFunction(net, 'myNeuralNetworkFunction' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a matrix-only MATLAB function for neural network code % generation with MATLAB Coder tools. genFunction(net, 'myNeuralNetworkFunction' , 'MatrixOnly' , 'yes' ); y = myNeuralNetworkFunction(x); end if (false) % Generate a Simulink diagram for simulation or deployment with. % Simulink Coder tools. gensim(net); end 7.2 网络训练过程 网络训练为: 7.3 训练结果 训练结果为: 训练样本、验证样本、测试样本的 R 值分别为 0.402150 、 1 和 1 。 误差直方图为: 训练样本、验证样本、测试样本、所有数据回归图为: