【FNN回归预测】基于Jaya优化前馈神经网络FNN实现数据回归预测附Matlab代码

本文展示了如何使用JAYA优化算法训练的前馈神经网络来检测乳腺癌。通过25个特征(包括病史、物理检查和X线结果),利用反向传播调整神经网络的权重。代码示例中,数据经过预处理后,神经网络进行训练和测试,比较了标准FNN与JAYA优化后的FNN在预测性能上的差异。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

✅作者简介:热爱科研的Matlab仿真开发者,修心和技术同步精进,matlab项目合作可私信。

🍎个人主页:Matlab科研工作室

🍊个人信条:格物致知。

更多Matlab仿真内容点击👇

智能优化算法       神经网络预测       雷达通信       无线传感器        电力系统

信号处理              图像处理               路径规划       元胞自动机        无人机 

⛄ 内容介绍

本文介绍了人工神经网络技术,该技术可使用由反向传播算法训练的前馈神经网络来确定检测乳腺癌的可能性。在训练期间,反向传播被独立应用于优化,以发展ANN的互连权重。本文探索了一种使用25种特征(例如12例患者病史,13项物理发现,4项乳腺X线摄影发现)的ANN,最后通过活检的结果来确定存在乳腺癌的可能性。

⛄ 部分代码

% *************************************************************************************************************

%                  Source Code of JAYA Optimization based Feed-Forward

%                  Neural Network 

% Cite: Wang S, Rao RV, Chen P, Zhang Y, Liu A, Wei L. Abnormal breast detection in 

% mammogram images by feed-forward neural network trained by Jaya algorithm. 

% *************************************************************************************************************

% Enjoy JAYA-ANN! 

clc;

close all

% Generating random correlated data 

mu = 50;

sigma = 5;

M = mu + sigma * randn(300, 2);

R = [1, 0.75; 0.75, 1];

L = chol(R);

M = M*L;

x = M(:,1);  % Example Inputs, Replace by your data inputs for your own experiments

y = M(:,2); % Example labels, Replace by your data labels for your own experiments

%% JAYA algorithms

%% Problem Definition

pop = 30;               % Population size

% Min-max normalization of data

m = max(x); mn = min(x); mm = m-mn;

X = ((x-mn)/mm); Y = ((x-mn)/mm);

% 90%:10% splitting of data for training and testing 

sz = (ceil(size(X,1))*0.9);

inputs = (X(1:sz))';

targets = (Y(1:sz))';

XTest = (X(sz+1:end))';

YTest = Y(sz+1:end)';

% number of neurons

n = 4;

tic;

% create a neural network

net = feedforwardnet(n);

% configure the neural network for this dataset

net = configure(net, inputs, targets);

% Denormalizaion and Prediction by FNN

FNN_Pred = ((net(XTest))' * mm) + mn;

sz = n^2 + n + n + 1; % Number of design variables i.e., no. of weights in FNN                

maxGen = 30;            % Maximum number of iterations

mini = repmat(-1,1,sz); % Lower Bound of Variables

maxi = ones(1,sz);      % Upper Bound of Variables  

objective = @(x) NMSE(x, net, inputs, targets);      % Cost Function

disp(['Optimum value = ',num2str(val,10)])

 figure;

 plot(fopt,'LineWidth', 2);

 xlabel('Itteration');

 ylabel('Best Cost');

 legend('JAYA');

 disp(' ' );

% Setting optimized weights and bias in network

net = setwb(net, Best');

% Denormalizaion and Prediction by JAYA_FNN

JAYA_FNN_Pred = ((net(XTest))' * mm) + mn;

YTest = (YTest * mm) + mn;

JAYA_FNN_Execution_Time_Seconds = toc 

% Plotting prediction results

figure;

plot(YTest,'LineWidth',2, 'Marker','diamond', 'MarkerSize',8);

hold on;

plot(FNN_Pred, 'LineWidth',2, 'Marker','x', 'MarkerSize',8);

plot(JAYA_FNN_Pred, 'LineWidth',2, 'Marker','pentagram', 'MarkerSize',8);

title('JAYA Optimization based Feed-Forward Neural Network');

xlabel('Time Interval');

ylabel('Values');

legend('Actual Values', 'FNN Predictions', 'JAYA-FNN Predictions');

hold off;

% Performance Evaluaion of FNN and JAYA-FNN

fprintf('Performance Evaluaion of FNN and JAYA-FNN using Normalized Root Mean Square Error \n');

NRMSE_FNN = (abs( sqrt( mean(mean((FNN_Pred - YTest).^2) )) )) / (max(YTest)-min(YTest))

NRMSE_JAYA_FNN = (abs( sqrt( mean(mean((JAYA_FNN_Pred - YTest).^2) ) ) )) / (max(YTest)-min(YTest))

% Objective Function for minimizing normalized mean square error of FNN by

% updation of nework's weights and biases

function [f] = NMSE(wb, net, input, target)

% wb is the weights and biases row vector obtained from the genetic algorithm.

% It must be transposed when transferring the weights and biases to the network net.

 net = setwb(net, wb');

% The net output matrix is given by net(input). The corresponding error matrix is given by

 error = target - net(input);

% The mean squared error normalized by the mean target variance is

 f = (mean(error.^2)/mean(var(target',1)));

% It is independent of the scale of the target components and related to the Rsquare statistic via

% Rsquare = 1 - NMSEcalc ( see Wikipedia)

 end

⛄ 运行结果

⛄ 参考文献

[1] Babu G A ,  Bhukya S N ,  Kumar R S . Feed forward network with back propagation algorithm for detection of breast cancer[C]// International Conference on Computer Science & Education. IEEE, 2013.

[2]朱正林, 刘权, 张欢. 基于优化广义回归神经网络的碟式太阳能温度预测研究[J]. 南京工程学院学报:自然科学版, 2020, 18(1):5.

⛳️ 完整代码

❤️部分理论引用网络文献,若有侵权联系博主删除

❤️ 关注我领取海量matlab电子书和数学建模资料

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

matlab科研助手

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值