QX项目实战-14.Struts2+Hibernate整合

      以用户登录为例,说明Struts2和hibernate的整合。首先在MyEclipse8.0下新建一个web项目。注意编译器选择J2EE5.0版本。


      我们先来增加Struts开发能力,在项目名上右击选择MyEclipse-增加Struts开发能力,这里选择Struts2.1版本。默认安装即可。再安装Hibernate开发能力,依然如上面的操作,选择Hibernate3.2版本,注意数据库连接部分,需要之前在数据库视图下设定好数据库连接,新建包来存放hibernate程序库,这里新建hibernate.example包。

      运行当前系统测试能否正常工作,首先打开服务器,然后部署当前项目到Tomcat中,随后在浏览器输入当前地址如:http://192.168.195.54:8080/QX4/即可访问当前默认首页。

      下面对Hibernate部分进行开发,切换到数据库视图下,找到需要编写程序的数据库表,这里是userinfo下的users表,右击选择hibernate反向工程,创建POJO对象和对应的映射文件。


      下面开始编写业务逻辑部分,新建类oper.example.userview,方法showusers方法为:

Session session = HibernateSessionFactory.getSession();
                   Query query= session.createQuery("from Users");
                   String ret =doPrint(query);
                   return ret;

      上述程序,通过Hibernate读取了数据库中的users表的所有记录并返回。

      下面我们编写Struts部分,来实现登录功能的跳转。首先我们新建类struts.example.loginAction,新建name和password属性,并设定好getter和setter访问器。设定execute方法,该方法调用Users的hibernate类,来实现对数据库的操作。这些操作返回success字串或者error字串,这两个字串对应于struts.xml文件对应的结果页面。这个方法定义为:

public String execute(){
                   if(getName().equals("admin")&& getPassword().equals("123")){
                            return"success";
                   }else{
                            return"error";
                   }
}

      而struts.xml文件为:

<struts>
         <packagename="logincode" extends="struts-default">
                   <actionname="login" class="struts.example.LoginAction">
                            <resultname="success">/loginSuccess.jsp</result>
                            <resultname="error">/loginFail.jsp</result>
                   </action>
         </package>
</struts>

      这里仅仅通过简单的实例,说明登录过程在Struts的使用流程。在loginSuccess.jsp页面中,我们引入Userview对象,输出了其showuser方法的内容,代码为:

<%
                userview uv = new userview();
                out.println("aaaaa"+uv.showUsers());                          
%>

      通过以上步骤,我们将Struts和hibernate结合起来了,在登录判断时,这里简单的使用硬编码方式,在实际中,我们会读取数据库并完成判断。谨以此说明,更复杂的程序就可以基于此进行开发了。

      PS:tomcat 部署struts2.1必须删除xerces.jar文件 所有项目都得删除

参考

1.      QX项目实战-9.ActiveMQ编程实例

2.      QX项目实战-10.基础架构实验一:传递消息、序列化对象和数据库封装

3.      QX项目实战-11.基础架构试验二:反序列化对象、重写数据库

4.      QX项目实战-12.基础架构试验三:异构数据库同步

5.      QX项目实战-13.基础架构试验四:JavaWeb消息平台

function [y1] = myNeuralNetworkFunction(x1) %MYNEURALNETWORKFUNCTION neural network simulation function. % % Auto-generated by MATLAB, 29-May-2025 17:44:10. % % [y1] = myNeuralNetworkFunction(x1) takes these arguments: % x = Qx5 matrix, input #1 % and returns: % y = Qx1 matrix, output #1 % where Q is the number of samples. %#ok<*RPMT0> % ===== NEURAL NETWORK CONSTANTS ===== % Input 1 x1_step1.xoffset = [480777000000;157513000000000;11401700000000;5205;154763000000000]; x1_step1.gain = [4.34708679869508e-14;1.34756478535618e-16;1.77276942837848e-15;3.97784341219408e-06;1.34756024554434e-16]; x1_step1.ymin = -1; % Layer 1 b1 = [1.6637122720762880235;-1.5732665448925968743;-1.3469017629425223959;5.2221613538636377783;-0.50109180945842046739;0.93076629966299784247;8.1811762637772638129;3.0235828698241875578;-2.3018970320204750202;2.7798931959262289659;0.40617600585227198184;-2.3812063612092764231;3.2303078209384015196;0.70681302038925530873;-0.30089952450459050981;-0.51338761225007900624;0.41685131655029350473;2.4064635997486485408;1.3825477326600374983;2.059209503904556815;1.3892729484841934706;1.1832450423043530119;-3.6033367654623575937;-0.54285568143037532529;0.60435560856867909241;-0.32844380742291567898;-1.0006479269246388863;-2.3381154756104458592;0.48579947504393156121;-6.5417870843350067744]; IW1_1 = [0.50778584405420768011 -0.13514095900822287177 0.33052488238907029583 2.0969957257563569542 -0.41820412888486002689;-0.27861324742865134496 -2.0648439591818048555 2.2335059779169976757 1.5725094050417163238 0.38615805120811086404;1.6931699488109086538 -1.2102507251722887105 0.44628934905022810131 0.94195198377707323178 -0.071477395366930790432;0.31253276156287845833 -8.4836433743946866315 3.2448198037316777764 -1.2000524100027547547 -1.2788402925151238776;-1.1218160910875487613 0.46651572421737680374 -1.3039072678201015787 -0.94950123051425627807 -1.2483163499387381901;-0.44056602834480063091 -2.6500257858484883577 -0.00016892644354535128748 -0.33695450415421163148 0.84176283293066422875;6.5100461225917776176 7.5828935926173972604 0.063046498412597656857 7.416075738466233247 -0.77907837421871872774;-6.5104344469523525163 -10.356764888567823846 -4.6111384750039405844 -5.9680825913628607893 1.7179082426779781478;0.29952779588636418984 1.0019679568316388618 1.7712263479163026236 -4.613975525809425271 6.1079805116506253881;-3.0357749869404901943 -8.550477132519608503 3.7249383217218867692 -0.8346953765884139731 2.5559284515383522418;-0.14436007739922590565 0.37360983799775560055 -0.78118848850123867589 -0.50543268743787606034 -0.61354429105307262127;3.0056066231496512131 -0.056405946008129898006 -4.6579964553960095941 -1.4957341240973340835 0.25592146384548081217;-3.0223314091154840355 -2.0197134127734797104 1.6575565871773842996 3.1953517969723788106 -0.52647038477218666142;1.3403568102543135243 0.088832456318453892341 0.64117725515135171044 1.146960229866753167 0.43765557565975743426;-0.89951249561902846263 2.538125789891104489 0.74444313090091951413 -0.39311961561065705828 1.8170339574167870822;0.29700568995971382558 1.1577422520961551111 0.24275241984966683395 -0.66222200065358349796 1.2953140520347545817;0.62730289743544209724 1.0825544738993371219 0.16732712665699264742 4.628909315361935839 -3.0095475571017158423;-0.37255144534578465976 1.1265656021398666731 1.6571261689011218543 0.9502084172921591021 1.6910490465906138358;0.80852370731862066577 0.15258921259379218394 0.42342754264485799043 -2.0198483620167690944 -0.31793638357817244344;2.2998497071752557197 -0.2113064995484469355 2.5555939246601111492 -0.095304632528927762936 3.6530810500597152846;0.39278974711902198136 -1.2193515690401528495 4.7072946121181065138 0.44259397113701937077 0.34957094961983786741;0.73659409418489285581 1.0182333756766508426 -1.084763859735640823 -0.4406744482554108 -1.4601507433390377155;-1.8245439363643136677 -5.0519787149383930114 -1.0002197246389910035 -0.36105203822314096973 1.2165240609929526272;4.3432992734077000563 0.77193429270487667893 -1.2974834124980081373 -1.0044468036451459358 -0.60306008323366633306;-1.1209619658040457235 3.391279664388744397 -2.0678585626930638952 -0.088135103753664567616 3.1386049070937835914;0.51106751785515147635 0.23951297167861929083 1.4156483541745823462 -0.6516581046274407818 -0.38567483461944374046;0.13954880584928869292 -1.1458371270137879261 -2.0122737899834195474 -0.53349262578446943195 -2.081343799375376058;-3.7292551802685065532 -0.48557016684159237663 0.1995494594269605626 2.4775510708792150005 -1.2559500567569636953;0.070735675867016678531 0.079511200828890044034 0.580046485811331336 -0.73310684489441557687 -0.17133606911874618706;0.07502084751049771516 -2.7165351554067003192 -4.8488608052408261884 -3.1702270422280593998 8.31619057127640815]; % Layer 2 b2 = -0.82862468421687540676; LW2_1 = [0.12564197345592126309 0.024173817244167429552 0.052712313429482703098 -0.016185255686955328513 0.031056200454798770644 0.013199645495849161511 0.015260924672909100333 -0.0015812692523211430932 0.013240349389070805303 -0.022469198130934748409 -0.20435941386687389576 -0.012300867514544679485 -0.007460318749128531414 -0.040361061054172329976 -0.041480213008578507849 -0.025160110050006075089 -0.0085696736964657061614 0.027790944719379925137 -0.022194529311787593306 0.017680140357790717631 0.047515103537599079342 -0.026018593215458959284 0.027216358070957694104 -0.00061123834116237250012 -0.022366089533571262282 -0.19184505163117052118 -0.067988068566674109139 -0.0026217244603330548507 0.28583637207524592583 -0.036170859837360173905]; % Output 1 y1_step1.ymin = -1; y1_step1.gain = 0.00833702872131406; y1_step1.xoffset = 0.01308; % ===== SIMULATION ======== % Dimensions Q = size(x1,1); % samples % Input 1 x1 = x1'; xp1 = mapminmax_apply(x1,x1_step1); % Layer 1 a1 = tansig_apply(repmat(b1,1,Q) + IW1_1*xp1); % Layer 2 a2 = repmat(b2,1,Q) + LW2_1*a1; % Output 1 y1 = mapminmax_reverse(a2,y1_step1); y1 = y1'; end % ===== MODULE FUNCTIONS ======== % Map Minimum and Maximum Input Processing Function function y = mapminmax_apply(x,settings) y = bsxfun(@minus,x,settings.xoffset); y = bsxfun(@times,y,settings.gain); y = bsxfun(@plus,y,settings.ymin); end % Sigmoid Symmetric Transfer Function function a = tansig_apply(n,~) a = 2 ./ (1 + exp(-2*n)) - 1; end % Map Minimum and Maximum Output Reverse-Processing Function function x = mapminmax_reverse(y,settings) x = bsxfun(@minus,y,settings.ymin); x = bsxfun(@rdivide,x,settings.gain); x = bsxfun(@plus,x,settings.xoffset); end 以上是我编好的神经网络预测模型,它现在是一个可调用的函数,现在我需要一个粒子群算法来实现输入参数优化。神经网络的输入有五个特征,输出有一个,然后我要在粒子群算法中设置的参数变化范围为 ub=[4.65e11,1.5e14,1.14e13,5.08e3,1.5e14]; lb=[4.65e13,1.5e16,1.14e15,5.08e5,1.5e16]; 现在需要找出使得神经网络预测模型的输出值最接近于0的点(x1,x2,x3,x4,x5),给我一个MATLAB代码
最新发布
05-30
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

gongqingkui

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值