Moore vs. Mealy Machine

slides from https://inst.eecs.berkeley.edu/~cs150/fa05/Lectures/07-SeqLogicIIIx2.pdf

[GPT 3.5]

A Mealy machine can have fewer states than a Moore machine because in a Mealy machine, the output depends on both the current state and the input, whereas in a Moore machine, the output depends only on the current state. This means that in a Mealy machine, states can be merged if they produce the same output for the same input, even if they transition to different states. In a Moore machine, states cannot be merged if they produce the same output, but transition to different states.

Here's an example to illustrate this:

Let's say we want to design a machine that detects a sequence of two consecutive 1's in a binary input. We can implement this using both a Mealy machine and a Moore machine.

**Mealy machine:**

In this Mealy machine, the state `q3` outputs a 1 when the input is 1, indicating that the sequence `11` has been detected. The output of `q3` depends on both the current state and the input. However, `q2` and `q3` produce the same output for input 0, so we can merge them into a single state.

**Moore machine:**

In this Moore machine, the state `q1/q0` outputs a 1 when the sequence `11` has been detected. The output of `q1/q0` depends only on the current state, not the input. However, we cannot merge `q0/q0` and `q0/q1` because they produce different outputs.

As we can see, the Mealy machine has fewer states than the Moore machine because we were able to merge `q2` and `q3` in the Mealy machine, but we couldn't do so in the Moore machine.

==> output is not dependent on input, but next state still is

with the merging rule stated in the beginning:

(starting from the Moore diagram, but change it to a Mealy first, then)

E is merged into B

D is merged into C

both outputs are dependent on states only

use registered Mealy to improve output safety

which makes it Moore Machine, without a decoder for output

see the slides for extended examples.

先展示下效果 https://pan.quark.cn/s/a4b39357ea24 遗传算法 - 简书 遗传算法的理论是根据达尔文进化论而设计出来的算法: 人类是朝着好的方向(最优解)进化,进化过程中,会自动选择优良基因,淘汰劣等基因。 遗传算法(英语:genetic algorithm (GA) )是计算数学中用于解决最佳化的搜索算法,是进化算法的一种。 进化算法最初是借鉴了进化生物学中的一些现象而发展起来的,这些现象包括遗传、突变、自然选择、杂交等。 搜索算法的共同特征为: 首先组成一组候选解 依据某些适应性条件测算这些候选解的适应度 根据适应度保留某些候选解,放弃其他候选解 对保留的候选解进行某些操作,生成新的候选解 遗传算法流程 遗传算法的一般步骤 my_fitness函数 评估每条染色体所对应个体的适应度 升序排列适应度评估值,选出 前 parent_number 个 个体作为 待选 parent 种群(适应度函数的值越小越好) 从 待选 parent 种群 中随机选择 2 个个体作为父方和母方。 抽取父母双方的染色体,进行交叉,产生 2 个子代。 (交叉概率) 对子代(parent + 生成的 child)的染色体进行变异。 (变异概率) 重复3,4,5步骤,直到新种群(parentnumber + childnumber)的产生。 循环以上步骤直至找到满意的解。 名词解释 交叉概率:两个个体进行交配的概率。 例如,交配概率为0.8,则80%的“夫妻”会生育后代。 变异概率:所有的基因中发生变异的占总体的比例。 GA函数 适应度函数 适应度函数由解决的问题决定。 举一个平方和的例子。 简单的平方和问题 求函数的最小值,其中每个变量的取值区间都是 [-1, ...
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值