Weak Classifiers with Strong Logic(具有强逻辑的弱分类器)

本文探讨了如何利用弱分类器进行非刚性目标分类,并将其应用到历史事件的解释中。通过多个弱条件共同作用的例子说明了其有效性,同时提出了一种历史故事背后可能存在的真实背景的分析方法。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

本文转自http://www.ricoh.com/about/company/technology/voice/column/021.html

Classification of Non-rigid Objectives

The previous column story told ( No.20 Weak Classifier and Strong Classifier) how integrated multiple weak classifiers reorganize themselves into a strong and flexible classifier.

Traditionally, engineers merely focused on specific targets that were easy to describe with numbers or were repeatedly observable. If, however, the target is not observable, or difficult to observe, how do we define the target? This time we focus on a conceptual or historical target whose description is often incomplete or observable only once. Even using this fuzzy target, you may notice that the concept of weak classifiers is a powerful framework.

We will first try to select a person with good personality using weak classifiers. Traditional methodology requires that "good personality" be defined, which is not easy. Without defining the object, it is difficult to design the classifier. However, if we can divide the feature into partial sub-features, the classification process becomes easier to manage. The features of a good personality could be expressed as a person who (1) has no biased opinions, (2) regards himself/herself equally with others, (3) does not impose on others, (4) is a good listener, (5) maintains temperate behavior, (6) is not offensive to others. None of these features is strong enough to describe a good personality by itself, but taken together, the extracted personality is expected to be something close. To be stable, each classifier should be independent. By integrating these weak classifiers, a high performance classifier system can be obtained. The individual weak classifiers can be used with the simple logic "if A, then B". Figure 1 shows the relationships between individual statements and the resulting statement, where the central common area represents the logical statement satisfied by all weak conditions. Figure 2 is another diagram of the common region satisfied by all the independent conditions: α, β, γ. The person who passes all tests from (1) to (6) should be a good person.


Fig. 1. Areas A-E depict the logical regions on which many weak pieces
of evidence stand. The center is the common area.


Fig. 2. The cube represents an area that has common but independent
properties, α, β and γ.

According to a theorem of logic, the statement "if not B, then not A" is always correct "if A, then B" is correct. This theorem is called contraposition. Even when the classifier is weak, this classifier should be valid. To check the validity of the weak classifiers, the contraposition theorem is helpful. It is a good idea to test this contraposition before accepting the statement that (1) "a not non-biased person does not always have a good personality."

Application to Historical Interpretation

The deduction, based on weak but many pieces of evidence has been used for many years in the field of history or anthropology. For example, mythology is thought to be man-made fiction with no realistic background, because it starts from God. However, it is unrealistic to believe that a story can be created without any background. If the story is shared by groups of people, it is more reasonable to guess that the story may have some common historical background. Here, I present one example of a folk tale inherited by the local people where I was born.

I was born in the area called Kibi, located in the western part of Japan.Soja is the current dominant city in this part of Okayama prefecture. A large fortress ruins, a few kilometers in size, called Kino Jo (devil's fortress), boasts the following folk tale:

    • 1.The group was led by a prince of Kudara, an ancient state in the south of the Korean Peninsula. His wife was princessAzo.
    • 2.The Prince lived in Kino Jo and annoyed nearby people with criminal acts, who gave him the name Devil Prince.
    • 3.The Government decided to send troops, led by general Kibitsuhiko, to conquer the Devil Prince.
    • 4.After many difficulties, Kibitsuhiko finally won the battle.

Here is my hypothesis; " The Devil Prince was a leader of a group that immigrated from Kudara."

Following is a list of historical facts and findings.

    • 1.The shrine called Kibitsu Jinja enshrines Kibitsuhiko, and keeps the tradition of the prophecy using the Devil's kettle. This job is done by females from nearbyAzo village, located at the foot of Kino Jo.
    • 2.The state of Kudara existed from the 4th century to the 7th century in the southern part of the Korean Peninsula. It was crushed just after the allied forces of Kudara & Japan were defeated by the allied forces of Tang & Shilla in 660 AD.
    • 3.When Kudara was crushed, many people, including royal families, immigrated to Japan.
    • 4.Kudara's main business was thought to be iron related.
    • 5.Around the 7th century, iron ore around Kudara was depleted.
    • 6.Large scale iron production in Japan started in the Chugoku region; as a result much waste was dumped in theSeto inland sea, where the white shoreline became a symbol of Seto sea scenery. Later, the iron production area moved to the north, where much sand flooded the shoreline of the Japan Sea, creating the unique scenery of sand dunes and sandbars.
    • 7.Today, the remains of many iron furnaces dating back to the 6th and 7th century can be found aroundAzo, as well as many 5th century Korean style ceramics.
    • 8.Ruins of the Kino Jo fortress are a constant reminder of ancient Korean style design.
    • 9.Ancient Japanese poems often have the prologue, magane fuku, meaning ironfurnacing, followed by the region Kibi.

The large scale Kino Jo, which has never appeared in historical literature, exhibits many of the facts listed above. Combined with historical tales of that era, here is my story ofKino Jo:

The ancient state of Kudara was born with its innovative technology of iron furnaces. After several hundreds of years, the iron ore ran out. With the superior technology of iron furnaces, the population immigrated to western Japan, where they built a strong and wealthy community that sometimes competed with the Japanese government. Kino Jo was the headquarters of the community.

I once asked an acquaintance, "Mr. Kudara, are you a descendant from the kingdom of theKudara royal family?", His answer was "Yes, I am."

Epilogue

I was deeply impressed by the book Seven Daughters of Eve written by Bryan Sykes. My wife sent our tissue samples to Oxford Ancestors to have our mitochondrial DNA analyzed. Here is one result of the analysis, which tells a very interesting story of my wife's ancestor Sachi.

Sachi: the great great godmother
Your DNA sequence, which is very rare among native Europeans, is found Predominantly in East Eurasia and the Americas. You are a direct maternal descendant of Sachi, the most frequently encountered clan in Japan.
Sachi's descendants are found throughout Japan today, from Hokkaido in the north to Okinawa in the south. They probably first arrived early in the history of Japan, crossing from the Asian mainland about 12,000 years ago, but later migrations, for example those associated with the Yayoi culture, would also have included other descendants of Sachi.
Sachi's descendants have also been found in mainland Asia (Siberia, China, Taiwan, Korea, Vietnam and Thailand) and in both North and South America. Very occasionally her clan can be found in Europe. Her descendants reached the islands of the Philippines and Indonesia, from where they set sail to Madagascar. Further out into the Pacific, Sachi's descendants are found in Fiji and the islands of Micronesia.
From the features of this geographical distribution of her descendants we believe that Sachi herself lived in Central Asia, probably at least 20,000 years ago. We will become more certain of this as more research is done in the countries of Asia, including Japan.
Thank you for your custom and congratulations on finding your place on the world's largest family tree.
--- Oxford Ancestors ---

As the story reveals, most Japanese today may have an ancestral connection with the mainland if we look back 2000 years. During its long history, several waves of immigrants groups with different cultural backgrounds arrived on Japan shores. One of the groups may be the Kudara group, who settled around Kino Jo.

What do you think?

(Ej, 2007.02)

 

资源下载链接为: https://pan.quark.cn/s/790f7ffa6527 在一维运动场景中,小车从初始位置 x=-100 出发,目标是到达 x=0 的位置,位置坐标 x 作为受控对象,通过增量式 PID 控制算法调节小车的运动状态。 系统采用的位置迭代公式为 x (k)=x (k-1)+v (k-1) dt,其中 dt 为仿真过程中的恒定时间间隔,因此速度 v 成为主要的调节量。通过调节速度参数,实现对小车位置的精确控制,最终生成位置 - 时间曲线的仿真结果。 在参数调节实验中,比例调节系数 Kp 的影响十分显著。从仿真曲线可以清晰观察到,当增大 Kp 值时,系统的响应速度明显加快,小车能够更快地收敛到目标位置,缩短了稳定时间。这表明比例调节在加快系统响应方面发挥着关键作用,适当增大比例系数可有效提升系统的动态性能。 积分调节系数 Ki 的调节则呈现出不同的特性。实验数据显示,当增大 Ki 值时,系统运动过程中的波动幅度明显增大,位置曲线出现更剧烈的震荡。但与此同时,小车位置的变化速率也有所提高,在动态调整过程中能够更快地接近目标值。这说明积分调节虽然会增加系统的波动性,但对加快位置变化过程具有积极作用。 通过一系列参数调试实验,清晰展现了比例系数和积分系数在增量式 PID 控制系统中的不同影响规律,为优化控制效果提供了直观的参考依据。合理匹配 Kp 和 Ki 参数,能够在保证系统稳定性的同时,兼顾响应速度和调节精度,实现小车位置的高效控制。
### 集成学习中弱分类器分类器的概念 在集成学习领域,弱分类器是指其预测能力仅略微优于随机猜测的简单模型[^1]。这类模型通常设计得较为简单,目的是为了减少过拟合的风险。相比之下,分类器则是指能够达到较高精度的复杂模型,通常是通过对多个弱分类器的结果进行加权组合而成。 #### AdaBoost 中的弱分类器分类器 AdaBoost 是一种经典的提升方法,它的核心思想在于通过迭代的方式逐步构建弱分类器,并赋予它们不同的权重以形成最终的分类器[^3]。具体而言,在每次迭代过程中,AdaBoost 调整训练样本的权重分布,使得下一轮训练更加关注于当前分类错误的样本。这种机制有效地提高了整体模型的表现。 以下是基于 Python 的 AdaBoost 实现示例: ```python from sklearn.datasets import load_breast_cancer from sklearn.model_selection import train_test_split from sklearn.tree import DecisionTreeClassifier from sklearn.ensemble import AdaBoostClassifier from sklearn.metrics import accuracy_score # 加载乳腺癌数据集 data = load_breast_cancer() X_train, X_test, y_train, y_test = train_test_split(data.data, data.target, test_size=0.3, random_state=42) # 定义基础弱分类器 (决策树) base_estimator = DecisionTreeClassifier(max_depth=1) # 设置最大深度为1,模拟弱分类器 # 构建 AdaBoost 分类器 ada_boost = AdaBoostClassifier(base_estimator=base_estimator, n_estimators=50, learning_rate=1.0, random_state=42) ada_boost.fit(X_train, y_train) # 测试模型表现 y_pred = ada_boost.predict(X_test) accuracy = accuracy_score(y_test, y_pred) print(f"Accuracy of the strong classifier: {accuracy:.4f}") ``` 此代码片段展示了如何利用 `sklearn` 库中的 `DecisionTreeClassifier` 和 `AdaBoostClassifier` 来创建一个由弱分类器组成的大分类器。这里的基础估计器是一个浅层决策树(即弱分类器),而整个集合则构成了大的 AdaBoost 模型。 #### 动态分类器集成中的弱分类器池 除了传统的提升方法外,还有其他形式的集成技术允许用户自定义弱分类器池。例如,在动态分类器选择(Dynamic Classifier Selection, DCS)或局部准确度估计(Local Accuracy Estimation, LA)框架下,可以选择多种类型的基分类器作为候选者[^2]。这种方法提供了更大的灵活性,因为它不限制所有成员都属于同一类别。 下面是一段关于定制化分类器池的例子: ```python from sklearn.linear_model import LogisticRegression from sklearn.naive_bayes import GaussianNB from deslib.dcs.ola import OLA from sklearn.model_selection import train_test_split # 创建不同类型的弱分类器 clf1 = LogisticRegression(random_state=42) clf2 = DecisionTreeClassifier(max_depth=2, random_state=42) clf3 = GaussianNB() classifiers_pool = [clf1, clf2, clf3] for clf in classifiers_pool: clf.fit(X_train, y_train) # 初始化 OLA 并传入分类器池 dynamic_classifier = OLA(pool_classifiers=classifiers_pool) dynamic_classifier.fit(X_train, y_train) # 进行预测 predictions = dynamic_classifier.predict(X_test) acc_dynamic = accuracy_score(y_test, predictions) print(f"Accuracy with custom weak classifiers: {acc_dynamic:.4f}") ``` 在这部分代码中,我们混合使用了逻辑回归、小型决策树以及高斯朴素贝叶斯三种不同类型的基础模型来组建我们的分类器池。随后采用 DESLib 提供的功能实现了动态选择最佳子集的过程。 --- ###
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值