
Machine Learning
文章平均质量分 75
xlliu0226
Student
展开
-
Rank
PageRank: http://en.wikipedia.org/wiki/Page_rank ; http://download.youkuaiyun.com/source/376367BM25:http://download.youkuaiyun.com/source/376362RankBoost: http://download.youkuaiyun.com/source/376396PRanking: htt原创 2008-03-10 16:56:00 · 469 阅读 · 0 评论 -
MiniMax
Wiki ref: http://en.wikipedia.org/wiki/MinimaxMinimax (sometimes minmax) is a method in decision theory for minimizing the maximum possible loss. Alternatively, it can be thought of as maximizing th转载 2008-03-24 18:05:00 · 2060 阅读 · 0 评论 -
Genetic Programming
One of the central challenges of computer science is to get a computer to do what needs to be done, without telling it how to do it. Genetic programming addresses this challenge by providing a转载 2008-03-17 11:53:00 · 2452 阅读 · 0 评论 -
Simulated Annealing
Wiki ref: http://en.wikipedia.org/wiki/Simulated_annealingSimulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation转载 2008-03-18 11:00:00 · 1402 阅读 · 0 评论 -
VC Dimension && Hill Climbing
Quoted From Wiki.Vapnik-Chervonenkis DimensionIn computational learning theory, the VC dimension (for Vapnik-Chervonenkis dimension) is a measure of the capacity of a statistical classification al转载 2008-03-13 15:27:00 · 1708 阅读 · 0 评论 -
K-Means
K-MEANS算法(基本上符合EM算法): k-means 算法接受输入量 k ;然后将n个数据对象划分为 k个聚类以便使得所获得的聚类满足:同一聚类中的对象相似度较高;而不同聚类中的对象相似度较小。聚类相似度是利用各聚类中对象的均值所获得一个“中心对象”(引力中心)来进行计算的。 k-means 算法的工作过程说明如下:首先从n个数据对象任意选择 k 个对象作为初始聚类中原创 2008-03-10 17:21:00 · 748 阅读 · 0 评论 -
SVM Tutorial
Wiki Ref: http://en.wikipedia.org/wiki/Support_vector_machine Source Code and Help Docs: http://download.youkuaiyun.com/source/377387SMO Algorithm: http://d.download.youkuaiyun.com/down/377392/xlliu0226SV原创 2008-03-10 16:27:00 · 1712 阅读 · 0 评论 -
Instance-based Learning: K-Nearest Neighbour Algorithm && Radial Basis Function
K-Nearest Neighbour Algorithm(Wiki ref) http://en.wikipedia.org/wiki/K-nearest_neighbor_algorithmRadial Basis Function(Wiki ref) http://en.wikipedia.org/wiki/Radial_basis_functionInstanced-Based原创 2008-03-14 10:33:00 · 694 阅读 · 0 评论 -
Naive Bayes
Wiki: http://en.wikipedia.org/wiki/Naive_bayes Algorithm From Christopher D. Mannings Information Retrieval TRAINBERNOULLINB(C,D)1 V ← EXTRACTVOCABULARY(D)2 N ← COUNTDOCS(D)3 for each原创 2008-03-12 10:11:00 · 657 阅读 · 0 评论 -
PCA程序设计(Matlab)
把从混合信号中求出主分量(能量最大的成份)的方法称为主分量分析(PCA),而次分量(Minor Components,MCs)与主分量(Principal Components,PCs)相对,它是混合信号中能量最小的成分,被认为是不重要的或是噪声有关的信号,把确定次分量的方法称为次分量分析(MCA). PCA可以用于减少特征空间维数、确定变量的线性组合、选择最有用的变量、变量辨识、识转载 2008-03-12 10:01:00 · 2313 阅读 · 2 评论 -
单例模式(singleton)
日志 > 个人日记 转载到我空间 复制本文网址 隐藏签名档 小字体 选用该信纸上一篇|下一篇|返回日志列表设置置顶 | 编辑 | 删除单例模式(singleton)发表于:2008年2月22日 9时47分59秒阅读(3)评论(0)特效:[信纸]本文链接:http://user.qzone.qq.com/592433424/blog/1203644879原创 2008-03-13 16:35:00 · 447 阅读 · 0 评论 -
Lagrange Multiplier && KTT Condition
Wiki Ref: http://en.wikipedia.org/wiki/Lagrange_multipliersgeneral formulation: The weak Lagrangian principleDenote the objective function by and let the constraints be given by , perhaps by movi转载 2008-03-12 11:42:00 · 1681 阅读 · 0 评论 -
Unsupervised, Semi-Supervised, Supervised Learning
Semi-Supervised:In computer science, semi-supervised learning is a class of machine learning techniques that make use of both labeled and unlabeled data for training - typically a small amount of la转载 2008-03-11 17:54:00 · 2050 阅读 · 0 评论 -
PCA Tutorial
Original: http://en.wikipedia.org/wiki/Principal_component_analysis Example: http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf; http://download.csdn.ne转载 2008-03-11 09:56:00 · 1836 阅读 · 0 评论 -
String Model
Exploring Spring ModelsBy Gustavo Oliveira GamasutraOctober 5, 2001URL: http://www.gamasutra.com/20011005/oliveira_01.htm The first time I implemented a spring model, it fascinated me for hours转载 2008-04-03 14:24:00 · 1403 阅读 · 0 评论