- 博客(103)
- 资源 (18)
- 收藏
- 关注
转载 Linux下 和 Windows 下 Oracle Instant Client 的安装.
Oracle数据库软件十分庞大,数据库引擎有好几个G。通常情况下,我们的使用方式是安装一个Oracle数据库在服务器机器上,在客户端通过PL/SQL Developer、sqlplus等工具操作服务器上的数据,当然Oracle 10g后,OEM(企业管理器)已经是WEB版了,在浏览器中也可以同样操作数据库。但是Windows下客户端工具 PL/SQL Developer 使用得最多,Lin
2018-06-01 01:57:14
982
原创 Change Jupyter Notebook Default Directory
Change Jupyter Notebook Default Directory There are three way to change the default (i.e., start-up) directory of jupyter notebook.Solution #11.Use command line (aka, cmd), run the followin
2018-01-29 04:54:12
1007
原创 How to use MathJax in Markdown
How to use MathJax in MarkdownWhen to Use MathJax?When using markdown to write blog, especially using Github Page to do it. You may have trouble to display formula. There are several ways[1] to do that
2017-12-12 01:38:11
766
原创 How to Render the Hyperlink with Braces
How to Render the Hyperlink with BracesHere is the code![Reorganized Dichotomies of B(4,3) - 1][11][11]:https://raw.githubusercontent.com/zhichengML/MarkdownPhoto/master/MachineLearning/Machine%20Learn
2017-12-12 01:37:51
735
原创 Mini-Batch Gradient Descent
Mini-Batch Gradient Descent1. What is Mini-Batch Gradient Descent?Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use som
2017-12-12 01:37:16
871
原创 Stochastic Gradient Descent
Stochastic Gradient Descent1. What is Stochastic Gradient DescentStochastic Gradient Descent(SGD) is similiar with Batch Gradient Desent, but it used only 1 example for each iteration. So that it makes
2017-12-12 01:36:48
1436
原创 Latex 常见错误整理
Latex 常见错误整理1. 求导\sideset{^*}{'}\sum_{1\le i\le 100} A(i)\qquad\sum_{1\le i\le 100}\vphantom{\sum}^{'} A(i)\qquad\mathop{{\sum}'}_{1\le i\le 100} A(i)∑∗∑′1≤i≤100A(i)∑1≤i≤100∑′A(i)∑′1≤i≤100A(i)\sid
2017-12-05 13:26:36
3146
原创 Concave and Convex Function
Concave and Convex FunctionWhat is Concave Function?Concave function is a function where the line segement between any two points of the function lies below or on the graph.[1]Mathematically, as for co
2017-11-30 21:33:47
3432
原创 Batch Gradient Descent
Batch Gradient Descent We use linear regression as example to explain this optimization algorithm.1. Formula1.1. Cost Function We prefer residual sum of squared to evaluate linear regression.J(θ)
2017-11-30 21:32:34
687
原创 Get More Data
Get More Data1. Why We Need More Data?In many situations (low bias learning model), more data usually means better performance of the model.2. When We Need More Data?Usually, we should plot the learnin
2017-11-30 21:29:33
609
原创 Tylor Expansion Example
Tylor Expansion Example Tylor Expansion is a powerful tool to deal with limits. Some examples are showed below.PrerequisiteSee more about how to calculate derivative at this link and differential ru
2017-11-06 15:57:51
681
原创 Practical Derivatives
Practical Derivatives1. Power FunctionGiven : f(x)=xa(a∈Q)f \left( x \right) = x^a \text(a \in Q)Proofs : f′(x)=a⋅xa−1f'\left( x \right) = a \cdot x^{a-1} Deductionf′(x)=limΔx→0f(x+Δx)−f(x)Δx=limΔx→
2017-11-06 15:57:18
622
原创 What is a one-sided limits?
What is a one-sided limits?IntroductionOne-sided Limits and Two-sided LimitsPeople are familiar with two sided limits, shown below.limx−>af(x)=L(1)\lim\limits_{x->a} f(x) = L\tag{$1$}But here, we ar
2017-11-06 15:56:34
1010
原创 Function and Limit
Function and Limit1. Function1) What is a Function?In mathematics, a function is a relation between a set of inputs and a set of permissible outputs with the property that each input is related to exac
2017-11-06 15:55:53
895
原创 Differentiation Rules
Differentiation Rules1. The Sum RuleIn calculus, the sum rule in differentiation is a method of finding the derivative of a function that is the sum of two other functions for which derivatives exist.[
2017-11-06 15:55:16
1169
原创 tanh Function
tanh Function1. Introduction To limit all data within the range of -1 to 1. Comparing to Sigmoid Function which output range is [0,1]2. FormulaThe formula and derivative of tanh is: f(z)f′(z)=tanh(
2017-10-30 23:15:11
1060
原创 Mathematics - Matrix and Vector Transformation
Mathematics - Matrix and Vector TransformationMathematics - Matrix and Vector Transformation 1. When to Transform?2. How to Transform? 1) Octave Implement2) Python Implement 1. When to Transform
2017-10-30 23:14:52
787
原创 Sigmoid Function
Sigmoid Function1. Introduction To limit all data within the range of 0 to 1.2. Formulay=11+e−xy = \frac{1}{1+e^{-x}}3. Implementation3.1 Octave x = linspace(-10, 10 ,10000); y = zeros( size(x, 1
2017-10-30 23:14:13
821
原创 How to use MathJax in Markdown
How to use MathJax in MarkdownWhen to Use MathJax?When using markdown to write blog, especially using Github Page to do it. You may have trouble to display formula. There are several ways[1] to do that
2017-10-30 23:10:30
603
原创 0. 机器学习基石 - Table of Contents
机器学习基石(Machine learning Foundation) - Table of Contents 这系列博客是基于台湾大学的林轩田教授的机器学习基石,自己加以整理的学习笔记。为什么选这个教程? 网上大部分人都是推荐Andrew Ng教授的机器学习,但是对比之后我觉得各有优劣,Ng的课程更适合没有任何基础的学生,算法没有讲的很深,相比之下,Hsuan-Tien Lin的教程遵循
2017-10-13 10:41:02
775
原创 15. 机器学习基石 - Summary - Power of Three
Summary - Power of ThreeSummary - Power of Three 1. Three Related Fields 1) Machine Learning V.S. Data Mining2) Machine Learning V.S. Artificial Intelligence3) Machine Learning V.S. Statistic2. T
2017-10-13 10:35:30
949
原创 14. 机器学习基石-How can Machine Learn Better? - Three Learning Principles
How can Machine Learn Better? - Three Learning PrinciplesHow can Machine Learn Better? - Three Learning Principles 1. Occam’s Razor2. Sampling Bias3. Data SnoopingSummaryReference 这节课, 主要是介绍提高机器
2017-10-13 10:34:47
717
原创 11.How can Machine Learn Better? - Overfitting and Solution
How can Machine Learn Better? - Overfitting and SolutionHow can Machine Learn Better? - Overfitting and Solution 1. What is Overfitting?2. Dealing with Overfitting 1) Start from Simple Model2) Dat
2017-10-13 10:34:07
836
原创 13. 机器学习基石-How can Machine Learn Better? - Validation
How can Machine Learn Better? - ValidationHow can Machine Learn Better? - Validation 1. Model Selection Problem2. Validation3. Leave-One-Out Cross Validation4. V-Fold Cross ValidationSummaryRefe
2017-10-13 10:30:32
727
原创 12. 机器学习基石-How can Machine Learn Better? - Regularization
How can Machine Learn Better? - Regularization1. Regularized Hypothesis Set正则化的主要思想:将假设函数从高次多项式的数降低到低次,即把复杂的模型变成简单模型。如图一所示的表示高次多项式函数,明显产生了过拟合现象,而左图的表示使用正则化后的低次函数。并且从图中的下方的Hypothesis Set的圈中可以看出,高次的多项式会包
2017-10-13 10:29:29
630
原创 10. 机器学习基石-How can Machine Learn? - Nonlinear Transformation
How can Machine Learn? - Nonlinear TransformationHow can Machine Learn? - Nonlinear Transformation 1. Quadratic Hypotheses2. Nonlinear Transform3. Price of Nonlinear Transform4. Structured Hypothe
2017-10-13 10:27:26
770
原创 9. 机器学习基石-How can Machine Learn? - Linear Model for Classification
How can Machine Learn? - Linear Model for ClassificationHow can Machine Learn? - Linear Model for Classification 1. Linear Models for Binary Classification 1) Analyzing of Three Linear Models2) Err
2017-10-13 10:19:28
752
原创 8. 机器学习基石-How can Machine Learn? - Logistic Regression
How can Machine Learn? - Logistic RegressionHow can Machine Learn? - Logistic Regression 1. Introduction of Logistic Regression2. Comparison of Linear Regression, Logistic Classification and Logisti
2017-10-13 09:13:33
846
原创 7. 机器学习基石-How can Machine Learn? - Linear Regression
How can Machine Learn? - Linear RegressionHow can Machine Learn? - Linear Regression 1. What is Regression2. Linear Regression 1) Introduction of Linear Regression2) Error Measurement of Linear Re
2017-10-13 08:58:35
752
原创 6. 机器学习基石-Why can Machine Learn? - Noice and Error
How can Machine Learn? - Noice and ErrorHow can Machine Learn? - Noice and Error 1. Noise 1) Introduction of Noise2) The Influence of Noise2. Error 1) Pointwise Error Measure2) Two Important Poi
2017-10-12 12:37:07
632
原创 5. 机器学习基石-Why can Machine Learn?
Why can Machine Learn?Why can Machine Learn? 1. Preview of Last Chapter2. VC Bound (Vapnik-Chervonenkis Bound) - A Upper Bound Limitation of Hoeffding Inequity 1) Introduction2) Growth function3)
2017-10-11 12:59:30
876
原创 Latex 数学符号和公式模板整理
Latex 数学符号和公式模板整理 日期 修改内容 2017年9月30日 添加希腊字母表 本文整理了Latex所有的数学符号和公式模板,便于查找。如果有错,麻烦指正! 1. 希腊字母表 (按照字母顺序排序) 字母 公式 字母 公式 字母 公式 α\alpha \alpha β\beta \beta χ\chi \chi δ\d
2017-09-30 19:51:20
6613
原创 4. 机器学习基石-When can Machine Learn? - Feasible of Learning
When can Machine Learn? - Feasible of LearningWhen can Machine Learn? - Feasible of Learning 1. Learning is Impossible?2. Probability of the Rescue 1) Hoeffding Inequity2) Connection Between Hoeff
2017-09-30 19:13:37
948
原创 3. 机器学习基石-When can Machine Learn? - Types of Learning
When can Machine Learn? - Types of LearningWhen can Machine Learn? - Types of Learning 1. Learning with Different Output Space 1) Binary Classification2) Multiclass Classification3) Regression4)
2017-09-30 19:12:18
662
原创 2. 机器学习基石-When can Machine Learn? - Learning to Answer Yes or No
When can Machine Learn? - Learning to Answer Yes or NoWhen can Machine Learn? - Learning to Answer Yes or No 1. Perceptron Hypothesis Set 1) Perceptron Hypothesis Set2) Perceptron Hypothesis Set公式化
2017-09-29 14:35:33
829
原创 1. 机器学习基石-When can Machine Learn? - The Learning Problem
When can Machine Learn? - The Learning Problem1.When can Machine Learn? - The Learning Problem 1. The Learning Problem 1) Human Learning and Machine Learning ① Human Learning② Machine Learning③ S
2017-09-29 00:33:02
959
原创 3. 线性模型性能分析--混淆矩阵(Confusion Matrix)
1. 什么是混淆矩阵 在人工智能中,混淆矩阵(confusion matrix)是可视化工具,特别用于监督学习,在无监督学习一般叫做匹配矩阵。在图像精度评价中,主要用于比较分类结果和实际测得值,可以把分类结果的精度显示在一个混淆矩阵里面。混淆矩阵是通过将每个实测像元的位置和分类与分类图像中的相应位置和分类像比较计算的[1]。通过分析混淆矩阵,我们可以得到: * TPR (True Posi
2017-09-14 17:15:20
4246
原创 2. 数据分割
1. 为什么是数据分割通过把数据集 (Dataset) 中的数据内容分割成训练集 (Train Set) 和测试集 (Test Set),用训练集来训练模型,再通过测试集来测试模型的性能,如果测试通过,才会考虑投放在实际应用中。2. 数据分割的注意事项2.1. 保证数据的随机性如果数据分割是按照一定的规律进行的话,那么训练出来的模型也会被“模式化”,一旦遇到特殊值,就会判断出错。2.2. 保证训练集
2017-09-14 16:42:12
1236
原创 1. 数据预处理-数据归一化和数据规范化
数据预处理-数据归一化和数据规范化 1. 数据归一化1.1. 作用把数据映射到[0,1]的区间中把有量纲形式变成无量纲形式1.2. 算法1.2.1. 最小-最大归一化Y=X−XminXmax−XminY = \frac{X - Xmin}{Xmax - Xmin} 把X的值映射到[0, 1] 的区域中,因为必有X < Xmax,所以分子(X - Xmin) < 分母(Xmax - Xmi
2017-09-11 19:43:25
8577
原创 0.大纲
机器学习-数学基础这页博客主要是收集各个链接,做成大纲,方便自己梳理知识点,也方便别人查阅。会不定时更新。这个栏目里面用到的图片和公式如果不是原创会在下面注明出处,如有侵权,联系删除。
2017-09-11 19:05:28
637
《机器学习实战》(中文版+英文版+源代码)
2017-09-25
图解机器学习
2017-09-25
树莓派RPi几本教程电子书
2015-07-30
Qt5开发及实例 书本PPT PDF 以及光盘资料
2015-07-23
Qt5开发及实例 书本PPT PDF 光盘资料
2015-07-23
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人