
XGBoost学习
HawardScut
工作啦,有时比较忙没及时回复,望见谅。
展开
-
(一)XGBoost入门
import xgboost as xgb# 读数据dtrain = xgb.DMatrix('xxx/xgboost/demo/data/agaricus.txt.train')dtest = xgb.DMatrix('xxx/xgboost/demo/data/agaricus.txt.test')[18:30:02] 6513x127 matrix with 143286 e...原创 2018-07-23 18:33:02 · 740 阅读 · 0 评论 -
(二)XGBoost之DART booster
XGBoost主要结合了大量的回归树和较小的学习率。在这种情况下,早期添加的树木很重要,而后期添加的树木并不重要。Vinayak和Gilad-Bachrach提出了一种新的方法,将深度神经网络社区的辍学技术添加到增强的树木中,并在某些情况下报告了更好的结果。——DART booster(解决过度拟合问题)import xgboost as xgb# 读数据dtrain = xgb...原创 2018-07-23 18:58:53 · 6011 阅读 · 0 评论 -
(三)XGBoost数据接口
import xgboost as xgb#数据接口:(1)逗号分隔值(CSV)文件(2)NumPy 2D阵列(3)XGBoost二进制缓冲区文件#1.将CSV文件加载到DMatrix(train.csv是文件名,第0列是lable)# label_column specifies the index of the column containing the true la...原创 2018-07-23 19:44:37 · 1614 阅读 · 0 评论 -
(四)XGBoost——预测用1棵树木vs全部树
#预测1棵树木vs全部树木import numpy as npimport xgboost as xgb### load data in do trainingdtrain = xgb.DMatrix('/home/b8204/HawardData/xgboost/demo/data/agaricus.txt.train')dtest = xgb.DMatrix('/home/b820...原创 2018-07-23 20:30:54 · 1539 阅读 · 0 评论 -
(五)xgboost中拟合广义线性模型
import xgboost as xgb### this script demonstrate how to fit generalized linear model in xgboost# basically, we are using linear model, instead of tree for our boosters##dtrain = xgb.DMatrix(bas...原创 2018-07-23 20:49:57 · 1306 阅读 · 0 评论 -
(六)XGBoost使用交叉验证
import numpy as npimport xgboost as xgb### load data in do trainingdtrain = xgb.DMatrix(basePath+'data/agaricus.txt.train')param = {'max_depth':2, 'eta':1, 'silent':1, 'objective':'binary:logisti...原创 2018-07-23 21:35:55 · 15447 阅读 · 2 评论 -
(七)XGBoost融合sklearn
APIScikit-Learn API(Scikit-Learn Wrapper interface for XGBoost.)import pickleimport xgboost as xgbimport numpy as npfrom sklearn.model_selection import KFold, train_test_split, GridSearchCVfr...原创 2018-07-23 22:37:31 · 1020 阅读 · 0 评论