Numpy实现BayesianRegression

网上学习资料一大堆,但如果学到的知识不成体系,遇到问题时只是浅尝辄止,不再深入研究,那么很难做到真正的技术提升。

需要这份系统化学习资料的朋友,可以戳这里获取

一个人可以走的很快,但一群人才能走的更远!不论你是正从事IT行业的老鸟或是对IT行业感兴趣的新人,都欢迎加入我们的的圈子(技术交流、学习资源、职场吐槽、大厂内推、面试辅导),让我们一起学习成长!

import numpy as np

from scipy.stats import chi2, multivariate_normal

from mlfromscratch.utils import mean_squared_error, train_test_split, polynomial_features

class BayesianRegression(object):

“”"Bayesian regression model. If poly_degree is specified the features will

be transformed to with a polynomial basis function, which allows for polynomial

regression. Assumes Normal prior and likelihood for the weights and scaled inverse

chi-squared prior and likelihood for the variance of the weights.

Parameters:


n_draws: float

The number of simulated draws from the posterior of the parameters.

mu0: array

The mean values of the prior Normal distribution of the parameters.

omega0: array

The precision matrix of the prior Normal distribution of the parameters.

nu0: float

The degrees of freedom of the prior scaled inverse chi squared distribution.

sigma_sq0: float

The scale parameter of the prior scaled inverse chi squared distribution.

poly_degree: int

The polynomial degree that the features should be transformed to. Allows

for polynomial regression.

cred_int: float

The credible interval (ETI in this impl.). 95 => 95% credible interval of the posterior

of the parameters.

Reference:

https://github.com/mattiasvillani/BayesLearnCourse/raw/master/Slides/BayesLearnL5.pdf

“”"

def init(self, n_draws, mu0, omega0, nu0, sigma_sq0, poly_degree=0, cred_int=95):

self.w = None

self.n_draws = n_draws

self.poly_degree = poly_degree

self.cred_int = cred_int

Prior parameters

self.mu0 = mu0

self.omega0 = omega0

self.nu0 = nu0

self.sigma_sq0 = sigma_sq0

Allows for simulation from the scaled inverse chi squared

distribution. Assumes the variance is distributed according to

this distribution.

Reference:

https://en.wikipedia.org/wiki/Scaled_inverse_chi-squared_distribution

def _draw_scaled_inv_chi_sq(self, n, df, scale):

X = chi2.rvs(size=n, df=df)

sigma_sq = df * scale / X

return sigma_sq

def fit(self, X, y):

If polynomial transformation

if self.poly_degree:

X = polynomial_features(X, degree=self.poly_degree)

n_samples, n_features = np.shape(X)

X_X = X.T.dot(X)

Least squares approximate of beta

beta_hat = np.linalg.pinv(X_X).dot(X.T).dot(y)

The posterior parameters can be determined analytically since we assume

conjugate priors for the likelihoods.

Normal prior / likelihood => Normal posterior

mu_n = np.linalg.pinv(X_X + self.omega0).dot(X_X.dot(beta_hat)+self.omega0.dot(self.mu0))

omega_n = X_X + self.omega0

Scaled inverse chi-squared prior / likelihood => Scaled inverse chi-squared posterior

nu_n = self.nu0 + n_samples

sigma_sq_n = (1.0/nu_n)(self.nu0self.sigma_sq0 + \

(y.T.dot(y) + self.mu0.T.dot(self.omega0).dot(self.mu0) - mu_n.T.dot(omega_n.dot(mu_n))))

Simulate parameter values for n_draws

beta_draws = np.empty((self.n_draws, n_features))

for i in range(self.n_draws):

sigma_sq = self._draw_scaled_inv_chi_sq(n=1, df=nu_n, scale=sigma_sq_n)

beta = multivariate_normal.rvs(size=1, mean=mu_n[:,0], cov=sigma_sq*np.linalg.pinv(omega_n))

Save parameter draws

beta_draws[i, :] = beta

Select the mean of the simulated variables as the ones used to make predictions

在这里插入图片描述

感谢每一个认真阅读我文章的人,看着粉丝一路的上涨和关注,礼尚往来总是要有的:

① 2000多本Python电子书(主流和经典的书籍应该都有了)

② Python标准库资料(最全中文版)

③ 项目源码(四五十个有趣且经典的练手项目及源码)

④ Python基础入门、爬虫、web开发、大数据分析方面的视频(适合小白学习)

⑤ Python学习路线图(告别不入流的学习)

网上学习资料一大堆,但如果学到的知识不成体系,遇到问题时只是浅尝辄止,不再深入研究,那么很难做到真正的技术提升。

需要这份系统化学习资料的朋友,可以戳这里获取

一个人可以走的很快,但一群人才能走的更远!不论你是正从事IT行业的老鸟或是对IT行业感兴趣的新人,都欢迎加入我们的的圈子(技术交流、学习资源、职场吐槽、大厂内推、面试辅导),让我们一起学习成长!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值