python使用梯度下降算法实现一个多线性回归


Posted in Python onMarch 24, 2020

python使用梯度下降算法实现一个多线性回归,供大家参考,具体内容如下

图示:

python使用梯度下降算法实现一个多线性回归

python使用梯度下降算法实现一个多线性回归

import pandas as pd
import matplotlib.pylab as plt
import numpy as np
# Read data from csv
pga = pd.read_csv("D:\python3\data\Test.csv")
# Normalize the data 归一化值 (x - mean) / (std)
pga.AT = (pga.AT - pga.AT.mean()) / pga.AT.std()
pga.V = (pga.V - pga.V.mean()) / pga.V.std()
pga.AP = (pga.AP - pga.AP.mean()) / pga.AP.std()
pga.RH = (pga.RH - pga.RH.mean()) / pga.RH.std()
pga.PE = (pga.PE - pga.PE.mean()) / pga.PE.std()


def cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y):
 # Initialize cost
 J = 0
 # The number of observations
 m = len(x1)
 # Loop through each observation
 # 通过每次观察进行循环
 for i in range(m):
 # Compute the hypothesis
 # 计算假设
 h=theta0+x1[i]*theta1+x2[i]*theta2+x3[i]*theta3+x4[i]*theta4
 # Add to cost
 J += (h - y[i])**2
 # Average and normalize cost
 J /= (2*m)
 return J
# The cost for theta0=0 and theta1=1


def partial_cost_theta4(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y):
 h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4
 diff = (h - y) * x4
 partial = diff.sum() / (x2.shape[0])
 return partial


def partial_cost_theta3(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y):
 h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4
 diff = (h - y) * x3
 partial = diff.sum() / (x2.shape[0])
 return partial


def partial_cost_theta2(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y):
 h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4
 diff = (h - y) * x2
 partial = diff.sum() / (x2.shape[0])
 return partial


def partial_cost_theta1(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y):
 h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4
 diff = (h - y) * x1
 partial = diff.sum() / (x2.shape[0])
 return partial

# 对theta0 进行求导
# Partial derivative of cost in terms of theta0


def partial_cost_theta0(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y):
 h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4
 diff = (h - y)
 partial = diff.sum() / (x2.shape[0])
 return partial


def gradient_descent(x1,x2,x3,x4,y, alpha=0.1, theta0=0, theta1=0,theta2=0,theta3=0,theta4=0):
 max_epochs = 1000 # Maximum number of iterations 最大迭代次数
 counter = 0 # Intialize a counter 当前第几次
 c = cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) ## Initial cost 当前代价函数
 costs = [c] # Lets store each update 每次损失值都记录下来
 # Set a convergence threshold to find where the cost function in minimized
 # When the difference between the previous cost and current cost
 # is less than this value we will say the parameters converged
 # 设置一个收敛的阈值 (两次迭代目标函数值相差没有相差多少,就可以停止了)
 convergence_thres = 0.000001
 cprev = c + 10
 theta0s = [theta0]
 theta1s = [theta1]
 theta2s = [theta2]
 theta3s = [theta3]
 theta4s = [theta4]
 # When the costs converge or we hit a large number of iterations will we stop updating
 # 两次间隔迭代目标函数值相差没有相差多少(说明可以停止了)
 while (np.abs(cprev - c) > convergence_thres) and (counter < max_epochs):
 cprev = c
 # Alpha times the partial deriviative is our updated
 # 先求导, 导数相当于步长
 update0 = alpha * partial_cost_theta0(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)
 update1 = alpha * partial_cost_theta1(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)
 update2 = alpha * partial_cost_theta2(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)
 update3 = alpha * partial_cost_theta3(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)
 update4 = alpha * partial_cost_theta4(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)
 # Update theta0 and theta1 at the same time
 # We want to compute the slopes at the same set of hypothesised parameters
 #  so we update after finding the partial derivatives
 # -= 梯度下降,+=梯度上升
 theta0 -= update0
 theta1 -= update1
 theta2 -= update2
 theta3 -= update3
 theta4 -= update4

 # Store thetas
 theta0s.append(theta0)
 theta1s.append(theta1)
 theta2s.append(theta2)
 theta3s.append(theta3)
 theta4s.append(theta4)

 # Compute the new cost
 # 当前迭代之后,参数发生更新
 c = cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y)

 # Store updates,可以进行保存当前代价值
 costs.append(c)
 counter += 1 # Count
 # 将当前的theta0, theta1, costs值都返回去
 #return {'theta0': theta0, 'theta1': theta1, 'theta2': theta2, 'theta3': theta3, 'theta4': theta4, "costs": costs}
 return {'costs':costs}

print("costs =", gradient_descent(pga.AT, pga.V,pga.AP,pga.RH,pga.PE)['costs'])
descend = gradient_descent(pga.AT, pga.V,pga.AP,pga.RH,pga.PE, alpha=.01)
plt.scatter(range(len(descend["costs"])), descend["costs"])
plt.show()

损失函数随迭代次数变换图:

python使用梯度下降算法实现一个多线性回归

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持三水点靠木。

Python 相关文章推荐
Python GAE、Django导出Excel的方法
Nov 24 Python
python开发之函数定义实例分析
Nov 12 Python
Python模拟登陆淘宝并统计淘宝消费情况的代码实例分享
Jul 04 Python
Python 由字符串函数名得到对应的函数(实例讲解)
Aug 10 Python
Python中的函数作用域
May 07 Python
Vue的el-scrollbar实现自定义滚动
May 29 Python
Python redis操作实例分析【连接、管道、发布和订阅等】
May 16 Python
python3的print()函数的用法图文讲解
Jul 16 Python
sklearn-SVC实现与类参数详解
Dec 10 Python
Django调用支付宝接口代码实例详解
Apr 04 Python
浅谈tensorflow模型保存为pb的各种姿势
May 25 Python
Python StringIO及BytesIO包使用方法解析
Jun 15 Python
PyQt5+python3+pycharm开发环境配置教程
Mar 24 #Python
python实现最速下降法
Mar 24 #Python
python实现梯度法 python最速下降法
Mar 24 #Python
PyQt5+Pycharm安装和配置图文教程详解
Mar 24 #Python
python实现梯度下降法
Mar 24 #Python
pycharm下配置pyqt5的教程(anaconda虚拟环境下+tensorflow)
Mar 25 #Python
pycharm通过anaconda安装pyqt5的教程
Mar 24 #Python
You might like
在任意字符集下正常显示网页的方法一
2007/04/01 PHP
php截取中文字符串不乱码的方法
2013/12/25 PHP
php实现文章置顶功能的方法
2016/10/20 PHP
Track Image Loading效果代码分析
2007/08/13 Javascript
高性能web开发 如何加载JS,JS应该放在什么位置?
2010/05/14 Javascript
jQuery链式操作如何实现以及为什么要用链式操作
2013/01/17 Javascript
jquery制作LED 时钟特效
2015/02/01 Javascript
JavaScript实现为input与textarea自定义hover,focus效果的方法
2015/08/21 Javascript
jquery validate表单验证的基本用法入门
2016/01/18 Javascript
JavaScript正则表达式exec/g实现多次循环用法示例
2017/01/17 Javascript
jQuery+C#实现参数RSA加密传输功能【附jsencrypt.js下载】
2017/06/26 jQuery
cnpm加速Angular项目创建的方法
2018/09/07 Javascript
Vue弹出菜单功能的实现代码
2018/09/12 Javascript
微信小程序文章详情页跳转案例详解
2019/07/09 Javascript
Vue实现多标签选择器
2019/11/28 Javascript
JS实现手写 forEach算法示例
2020/04/29 Javascript
如何在Vue中使localStorage具有响应式(思想实验)
2020/07/14 Javascript
vue Cli 环境删除与重装教程 - 版本文档
2020/09/11 Javascript
NodeJS模块Buffer原理及使用方法解析
2020/11/11 NodeJs
详解JavaScript执行模型
2020/11/16 Javascript
[03:09]DOTA2亚洲邀请赛 LGD战队出场宣传片
2015/02/07 DOTA
Python实现图片尺寸缩放脚本
2018/03/10 Python
python使用requests.session模拟登录
2019/08/09 Python
详解python如何引用包package
2020/06/07 Python
python给视频添加背景音乐并改变音量的具体方法
2020/07/19 Python
python两个list[]相加的实现方法
2020/09/23 Python
HTML5录音实践总结(Preact)
2020/05/07 HTML / CSS
开工典礼策划方案
2014/05/23 职场文书
经理任命书模板
2014/06/06 职场文书
绘画专业自荐信
2014/07/04 职场文书
项目工作说明书
2014/07/29 职场文书
2014年质检工作总结
2014/11/26 职场文书
金秋助学感谢信
2015/01/21 职场文书
高三教师工作总结2015
2015/07/21 职场文书
2015毕业设计工作总结
2015/07/24 职场文书
使用vue-element-admin框架从后端动态获取菜单功能的实现
2021/04/29 Vue.js