机器学习笔记part1,系数优化(SGD/随机梯度下降)及代码实现

机器学习笔记part1

要先理解梯度下降:https://blog.csdn.net/CSDNXXCQ/article/details/113871648

1,epochs(时代/学习周期)→模型通过不断地运行(学习)→不断的更新coefficient(系数)→更好地拟合数据
即b=b-learningrate* error * x
2把所有地epoch(学习周期,注意,每个epoch里会有相应的训练集)进行loop(循环迭代)
3每一次系数(coefficient)循环都会进行系数调优
在这里插入图片描述
error=prediction-expected

// An highlighted block
from Ay_hat import make_prediction#模块化编程,轻量化代码








def using_sgd_method_to_calculate_coefficient(traing_dataset,learning_rate,n_times_epoch):
	coefficients=[0.0 for i in range(len(traing_dataset[0]))]
	for epoch in range(n_times_epoch):
		the_sum_of_error = 0#用于计数
		for row in traing_dataset:
			y_hat=make_prediction(row,coefficients)
			error = y_hat - row[-1]#error=prediction-expected
			the_sum_of_error += error**2#用平方避免负数的情况出现
			coefficients[0] = coefficients[0] - learning_rate*error#b=b-learningrate*errorx
			for i in range(len(row)-1):
				coefficients[i+1] = coefficients[i+1] - learning_rate*error*row[i]

		print("This is epoch :",epoch,"the learning_rate we are using is :",learning_rate,"the error is :",the_sum_of_error)

	return coefficients




your_training_dataset=[[1,1],[2,3],[4,3],[3,2],[5,5]]
test_coefficients = [0.4,0.8]

your_model_learning_rate = 0.01
your_n_epoch = 50
your_coefficient = using_sgd_method_to_calculate_coefficient(your_training_dataset,your_model_learning_rate,your_n_epoch)


你可能感兴趣的:(机器学习笔记,统计学,人工智能,数据分析,机器学习)