Step1:Model
Step2:Goodness of Function
Step3:Gradient De’scent
y = b +w1x1+w1x2…
Total Loss等于Cross Entrpy之和
使用Gradient Descent
使用backprapogation算偏微分
使用dropout
当神经网络很深时,训练结果不一定更好,因为有梯度消失
Maxout 神经网络,激励函数是可变的
stride
表示filter移动的像素距离
strides = [1,x_move,y_move,1]
padding = ‘SAME’--------表示fliter与原图大小相等,取到图外部分以0填充
padding = ‘VALID’--------flier比原图小,挨着原图取
Convolution过程
import numpy as np
import cv2 as cv
from cv2 import HoughCircles
img = cv.imread("tupian/171921-005.png")
img = cv.resize(img, (1024, 512))
img =cv.GaussianBlur(img, (3, 3), 0)
grad_x = cv.Sobel(img, ddepth=cv.CV_16S, dx=1, dy=0)
grad_y = cv.Sobel(img, ddepth=cv.CV_16S, dx=0, dy=1)
grad_x_8u = cv.convertScaleAbs(grad_x)
grad_y_8u = cv.convertScaleAbs(grad_y)
img_sobel = cv.addWeighted(grad_x_8u, 0.6, grad_y_8u, 0.4, 0)
circles = HoughCircles(img, cv.HOUGH_GRADIENT, 1, 100, param1=100 , param2=30, minRadius=5, maxRadius=300)
print(circles.shape)
cv.destroyAllWindows()