单层神经网络线性回归_单层神经网络| 使用Python的线性代数

单层神经网络线性回归

A neural network is a powerful tool often utilized in Machine Learning because neural networks are fundamentally very mathematical. We will use our basics of Linear Algebra and NumPy to understand the foundation of Machine Learning using Neural Networks.

神经网络是机器学习中经常使用的强大工具,因为神经网络从根本上说是非常数学的。 我们将使用线性代数和NumPy的基础知识来理解使用神经网络进行机器学习的基础。

Our article is a showcase of the application of Linear Algebra and, Python provides a wide set of libraries that help to build our motivation of using Python for machine learning.

我们的文章展示了线性代数的应用,Python提供了广泛的库,有助于建立我们使用Python进行机器学习的动机。

The figure is showing a neural network with multi-input and one output node.

该图显示了一个具有多输入和一个输出节点的神经网络。

单层神经网络线性回归_单层神经网络| 使用Python的线性代数_第1张图片

Input to the neural network is X1, X2,  X3……... Xn and their corresponding weights are w1, w2, w3………..wn respectively. The output z is a tangent hyperbolic function for decision making which has input as the sum of products of Input and Weight.

输入到神经网络的是X 1X 2 X 3 ……... X n ,它们的相应权重分别是w 1w 2 ,w 3 ………..w n 。 输出z是决策的切线双曲函数,其输入为输入与权重的乘积之和。

Mathematically,  z = tanh(∑ Xiwi)

数学上z = tanh(∑ X i w i )

Where tanh( ) is an tangent hyperbolic function (Refer article Linear Algebra | Tangent Hyperbolic Function) because it is one of the most used decision-making functions.

其中tanh()是切线双曲函数(请参阅线性代数|切线双曲函数),因为它是最常用的决策函数之一。

So for drawing this mathematical network in a python code by defining a function neural_network( X, W). Note: The tangent hyperbolic function takes input within the range of 0 to 1.

因此,通过定义函数Neuro_network(X,W)以python代码绘制此数学网络。 注意:切线双曲函数的输入范围为0到1。

Input parameter(s): Vector X and W

输入参数:向量XW

Return: A value ranging between 0 and 1, as a prediction of the neural network based on the inputs.

返回一个介于0到1之间的值,作为基于输入的神经网络的预测。

Application:

应用:

  1. Machine Learning

    机器学习

  2. Computer Vision

    计算机视觉

  3. Data Analysis

    数据分析

  4. Fintech

    金融科技

单层神经网络的Python程序 (Python program for Uni - Layer Neural Network)

#Linear Algebra and Neural Network
#Linear Algebra Learning Sequence


import numpy as np

#Use of np.array() to define an Input Vector
inp = np.array([0.323, 0.432, 0.546, 0.44, 0.123, 0.78, 0.123])
print("The Vector A : ",inp)

#defining Weight Vector
weigh = np.array([0.3, 0.63, 0.99, 0.89, 0.50, 0.66, 0.123])
print("\nThe Vector B : ",weigh)

#defining a neural network for predicting an output value
def neural_network(inputs, weights):
    wT = np.transpose(weights)
    elpro = wT.dot(inputs)
    
    #Tangent Hyperbolic Function for Decision Making
    out = np.tanh(elpro)
    return out

outputi = neural_network(inp,weigh)

#printing the expected output
print("\nExpected Output of the given Input data and their respective Weight : ", outputi)

Output:

输出:

The Vector A :  [0.323 0.432 0.546 0.44  0.123 0.78  0.123]

The Vector B :  [0.3   0.63  0.99  0.89  0.5   0.66  0.123]

Expected Output of the given Input data and their respective Weight :  0.9556019596251646


翻译自: https://www.includehelp.com/python/uni-layer-neural-network.aspx

单层神经网络线性回归

你可能感兴趣的:(神经网络,python,机器学习,深度学习,人工智能)