神经网络与深度学习(二) pytorch入门——张量

本文章参考飞桨AI Studio——人工智能学习实训社区神经网络与机器学习:案例与实践教程进行学习

目录

  • 一、概念:张量算子
  • 二.、使用pytorch实现张量运算
    • 1.2.1创建张量
      • 1.2.1.1指定数据创建张量
      • 1.2.1.2指定形状创建
      • 1.2.1.3指定区间创建张量
    • 1.2.2张量的属性
      • 1.2.2.1张量的形状
      • 1.2.2.2形状的改变
      • 1.2.2.3 张量的数据类型
      • 1.2.2.4张量的设备位置
    • 1.2.3张量与Numpy数组转换
    • 1.2.4张量的访问
      • 1.2.4.1索引和切片
      • 1.2.4.2 访问张量
      • 1.2.4.3 修改张量
    • 1.2.5 张量的运算
      • 1.2.5.1 数学运算
      • 1.2.5.2 逻辑运算
      • 1.2.5.3 矩阵运算
      • 1.2.5.4 广播机制
  • 三. 使用pytorch实现数据预处理
      • 1、house_tiny.csv数据集处理:
      • 2、boston_house_prices.csv数据集处理:
      • 3、Iris.csv数据集处理:

一、概念:张量算子

张量: 张量为我们提供了描述具有任意数量轴的n维数组的通用方法,是矩阵的扩展与延伸。张量类似于Numpy的多维数组(ndarray)的概念,可以具有任意多的维度。
算子: 算子是构建复杂机器学习模型的基础组件,包含一个函数f(x)的前向函数和反向函数。深度学习算法由一个个计算单元组成,我们称这些计算单元为算子。

二.、使用pytorch实现张量运算

1.2.1创建张量

创建一个张量可以有多种方式,下面介绍指定数据创建、指定形状创建、指定区间创建三个基础方法。

1.2.1.1指定数据创建张量

引入torch库

import torch as tc

通过给定的列表数据,创建一个任意维度的张量
(1)给定一个一维列表数据,创建一个一维张量

ndim_1_Tensor = tc.tensor([2.0, 3.0, 4.0])
print(ndim_1_Tensor)

输出结果如下:
在这里插入图片描述
(2)给定一个二维列表数据, 创建一个二维张量

ndim_2_Tensor = tc.tensor([[1.0, 2.0, 3.0],
                          [4.0, 5.0, 6.0]])
print(ndim_2_Tensor)

输出结果如下:
在这里插入图片描述
(3)给定一个三维列表数据, 创建一个三维张量

ndim_3_Tensor = tc.tensor([[[1, 2, 3],
                            [4, 5, 6]],
                           [[7, 8, 9],
                            [10, 11, 12]]])
print(ndim_3_Tensor)

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第1张图片
需要注意的是,张量在任何一个维度上的元素数量必须相等。在同一维度上元素数量不等的张量会抛出异常(读者可以自己尝试一下)。

1.2.1.2指定形状创建

这里介绍使用torch.zeros、torch.ones、torch.full三个方法创建指定形状的张量

m, n = 2, 3  # 创建形状为[m, n]的tensor
zeros_tensor = tc.zeros([m, n])  # 创建0矩阵
ones_tensor = tc.ones([m, n])  # 创建1矩阵
full_tensor = tc.full([m, n], 10)  # 创建元素全为10矩阵
print(zeros_tensor)
print(ones_tensor)
print(full_tensor)

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第2张图片

1.2.1.3指定区间创建张量

这里介绍torch.arange、torch.linspace两种方法在指定区间创建张量。

arange_tensor = tc.arange(start=1, end=5, step=1)  # 从1开始,到5结束(不包括5),步长为1
linspace_tensor = tc.linspace(start=1, end=5, steps=5)  # 从1开始, 到5结束(包括5), 平均分成五步
print(arange_tensor)
print(linspace_tensor)

运行结果如下:
在这里插入图片描述

1.2.2张量的属性

1.2.2.1张量的形状

张量具有如下属性:

  • Tensor.ndim:张量的维度。
  • Tensor.shape: 张量每个维度上元素的数量。
  • Tensor.shape[n]:张量第n维的大小。第n维也称为轴。
  • Tensor.numel():张量中全部元素的个数。
    下面分别输出输出张量的属性:
ndim_4_Tensor = tc.ones([2, 3, 4, 5])

print("Number of dimensions:", ndim_4_Tensor.ndim)
print("Shape of Tensor:", ndim_4_Tensor.shape)
print("Elements number along axis 0 of Tensor:", ndim_4_Tensor.shape[0])
print("Elements number along the last axis of Tensor:", ndim_4_Tensor.shape[-1])
print('Number of elements in Tensor: ', ndim_4_Tensor.numel())

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第3张图片

1.2.2.2形状的改变

除了查看张量的形状外,重新设置张量的在实际编程中也具有重要意义。

# 定义一个shape为[3,2,5]的三维Tensor
ndim_3_Tensor = tc.tensor([[[1, 2, 3, 4, 5],
                            [6, 7, 8, 9, 10]],
                           [[11, 12, 13, 14, 15],
                            [16, 17, 18, 19, 20]],
                           [[21, 22, 23, 24, 25],
                            [26, 27, 28, 29, 30]]])
print("the shape of ndim_3_Tensor:", ndim_3_Tensor.shape)
# torch.reshape 可以保持在输入数据不变的情况下,改变数据形状。这里我们设置reshape为[2,5,3]
reshape_Tensor = tc.reshape(ndim_3_Tensor, [2, 5, 3])
print("After reshape:", reshape_Tensor)

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第4张图片
从输出结果看,将张量从[3, 2, 5]的形状reshape为[2, 5, 3]的形状时,张量内的数据不会发生改变,元素顺序也没有发生改变,只有数据形状发生了改变。

分别对上文定义的ndim_3_Tensor进行reshape为[-1]和reshape为[3, 5, 2]两种操作,观察新张量的形状。

new_Tensor1 = ndim_3_Tensor.reshape([-1])
print('new Tensor 1 shape: ', new_Tensor1.shape)
new_Tensor2 = ndim_3_Tensor.reshape([3, 5, 2])
print('new Tensor 2 shape: ', new_Tensor2.shape)

输出结果如下:
在这里插入图片描述
从输出结果看,第一行代码中的第一个reshape操作将张量reshape为元素数量为30的一维向量;第四行代码中的第二个reshape操作中,3对应的维度的元素个数与原张量在该维度上的元素个数相同。

除使用torch.reshape进行张量形状的改变外,还可以通过torch.unsqueeze将张量中的一个或多个维度中插入尺寸为1的维度。

ones_Tensor = tc.ones([5, 10])
new_Tensor1 = tc.unsqueeze(ones_Tensor, dim=0)  # 在pytorch中,dim为插入维度的索引,即在第几维度插入
print('new Tensor 1 shape: ', new_Tensor1.shape)
new_Tensor2 = tc.unsqueeze(ones_Tensor, dim=1)
print('new Tensor 2 shape: ', new_Tensor2.shape)

运行结果如下:
在这里插入图片描述

1.2.2.3 张量的数据类型

pytorch中可以通过Tensor.dtype来查看数据类型。 Tensor的最基本数据类型有:

  • 32位浮点型:torch.float32 (最常用)
  • 64位浮点型:torch.float64 (最常用)
  • 32位整型:torch.int32
  • 32位整型:torch.int32
  • 16位整型:torch.int16
  • 64位整型:torch.int64

1)通过Python元素创建的张量,如果未指定:

  • 对于Python整型数据,则会创建int64型张量。
  • 对于Python浮点型数据,默认会创建float32型张量。
print('Tensor dtype from python integers:', tc.tensor(1).dtype)
print('Tensor dtype from python floating point:', tc.tensor(1.0).dtype)

运行结果如下:
在这里插入图片描述

2)通过tensor.type()函数可以改变数组的数据类型。

float32_tensor = tc.tensor(1.0)
int64_tensor = float32_tensor.type(tc.int64)
print('Tensor after cast to int64:', int64_tensor.dtype)

运行结果如下:
在这里插入图片描述

1.2.2.4张量的设备位置

固定内存也称为不可分页内存或锁页内存,它与GPU之间具有更高的读写效率,并且支持异步传输,这对网络整体性能会有进一步提升,但它的缺点是分配空间过多时可能会降低主机系统的性能,因为它减少了用于存储虚拟内存数据的可分页内存。当未指定设备位置时,张量默认设备位置和安装的飞桨版本一致,如安装了GPU版本的飞桨,则设备位置默认为GPU。

如下代码创建了CPU上的张量,并通过Tensor.device查看张量所在的设备位置。

# 创建CPU上的Tensor
cpu_Tensor = tc.tensor(1, device='cpu')
# 通过Tensor.place查看张量所在设备位置
print('cpu Tensor: ', cpu_Tensor.device)

运行结果如下:
在这里插入图片描述

1.2.3张量与Numpy数组转换

张量和Numpy数组可以相互转换。第1.2.2.3节中我们了解到torch.tensor()函数可以将Numpy数组转化为张量,也可以通过Tensor.numpy()函数将张量转化为Numpy数组。

ndim_1_Tensor = tc.tensor([1., 2.])
# 将当前Tensor转化为numpy.ndarray
print('Tensor to convert:', ndim_1_Tensor.numpy())

运行结果如下:
在这里插入图片描述

1.2.4张量的访问

1.2.4.1索引和切片

我们可以通过索引或切片方便地访问或修改张量。飞桨使用标准的Python索引规则与Numpy索引规则,具有以下特点:

  • 基于0−n的下标进行索引,如果下标为负数,则从尾部开始计算。
  • 通过冒号“:”分隔切片参数start:stop:step来进行切片操作,也就是访问start到stop范围内的部分元素并生成一个新的序列。其中start为切片的起始位置,stop为切片的截止位置,step是切片的步长,这三个参数均可缺省。

1.2.4.2 访问张量

针对一维张量,对单个轴进行索引和切片。

# 定义1个一维Tensor
ndim_1_Tensor = tc.tensor([0, 1, 2, 3, 4, 5, 6, 7, 8])
print("Origin Tensor:", ndim_1_Tensor)
print("First element:", ndim_1_Tensor[0])
print("Last element:", ndim_1_Tensor[-1])
print("All element:", ndim_1_Tensor[:])
print("Before 3:", ndim_1_Tensor[:3])
print("Interval of 3:", ndim_1_Tensor[::3])

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第5张图片
pytorch不支持负数步长,比如进行print(“Reverse:”, ndim_1_Tensor[::-1])操作就会抛出错误。

针对二维及以上维度的张量,在多个维度上进行索引或切片。索引或切片的第一个值对应第0维,第二个值对应第1维,以此类推,如果某个维度上未指定索引,则默认为“:”。

# 定义1个二维Tensor
ndim_2_Tensor = tc.tensor([[0, 1, 2, 3],
                           [4, 5, 6, 7],
                           [8, 9, 10, 11]])
print("Origin Tensor:", ndim_2_Tensor)
print("First row:", ndim_2_Tensor[0])
print("First row:", ndim_2_Tensor[0, :])
print("First column:", ndim_2_Tensor[:, 0])
print("Last column:", ndim_2_Tensor[:, -1])
print("All element:", ndim_2_Tensor[:])
print("First row and second column:", ndim_2_Tensor[0, 1])

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第6张图片

1.2.4.3 修改张量

与访问张量类似,可以在单个或多个轴上通过索引或切片操作来修改张量。
提醒
慎重通过索引或切片操作来修改张量,此操作仅会原地修改该张量的数值,且原值不会被保存。如果被修改的张量参与梯度计算,将仅会使用修改后的数值,这可能会给梯度计算引入风险。

ndim_2_Tensor = tc.ones([2, 3], dtype=tc.float32)
print('Origin Tensor:\n ', ndim_2_Tensor)
# 修改第1维为0
ndim_2_Tensor[0] = 0
print('change Tensor:\n ', ndim_2_Tensor)
# 修改第1维为2.1
ndim_2_Tensor[0:1] = 2.1
print('change Tensor: \n', ndim_2_Tensor)
# 修改全部Tensor
ndim_2_Tensor[...] = 3
print('change Tensor:\n ', ndim_2_Tensor)

输出结果如下:
神经网络与深度学习(二) pytorch入门——张量_第7张图片

1.2.5 张量的运算

张量支持包括基础数学运算、逻辑运算、矩阵运算等100余种运算操作,以加法为例,有如下两种实现方式:
1)使用pytorch torch.add(x,y)。
2)使用张量类成员函数x.add(y)。

# 定义两个Tensor
x = tc.tensor([[1.1, 2.2], [3.3, 4.4]], dtype=tc.float64)
y = tc.tensor([[5.5, 6.6], [7.7, 8.8]], dtype=tc.float64)
# 第一种调用方法,paddle.add逐元素相加算子,并将各个位置的输出元素保存到返回结果中
print('Method 1: ', tc.add(x, y))
# 第二种调用方法
print('Method 2: ', x.add(y))

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第8张图片
从输出结果看,使用张量类成员函数x.add(y)和torch.add(x,y)具有相同的效果。

1.2.5.1 数学运算

张量类的基础数学函数如下:
x.abs()                       # 逐元素取绝对值
x.ceil()                      # 逐元素向上取整
x.floor()                     # 逐元素向下取整
x.round()                     # 逐元素四舍五入
x.exp()                       # 逐元素计算自然常数为底的指数
x.log()                       # 逐元素计算x的自然对数
x.reciprocal()                # 逐元素求倒数
x.square()                    # 逐元素计算平方
x.sqrt()                      # 逐元素计算平方根
x.sin()                       # 逐元素计算正弦
x.cos()                       # 逐元素计算余弦
x.add(y)                      # 逐元素加
x.subtract(y)                 # 逐元素减
x.multiply(y)                 # 逐元素乘(积)
x.divide(y)                   # 逐元素除
x.mod(y)                      # 逐元素除并取余
x.pow(y)                      # 逐元素幂
x.max()                       # 指定维度上元素最大值,默认为全部维度
x.min()                       # 指定维度上元素最小值,默认为全部维度
x.prod()                      # 指定维度上元素累乘,默认为全部维度
x.sum()                       # 指定维度上元素的和,默认为全部维度

同时,为了更方便地使用张量,pytorch对Python数学运算相关的魔法函数进行了重写,以下操作与上述结果相同。

x + y  -> x.add(y)            # 逐元素加
x - y  -> x.subtract(y)       # 逐元素减
x * y  -> x.multiply(y)       # 逐元素乘(积)
x / y  -> x.divide(y)         # 逐元素除
x % y  -> x.mod(y)            # 逐元素除并取余
x ** y -> x.pow(y)            # 逐元素幂

1.2.5.2 逻辑运算

张量类的逻辑运算函数如下:

x.isfinite()                  # 判断Tensor中元素是否是有限的数字,即不包括inf与nan
x.equal_all(y)                # 判断两个Tensor的全部元素是否相等,并返回形状为[1]的布尔类Tensor
x.equal(y)                    # 判断两个Tensor的每个元素是否相等,并返回形状相同的布尔类Tensor
x.not_equal(y)                # 判断两个Tensor的每个元素是否不相等
x.less_than(y)                # 判断Tensor x的元素是否小于Tensor y的对应元素
x.less_equal(y)               # 判断Tensor x的元素是否小于或等于Tensor y的对应元素
x.greater_than(y)             # 判断Tensor x的元素是否大于Tensor y的对应元素
x.greater_equal(y)            # 判断Tensor x的元素是否大于或等于Tensor y的对应元素
x.allclose(y)                 # 判断两个Tensor的全部元素是否接近

同样地,pytorch对Python逻辑比较相关的魔法函数也进行了重写,这里不再赘述。

1.2.5.3 矩阵运算

张量类还包含了矩阵运算相关的函数,如矩阵的转置、范数计算和乘法等。

x.t()                         # 矩阵转置
x.transpose([1, 0])           # 交换第 0 维与第 1 维的顺序
x.norm('fro')                 # 矩阵的弗罗贝尼乌斯范数
x.dist(y, p=2)                # 矩阵(x-y)的2范数
x.matmul(y)                   # 矩阵乘法

有些矩阵运算中也支持大于两维的张量,比如matmul函数,对最后两个维度进行矩阵乘。比如x是形状为[j,k,n,m]的张量,另一个y是[j,k,m,p]的张量,则x.matmul(y)输出的张量形状为[j,k,n,p]。

1.2.5.4 广播机制

pytorch的一些API在计算时支持广播(Broadcasting)机制,允许在一些运算时使用不同形状的张量。通常来讲,如果有一个形状较小和一个形状较大的张量,会希望多次使用较小的张量来对较大的张量执行某些操作,看起来像是形状较小的张量首先被扩展到和较大的张量形状一致,然后再做运算。

广播机制的条件

pytorch的广播机制主要遵循如下规则(参考Numpy广播机制):

1)每个张量至少为一维张量。
2)从后往前比较张量的形状,当前维度的大小要么相等,要么其中一个等于1,要么其中一个不存在。

x = tc.ones((2, 3, 4))
y = tc.ones((2, 3, 4))
z = x + y
print('broadcasting with two same shape tensor: ', z.shape)

x = tc.ones((2, 3, 1, 5))
y = tc.ones((3, 4, 1))
# 从后往前依次比较:
# 第一次:y的维度大小是1
# 第二次:x的维度大小是1
# 第三次:x和y的维度大小相等,都为3
# 第四次:y的维度不存在
# 所以x和y是可以广播的
z = x + y
print('broadcasting with two different shape tensor:', z.shape)

运行结果如下:
在这里插入图片描述
从输出结果看,x与y在上述两种情况中均遵循广播规则,因此在张量相加时可以广播。我们再定义两个shape分别为[2, 3, 4]和[2, 3, 6]的张量,观察这两个张量是否能够通过广播操作相加。

x = tc.ones((2, 3, 4))
y = tc.ones((2, 3, 6))
z = x + y

运行结果会抛出错误,从输出结果看,此时x和y是不能广播的,因为在第一次从后往前的比较中,4和6不相等,不符合广播规则。

广播机制的计算规则

现在我们知道在什么情况下两个张量是可以广播的。两个张量进行广播后的结果张量的形状计算规则如下:

1)如果两个张量shape的长度不一致,那么需要在较小长度的shape前添加1,直到两个张量的形状长度相等。

2) 保证两个张量形状相等之后,每个维度上的结果维度就是当前维度上较大的那个。

以张量x和y进行广播为例,x的shape为[2, 3,
1,5],张量y的shape为[3,4,1]。首先张量y的形状长度较小,因此要将该张量形状补齐为[1, 3, 4,
1],再对两个张量的每一维进行比较。从第一维看,x在一维上的大小为2,y为1,因此,结果张量在第一维的大小为2。以此类推,对每一维进行比较,得到结果张量的形状为[2,
3, 4, 5]。

由于矩阵乘法函数paddle.matmul在深度学习中使用非常多,这里需要特别说明一下它的广播规则:

1)如果两个张量均为一维,则获得点积结果。

2) 如果两个张量都是二维的,则获得矩阵与矩阵的乘积。

3) 如果张量x是一维,y是二维,则将x的shape转换为[1, D],与y进行矩阵相乘后再删除前置尺寸。

4) 如果张量x是二维,y是一维,则获得矩阵与向量的乘积。

5) 如果两个张量都是N维张量(N >
2),则根据广播规则广播非矩阵维度(除最后两个维度外其余维度)。比如:如果输入x是形状为[j,1,n,m]的张量,另一个y是[k,m,p]的张量,则输出张量的形状为[j,k,n,p]。

x = tc.ones([10, 1, 5, 2])
y = tc.ones([3, 2, 5])
z = tc.matmul(x, y)
print('After matmul:', z.shape)

运行结果如下:
在这里插入图片描述
从输出结果看,计算张量乘积时会使用到广播机制。

三. 使用pytorch实现数据预处理

下面我们对house_tiny.csv、boston_house_prices.csv、Iris.csv三个数据集进行读取,处理缺失值,转换为张量格式三种操作。

1、house_tiny.csv数据集处理:

(1)读取数据集 house_tiny.csv

import pandas as pd
import numpy as np
import torch

data = pd.read_csv('house_tiny.csv')
print(data)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第9张图片

(2)处理缺失值

# 处理缺失值和离散值
x = data
# 使用均值处理缺失值
x = x.fillna(value=x.mean())
print(x)
# 处理离散值
x = pd.get_dummies(x)
print(x)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第10张图片
(3)转换为张量格式

#  转换为张量形式
x = np.array(x)
x = torch.tensor(x)
print(x)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第11张图片

2、boston_house_prices.csv数据集处理:

(1)读取数据集 boston_house_prices.csv

import pandas as pd
import numpy as np
import torch

data = pd.read_csv('boston_house_prices.csv')
print(data)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第12张图片
(2)无缺失值

(3)转换为张量格式

# 转换为张量形式
x = np.array(x)
x = torch.tensor(x)
print(x)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第13张图片

3、Iris.csv数据集处理:

(1)读取数据集 Iris.csv

import pandas as pd
import numpy as np
import torch

data = pd.read_csv('Iris.csv')
print(data)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第14张图片

(2)无缺失值

(3)转换为张量格式

x = np.array(x)
x = torch.tensor(x)
print(x)

运行结果如下:
神经网络与深度学习(二) pytorch入门——张量_第15张图片
这里奉上house_tiny.csv、boston_house_prices.csv、Iris.csv三个数据集,大家可以直接拿去使用。

NumRooms,Alley,Price
NA,Pave,127500
2,NA,106000
4,NA,178100
NA,NA,140000
CRIM,ZN,INDUS,CHAS,NOX,RM,AGE,DIS,RAD,TAX,PTRATIO,LSTAT,MEDV
0.00632,18,2.31,0,0.538,6.575,65.2,4.09,1,296,15.3,4.98,24
0.02731,0,7.07,0,0.469,6.421,78.9,4.9671,2,242,17.8,9.14,21.6
0.02729,0,7.07,0,0.469,7.185,61.1,4.9671,2,242,17.8,4.03,34.7
0.03237,0,2.18,0,0.458,6.998,45.8,6.0622,3,222,18.7,2.94,33.4
0.06905,0,2.18,0,0.458,7.147,54.2,6.0622,3,222,18.7,5.33,36.2
0.02985,0,2.18,0,0.458,6.43,58.7,6.0622,3,222,18.7,5.21,28.7
0.08829,12.5,7.87,0,0.524,6.012,66.6,5.5605,5,311,15.2,12.43,22.9
0.14455,12.5,7.87,0,0.524,6.172,96.1,5.9505,5,311,15.2,19.15,27.1
0.21124,12.5,7.87,0,0.524,5.631,100,6.0821,5,311,15.2,29.93,16.5
0.17004,12.5,7.87,0,0.524,6.004,85.9,6.5921,5,311,15.2,17.1,18.9
0.22489,12.5,7.87,0,0.524,6.377,94.3,6.3467,5,311,15.2,20.45,15
0.11747,12.5,7.87,0,0.524,6.009,82.9,6.2267,5,311,15.2,13.27,18.9
0.09378,12.5,7.87,0,0.524,5.889,39,5.4509,5,311,15.2,15.71,21.7
0.62976,0,8.14,0,0.538,5.949,61.8,4.7075,4,307,21,8.26,20.4
0.63796,0,8.14,0,0.538,6.096,84.5,4.4619,4,307,21,10.26,18.2
0.62739,0,8.14,0,0.538,5.834,56.5,4.4986,4,307,21,8.47,19.9
1.05393,0,8.14,0,0.538,5.935,29.3,4.4986,4,307,21,6.58,23.1
0.7842,0,8.14,0,0.538,5.99,81.7,4.2579,4,307,21,14.67,17.5
0.80271,0,8.14,0,0.538,5.456,36.6,3.7965,4,307,21,11.69,20.2
0.7258,0,8.14,0,0.538,5.727,69.5,3.7965,4,307,21,11.28,18.2
1.25179,0,8.14,0,0.538,5.57,98.1,3.7979,4,307,21,21.02,13.6
0.85204,0,8.14,0,0.538,5.965,89.2,4.0123,4,307,21,13.83,19.6
1.23247,0,8.14,0,0.538,6.142,91.7,3.9769,4,307,21,18.72,15.2
0.98843,0,8.14,0,0.538,5.813,100,4.0952,4,307,21,19.88,14.5
0.75026,0,8.14,0,0.538,5.924,94.1,4.3996,4,307,21,16.3,15.6
0.84054,0,8.14,0,0.538,5.599,85.7,4.4546,4,307,21,16.51,13.9
0.67191,0,8.14,0,0.538,5.813,90.3,4.682,4,307,21,14.81,16.6
0.95577,0,8.14,0,0.538,6.047,88.8,4.4534,4,307,21,17.28,14.8
0.77299,0,8.14,0,0.538,6.495,94.4,4.4547,4,307,21,12.8,18.4
1.00245,0,8.14,0,0.538,6.674,87.3,4.239,4,307,21,11.98,21
1.13081,0,8.14,0,0.538,5.713,94.1,4.233,4,307,21,22.6,12.7
1.35472,0,8.14,0,0.538,6.072,100,4.175,4,307,21,13.04,14.5
1.38799,0,8.14,0,0.538,5.95,82,3.99,4,307,21,27.71,13.2
1.15172,0,8.14,0,0.538,5.701,95,3.7872,4,307,21,18.35,13.1
1.61282,0,8.14,0,0.538,6.096,96.9,3.7598,4,307,21,20.34,13.5
0.06417,0,5.96,0,0.499,5.933,68.2,3.3603,5,279,19.2,9.68,18.9
0.09744,0,5.96,0,0.499,5.841,61.4,3.3779,5,279,19.2,11.41,20
0.08014,0,5.96,0,0.499,5.85,41.5,3.9342,5,279,19.2,8.77,21
0.17505,0,5.96,0,0.499,5.966,30.2,3.8473,5,279,19.2,10.13,24.7
0.02763,75,2.95,0,0.428,6.595,21.8,5.4011,3,252,18.3,4.32,30.8
0.03359,75,2.95,0,0.428,7.024,15.8,5.4011,3,252,18.3,1.98,34.9
0.12744,0,6.91,0,0.448,6.77,2.9,5.7209,3,233,17.9,4.84,26.6
0.1415,0,6.91,0,0.448,6.169,6.6,5.7209,3,233,17.9,5.81,25.3
0.15936,0,6.91,0,0.448,6.211,6.5,5.7209,3,233,17.9,7.44,24.7
0.12269,0,6.91,0,0.448,6.069,40,5.7209,3,233,17.9,9.55,21.2
0.17142,0,6.91,0,0.448,5.682,33.8,5.1004,3,233,17.9,10.21,19.3
0.18836,0,6.91,0,0.448,5.786,33.3,5.1004,3,233,17.9,14.15,20
0.22927,0,6.91,0,0.448,6.03,85.5,5.6894,3,233,17.9,18.8,16.6
0.25387,0,6.91,0,0.448,5.399,95.3,5.87,3,233,17.9,30.81,14.4
0.21977,0,6.91,0,0.448,5.602,62,6.0877,3,233,17.9,16.2,19.4
0.08873,21,5.64,0,0.439,5.963,45.7,6.8147,4,243,16.8,13.45,19.7
0.04337,21,5.64,0,0.439,6.115,63,6.8147,4,243,16.8,9.43,20.5
0.0536,21,5.64,0,0.439,6.511,21.1,6.8147,4,243,16.8,5.28,25
0.04981,21,5.64,0,0.439,5.998,21.4,6.8147,4,243,16.8,8.43,23.4
0.0136,75,4,0,0.41,5.888,47.6,7.3197,3,469,21.1,14.8,18.9
0.01311,90,1.22,0,0.403,7.249,21.9,8.6966,5,226,17.9,4.81,35.4
0.02055,85,0.74,0,0.41,6.383,35.7,9.1876,2,313,17.3,5.77,24.7
0.01432,100,1.32,0,0.411,6.816,40.5,8.3248,5,256,15.1,3.95,31.6
0.15445,25,5.13,0,0.453,6.145,29.2,7.8148,8,284,19.7,6.86,23.3
0.10328,25,5.13,0,0.453,5.927,47.2,6.932,8,284,19.7,9.22,19.6
0.14932,25,5.13,0,0.453,5.741,66.2,7.2254,8,284,19.7,13.15,18.7
0.17171,25,5.13,0,0.453,5.966,93.4,6.8185,8,284,19.7,14.44,16
0.11027,25,5.13,0,0.453,6.456,67.8,7.2255,8,284,19.7,6.73,22.2
0.1265,25,5.13,0,0.453,6.762,43.4,7.9809,8,284,19.7,9.5,25
0.01951,17.5,1.38,0,0.4161,7.104,59.5,9.2229,3,216,18.6,8.05,33
0.03584,80,3.37,0,0.398,6.29,17.8,6.6115,4,337,16.1,4.67,23.5
0.04379,80,3.37,0,0.398,5.787,31.1,6.6115,4,337,16.1,10.24,19.4
0.05789,12.5,6.07,0,0.409,5.878,21.4,6.498,4,345,18.9,8.1,22
0.13554,12.5,6.07,0,0.409,5.594,36.8,6.498,4,345,18.9,13.09,17.4
0.12816,12.5,6.07,0,0.409,5.885,33,6.498,4,345,18.9,8.79,20.9
0.08826,0,10.81,0,0.413,6.417,6.6,5.2873,4,305,19.2,6.72,24.2
0.15876,0,10.81,0,0.413,5.961,17.5,5.2873,4,305,19.2,9.88,21.7
0.09164,0,10.81,0,0.413,6.065,7.8,5.2873,4,305,19.2,5.52,22.8
0.19539,0,10.81,0,0.413,6.245,6.2,5.2873,4,305,19.2,7.54,23.4
0.07896,0,12.83,0,0.437,6.273,6,4.2515,5,398,18.7,6.78,24.1
0.09512,0,12.83,0,0.437,6.286,45,4.5026,5,398,18.7,8.94,21.4
0.10153,0,12.83,0,0.437,6.279,74.5,4.0522,5,398,18.7,11.97,20
0.08707,0,12.83,0,0.437,6.14,45.8,4.0905,5,398,18.7,10.27,20.8
0.05646,0,12.83,0,0.437,6.232,53.7,5.0141,5,398,18.7,12.34,21.2
0.08387,0,12.83,0,0.437,5.874,36.6,4.5026,5,398,18.7,9.1,20.3
0.04113,25,4.86,0,0.426,6.727,33.5,5.4007,4,281,19,5.29,28
0.04462,25,4.86,0,0.426,6.619,70.4,5.4007,4,281,19,7.22,23.9
0.03659,25,4.86,0,0.426,6.302,32.2,5.4007,4,281,19,6.72,24.8
0.03551,25,4.86,0,0.426,6.167,46.7,5.4007,4,281,19,7.51,22.9
0.05059,0,4.49,0,0.449,6.389,48,4.7794,3,247,18.5,9.62,23.9
0.05735,0,4.49,0,0.449,6.63,56.1,4.4377,3,247,18.5,6.53,26.6
0.05188,0,4.49,0,0.449,6.015,45.1,4.4272,3,247,18.5,12.86,22.5
0.07151,0,4.49,0,0.449,6.121,56.8,3.7476,3,247,18.5,8.44,22.2
0.0566,0,3.41,0,0.489,7.007,86.3,3.4217,2,270,17.8,5.5,23.6
0.05302,0,3.41,0,0.489,7.079,63.1,3.4145,2,270,17.8,5.7,28.7
0.04684,0,3.41,0,0.489,6.417,66.1,3.0923,2,270,17.8,8.81,22.6
0.03932,0,3.41,0,0.489,6.405,73.9,3.0921,2,270,17.8,8.2,22
0.04203,28,15.04,0,0.464,6.442,53.6,3.6659,4,270,18.2,8.16,22.9
0.02875,28,15.04,0,0.464,6.211,28.9,3.6659,4,270,18.2,6.21,25
0.04294,28,15.04,0,0.464,6.249,77.3,3.615,4,270,18.2,10.59,20.6
0.12204,0,2.89,0,0.445,6.625,57.8,3.4952,2,276,18,6.65,28.4
0.11504,0,2.89,0,0.445,6.163,69.6,3.4952,2,276,18,11.34,21.4
0.12083,0,2.89,0,0.445,8.069,76,3.4952,2,276,18,4.21,38.7
0.08187,0,2.89,0,0.445,7.82,36.9,3.4952,2,276,18,3.57,43.8
0.0686,0,2.89,0,0.445,7.416,62.5,3.4952,2,276,18,6.19,33.2
0.14866,0,8.56,0,0.52,6.727,79.9,2.7778,5,384,20.9,9.42,27.5
0.11432,0,8.56,0,0.52,6.781,71.3,2.8561,5,384,20.9,7.67,26.5
0.22876,0,8.56,0,0.52,6.405,85.4,2.7147,5,384,20.9,10.63,18.6
0.21161,0,8.56,0,0.52,6.137,87.4,2.7147,5,384,20.9,13.44,19.3
0.1396,0,8.56,0,0.52,6.167,90,2.421,5,384,20.9,12.33,20.1
0.13262,0,8.56,0,0.52,5.851,96.7,2.1069,5,384,20.9,16.47,19.5
0.1712,0,8.56,0,0.52,5.836,91.9,2.211,5,384,20.9,18.66,19.5
0.13117,0,8.56,0,0.52,6.127,85.2,2.1224,5,384,20.9,14.09,20.4
0.12802,0,8.56,0,0.52,6.474,97.1,2.4329,5,384,20.9,12.27,19.8
0.26363,0,8.56,0,0.52,6.229,91.2,2.5451,5,384,20.9,15.55,19.4
0.10793,0,8.56,0,0.52,6.195,54.4,2.7778,5,384,20.9,13,21.7
0.10084,0,10.01,0,0.547,6.715,81.6,2.6775,6,432,17.8,10.16,22.8
0.12329,0,10.01,0,0.547,5.913,92.9,2.3534,6,432,17.8,16.21,18.8
0.22212,0,10.01,0,0.547,6.092,95.4,2.548,6,432,17.8,17.09,18.7
0.14231,0,10.01,0,0.547,6.254,84.2,2.2565,6,432,17.8,10.45,18.5
0.17134,0,10.01,0,0.547,5.928,88.2,2.4631,6,432,17.8,15.76,18.3
0.13158,0,10.01,0,0.547,6.176,72.5,2.7301,6,432,17.8,12.04,21.2
0.15098,0,10.01,0,0.547,6.021,82.6,2.7474,6,432,17.8,10.3,19.2
0.13058,0,10.01,0,0.547,5.872,73.1,2.4775,6,432,17.8,15.37,20.4
0.14476,0,10.01,0,0.547,5.731,65.2,2.7592,6,432,17.8,13.61,19.3
0.06899,0,25.65,0,0.581,5.87,69.7,2.2577,2,188,19.1,14.37,22
0.07165,0,25.65,0,0.581,6.004,84.1,2.1974,2,188,19.1,14.27,20.3
0.09299,0,25.65,0,0.581,5.961,92.9,2.0869,2,188,19.1,17.93,20.5
0.15038,0,25.65,0,0.581,5.856,97,1.9444,2,188,19.1,25.41,17.3
0.09849,0,25.65,0,0.581,5.879,95.8,2.0063,2,188,19.1,17.58,18.8
0.16902,0,25.65,0,0.581,5.986,88.4,1.9929,2,188,19.1,14.81,21.4
0.38735,0,25.65,0,0.581,5.613,95.6,1.7572,2,188,19.1,27.26,15.7
0.25915,0,21.89,0,0.624,5.693,96,1.7883,4,437,21.2,17.19,16.2
0.32543,0,21.89,0,0.624,6.431,98.8,1.8125,4,437,21.2,15.39,18
0.88125,0,21.89,0,0.624,5.637,94.7,1.9799,4,437,21.2,18.34,14.3
0.34006,0,21.89,0,0.624,6.458,98.9,2.1185,4,437,21.2,12.6,19.2
1.19294,0,21.89,0,0.624,6.326,97.7,2.271,4,437,21.2,12.26,19.6
0.59005,0,21.89,0,0.624,6.372,97.9,2.3274,4,437,21.2,11.12,23
0.32982,0,21.89,0,0.624,5.822,95.4,2.4699,4,437,21.2,15.03,18.4
0.97617,0,21.89,0,0.624,5.757,98.4,2.346,4,437,21.2,17.31,15.6
0.55778,0,21.89,0,0.624,6.335,98.2,2.1107,4,437,21.2,16.96,18.1
0.32264,0,21.89,0,0.624,5.942,93.5,1.9669,4,437,21.2,16.9,17.4
0.35233,0,21.89,0,0.624,6.454,98.4,1.8498,4,437,21.2,14.59,17.1
0.2498,0,21.89,0,0.624,5.857,98.2,1.6686,4,437,21.2,21.32,13.3
0.54452,0,21.89,0,0.624,6.151,97.9,1.6687,4,437,21.2,18.46,17.8
0.2909,0,21.89,0,0.624,6.174,93.6,1.6119,4,437,21.2,24.16,14
1.62864,0,21.89,0,0.624,5.019,100,1.4394,4,437,21.2,34.41,14.4
3.32105,0,19.58,1,0.871,5.403,100,1.3216,5,403,14.7,26.82,13.4
4.0974,0,19.58,0,0.871,5.468,100,1.4118,5,403,14.7,26.42,15.6
2.77974,0,19.58,0,0.871,4.903,97.8,1.3459,5,403,14.7,29.29,11.8
2.37934,0,19.58,0,0.871,6.13,100,1.4191,5,403,14.7,27.8,13.8
2.15505,0,19.58,0,0.871,5.628,100,1.5166,5,403,14.7,16.65,15.6
2.36862,0,19.58,0,0.871,4.926,95.7,1.4608,5,403,14.7,29.53,14.6
2.33099,0,19.58,0,0.871,5.186,93.8,1.5296,5,403,14.7,28.32,17.8
2.73397,0,19.58,0,0.871,5.597,94.9,1.5257,5,403,14.7,21.45,15.4
1.6566,0,19.58,0,0.871,6.122,97.3,1.618,5,403,14.7,14.1,21.5
1.49632,0,19.58,0,0.871,5.404,100,1.5916,5,403,14.7,13.28,19.6
1.12658,0,19.58,1,0.871,5.012,88,1.6102,5,403,14.7,12.12,15.3
2.14918,0,19.58,0,0.871,5.709,98.5,1.6232,5,403,14.7,15.79,19.4
1.41385,0,19.58,1,0.871,6.129,96,1.7494,5,403,14.7,15.12,17
3.53501,0,19.58,1,0.871,6.152,82.6,1.7455,5,403,14.7,15.02,15.6
2.44668,0,19.58,0,0.871,5.272,94,1.7364,5,403,14.7,16.14,13.1
1.22358,0,19.58,0,0.605,6.943,97.4,1.8773,5,403,14.7,4.59,41.3
1.34284,0,19.58,0,0.605,6.066,100,1.7573,5,403,14.7,6.43,24.3
1.42502,0,19.58,0,0.871,6.51,100,1.7659,5,403,14.7,7.39,23.3
1.27346,0,19.58,1,0.605,6.25,92.6,1.7984,5,403,14.7,5.5,27
1.46336,0,19.58,0,0.605,7.489,90.8,1.9709,5,403,14.7,1.73,50
1.83377,0,19.58,1,0.605,7.802,98.2,2.0407,5,403,14.7,1.92,50
1.51902,0,19.58,1,0.605,8.375,93.9,2.162,5,403,14.7,3.32,50
2.24236,0,19.58,0,0.605,5.854,91.8,2.422,5,403,14.7,11.64,22.7
2.924,0,19.58,0,0.605,6.101,93,2.2834,5,403,14.7,9.81,25
2.01019,0,19.58,0,0.605,7.929,96.2,2.0459,5,403,14.7,3.7,50
1.80028,0,19.58,0,0.605,5.877,79.2,2.4259,5,403,14.7,12.14,23.8
2.3004,0,19.58,0,0.605,6.319,96.1,2.1,5,403,14.7,11.1,23.8
2.44953,0,19.58,0,0.605,6.402,95.2,2.2625,5,403,14.7,11.32,22.3
1.20742,0,19.58,0,0.605,5.875,94.6,2.4259,5,403,14.7,14.43,17.4
2.3139,0,19.58,0,0.605,5.88,97.3,2.3887,5,403,14.7,12.03,19.1
0.13914,0,4.05,0,0.51,5.572,88.5,2.5961,5,296,16.6,14.69,23.1
0.09178,0,4.05,0,0.51,6.416,84.1,2.6463,5,296,16.6,9.04,23.6
0.08447,0,4.05,0,0.51,5.859,68.7,2.7019,5,296,16.6,9.64,22.6
0.06664,0,4.05,0,0.51,6.546,33.1,3.1323,5,296,16.6,5.33,29.4
0.07022,0,4.05,0,0.51,6.02,47.2,3.5549,5,296,16.6,10.11,23.2
0.05425,0,4.05,0,0.51,6.315,73.4,3.3175,5,296,16.6,6.29,24.6
0.06642,0,4.05,0,0.51,6.86,74.4,2.9153,5,296,16.6,6.92,29.9
0.0578,0,2.46,0,0.488,6.98,58.4,2.829,3,193,17.8,5.04,37.2
0.06588,0,2.46,0,0.488,7.765,83.3,2.741,3,193,17.8,7.56,39.8
0.06888,0,2.46,0,0.488,6.144,62.2,2.5979,3,193,17.8,9.45,36.2
0.09103,0,2.46,0,0.488,7.155,92.2,2.7006,3,193,17.8,4.82,37.9
0.10008,0,2.46,0,0.488,6.563,95.6,2.847,3,193,17.8,5.68,32.5
0.08308,0,2.46,0,0.488,5.604,89.8,2.9879,3,193,17.8,13.98,26.4
0.06047,0,2.46,0,0.488,6.153,68.8,3.2797,3,193,17.8,13.15,29.6
0.05602,0,2.46,0,0.488,7.831,53.6,3.1992,3,193,17.8,4.45,50
0.07875,45,3.44,0,0.437,6.782,41.1,3.7886,5,398,15.2,6.68,32
0.12579,45,3.44,0,0.437,6.556,29.1,4.5667,5,398,15.2,4.56,29.8
0.0837,45,3.44,0,0.437,7.185,38.9,4.5667,5,398,15.2,5.39,34.9
0.09068,45,3.44,0,0.437,6.951,21.5,6.4798,5,398,15.2,5.1,37
0.06911,45,3.44,0,0.437,6.739,30.8,6.4798,5,398,15.2,4.69,30.5
0.08664,45,3.44,0,0.437,7.178,26.3,6.4798,5,398,15.2,2.87,36.4
0.02187,60,2.93,0,0.401,6.8,9.9,6.2196,1,265,15.6,5.03,31.1
0.01439,60,2.93,0,0.401,6.604,18.8,6.2196,1,265,15.6,4.38,29.1
0.01381,80,0.46,0,0.422,7.875,32,5.6484,4,255,14.4,2.97,50
0.04011,80,1.52,0,0.404,7.287,34.1,7.309,2,329,12.6,4.08,33.3
0.04666,80,1.52,0,0.404,7.107,36.6,7.309,2,329,12.6,8.61,30.3
0.03768,80,1.52,0,0.404,7.274,38.3,7.309,2,329,12.6,6.62,34.6
0.0315,95,1.47,0,0.403,6.975,15.3,7.6534,3,402,17,4.56,34.9
0.01778,95,1.47,0,0.403,7.135,13.9,7.6534,3,402,17,4.45,32.9
0.03445,82.5,2.03,0,0.415,6.162,38.4,6.27,2,348,14.7,7.43,24.1
0.02177,82.5,2.03,0,0.415,7.61,15.7,6.27,2,348,14.7,3.11,42.3
0.0351,95,2.68,0,0.4161,7.853,33.2,5.118,4,224,14.7,3.81,48.5
0.02009,95,2.68,0,0.4161,8.034,31.9,5.118,4,224,14.7,2.88,50
0.13642,0,10.59,0,0.489,5.891,22.3,3.9454,4,277,18.6,10.87,22.6
0.22969,0,10.59,0,0.489,6.326,52.5,4.3549,4,277,18.6,10.97,24.4
0.25199,0,10.59,0,0.489,5.783,72.7,4.3549,4,277,18.6,18.06,22.5
0.13587,0,10.59,1,0.489,6.064,59.1,4.2392,4,277,18.6,14.66,24.4
0.43571,0,10.59,1,0.489,5.344,100,3.875,4,277,18.6,23.09,20
0.17446,0,10.59,1,0.489,5.96,92.1,3.8771,4,277,18.6,17.27,21.7
0.37578,0,10.59,1,0.489,5.404,88.6,3.665,4,277,18.6,23.98,19.3
0.21719,0,10.59,1,0.489,5.807,53.8,3.6526,4,277,18.6,16.03,22.4
0.14052,0,10.59,0,0.489,6.375,32.3,3.9454,4,277,18.6,9.38,28.1
0.28955,0,10.59,0,0.489,5.412,9.8,3.5875,4,277,18.6,29.55,23.7
0.19802,0,10.59,0,0.489,6.182,42.4,3.9454,4,277,18.6,9.47,25
0.0456,0,13.89,1,0.55,5.888,56,3.1121,5,276,16.4,13.51,23.3
0.07013,0,13.89,0,0.55,6.642,85.1,3.4211,5,276,16.4,9.69,28.7
0.11069,0,13.89,1,0.55,5.951,93.8,2.8893,5,276,16.4,17.92,21.5
0.11425,0,13.89,1,0.55,6.373,92.4,3.3633,5,276,16.4,10.5,23
0.35809,0,6.2,1,0.507,6.951,88.5,2.8617,8,307,17.4,9.71,26.7
0.40771,0,6.2,1,0.507,6.164,91.3,3.048,8,307,17.4,21.46,21.7
0.62356,0,6.2,1,0.507,6.879,77.7,3.2721,8,307,17.4,9.93,27.5
0.6147,0,6.2,0,0.507,6.618,80.8,3.2721,8,307,17.4,7.6,30.1
0.31533,0,6.2,0,0.504,8.266,78.3,2.8944,8,307,17.4,4.14,44.8
0.52693,0,6.2,0,0.504,8.725,83,2.8944,8,307,17.4,4.63,50
0.38214,0,6.2,0,0.504,8.04,86.5,3.2157,8,307,17.4,3.13,37.6
0.41238,0,6.2,0,0.504,7.163,79.9,3.2157,8,307,17.4,6.36,31.6
0.29819,0,6.2,0,0.504,7.686,17,3.3751,8,307,17.4,3.92,46.7
0.44178,0,6.2,0,0.504,6.552,21.4,3.3751,8,307,17.4,3.76,31.5
0.537,0,6.2,0,0.504,5.981,68.1,3.6715,8,307,17.4,11.65,24.3
0.46296,0,6.2,0,0.504,7.412,76.9,3.6715,8,307,17.4,5.25,31.7
0.57529,0,6.2,0,0.507,8.337,73.3,3.8384,8,307,17.4,2.47,41.7
0.33147,0,6.2,0,0.507,8.247,70.4,3.6519,8,307,17.4,3.95,48.3
0.44791,0,6.2,1,0.507,6.726,66.5,3.6519,8,307,17.4,8.05,29
0.33045,0,6.2,0,0.507,6.086,61.5,3.6519,8,307,17.4,10.88,24
0.52058,0,6.2,1,0.507,6.631,76.5,4.148,8,307,17.4,9.54,25.1
0.51183,0,6.2,0,0.507,7.358,71.6,4.148,8,307,17.4,4.73,31.5
0.08244,30,4.93,0,0.428,6.481,18.5,6.1899,6,300,16.6,6.36,23.7
0.09252,30,4.93,0,0.428,6.606,42.2,6.1899,6,300,16.6,7.37,23.3
0.11329,30,4.93,0,0.428,6.897,54.3,6.3361,6,300,16.6,11.38,22
0.10612,30,4.93,0,0.428,6.095,65.1,6.3361,6,300,16.6,12.4,20.1
0.1029,30,4.93,0,0.428,6.358,52.9,7.0355,6,300,16.6,11.22,22.2
0.12757,30,4.93,0,0.428,6.393,7.8,7.0355,6,300,16.6,5.19,23.7
0.20608,22,5.86,0,0.431,5.593,76.5,7.9549,7,330,19.1,12.5,17.6
0.19133,22,5.86,0,0.431,5.605,70.2,7.9549,7,330,19.1,18.46,18.5
0.33983,22,5.86,0,0.431,6.108,34.9,8.0555,7,330,19.1,9.16,24.3
0.19657,22,5.86,0,0.431,6.226,79.2,8.0555,7,330,19.1,10.15,20.5
0.16439,22,5.86,0,0.431,6.433,49.1,7.8265,7,330,19.1,9.52,24.5
0.19073,22,5.86,0,0.431,6.718,17.5,7.8265,7,330,19.1,6.56,26.2
0.1403,22,5.86,0,0.431,6.487,13,7.3967,7,330,19.1,5.9,24.4
0.21409,22,5.86,0,0.431,6.438,8.9,7.3967,7,330,19.1,3.59,24.8
0.08221,22,5.86,0,0.431,6.957,6.8,8.9067,7,330,19.1,3.53,29.6
0.36894,22,5.86,0,0.431,8.259,8.4,8.9067,7,330,19.1,3.54,42.8
0.04819,80,3.64,0,0.392,6.108,32,9.2203,1,315,16.4,6.57,21.9
0.03548,80,3.64,0,0.392,5.876,19.1,9.2203,1,315,16.4,9.25,20.9
0.01538,90,3.75,0,0.394,7.454,34.2,6.3361,3,244,15.9,3.11,44
0.61154,20,3.97,0,0.647,8.704,86.9,1.801,5,264,13,5.12,50
0.66351,20,3.97,0,0.647,7.333,100,1.8946,5,264,13,7.79,36
0.65665,20,3.97,0,0.647,6.842,100,2.0107,5,264,13,6.9,30.1
0.54011,20,3.97,0,0.647,7.203,81.8,2.1121,5,264,13,9.59,33.8
0.53412,20,3.97,0,0.647,7.52,89.4,2.1398,5,264,13,7.26,43.1
0.52014,20,3.97,0,0.647,8.398,91.5,2.2885,5,264,13,5.91,48.8
0.82526,20,3.97,0,0.647,7.327,94.5,2.0788,5,264,13,11.25,31
0.55007,20,3.97,0,0.647,7.206,91.6,1.9301,5,264,13,8.1,36.5
0.76162,20,3.97,0,0.647,5.56,62.8,1.9865,5,264,13,10.45,22.8
0.7857,20,3.97,0,0.647,7.014,84.6,2.1329,5,264,13,14.79,30.7
0.57834,20,3.97,0,0.575,8.297,67,2.4216,5,264,13,7.44,50
0.5405,20,3.97,0,0.575,7.47,52.6,2.872,5,264,13,3.16,43.5
0.09065,20,6.96,1,0.464,5.92,61.5,3.9175,3,223,18.6,13.65,20.7
0.29916,20,6.96,0,0.464,5.856,42.1,4.429,3,223,18.6,13,21.1
0.16211,20,6.96,0,0.464,6.24,16.3,4.429,3,223,18.6,6.59,25.2
0.1146,20,6.96,0,0.464,6.538,58.7,3.9175,3,223,18.6,7.73,24.4
0.22188,20,6.96,1,0.464,7.691,51.8,4.3665,3,223,18.6,6.58,35.2
0.05644,40,6.41,1,0.447,6.758,32.9,4.0776,4,254,17.6,3.53,32.4
0.09604,40,6.41,0,0.447,6.854,42.8,4.2673,4,254,17.6,2.98,32
0.10469,40,6.41,1,0.447,7.267,49,4.7872,4,254,17.6,6.05,33.2
0.06127,40,6.41,1,0.447,6.826,27.6,4.8628,4,254,17.6,4.16,33.1
0.07978,40,6.41,0,0.447,6.482,32.1,4.1403,4,254,17.6,7.19,29.1
0.21038,20,3.33,0,0.4429,6.812,32.2,4.1007,5,216,14.9,4.85,35.1
0.03578,20,3.33,0,0.4429,7.82,64.5,4.6947,5,216,14.9,3.76,45.4
0.03705,20,3.33,0,0.4429,6.968,37.2,5.2447,5,216,14.9,4.59,35.4
0.06129,20,3.33,1,0.4429,7.645,49.7,5.2119,5,216,14.9,3.01,46
0.01501,90,1.21,1,0.401,7.923,24.8,5.885,1,198,13.6,3.16,50
0.00906,90,2.97,0,0.4,7.088,20.8,7.3073,1,285,15.3,7.85,32.2
0.01096,55,2.25,0,0.389,6.453,31.9,7.3073,1,300,15.3,8.23,22
0.01965,80,1.76,0,0.385,6.23,31.5,9.0892,1,241,18.2,12.93,20.1
0.03871,52.5,5.32,0,0.405,6.209,31.3,7.3172,6,293,16.6,7.14,23.2
0.0459,52.5,5.32,0,0.405,6.315,45.6,7.3172,6,293,16.6,7.6,22.3
0.04297,52.5,5.32,0,0.405,6.565,22.9,7.3172,6,293,16.6,9.51,24.8
0.03502,80,4.95,0,0.411,6.861,27.9,5.1167,4,245,19.2,3.33,28.5
0.07886,80,4.95,0,0.411,7.148,27.7,5.1167,4,245,19.2,3.56,37.3
0.03615,80,4.95,0,0.411,6.63,23.4,5.1167,4,245,19.2,4.7,27.9
0.08265,0,13.92,0,0.437,6.127,18.4,5.5027,4,289,16,8.58,23.9
0.08199,0,13.92,0,0.437,6.009,42.3,5.5027,4,289,16,10.4,21.7
0.12932,0,13.92,0,0.437,6.678,31.1,5.9604,4,289,16,6.27,28.6
0.05372,0,13.92,0,0.437,6.549,51,5.9604,4,289,16,7.39,27.1
0.14103,0,13.92,0,0.437,5.79,58,6.32,4,289,16,15.84,20.3
0.06466,70,2.24,0,0.4,6.345,20.1,7.8278,5,358,14.8,4.97,22.5
0.05561,70,2.24,0,0.4,7.041,10,7.8278,5,358,14.8,4.74,29
0.04417,70,2.24,0,0.4,6.871,47.4,7.8278,5,358,14.8,6.07,24.8
0.03537,34,6.09,0,0.433,6.59,40.4,5.4917,7,329,16.1,9.5,22
0.09266,34,6.09,0,0.433,6.495,18.4,5.4917,7,329,16.1,8.67,26.4
0.1,34,6.09,0,0.433,6.982,17.7,5.4917,7,329,16.1,4.86,33.1
0.05515,33,2.18,0,0.472,7.236,41.1,4.022,7,222,18.4,6.93,36.1
0.05479,33,2.18,0,0.472,6.616,58.1,3.37,7,222,18.4,8.93,28.4
0.07503,33,2.18,0,0.472,7.42,71.9,3.0992,7,222,18.4,6.47,33.4
0.04932,33,2.18,0,0.472,6.849,70.3,3.1827,7,222,18.4,7.53,28.2
0.49298,0,9.9,0,0.544,6.635,82.5,3.3175,4,304,18.4,4.54,22.8
0.3494,0,9.9,0,0.544,5.972,76.7,3.1025,4,304,18.4,9.97,20.3
2.63548,0,9.9,0,0.544,4.973,37.8,2.5194,4,304,18.4,12.64,16.1
0.79041,0,9.9,0,0.544,6.122,52.8,2.6403,4,304,18.4,5.98,22.1
0.26169,0,9.9,0,0.544,6.023,90.4,2.834,4,304,18.4,11.72,19.4
0.26938,0,9.9,0,0.544,6.266,82.8,3.2628,4,304,18.4,7.9,21.6
0.3692,0,9.9,0,0.544,6.567,87.3,3.6023,4,304,18.4,9.28,23.8
0.25356,0,9.9,0,0.544,5.705,77.7,3.945,4,304,18.4,11.5,16.2
0.31827,0,9.9,0,0.544,5.914,83.2,3.9986,4,304,18.4,18.33,17.8
0.24522,0,9.9,0,0.544,5.782,71.7,4.0317,4,304,18.4,15.94,19.8
0.40202,0,9.9,0,0.544,6.382,67.2,3.5325,4,304,18.4,10.36,23.1
0.47547,0,9.9,0,0.544,6.113,58.8,4.0019,4,304,18.4,12.73,21
0.1676,0,7.38,0,0.493,6.426,52.3,4.5404,5,287,19.6,7.2,23.8
0.18159,0,7.38,0,0.493,6.376,54.3,4.5404,5,287,19.6,6.87,23.1
0.35114,0,7.38,0,0.493,6.041,49.9,4.7211,5,287,19.6,7.7,20.4
0.28392,0,7.38,0,0.493,5.708,74.3,4.7211,5,287,19.6,11.74,18.5
0.34109,0,7.38,0,0.493,6.415,40.1,4.7211,5,287,19.6,6.12,25
0.19186,0,7.38,0,0.493,6.431,14.7,5.4159,5,287,19.6,5.08,24.6
0.30347,0,7.38,0,0.493,6.312,28.9,5.4159,5,287,19.6,6.15,23
0.24103,0,7.38,0,0.493,6.083,43.7,5.4159,5,287,19.6,12.79,22.2
0.06617,0,3.24,0,0.46,5.868,25.8,5.2146,4,430,16.9,9.97,19.3
0.06724,0,3.24,0,0.46,6.333,17.2,5.2146,4,430,16.9,7.34,22.6
0.04544,0,3.24,0,0.46,6.144,32.2,5.8736,4,430,16.9,9.09,19.8
0.05023,35,6.06,0,0.4379,5.706,28.4,6.6407,1,304,16.9,12.43,17.1
0.03466,35,6.06,0,0.4379,6.031,23.3,6.6407,1,304,16.9,7.83,19.4
0.05083,0,5.19,0,0.515,6.316,38.1,6.4584,5,224,20.2,5.68,22.2
0.03738,0,5.19,0,0.515,6.31,38.5,6.4584,5,224,20.2,6.75,20.7
0.03961,0,5.19,0,0.515,6.037,34.5,5.9853,5,224,20.2,8.01,21.1
0.03427,0,5.19,0,0.515,5.869,46.3,5.2311,5,224,20.2,9.8,19.5
0.03041,0,5.19,0,0.515,5.895,59.6,5.615,5,224,20.2,10.56,18.5
0.03306,0,5.19,0,0.515,6.059,37.3,4.8122,5,224,20.2,8.51,20.6
0.05497,0,5.19,0,0.515,5.985,45.4,4.8122,5,224,20.2,9.74,19
0.06151,0,5.19,0,0.515,5.968,58.5,4.8122,5,224,20.2,9.29,18.7
0.01301,35,1.52,0,0.442,7.241,49.3,7.0379,1,284,15.5,5.49,32.7
0.02498,0,1.89,0,0.518,6.54,59.7,6.2669,1,422,15.9,8.65,16.5
0.02543,55,3.78,0,0.484,6.696,56.4,5.7321,5,370,17.6,7.18,23.9
0.03049,55,3.78,0,0.484,6.874,28.1,6.4654,5,370,17.6,4.61,31.2
0.03113,0,4.39,0,0.442,6.014,48.5,8.0136,3,352,18.8,10.53,17.5
0.06162,0,4.39,0,0.442,5.898,52.3,8.0136,3,352,18.8,12.67,17.2
0.0187,85,4.15,0,0.429,6.516,27.7,8.5353,4,351,17.9,6.36,23.1
0.01501,80,2.01,0,0.435,6.635,29.7,8.344,4,280,17,5.99,24.5
0.02899,40,1.25,0,0.429,6.939,34.5,8.7921,1,335,19.7,5.89,26.6
0.06211,40,1.25,0,0.429,6.49,44.4,8.7921,1,335,19.7,5.98,22.9
0.0795,60,1.69,0,0.411,6.579,35.9,10.7103,4,411,18.3,5.49,24.1
0.07244,60,1.69,0,0.411,5.884,18.5,10.7103,4,411,18.3,7.79,18.6
0.01709,90,2.02,0,0.41,6.728,36.1,12.1265,5,187,17,4.5,30.1
0.04301,80,1.91,0,0.413,5.663,21.9,10.5857,4,334,22,8.05,18.2
0.10659,80,1.91,0,0.413,5.936,19.5,10.5857,4,334,22,5.57,20.6
8.98296,0,18.1,1,0.77,6.212,97.4,2.1222,24,666,20.2,17.6,17.8
3.8497,0,18.1,1,0.77,6.395,91,2.5052,24,666,20.2,13.27,21.7
5.20177,0,18.1,1,0.77,6.127,83.4,2.7227,24,666,20.2,11.48,22.7
4.26131,0,18.1,0,0.77,6.112,81.3,2.5091,24,666,20.2,12.67,22.6
4.54192,0,18.1,0,0.77,6.398,88,2.5182,24,666,20.2,7.79,25
3.83684,0,18.1,0,0.77,6.251,91.1,2.2955,24,666,20.2,14.19,19.9
3.67822,0,18.1,0,0.77,5.362,96.2,2.1036,24,666,20.2,10.19,20.8
4.22239,0,18.1,1,0.77,5.803,89,1.9047,24,666,20.2,14.64,16.8
3.47428,0,18.1,1,0.718,8.78,82.9,1.9047,24,666,20.2,5.29,21.9
4.55587,0,18.1,0,0.718,3.561,87.9,1.6132,24,666,20.2,7.12,27.5
3.69695,0,18.1,0,0.718,4.963,91.4,1.7523,24,666,20.2,14,21.9
13.5222,0,18.1,0,0.631,3.863,100,1.5106,24,666,20.2,13.33,23.1
4.89822,0,18.1,0,0.631,4.97,100,1.3325,24,666,20.2,3.26,50
5.66998,0,18.1,1,0.631,6.683,96.8,1.3567,24,666,20.2,3.73,50
6.53876,0,18.1,1,0.631,7.016,97.5,1.2024,24,666,20.2,2.96,50
9.2323,0,18.1,0,0.631,6.216,100,1.1691,24,666,20.2,9.53,50
8.26725,0,18.1,1,0.668,5.875,89.6,1.1296,24,666,20.2,8.88,50
11.1081,0,18.1,0,0.668,4.906,100,1.1742,24,666,20.2,34.77,13.8
18.4982,0,18.1,0,0.668,4.138,100,1.137,24,666,20.2,37.97,13.8
19.6091,0,18.1,0,0.671,7.313,97.9,1.3163,24,666,20.2,13.44,15
15.288,0,18.1,0,0.671,6.649,93.3,1.3449,24,666,20.2,23.24,13.9
9.82349,0,18.1,0,0.671,6.794,98.8,1.358,24,666,20.2,21.24,13.3
23.6482,0,18.1,0,0.671,6.38,96.2,1.3861,24,666,20.2,23.69,13.1
17.8667,0,18.1,0,0.671,6.223,100,1.3861,24,666,20.2,21.78,10.2
88.9762,0,18.1,0,0.671,6.968,91.9,1.4165,24,666,20.2,17.21,10.4
15.8744,0,18.1,0,0.671,6.545,99.1,1.5192,24,666,20.2,21.08,10.9
9.18702,0,18.1,0,0.7,5.536,100,1.5804,24,666,20.2,23.6,11.3
7.99248,0,18.1,0,0.7,5.52,100,1.5331,24,666,20.2,24.56,12.3
20.0849,0,18.1,0,0.7,4.368,91.2,1.4395,24,666,20.2,30.63,8.8
16.8118,0,18.1,0,0.7,5.277,98.1,1.4261,24,666,20.2,30.81,7.2
24.3938,0,18.1,0,0.7,4.652,100,1.4672,24,666,20.2,28.28,10.5
22.5971,0,18.1,0,0.7,5,89.5,1.5184,24,666,20.2,31.99,7.4
14.3337,0,18.1,0,0.7,4.88,100,1.5895,24,666,20.2,30.62,10.2
8.15174,0,18.1,0,0.7,5.39,98.9,1.7281,24,666,20.2,20.85,11.5
6.96215,0,18.1,0,0.7,5.713,97,1.9265,24,666,20.2,17.11,15.1
5.29305,0,18.1,0,0.7,6.051,82.5,2.1678,24,666,20.2,18.76,23.2
11.5779,0,18.1,0,0.7,5.036,97,1.77,24,666,20.2,25.68,9.7
8.64476,0,18.1,0,0.693,6.193,92.6,1.7912,24,666,20.2,15.17,13.8
13.3598,0,18.1,0,0.693,5.887,94.7,1.7821,24,666,20.2,16.35,12.7
8.71675,0,18.1,0,0.693,6.471,98.8,1.7257,24,666,20.2,17.12,13.1
5.87205,0,18.1,0,0.693,6.405,96,1.6768,24,666,20.2,19.37,12.5
7.67202,0,18.1,0,0.693,5.747,98.9,1.6334,24,666,20.2,19.92,8.5
38.3518,0,18.1,0,0.693,5.453,100,1.4896,24,666,20.2,30.59,5
9.91655,0,18.1,0,0.693,5.852,77.8,1.5004,24,666,20.2,29.97,6.3
25.0461,0,18.1,0,0.693,5.987,100,1.5888,24,666,20.2,26.77,5.6
14.2362,0,18.1,0,0.693,6.343,100,1.5741,24,666,20.2,20.32,7.2
9.59571,0,18.1,0,0.693,6.404,100,1.639,24,666,20.2,20.31,12.1
24.8017,0,18.1,0,0.693,5.349,96,1.7028,24,666,20.2,19.77,8.3
41.5292,0,18.1,0,0.693,5.531,85.4,1.6074,24,666,20.2,27.38,8.5
67.9208,0,18.1,0,0.693,5.683,100,1.4254,24,666,20.2,22.98,5
20.7162,0,18.1,0,0.659,4.138,100,1.1781,24,666,20.2,23.34,11.9
11.9511,0,18.1,0,0.659,5.608,100,1.2852,24,666,20.2,12.13,27.9
7.40389,0,18.1,0,0.597,5.617,97.9,1.4547,24,666,20.2,26.4,17.2
14.4383,0,18.1,0,0.597,6.852,100,1.4655,24,666,20.2,19.78,27.5
51.1358,0,18.1,0,0.597,5.757,100,1.413,24,666,20.2,10.11,15
14.0507,0,18.1,0,0.597,6.657,100,1.5275,24,666,20.2,21.22,17.2
18.811,0,18.1,0,0.597,4.628,100,1.5539,24,666,20.2,34.37,17.9
28.6558,0,18.1,0,0.597,5.155,100,1.5894,24,666,20.2,20.08,16.3
45.7461,0,18.1,0,0.693,4.519,100,1.6582,24,666,20.2,36.98,7
18.0846,0,18.1,0,0.679,6.434,100,1.8347,24,666,20.2,29.05,7.2
10.8342,0,18.1,0,0.679,6.782,90.8,1.8195,24,666,20.2,25.79,7.5
25.9406,0,18.1,0,0.679,5.304,89.1,1.6475,24,666,20.2,26.64,10.4
73.5341,0,18.1,0,0.679,5.957,100,1.8026,24,666,20.2,20.62,8.8
11.8123,0,18.1,0,0.718,6.824,76.5,1.794,24,666,20.2,22.74,8.4
11.0874,0,18.1,0,0.718,6.411,100,1.8589,24,666,20.2,15.02,16.7
7.02259,0,18.1,0,0.718,6.006,95.3,1.8746,24,666,20.2,15.7,14.2
12.0482,0,18.1,0,0.614,5.648,87.6,1.9512,24,666,20.2,14.1,20.8
7.05042,0,18.1,0,0.614,6.103,85.1,2.0218,24,666,20.2,23.29,13.4
8.79212,0,18.1,0,0.584,5.565,70.6,2.0635,24,666,20.2,17.16,11.7
15.8603,0,18.1,0,0.679,5.896,95.4,1.9096,24,666,20.2,24.39,8.3
12.2472,0,18.1,0,0.584,5.837,59.7,1.9976,24,666,20.2,15.69,10.2
37.6619,0,18.1,0,0.679,6.202,78.7,1.8629,24,666,20.2,14.52,10.9
7.36711,0,18.1,0,0.679,6.193,78.1,1.9356,24,666,20.2,21.52,11
9.33889,0,18.1,0,0.679,6.38,95.6,1.9682,24,666,20.2,24.08,9.5
8.49213,0,18.1,0,0.584,6.348,86.1,2.0527,24,666,20.2,17.64,14.5
10.0623,0,18.1,0,0.584,6.833,94.3,2.0882,24,666,20.2,19.69,14.1
6.44405,0,18.1,0,0.584,6.425,74.8,2.2004,24,666,20.2,12.03,16.1
5.58107,0,18.1,0,0.713,6.436,87.9,2.3158,24,666,20.2,16.22,14.3
13.9134,0,18.1,0,0.713,6.208,95,2.2222,24,666,20.2,15.17,11.7
11.1604,0,18.1,0,0.74,6.629,94.6,2.1247,24,666,20.2,23.27,13.4
14.4208,0,18.1,0,0.74,6.461,93.3,2.0026,24,666,20.2,18.05,9.6
15.1772,0,18.1,0,0.74,6.152,100,1.9142,24,666,20.2,26.45,8.7
13.6781,0,18.1,0,0.74,5.935,87.9,1.8206,24,666,20.2,34.02,8.4
9.39063,0,18.1,0,0.74,5.627,93.9,1.8172,24,666,20.2,22.88,12.8
22.0511,0,18.1,0,0.74,5.818,92.4,1.8662,24,666,20.2,22.11,10.5
9.72418,0,18.1,0,0.74,6.406,97.2,2.0651,24,666,20.2,19.52,17.1
5.66637,0,18.1,0,0.74,6.219,100,2.0048,24,666,20.2,16.59,18.4
9.96654,0,18.1,0,0.74,6.485,100,1.9784,24,666,20.2,18.85,15.4
12.8023,0,18.1,0,0.74,5.854,96.6,1.8956,24,666,20.2,23.79,10.8
10.6718,0,18.1,0,0.74,6.459,94.8,1.9879,24,666,20.2,23.98,11.8
6.28807,0,18.1,0,0.74,6.341,96.4,2.072,24,666,20.2,17.79,14.9
9.92485,0,18.1,0,0.74,6.251,96.6,2.198,24,666,20.2,16.44,12.6
9.32909,0,18.1,0,0.713,6.185,98.7,2.2616,24,666,20.2,18.13,14.1
7.52601,0,18.1,0,0.713,6.417,98.3,2.185,24,666,20.2,19.31,13
6.71772,0,18.1,0,0.713,6.749,92.6,2.3236,24,666,20.2,17.44,13.4
5.44114,0,18.1,0,0.713,6.655,98.2,2.3552,24,666,20.2,17.73,15.2
5.09017,0,18.1,0,0.713,6.297,91.8,2.3682,24,666,20.2,17.27,16.1
8.24809,0,18.1,0,0.713,7.393,99.3,2.4527,24,666,20.2,16.74,17.8
9.51363,0,18.1,0,0.713,6.728,94.1,2.4961,24,666,20.2,18.71,14.9
4.75237,0,18.1,0,0.713,6.525,86.5,2.4358,24,666,20.2,18.13,14.1
4.66883,0,18.1,0,0.713,5.976,87.9,2.5806,24,666,20.2,19.01,12.7
8.20058,0,18.1,0,0.713,5.936,80.3,2.7792,24,666,20.2,16.94,13.5
7.75223,0,18.1,0,0.713,6.301,83.7,2.7831,24,666,20.2,16.23,14.9
6.80117,0,18.1,0,0.713,6.081,84.4,2.7175,24,666,20.2,14.7,20
4.81213,0,18.1,0,0.713,6.701,90,2.5975,24,666,20.2,16.42,16.4
3.69311,0,18.1,0,0.713,6.376,88.4,2.5671,24,666,20.2,14.65,17.7
6.65492,0,18.1,0,0.713,6.317,83,2.7344,24,666,20.2,13.99,19.5
5.82115,0,18.1,0,0.713,6.513,89.9,2.8016,24,666,20.2,10.29,20.2
7.83932,0,18.1,0,0.655,6.209,65.4,2.9634,24,666,20.2,13.22,21.4
3.1636,0,18.1,0,0.655,5.759,48.2,3.0665,24,666,20.2,14.13,19.9
3.77498,0,18.1,0,0.655,5.952,84.7,2.8715,24,666,20.2,17.15,19
4.42228,0,18.1,0,0.584,6.003,94.5,2.5403,24,666,20.2,21.32,19.1
15.5757,0,18.1,0,0.58,5.926,71,2.9084,24,666,20.2,18.13,19.1
13.0751,0,18.1,0,0.58,5.713,56.7,2.8237,24,666,20.2,14.76,20.1
4.34879,0,18.1,0,0.58,6.167,84,3.0334,24,666,20.2,16.29,19.9
4.03841,0,18.1,0,0.532,6.229,90.7,3.0993,24,666,20.2,12.87,19.6
3.56868,0,18.1,0,0.58,6.437,75,2.8965,24,666,20.2,14.36,23.2
4.64689,0,18.1,0,0.614,6.98,67.6,2.5329,24,666,20.2,11.66,29.8
8.05579,0,18.1,0,0.584,5.427,95.4,2.4298,24,666,20.2,18.14,13.8
6.39312,0,18.1,0,0.584,6.162,97.4,2.206,24,666,20.2,24.1,13.3
4.87141,0,18.1,0,0.614,6.484,93.6,2.3053,24,666,20.2,18.68,16.7
15.0234,0,18.1,0,0.614,5.304,97.3,2.1007,24,666,20.2,24.91,12
10.233,0,18.1,0,0.614,6.185,96.7,2.1705,24,666,20.2,18.03,14.6
14.3337,0,18.1,0,0.614,6.229,88,1.9512,24,666,20.2,13.11,21.4
5.82401,0,18.1,0,0.532,6.242,64.7,3.4242,24,666,20.2,10.74,23
5.70818,0,18.1,0,0.532,6.75,74.9,3.3317,24,666,20.2,7.74,23.7
5.73116,0,18.1,0,0.532,7.061,77,3.4106,24,666,20.2,7.01,25
2.81838,0,18.1,0,0.532,5.762,40.3,4.0983,24,666,20.2,10.42,21.8
2.37857,0,18.1,0,0.583,5.871,41.9,3.724,24,666,20.2,13.34,20.6
3.67367,0,18.1,0,0.583,6.312,51.9,3.9917,24,666,20.2,10.58,21.2
5.69175,0,18.1,0,0.583,6.114,79.8,3.5459,24,666,20.2,14.98,19.1
4.83567,0,18.1,0,0.583,5.905,53.2,3.1523,24,666,20.2,11.45,20.6
0.15086,0,27.74,0,0.609,5.454,92.7,1.8209,4,711,20.1,18.06,15.2
0.18337,0,27.74,0,0.609,5.414,98.3,1.7554,4,711,20.1,23.97,7
0.20746,0,27.74,0,0.609,5.093,98,1.8226,4,711,20.1,29.68,8.1
0.10574,0,27.74,0,0.609,5.983,98.8,1.8681,4,711,20.1,18.07,13.6
0.11132,0,27.74,0,0.609,5.983,83.5,2.1099,4,711,20.1,13.35,20.1
0.17331,0,9.69,0,0.585,5.707,54,2.3817,6,391,19.2,12.01,21.8
0.27957,0,9.69,0,0.585,5.926,42.6,2.3817,6,391,19.2,13.59,24.5
0.17899,0,9.69,0,0.585,5.67,28.8,2.7986,6,391,19.2,17.6,23.1
0.2896,0,9.69,0,0.585,5.39,72.9,2.7986,6,391,19.2,21.14,19.7
0.26838,0,9.69,0,0.585,5.794,70.6,2.8927,6,391,19.2,14.1,18.3
0.23912,0,9.69,0,0.585,6.019,65.3,2.4091,6,391,19.2,12.92,21.2
0.17783,0,9.69,0,0.585,5.569,73.5,2.3999,6,391,19.2,15.1,17.5
0.22438,0,9.69,0,0.585,6.027,79.7,2.4982,6,391,19.2,14.33,16.8
0.06263,0,11.93,0,0.573,6.593,69.1,2.4786,1,273,21,9.67,22.4
0.04527,0,11.93,0,0.573,6.12,76.7,2.2875,1,273,21,9.08,20.6
0.06076,0,11.93,0,0.573,6.976,91,2.1675,1,273,21,5.64,23.9
0.10959,0,11.93,0,0.573,6.794,89.3,2.3889,1,273,21,6.48,22
0.04741,0,11.93,0,0.573,6.03,80.8,2.505,1,273,21,7.88,11.9
Id,SepalLengthCm,SepalWidthCm,PetalLengthCm,PetalWidthCm,Species
1,5.1,3.5,1.4,0.2,Iris-setosa
2,4.9,3,1.4,0.2,Iris-setosa
3,4.7,3.2,1.3,0.2,Iris-setosa
4,4.6,3.1,1.5,0.2,Iris-setosa
5,5,3.6,1.4,0.2,Iris-setosa
6,5.4,3.9,1.7,0.4,Iris-setosa
7,4.6,3.4,1.4,0.3,Iris-setosa
8,5,3.4,1.5,0.2,Iris-setosa
9,4.4,2.9,1.4,0.2,Iris-setosa
10,4.9,3.1,1.5,0.1,Iris-setosa
11,5.4,3.7,1.5,0.2,Iris-setosa
12,4.8,3.4,1.6,0.2,Iris-setosa
13,4.8,3,1.4,0.1,Iris-setosa
14,4.3,3,1.1,0.1,Iris-setosa
15,5.8,4,1.2,0.2,Iris-setosa
16,5.7,4.4,1.5,0.4,Iris-setosa
17,5.4,3.9,1.3,0.4,Iris-setosa
18,5.1,3.5,1.4,0.3,Iris-setosa
19,5.7,3.8,1.7,0.3,Iris-setosa
20,5.1,3.8,1.5,0.3,Iris-setosa
21,5.4,3.4,1.7,0.2,Iris-setosa
22,5.1,3.7,1.5,0.4,Iris-setosa
23,4.6,3.6,1,0.2,Iris-setosa
24,5.1,3.3,1.7,0.5,Iris-setosa
25,4.8,3.4,1.9,0.2,Iris-setosa
26,5,3,1.6,0.2,Iris-setosa
27,5,3.4,1.6,0.4,Iris-setosa
28,5.2,3.5,1.5,0.2,Iris-setosa
29,5.2,3.4,1.4,0.2,Iris-setosa
30,4.7,3.2,1.6,0.2,Iris-setosa
31,4.8,3.1,1.6,0.2,Iris-setosa
32,5.4,3.4,1.5,0.4,Iris-setosa
33,5.2,4.1,1.5,0.1,Iris-setosa
34,5.5,4.2,1.4,0.2,Iris-setosa
35,4.9,3.1,1.5,0.1,Iris-setosa
36,5,3.2,1.2,0.2,Iris-setosa
37,5.5,3.5,1.3,0.2,Iris-setosa
38,4.9,3.1,1.5,0.1,Iris-setosa
39,4.4,3,1.3,0.2,Iris-setosa
40,5.1,3.4,1.5,0.2,Iris-setosa
41,5,3.5,1.3,0.3,Iris-setosa
42,4.5,2.3,1.3,0.3,Iris-setosa
43,4.4,3.2,1.3,0.2,Iris-setosa
44,5,3.5,1.6,0.6,Iris-setosa
45,5.1,3.8,1.9,0.4,Iris-setosa
46,4.8,3,1.4,0.3,Iris-setosa
47,5.1,3.8,1.6,0.2,Iris-setosa
48,4.6,3.2,1.4,0.2,Iris-setosa
49,5.3,3.7,1.5,0.2,Iris-setosa
50,5,3.3,1.4,0.2,Iris-setosa
51,7,3.2,4.7,1.4,Iris-versicolor
52,6.4,3.2,4.5,1.5,Iris-versicolor
53,6.9,3.1,4.9,1.5,Iris-versicolor
54,5.5,2.3,4,1.3,Iris-versicolor
55,6.5,2.8,4.6,1.5,Iris-versicolor
56,5.7,2.8,4.5,1.3,Iris-versicolor
57,6.3,3.3,4.7,1.6,Iris-versicolor
58,4.9,2.4,3.3,1,Iris-versicolor
59,6.6,2.9,4.6,1.3,Iris-versicolor
60,5.2,2.7,3.9,1.4,Iris-versicolor
61,5,2,3.5,1,Iris-versicolor
62,5.9,3,4.2,1.5,Iris-versicolor
63,6,2.2,4,1,Iris-versicolor
64,6.1,2.9,4.7,1.4,Iris-versicolor
65,5.6,2.9,3.6,1.3,Iris-versicolor
66,6.7,3.1,4.4,1.4,Iris-versicolor
67,5.6,3,4.5,1.5,Iris-versicolor
68,5.8,2.7,4.1,1,Iris-versicolor
69,6.2,2.2,4.5,1.5,Iris-versicolor
70,5.6,2.5,3.9,1.1,Iris-versicolor
71,5.9,3.2,4.8,1.8,Iris-versicolor
72,6.1,2.8,4,1.3,Iris-versicolor
73,6.3,2.5,4.9,1.5,Iris-versicolor
74,6.1,2.8,4.7,1.2,Iris-versicolor
75,6.4,2.9,4.3,1.3,Iris-versicolor
76,6.6,3,4.4,1.4,Iris-versicolor
77,6.8,2.8,4.8,1.4,Iris-versicolor
78,6.7,3,5,1.7,Iris-versicolor
79,6,2.9,4.5,1.5,Iris-versicolor
80,5.7,2.6,3.5,1,Iris-versicolor
81,5.5,2.4,3.8,1.1,Iris-versicolor
82,5.5,2.4,3.7,1,Iris-versicolor
83,5.8,2.7,3.9,1.2,Iris-versicolor
84,6,2.7,5.1,1.6,Iris-versicolor
85,5.4,3,4.5,1.5,Iris-versicolor
86,6,3.4,4.5,1.6,Iris-versicolor
87,6.7,3.1,4.7,1.5,Iris-versicolor
88,6.3,2.3,4.4,1.3,Iris-versicolor
89,5.6,3,4.1,1.3,Iris-versicolor
90,5.5,2.5,4,1.3,Iris-versicolor
91,5.5,2.6,4.4,1.2,Iris-versicolor
92,6.1,3,4.6,1.4,Iris-versicolor
93,5.8,2.6,4,1.2,Iris-versicolor
94,5,2.3,3.3,1,Iris-versicolor
95,5.6,2.7,4.2,1.3,Iris-versicolor
96,5.7,3,4.2,1.2,Iris-versicolor
97,5.7,2.9,4.2,1.3,Iris-versicolor
98,6.2,2.9,4.3,1.3,Iris-versicolor
99,5.1,2.5,3,1.1,Iris-versicolor
100,5.7,2.8,4.1,1.3,Iris-versicolor
101,6.3,3.3,6,2.5,Iris-virginica
102,5.8,2.7,5.1,1.9,Iris-virginica
103,7.1,3,5.9,2.1,Iris-virginica
104,6.3,2.9,5.6,1.8,Iris-virginica
105,6.5,3,5.8,2.2,Iris-virginica
106,7.6,3,6.6,2.1,Iris-virginica
107,4.9,2.5,4.5,1.7,Iris-virginica
108,7.3,2.9,6.3,1.8,Iris-virginica
109,6.7,2.5,5.8,1.8,Iris-virginica
110,7.2,3.6,6.1,2.5,Iris-virginica
111,6.5,3.2,5.1,2,Iris-virginica
112,6.4,2.7,5.3,1.9,Iris-virginica
113,6.8,3,5.5,2.1,Iris-virginica
114,5.7,2.5,5,2,Iris-virginica
115,5.8,2.8,5.1,2.4,Iris-virginica
116,6.4,3.2,5.3,2.3,Iris-virginica
117,6.5,3,5.5,1.8,Iris-virginica
118,7.7,3.8,6.7,2.2,Iris-virginica
119,7.7,2.6,6.9,2.3,Iris-virginica
120,6,2.2,5,1.5,Iris-virginica
121,6.9,3.2,5.7,2.3,Iris-virginica
122,5.6,2.8,4.9,2,Iris-virginica
123,7.7,2.8,6.7,2,Iris-virginica
124,6.3,2.7,4.9,1.8,Iris-virginica
125,6.7,3.3,5.7,2.1,Iris-virginica
126,7.2,3.2,6,1.8,Iris-virginica
127,6.2,2.8,4.8,1.8,Iris-virginica
128,6.1,3,4.9,1.8,Iris-virginica
129,6.4,2.8,5.6,2.1,Iris-virginica
130,7.2,3,5.8,1.6,Iris-virginica
131,7.4,2.8,6.1,1.9,Iris-virginica
132,7.9,3.8,6.4,2,Iris-virginica
133,6.4,2.8,5.6,2.2,Iris-virginica
134,6.3,2.8,5.1,1.5,Iris-virginica
135,6.1,2.6,5.6,1.4,Iris-virginica
136,7.7,3,6.1,2.3,Iris-virginica
137,6.3,3.4,5.6,2.4,Iris-virginica
138,6.4,3.1,5.5,1.8,Iris-virginica
139,6,3,4.8,1.8,Iris-virginica
140,6.9,3.1,5.4,2.1,Iris-virginica
141,6.7,3.1,5.6,2.4,Iris-virginica
142,6.9,3.1,5.1,2.3,Iris-virginica
143,5.8,2.7,5.1,1.9,Iris-virginica
144,6.8,3.2,5.9,2.3,Iris-virginica
145,6.7,3.3,5.7,2.5,Iris-virginica
146,6.7,3,5.2,2.3,Iris-virginica
147,6.3,2.5,5,1.9,Iris-virginica
148,6.5,3,5.2,2,Iris-virginica
149,6.2,3.4,5.4,2.3,Iris-virginica
150,5.9,3,5.1,1.8,Iris-virginica

这次的难度相较于上次提升了不少,博主也是边学边写,难免有些错误和不足,若能帮到一些和我一样正在学习这方面知识的童鞋,荣幸之至。非常欢迎大家一起交流知识,共同进步。
至此,写一个小小的心得:
本章节我们学习了NNDL中非常重要的一个知识点——张量的基础使用方法以及其他的一些概念。同时我们了解到了数据的读取、预处理和转换的初步内容。可谓收获颇丰。以后的形式会更加艰难,博主争取好好听讲。NNDL本身是一个很难的一个领域,但请相信,只要一步一个脚印,踏踏实实,埋头苦干,一定可以学的很出色。加油吧。
这里再附上我们老师的博客,大家不懂的也可以直接请教他老人家。NNDL专家

你可能感兴趣的:(神经网络与深度学习(二) pytorch入门——张量)