学习预测函数的参数并在相同的数据上测试是一个方法上的错误:一个只需重复其刚刚看到的样本标签的模型将有一个完美的分数,但无法预测对尚未看到的数据有用的任何东西。这种情况被称为过度拟合。为了避免这种情况,在执行(监督的)机器学习实验时,通常会将部分可用数据作为测试集x_测试、y_测试。请注意,“实验”一词并不只是用来表示学术用途,因为即使在商业环境中,机器学习通常也是从实验开始的。下面是模型培训中典型的交叉验证工作流的流程图。最佳参数可以通过网格搜索技术来确定。
基本思想
交叉验证的基本思想是把在某种意义下将原始数据(dataset)进行分组,一部分做为训练集(train set),另一部分做为验证集(validation set or test set),首先用训练集对分类器进行训练,再利用验证集来测试训练得到的模型(model),以此来做为评价分类器的性能指标。
目的
常见形式
参数及方法详情(https://blog.csdn.net/weixin_44883371/article/details/98978046)
通常情况下,很多超参数需要调节,但是手动过程繁杂,所以需要对模型预设几种超参数组合,每组超参数都采用交叉验证来进行评估。最后选出最优参数组合建立模型。
寻求参数最优的一种方法
参数及方法详情(https://blog.csdn.net/weixin_44883371/article/details/98978046)
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.linear_model import LogisticRegression #逻辑回归
from sklearn.tree import DecisionTreeClassifier #分类树
from sklearn.model_selection import cross_val_score #交叉验证
from sklearn.model_selection import GridSearchCV #网格搜索
from sklearn.model_selection import train_test_split #切分
from sklearn.metrics import confusion_matrix,classification_report #混淆矩阵,分类报告
from sklearn.metrics import precision_score,recall_score,f1_score,roc_auc_score,roc_curve
#精确率 #召回率 #F1分数 #AUC值 #ROC曲线
#消除警告
import warnings
warnings.filterwarnings('ignore')
#中文、负号
plt.rcParams['font.sans-serif'] = ['SimHei']
plt.rcParams['axes.unicode_minus'] = False
# (1)读入aviation数据集,设置MEMBER_NO为索引列;把?转换为nan(4分)
data=pd.read_excel("../datas/aviation.xls",index_col="MEMBER_NO",na_values='?')
# (2)剔除重复值、缺失值。(4分)
data.drop_duplicates(inplace=True)
data.dropna(inplace=True)
# (3)随机抽取500样本,切片特征X和标签Y;(4分) 因为调参很慢,所以这里的操作是想在小数据集上调参
data = data.sample(n=500)
x = data.iloc[:,:-1]
y = data.iloc[:,-1]
# (4)使用交叉验证方法(10折)
# 比较逻辑回归、决策树算法性能差异,评估指标用F1分数(5分)
lr = LogisticRegression()
dt = DecisionTreeClassifier()
modle = cross_val_score(lr,x,y,cv=2)
modle1 = cross_val_score(dt,x,y,cv=2)
# (5)使用网格搜索对上题中F1分数较高的算法进行超参数调优。(4分)
paramters = {
'C':[0.1,0.5,10,50,100],
'max_iter':[1,5,50,200,500]
}
g = GridSearchCV(lr,param_grid=paramters,cv=2)
g.fit(x,y)
print(g.best_params_)
print('逻辑回归F1分数',(g.best_score_)*100,'%')
print()
paramters1 = {
'max_depth':[0.1,0.5,1,50,100],
'min_samples_leaf':[1,5,50,200,500]
}
g1 = GridSearchCV(dt,param_grid=paramters1,cv=2)
g1.fit(x,y)
print(g1.best_params_)
print('决策树F1分数',(g1.best_score_)*100,'%')
# (6)使用4、5中确定的最优算法和最优参数建立模型。(4分)
lr1 = LogisticRegression(C=g.best_params_.get('C'),max_iter=g.best_params_.get('max_iter'))
# (7)按照6:4划分整个数据集(样本总体)。(4分)
train_x,test_x,train_y,test_y = train_test_split(x,y,test_size=0.3,random_state=7)
# (8)使用训练集数据进行模型训练,对测试集数据进行预测,打印混淆矩阵。(4分)
lr1.fit(train_x,train_y)
h = lr1.predict(test_x)
h1 = lr1.predict_proba(test_x)
print('\n预测值')
print(h)
print('\n混淆矩阵')
print(confusion_matrix(y_true=test_y,y_pred=h))
# (9)打印精确率、召回率、F1分数和AUC值、画出ROC曲线。(5分)
print('\n精确率',precision_score(y_true=test_y,y_pred=h))
print('召回率',recall_score(y_true=test_y,y_pred=h))
print('F1分数',f1_score(y_true=test_y,y_pred=h))
print('AUC值',roc_auc_score(y_true=test_y,y_score=h1[:,-1]))
a,b,c = roc_curve(y_true=test_y,y_score=h1[:,-1])
plt.plot(a,b)
plt.show()
效果展示
{'C': 10, 'max_iter': 200}
逻辑回归F1分数 96.8 %
{'max_depth': 50, 'min_samples_leaf': 1}
决策树F1分数 92.4 %
预测值
[0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 1 1 1 0 1 0 0 0 0 1 0 0 1 0 1 1 0 0
0 1 1 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 1 0 1 1 1 1 1 1 1 1 1 0 1 0 0 0
0 1 1 1 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 1 1 0 0 1 1 1 0 0 0 1 1 0
0 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 1 1 1 1 1 1 1 0 0 0 1 1 0 0 1 1 0
0 0]
混淆矩阵
[[86 3]
[ 3 58]]
精确率 0.9508196721311475
召回率 0.9508196721311475
F1分数 0.9508196721311475
AUC值 0.9952109044022841
部分数据集展示
MEMBER_NO DAYS_FROM_LAST_TO_END DAYS_FROM_BEGIN_TO_FIRST FFP_TIER age FLIGHT_COUNT FLIGHT_COUNT_QTR_1 FLIGHT_COUNT_QTR_2 FLIGHT_COUNT_QTR_3 FLIGHT_COUNT_QTR_4 FLIGHT_COUNT_QTR_5 FLIGHT_COUNT_QTR_6 FLIGHT_COUNT_QTR_7 FLIGHT_COUNT_QTR_8 BASE_POINTS_SUM BASE_POINTS_SUM_QTR_1 BASE_POINTS_SUM_QTR_2 BASE_POINTS_SUM_QTR_3 BASE_POINTS_SUM_QTR_4 BASE_POINTS_SUM_QTR_5 BASE_POINTS_SUM_QTR_6 BASE_POINTS_SUM_QTR_7 BASE_POINTS_SUM_QTR_8 ELITE_POINTS_SUM_YR_1 ELITE_POINTS_SUM_YR_2 EXPENSE_SUM_YR_1 EXPENSE_SUM_YR_2 SEG_KM_SUM WEIGHTED_SEG_KM AVG_FLIGHT_COUNT AVG_BASE_POINTS_SUM AVG_FLIGHT_INTERVAL MAX_FLIGHT_INTERVAL MILEAGE_IN_COUNT ADD_POINTS_SUM_YR_1 ADD_POINTS_SUM_YR_2 EXCHANGE_COUNT Avg_Discount P1Y_Flight_Count L1Y_Flight_Count P1Y_BASE_POINTS_SUM L1Y_BASE_POINTS_SUM ELITE_POINTS_SUM ADD_POINTS_SUM Eli_Add_Point_Sum L1Y_ELi_Add_Points Points_Sum L1Y_Points_Sum Ration_L1Y_Flight_Count Ration_P1Y_Flight_Count Ration_P1Y_BPS Ration_L1Y_BPS Point_Chg_NotFlight FFP_DAYS runoff_flag
‘00024549 301 19 4 27 3 1 1 0 0 1 0 0 0 2187 646 895 0 0 646 0 0 0 0 0 1570 621 2360 2291 0.375 273.375 179 322 0 0 0 0 0.970762712 2 1 1541 646 0 0 0 0 2187 646 0.333333333 0.666666667 0.704296161 0.295246801 0 678 0
‘00048301 6 199 5 31 42 0 0 1 8 8 5 13 7 36325 0 0 500 7440 5575 4117 10963 7730 0 3312 8835 31502 49407 40804.44 5.25 4540.625 12.82926829 81 0 0 0 0 0.825883782 9 33 7940 28385 3312 0 3312 3312 39637 31697 0.785714286 0.214285714 0.218576226 0.781396245 0 1847 0
‘00033467 135 257 4 22 5 0 0 0 0 1 2 2 0 3616 0 0 0 0 866 1212 1538 0 0 0 0 2378 7689 5133.25 0.714285714 516.5714286 51.75 104 0 0 0 0 0.667609572 0 5 0 3616 0 0 0 0 3616 3616 1 0 0 0.999723528 0 599 1
‘00040195 78 16 4 70 14 4 2 0 0 0 3 3 2 8292 2236 1124 0 0 0 353 1908 2671 0 0 3955 4194 13194 8220.75 1.75 1036.5 49 352 1 0 3000 4 0.623067303 6 8 3360 4932 0 3000 3000 3000 11292 7932 0.571428571 0.428571429 0.405160979 0.594718437 5 3363 0
‘00048412 283 166 4 37 6 0 2 1 1 2 0 0 0 4273 0 500 563 1784 1426 0 0 0 0 0 3751 2284 9209 6652.18 0.75 534.125 56.4 132 0 0 0 2 0.72235639 4 2 2847 1426 0 0 0 0 4273 1426 0.333333333 0.666666667 0.66612073 0.333645297 2 1713 1
‘00028811 66 0 5 34 10 0 2 1 0 2 3 1 1 16233 0 4393 1198 0 2599 3542 2950 1551 0 0 5900 8082 14299 16789.6 1.428571429 2319 58.55555556 185 0 0 0 0 1.174180013 3 7 5591 10642 0 0 0 0 16233 10642 0.7 0.3 0.344400641 0.65553776 0 593 0
‘00010615 368 26 4 39 3 1 0 0 2 0 0 0 0 2423 515 0 0 1908 0 0 0 0 0 0 2445 0 4501 2632.35 0.375 302.875 168.5 279 0 0 0 0 0.584836703 3 0 2423 0 0 0 0 0 2423 0 0 1 0.999587459 0 0 882 1
‘00042913 5 0 4 33 40 0 5 13 3 4 2 7 6 26606 0 3230 8398 1938 2584 1292 5918 3246 0 0 14490 11176 29191 27516.11 5.714285714 3800.857143 15.05128205 70 0 0 0 0 0.942623069 21 19 13566 13040 0 0 0 0 26606 13040 0.475 0.525 0.509865825 0.490096591 0 592 0
‘00023616 510 58 4 45 6 1 0 5 0 0 0 0 0 7550 484 0 7066 0 0 0 0 0 0 0 4674 0 16991 8831.6 0.75 943.75 32.6 126 0 0 0 0 0.519781061 6 0 7550 0 0 0 0 0 7550 0 0 1 0.999867567 0 0 1052 1
‘00010878 136 1 4 38 5 0 0 2 2 0 0 1 0 3514 0 0 2025 1489 0 0 0 0 0 0 3300 486 11390 4452.68 0.833333333 585.6666667 84 268 0 0 0 0 0.390928885 4 1 3514 0 0 0 0 0 3514 0 0.2 0.8 0.999715505 0 0 473 0
‘00017973 9 1 6 54 33 9 4 5 2 1 3 2 7 34261 6103 4324 3290 1527 1034 2547 4407 11029 0 0 15610 16168 46736 36444.82 4.125 4282.625 22.53125 95 0 0 0 2 0.779801866 20 13 15244 19017 0 0 0 0 34261 19017 0.393939394 0.606060606 0.444924406 0.555046407 2 2287 0
‘00047039 94 108 4 25 2 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 986 3877 1123.76 0.4 0 181 181 0 0 0 0 0.289852979 0 2 0 0 0 0 0 0 0 0 1 0 0 0 0 383 0
‘00029827 214 193 4 40 6 0 0 3 2 0 1 0 0 2727 0 0 1331 968 0 428 0 0 0 0 2369 617 7772 4628.1 0.75 340.875 64.8 175 0 0 0 0 0.595483788 5 1 2299 428 0 0 0 0 2727 428 0.166666667 0.833333333 0.842741935 0.156891496 0 2284 1
‘00034132 28 0 4 41 10 0 0 2 0 2 1 2 3 7371 0 0 1161 0 1415 1291 1257 2247 0 0 1385 6165 9954 7693.39 1.666666667 1228.5 51.33333333 141 0 0 0 0 0.772894314 2 8 1161 6210 0 0 0 0 7371 6210 0.8 0.2 0.157487792 0.84237656 0 490 0
‘00007419 227 0 4 44 2 0 0 0 1 0 1 0 0 2356 0 0 0 448 0 1908 0 0 0 0 649 1564 3046 2403.09 0.4 471.2 173 173 0 0 0 0 0.788933027 1 1 448 1908 0 0 0 0 2356 1908 0.5 0.5 0.190072126 0.809503606 0 400 1
‘00045772 1 210 6 57 57 0 0 1 8 10 10 12 16 111196 0 0 434 16363 26644 15878 24296 27581 0 15247 14001 94940 90553 123762.79 7.125 13899.5 9.285714286 88 40 5135 16603 0 1.366744227 9 48 16797 94399 15247 21738 36985 31850 148181 126249 0.842105263 0.157894737 0.151056234 0.848934773 40 2857 0
‘00062840 209 228 4 33 2 0 0 1 0 0 1 0 0 2373 0 0 1114 0 0 1259 0 0 0 0 1651 1240 6613 3739.41 0.25 296.625 294 294 0 0 0 0 0.565463481 1 1 1114 1259 0 0 0 0 2373 1259 0.5 0.5 0.469250211 0.530328559 0 2149 0
‘00018822 8 24 6 46 39 4 2 3 7 13 2 2 6 37340 1352 976 3295 8862 15710 1124 1646 4375 0 983 14763 23287 46760 39694.19 4.875 4667.5 18.39473684 77 1 0 0 0 0.848892002 16 23 14485 22855 983 0 983 983 38323 23838 0.58974359 0.41025641 0.387911411 0.612061809 1 2878 0
‘00046151 21 20 5 61 47 4 6 5 7 5 9 5 6 31648 2298 4027 2502 6953 2655 7137 2655 3421 0 2276 19253 22081 69599 43709.4 5.875 3956 15 51 0 0 0 1 0.628017644 22 25 15780 15868 2276 0 2276 2276 33924 18144 0.531914894 0.468085106 0.498593952 0.501374451 1 2575 0
‘00013626 173 498 4 39 3 0 0 0 0 0 2 1 0 3069 0 0 0 0 0 1614 1455 0 0 0 0 2056 4708 3557.41 0.375 383.625 30 58 0 0 0 0 0.755609601 0 3 0 3069 0 0 0 0 3069 3069 1 0 0 0.999674267 0 1431 1
‘00052414 247 0 4 33 4 0 0 0 2 1 1 0 0 1751 0 0 0 988 0 763 0 0 0 0 1560 1840 7718 3848.71 0.8 350.2 63.33333333 107 0 0 0 0 0.498666753 2 2 988 763 0 0 0 0 1751 763 0.5 0.5 0.563926941 0.435502283 0 437 1
‘00062283 34 2 4 34 15 4 2 0 1 1 0 6 1 13065 3064 1185 0 1138 0 0 7478 200 0 0 4447 7124 27024 15243.93 1.875 1633.125 49.64285714 271 2 0 3411 0 0.564088588 7 8 5387 7678 0 3411 3411 3411 16476 11089 0.533333333 0.466666667 0.412291443 0.587632022 2 1177 0
‘00021436 99 501 4 43 3 0 0 0 0 0 2 1 0 1000 0 0 0 0 0 1000 0 0 0 0 0 1651 2358 1425.36 0.375 125 65.5 111 0 0 0 0 0.604478372 0 3 0 1000 0 0 0 0 1000 1000 1 0 0 0.999000999 0 2125 0
‘00009253 268 62 4 39 9 1 2 4 0 0 2 0 0 14237 646 1000 1572 0 0 11019 0 0 0 0 2810 7120 25238 15730.25 1.125 1779.625 50.125 234 0 0 0 0 0.623276409 7 2 3218 11019 0 0 0 0 14237 11019 0.222222222 0.777777778 0.22601489 0.773914876 0 930 1
‘00004314 23 32 4 52 21 1 8 4 1 1 3 0 3 21040 0 10454 2110 1056 562 5573 0 1285 0 0 14359 7870 34487 23999.01 2.625 2630 33.8 161 0 0 0 1 0.695885696 14 7 13620 7420 0 0 0 0 21040 7420 0.333333333 0.666666667 0.647307637 0.352644836 1 1662 0
‘00001973 49 53 4 30 6 0 0 0 0 3 0 2 1 2677 0 0 0 0 1329 0 786 562 0 0 0 3101 3972 2793.77 1.2 535.4 56.2 120 2 0 11000 0 0.703366062 0 6 0 2677 0 11000 11000 11000 13677 13677 1 0 0 0.999626587 2 383 0
‘00019129 18 53 5 40 44 5 4 0 4 4 8 9 10 39015 5362 4407 0 3102 3706 5251 7429 9758 0 2238 12664 26366 50974 40248.1 5.5 4876.875 15.34883721 109 2 0 2000 1 0.789580963 13 31 12871 26144 2238 2000 4238 4238 43253 30382 0.704545455 0.295454545 0.329890301 0.670084068 3 2672 0
‘00037605 36 695 4 31 2 0 0 0 0 0 0 0 2 1292 0 0 0 0 0 0 0 1292 0 0 0 1131 1380 1131.6 0.25 161.5 0 0 0 0 0 0 0.82 0 2 0 1292 0 0 0 0 1292 1292 1 0 0 0.999226605 0 896 1
‘00006884 22 0 4 26 12 0 6 0 2 2 1 0 1 7703 0 4578 0 1924 0 739 0 462 0 0 8387 2569 21346 12147.22 1.714285714 1100.428571 54.36363636 184 0 0 0 0 0.569063056 8 4 6502 1201 0 0 0 0 7703 1201 0.333333333 0.666666667 0.843977155 0.155893043 0 620 0
‘00006500 201 0 4 48 3 0 1 0 0 0 2 0 0 3725 0 815 0 0 0 2910 0 0 0 0 640 0 4272 3789.84 0.428571429 532.1428571 206.5 411 0 0 0 0 0.887134831 1 2 815 2910 0 0 0 0 3725 2910 0.666666667 0.333333333 0.218733226 0.78099839 0 614 0
‘00018939 705 10 4 73 2 2 0 0 0 0 0 0 0 2592 2592 0 0 0 0 0 0 0 0 0 2662 0 4137 2859.6 0.25 324 16 16 0 0 0 0 0.691225526 2 0 2592 0 0 0 0 0 2592 0 0 1 0.999614346 0 0 2795 1
‘00036790 16 1 4 42 11 2 4 0 1 0 0 2 2 8293 1242 5268 0 1020 0 0 763 0 0 0 7274 3373 21191 12983.39 1.375 1036.625 71.4 221 0 0 0 3 0.612684158 7 4 7530 763 0 0 0 0 8293 763 0.363636364 0.636363636 0.907885218 0.091994213 3 2853 0
‘00018989 65 12 4 35 30 8 6 4 3 2 6 0 1 39720 12150 8620 4332 2289 2671 9158 0 500 0 0 26033 10671 51007 41664.61 3.75 4965 22.55172414 138 28 8840 6650 5 0.816841022 21 9 27391 12329 0 15490 15490 6650 55210 18979 0.3 0.7 0.689584854 0.31038997 33 2739 0
‘00001941 248 127 4 48 5 0 0 0 0 1 4 0 0 1705 0 0 0 0 213 1492 0 0 0 0 0 2098 2655 1768.23 1 341 11.25 19 0 0 0 0 0.666 0 5 0 1705 0 0 0 0 1705 1705 1 0 0 0.999413834 0 420 1
‘00014373 2 467 4 41 3 0 0 0 0 0 1 0 2 2456 0 0 0 0 0 1178 0 1278 0 0 0 2784 6863 3743.09 0.375 307 131 260 0 0 0 0 0.545401428 0 3 0 2456 0 0 0 0 2456 2456 1 0 0 0.999593 0 1023 0
‘00054320 492 47 4 48 5 3 0 2 0 0 0 0 0 3049 1757 0 1292 0 0 0 0 0 0 0 3400 0 4458 3470.65 0.625 381.125 48 150 0 0 0 0 0.778521759 5 0 3049 0 0 0 0 0 3049 0 0 1 0.999672131 0 0 974 1
‘00026549 189 100 4 24 3 0 2 0 0 0 1 0 0 2658 0 2658 0 0 0 0 0 0 0 0 2272 1079 11508 4526.48 0.375 332.25 221 389 1 1500 0 0 0.393333333 2 1 2658 0 0 1500 1500 0 4158 0 0.333333333 0.666666667 0.999623919 0 1 980 0
‘00051124 10 0 4 35 8 0 0 1 2 0 0 1 4 12609 0 0 618 3816 0 0 1908 6267 0 0 4030 7036 13394 12575.76 1.333333333 2101.5 67.85714286 318 0 0 0 0 0.93890996 3 5 4434 8175 0 0 0 0 12609 8175 0.625 0.375 0.351625694 0.648295004 0 485 0
‘00024557 57 112 4 42 7 0 0 0 2 2 0 1 2 4300 0 0 0 1292 1292 0 200 1516 0 0 1380 3147 5329 4597.61 1.166666667 716.6666667 61.33333333 143 0 0 0 0 0.862752862 2 5 1292 3008 0 0 0 0 4300 3008 0.714285714 0.285714286 0.300395257 0.699372239 0 537 0
‘00001711 72 22 4 29 22 1 7 3 2 4 2 1 2 9865 270 4067 1769 946 1486 787 270 270 0 0 8341 818 18281 13132.26 2.75 1233.125 27.19047619 113 0 0 0 0 0.71835567 13 9 7052 2813 0 0 0 0 9865 2813 0.409090909 0.590909091 0.714778026 0.285120616 0 665 0
‘00019751 18 71 5 33 13 2 2 0 0 3 0 4 2 150850 30242 30242 0 0 45363 0 30664 14339 0 18224 25200 43093 126495 128334.98 1.625 18856.25 53.5 262 2 0 500 3 1.014545871 4 9 60484 90366 18224 500 18724 18724 169574 109090 0.692307692 0.307692308 0.400951933 0.599041438 5 1055 0
‘00022884 208 12 4 36 4 1 1 1 0 0 1 0 0 5250 1198 874 1211 0 0 1967 0 0 0 0 2170 1610 5570 5403.84 0.5 656.25 170.3333333 332 0 0 0 0 0.970168761 3 1 3283 1967 0 0 0 0 5250 1967 0.25 0.75 0.625214245 0.374595315 0 1133 0
‘00050048 1 50 6 37 110 4 15 13 12 14 20 20 12 59275 1544 6892 7269 8133 7458 12878 10036 5065 0 11174 29292 41770 94212 71728.71 13.75 7409.375 6.23853211 37 5 0 5021 5 0.761354286 44 66 23838 35437 11174 5021 16195 16195 75470 51632 0.6 0.4 0.402152642 0.597830488 10 1119 0
‘00055062 45 8 6 54 10 2 0 2 0 2 1 0 3 11699 3510 0 1572 0 3300 284 0 3033 0 0 5365 6166 17182 12989.16 1.25 1462.375 75.33333333 225 0 0 0 0 0.755974857 4 6 5082 6617 0 0 0 0 11699 6617 0.6 0.4 0.434358974 0.565555556 0 2637 0
‘00033453 13 271 4 30 21 0 0 0 4 5 3 6 3 19623 0 0 0 5053 3710 2739 6872 1249 0 0 5033 19708 49712 29992.63 2.625 2452.875 21.45 57 0 0 0 0 0.603327768 4 17 5053 14570 0 0 0 0 19623 14570 0.80952381 0.19047619 0.257490828 0.742458214 0 713 0
‘00033841 240 0 4 40 9 0 2 0 2 0 5 0 0 12487 0 1748 0 2509 0 8230 0 0 0 0 4410 4771 15181 11924 1.285714286 1783.857143 42.25 203 0 0 0 0 0.785455504 4 5 4257 8230 0 0 0 0 12487 8230 0.555555556 0.444444444 0.340887252 0.659032671 0 578 1
‘00054682 20 569 4 42 3 0 0 0 0 0 0 2 1 2296 0 0 0 0 0 0 2296 0 0 0 0 2465 5122 2765.49 0.375 287 71 86 0 0 0 0 0.539923858 0 3 0 2296 0 0 0 0 2296 2296 1 0 0 0.99956465 0 2946 0
‘00020766 52 88 4 74 9 1 0 1 0 3 2 1 1 5432 786 0 0 0 1537 1537 786 786 0 0 1226 6196 17025 8451.76 1.125 679 73.875 167 1 1500 0 0 0.496432305 2 7 786 4646 0 1500 1500 0 6932 4646 0.777777778 0.222222222 0.144671452 0.855144487 1 2383 0
‘00041811 142 22 4 51 17 3 6 3 1 2 0 2 0 10799 2671 4015 1526 0 0 0 2587 0 0 0 10430 3247 31073 15558.74 2.125 1349.875 35.4375 212 0 0 0 0 0.500715734 13 4 8212 2587 0 0 0 0 10799 2587 0.235294118 0.764705882 0.76037037 0.239537037 0 813 0