scGAC运行

运行产生的文件

1.预处理:ori_data.tsv产生 data.tsv
2.主函数运行:例如
hidden_Biase.tsv :图注意自动编码器中的隐藏层,将这个矩阵作为自动化聚类的输入
NE_Biase.csv:网络去噪之后的矩阵
pred_Biase.txt:预测聚类之后的结果

运行

  1. 查看可选择参数
    设置参数为 -h
usage: scGAC.py [-h] [--subtype_path SUBTYPE_PATH] [--k K] [--is_NE IS_NE]
                [--PCA_dim PCA_DIM] [--F1 F1] [--F2 F2]
                [--n_attn_heads N_ATTN_HEADS] [--dropout_rate DROPOUT_RATE]
                [--l2_reg L2_REG] [--learning_rate LEARNING_RATE]
                [--pre_lr PRE_LR] [--pre_epochs PRE_EPOCHS] [--epochs EPOCHS]
                [--c1 C1] [--c2 C2]
                dataset_str n_clusters

positional arguments:
  dataset_str           name of dataset
  n_clusters            expected number of clusters

optional arguments:
  -h, --help            show this help message and exit
  --subtype_path SUBTYPE_PATH
                        path of true labels for evaluation of ARI and NMI
  --k K                 number of neighbors to construct the cell graph
  --is_NE IS_NE         use NE denoise the cell graph or not
  --PCA_dim PCA_DIM     dimensionality of input feature matrix that
                        transformed by PCA
  --F1 F1               number of neurons in the 1-st layer of encoder
  --F2 F2               number of neurons in the 2-nd layer of encoder
  --n_attn_heads N_ATTN_HEADS
                        number of heads for attention
  --dropout_rate DROPOUT_RATE
                        dropout rate of neurons in autoencoder
  --l2_reg L2_REG       coefficient for L2 regularizition
  --learning_rate LEARNING_RATE
                        learning rate for training
  --pre_lr PRE_LR       learning rate for pre-training
  --pre_epochs PRE_EPOCHS
                        number of epochs for pre-training
  --epochs EPOCHS       number of epochs for pre-training
  --c1 C1               weight of reconstruction loss
  --c2 C2               weight of clustering loss

Process finished with exit code 0

  1. Biase 数据集
    (1)运行process.py
    scGAC运行_第1张图片
    (2)运行scGAC.py
    输入参数 Biase 3(n_clusters)
data/Biase/data.tsv Biase 512 True 3 None
NE
Shape after transformation: (49, 512)
Pre-process: run time is 0.00  minutes

Pre-train: run time is 0.08  minutes
--------------------------------
Kmeans start, with data shape of (49, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.917 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.923 , delta_label 0.0 , loss: [1.17 0.5  0.67 0.09]
Iter: 4 , sil_hid: 0.93 , delta_label 0.0 , loss: [ 1.12  0.5   0.61 -0.04]
Iter: 6 , sil_hid: 0.934 , delta_label 0.0 , loss: [1.11 0.51 0.59 0.09]
Iter: 8 , sil_hid: 0.938 , delta_label 0.0 , loss: [1.13 0.51 0.62 0.11]
Iter: 10 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.09  0.51  0.58 -0.04]
Iter: 12 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.05  0.5   0.56 -0.01]
Iter: 14 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.04  0.5   0.54 -0.02]
Iter: 16 , sil_hid: 0.939 , delta_label 0.0 , loss: [1.08 0.51 0.57 0.02]
Iter: 18 , sil_hid: 0.937 , delta_label 0.0 , loss: [ 1.05  0.51  0.54 -0.07]
Iter: 20 , sil_hid: 0.938 , delta_label 0.0 , loss: [ 1.05  0.5   0.55 -0.2 ]
Iter: 22 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.02  0.48  0.54 -0.1 ]
Iter: 24 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.03  0.49  0.54 -0.1 ]
Iter: 26 , sil_hid: 0.942 , delta_label 0.0 , loss: [ 1.04  0.49  0.55 -0.08]
Iter: 28 , sil_hid: 0.943 , delta_label 0.0 , loss: [ 1.04  0.5   0.54 -0.17]
Iter: 30 , sil_hid: 0.945 , delta_label 0.0 , loss: [ 1.05  0.49  0.55 -0.15]
Iter: 32 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 1.    0.47  0.53 -0.16]
Iter: 34 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.03  0.48  0.55 -0.12]
Iter: 36 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.03  0.49  0.54 -0.08]
Iter: 38 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.07]
Iter: 40 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 1.04  0.48  0.55 -0.02]
Iter: 42 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 1.04  0.48  0.56 -0.06]
Iter: 44 , sil_hid: 0.946 , delta_label 0.0 , loss: [ 1.04  0.47  0.57 -0.15]
Iter: 46 , sil_hid: 0.943 , delta_label 0.0 , loss: [ 1.02  0.49  0.53 -0.12]
Iter: 48 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 0.97  0.47  0.5  -0.19]
Iter: 50 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.09]
Iter: 52 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.06  0.48  0.58 -0.18]
Iter: 54 , sil_hid: 0.942 , delta_label 0.0 , loss: [ 1.01  0.47  0.54 -0.09]
Iter: 56 , sil_hid: 0.944 , delta_label 0.0 , loss: [ 1.01  0.47  0.54 -0.13]
Iter: 58 , sil_hid: 0.946 , delta_label 0.0 , loss: [ 1.    0.47  0.53 -0.11]
Iter: 60 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 1.04  0.47  0.56 -0.1 ]
Iter: 62 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.03  0.48  0.55 -0.05]
Iter: 64 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.09]
Iter: 66 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.03  0.49  0.54 -0.07]
Iter: 68 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 0.99  0.46  0.53 -0.07]
Iter: 70 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 1.    0.46  0.55 -0.1 ]
Iter: 72 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 1.01  0.47  0.55 -0.07]
Iter: 74 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 0.99  0.46  0.53 -0.11]
Iter: 76 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 1.    0.47  0.53 -0.16]
Iter: 78 , sil_hid: 0.95 , delta_label 0.0 , loss: [ 1.    0.46  0.53 -0.17]
Iter: 80 , sil_hid: 0.95 , delta_label 0.0 , loss: [ 1.    0.46  0.54 -0.14]
Iter: 82 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.99  0.47  0.52 -0.14]
Iter: 84 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 1.03  0.48  0.55 -0.19]
Iter: 86 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.99  0.46  0.53 -0.11]
Iter: 88 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.08]
Iter: 90 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.02  0.47  0.55 -0.11]
Iter: 92 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.03  0.47  0.56 -0.11]
Iter: 94 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.    0.46  0.54 -0.14]
Iter: 96 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.98  0.46  0.51 -0.1 ]
Iter: 98 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.    0.48  0.52 -0.11]
Iter: 100 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.99  0.47  0.52 -0.06]
Iter: 102 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.97  0.46  0.51 -0.12]
Iter: 104 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.95  0.47  0.49 -0.18]
Iter: 106 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.16]
Iter: 108 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.1 ]
Iter: 110 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.99  0.46  0.54 -0.2 ]
Iter: 112 , sil_hid: 0.953 , delta_label 0.0 , loss: [ 0.98  0.47  0.5  -0.13]
Iter: 114 , sil_hid: 0.953 , delta_label 0.0 , loss: [ 1.    0.46  0.54 -0.16]
Iter: 116 , sil_hid: 0.954 , delta_label 0.0 , loss: [ 0.98  0.45  0.52 -0.09]
Iter: 118 , sil_hid: 0.954 , delta_label 0.0 , loss: [ 0.99  0.45  0.53 -0.15]
Stop early at 118 epoch
Train: run time is 0.07  minutes
Done.

输入参数 Biase 4(n_clusters)

data/Biase/data.tsv Biase 512 True 4 None
NE
Shape after transformation: (49, 512)
Pre-process: run time is 0.00  minutes

Pre-train: run time is 0.07  minutes
--------------------------------
Kmeans start, with data shape of (49, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.726 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.725 , delta_label 0.163 , loss: [1.26 0.52 0.74 0.12]
Iter: 4 , sil_hid: 0.719 , delta_label 0.0 , loss: [1.18 0.52 0.67 0.21]
Iter: 6 , sil_hid: 0.747 , delta_label 0.082 , loss: [1.16 0.51 0.66 0.22]
Iter: 8 , sil_hid: 0.731 , delta_label 0.02 , loss: [1.11 0.51 0.61 0.18]
Iter: 10 , sil_hid: 0.736 , delta_label 0.041 , loss: [1.07 0.52 0.55 0.03]
Iter: 12 , sil_hid: 0.734 , delta_label 0.02 , loss: [ 1.07  0.51  0.55 -0.02]
Iter: 14 , sil_hid: 0.756 , delta_label 0.082 , loss: [ 1.02  0.5   0.52 -0.01]
Iter: 16 , sil_hid: 0.759 , delta_label 0.0 , loss: [1.04 0.5  0.53 0.02]
Iter: 18 , sil_hid: 0.758 , delta_label 0.0 , loss: [1.07 0.5  0.56 0.07]
Iter: 20 , sil_hid: 0.761 , delta_label 0.0 , loss: [ 1.05  0.49  0.56 -0.05]
Iter: 22 , sil_hid: 0.738 , delta_label 0.061 , loss: [ 1.06  0.51  0.55 -0.09]
Iter: 24 , sil_hid: 0.731 , delta_label 0.0 , loss: [1.03 0.5  0.54 0.03]
Iter: 26 , sil_hid: 0.76 , delta_label 0.061 , loss: [ 1.1   0.51  0.59 -0.14]
Iter: 28 , sil_hid: 0.765 , delta_label 0.0 , loss: [ 1.05  0.51  0.54 -0.06]
Iter: 30 , sil_hid: 0.764 , delta_label 0.0 , loss: [ 1.    0.49  0.52 -0.03]
Iter: 32 , sil_hid: 0.758 , delta_label 0.0 , loss: [ 1.02  0.47  0.54 -0.04]
Iter: 34 , sil_hid: 0.752 , delta_label 0.0 , loss: [1.02 0.5  0.52 0.02]
Iter: 36 , sil_hid: 0.723 , delta_label 0.061 , loss: [1.03 0.52 0.51 0.01]
Iter: 38 , sil_hid: 0.735 , delta_label 0.041 , loss: [ 0.99  0.48  0.51 -0.02]
Iter: 40 , sil_hid: 0.739 , delta_label 0.02 , loss: [ 0.97  0.49  0.47 -0.  ]
Iter: 42 , sil_hid: 0.721 , delta_label 0.041 , loss: [ 1.01  0.49  0.53 -0.14]
Iter: 44 , sil_hid: 0.72 , delta_label 0.0 , loss: [ 1.    0.49  0.51 -0.08]
Iter: 46 , sil_hid: 0.737 , delta_label 0.041 , loss: [ 0.96  0.48  0.47 -0.06]
Iter: 48 , sil_hid: 0.735 , delta_label 0.02 , loss: [0.98 0.49 0.49 0.01]
Iter: 50 , sil_hid: 0.729 , delta_label 0.02 , loss: [ 0.98  0.48  0.5  -0.1 ]
Iter: 52 , sil_hid: 0.729 , delta_label 0.0 , loss: [ 0.96  0.49  0.47 -0.03]
Iter: 54 , sil_hid: 0.729 , delta_label 0.0 , loss: [ 0.94  0.48  0.46 -0.09]
Iter: 56 , sil_hid: 0.73 , delta_label 0.02 , loss: [ 0.95  0.47  0.48 -0.19]
Iter: 58 , sil_hid: 0.768 , delta_label 0.061 , loss: [ 0.95  0.48  0.48 -0.12]
Iter: 60 , sil_hid: 0.773 , delta_label 0.0 , loss: [ 0.95  0.48  0.48 -0.13]
Iter: 62 , sil_hid: 0.771 , delta_label 0.082 , loss: [ 0.95  0.47  0.48 -0.06]
Iter: 64 , sil_hid: 0.769 , delta_label 0.0 , loss: [0.98 0.48 0.51 0.  ]
Iter: 66 , sil_hid: 0.765 , delta_label 0.0 , loss: [ 1.03  0.51  0.53 -0.01]
Iter: 68 , sil_hid: 0.761 , delta_label 0.0 , loss: [1.02 0.48 0.55 0.  ]
Iter: 70 , sil_hid: 0.76 , delta_label 0.0 , loss: [ 1.07  0.5   0.57 -0.05]
Iter: 72 , sil_hid: 0.76 , delta_label 0.0 , loss: [ 1.01  0.47  0.53 -0.02]
Iter: 74 , sil_hid: 0.766 , delta_label 0.082 , loss: [ 1.02  0.47  0.55 -0.03]
Iter: 76 , sil_hid: 0.763 , delta_label 0.082 , loss: [ 1.01  0.47  0.54 -0.04]
Iter: 78 , sil_hid: 0.764 , delta_label 0.0 , loss: [ 1.01  0.47  0.54 -0.07]
Iter: 80 , sil_hid: 0.763 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.06]
Iter: 82 , sil_hid: 0.769 , delta_label 0.082 , loss: [ 0.95  0.47  0.48 -0.04]
Iter: 84 , sil_hid: 0.76 , delta_label 0.082 , loss: [ 0.94  0.46  0.48 -0.07]
Iter: 86 , sil_hid: 0.764 , delta_label 0.082 , loss: [ 0.96  0.47  0.49 -0.06]
Iter: 88 , sil_hid: 0.761 , delta_label 0.0 , loss: [ 0.94  0.46  0.48 -0.06]
Iter: 90 , sil_hid: 0.761 , delta_label 0.0 , loss: [ 0.96  0.46  0.5  -0.01]
Iter: 92 , sil_hid: 0.76 , delta_label 0.0 , loss: [ 0.97  0.46  0.51 -0.09]
Iter: 94 , sil_hid: 0.756 , delta_label 0.082 , loss: [ 0.96  0.45  0.5  -0.12]
Iter: 96 , sil_hid: 0.773 , delta_label 0.02 , loss: [ 0.99  0.47  0.52 -0.08]
Iter: 98 , sil_hid: 0.756 , delta_label 0.02 , loss: [ 0.99  0.48  0.51 -0.02]
Iter: 100 , sil_hid: 0.754 , delta_label 0.0 , loss: [ 0.99  0.47  0.51 -0.03]
Iter: 102 , sil_hid: 0.765 , delta_label 0.082 , loss: [ 0.98  0.46  0.52 -0.  ]
Iter: 104 , sil_hid: 0.742 , delta_label 0.102 , loss: [1.   0.48 0.53 0.02]
Iter: 106 , sil_hid: 0.739 , delta_label 0.02 , loss: [ 0.97  0.47  0.51 -0.09]
Iter: 108 , sil_hid: 0.738 , delta_label 0.0 , loss: [ 0.96  0.45  0.51 -0.08]
Iter: 110 , sil_hid: 0.739 , delta_label 0.02 , loss: [ 0.96  0.46  0.5  -0.01]
Iter: 112 , sil_hid: 0.733 , delta_label 0.0 , loss: [0.99 0.46 0.53 0.01]
Iter: 114 , sil_hid: 0.735 , delta_label 0.02 , loss: [ 0.96  0.47  0.5  -0.03]
Iter: 116 , sil_hid: 0.736 , delta_label 0.0 , loss: [ 0.99  0.48  0.52 -0.07]
Iter: 118 , sil_hid: 0.739 , delta_label 0.02 , loss: [ 0.97  0.47  0.5  -0.05]
Stop early at 118 epoch
Train: run time is 0.09  minutes
Done.

输入参数 Biase 3 --subtype_path data/Biase/subtype.ann --k 4

data/Biase/data.tsv Biase 512 True 3 4
NE
Shape after transformation: (49, 512)
Pre-process: run time is 0.00  minutes**

Pre-train: run time is 0.07  minutes
--------------------------------
Kmeans start, with data shape of (49, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.922 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.924 , delta_label 0.0 , loss: [1.19 0.52 0.67 0.12]
Iter: 4 , sil_hid: 0.928 , delta_label 0.0 , loss: [1.15 0.52 0.63 0.02]
Iter: 6 , sil_hid: 0.932 , delta_label 0.0 , loss: [ 1.13  0.54  0.59 -0.05]
Iter: 8 , sil_hid: 0.936 , delta_label 0.0 , loss: [1.14 0.54 0.6  0.04]
Iter: 10 , sil_hid: 0.939 , delta_label 0.0 , loss: [ 1.08  0.51  0.57 -0.02]
Iter: 12 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.11  0.51  0.6  -0.  ]
Iter: 14 , sil_hid: 0.942 , delta_label 0.0 , loss: [1.06 0.49 0.56 0.05]
Iter: 16 , sil_hid: 0.943 , delta_label 0.0 , loss: [1.11 0.52 0.59 0.04]
Iter: 18 , sil_hid: 0.943 , delta_label 0.0 , loss: [1.08 0.5  0.58 0.03]
Iter: 20 , sil_hid: 0.943 , delta_label 0.0 , loss: [1.09 0.5  0.59 0.01]
Iter: 22 , sil_hid: 0.942 , delta_label 0.0 , loss: [ 1.05  0.5   0.55 -0.01]
Iter: 24 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.06  0.49  0.57 -0.02]
Iter: 26 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.04  0.5   0.54 -0.13]
Iter: 28 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.04]
Iter: 30 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.08  0.52  0.56 -0.05]
Iter: 32 , sil_hid: 0.94 , delta_label 0.0 , loss: [ 1.04  0.5   0.54 -0.07]
Iter: 34 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.02  0.48  0.54 -0.11]
Iter: 36 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.03  0.49  0.54 -0.06]
Iter: 38 , sil_hid: 0.941 , delta_label 0.0 , loss: [ 1.02  0.49  0.53 -0.04]
Iter: 40 , sil_hid: 0.942 , delta_label 0.0 , loss: [ 1.02  0.49  0.53 -0.1 ]
Iter: 42 , sil_hid: 0.942 , delta_label 0.0 , loss: [ 1.02  0.48  0.54 -0.03]
Iter: 44 , sil_hid: 0.943 , delta_label 0.0 , loss: [ 1.03  0.5   0.54 -0.14]
Iter: 46 , sil_hid: 0.943 , delta_label 0.0 , loss: [ 0.97  0.47  0.5  -0.06]
Iter: 48 , sil_hid: 0.944 , delta_label 0.0 , loss: [ 1.04  0.5   0.55 -0.13]
Iter: 50 , sil_hid: 0.946 , delta_label 0.0 , loss: [ 1.01  0.49  0.52 -0.09]
Iter: 52 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 0.99  0.47  0.52 -0.12]
Iter: 54 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 0.99  0.47  0.52 -0.14]
Iter: 56 , sil_hid: 0.95 , delta_label 0.0 , loss: [ 1.03  0.48  0.55 -0.1 ]
Iter: 58 , sil_hid: 0.95 , delta_label 0.0 , loss: [ 1.01  0.47  0.54 -0.13]
Iter: 60 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 1.02  0.49  0.53 -0.11]
Iter: 62 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 1.01  0.47  0.54 -0.21]
Iter: 64 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 1.02  0.47  0.55 -0.16]
Iter: 66 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 0.96  0.47  0.5  -0.11]
Iter: 68 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 0.97  0.46  0.5  -0.18]
Iter: 70 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 0.96  0.47  0.49 -0.2 ]
Iter: 72 , sil_hid: 0.947 , delta_label 0.0 , loss: [ 0.99  0.48  0.51 -0.15]
Iter: 74 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 0.98  0.48  0.5  -0.09]
Iter: 76 , sil_hid: 0.948 , delta_label 0.0 , loss: [ 0.96  0.47  0.49 -0.15]
Iter: 78 , sil_hid: 0.949 , delta_label 0.0 , loss: [ 0.97  0.46  0.51 -0.15]
Iter: 80 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 1.01  0.48  0.52 -0.11]
Iter: 82 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.95  0.47  0.48 -0.14]
Iter: 84 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.13]
Iter: 86 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.98  0.48  0.51 -0.05]
Iter: 88 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.99  0.47  0.51 -0.07]
Iter: 90 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.98  0.46  0.52 -0.13]
Iter: 92 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.97  0.46  0.51 -0.13]
Iter: 94 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 1.03  0.47  0.55 -0.04]
Iter: 96 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.12]
Iter: 98 , sil_hid: 0.951 , delta_label 0.0 , loss: [ 1.    0.49  0.51 -0.15]
Iter: 100 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.99  0.48  0.51 -0.2 ]
Iter: 102 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.    0.47  0.53 -0.09]
Iter: 104 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.99  0.47  0.52 -0.09]
Iter: 106 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 1.01  0.48  0.53 -0.11]
Iter: 108 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.98  0.46  0.52 -0.11]
Iter: 110 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.97  0.47  0.51 -0.1 ]
Iter: 112 , sil_hid: 0.952 , delta_label 0.0 , loss: [ 0.98  0.47  0.52 -0.07]
Iter: 114 , sil_hid: 0.953 , delta_label 0.0 , loss: [ 0.98  0.47  0.51 -0.06]
Iter: 116 , sil_hid: 0.953 , delta_label 0.0 , loss: [ 0.98  0.46  0.52 -0.12]
Iter: 118 , sil_hid: 0.954 , delta_label 0.0 , loss: [ 1.    0.46  0.54 -0.12]
Stop early at 118 epoch
Train: run time is 0.08  minutes
#######################
ARI 1.0
NMI 1.0
Done.
  1. Björklund数据集
    基因名称为第一列
    例如
    scGAC运行_第2张图片

(1)运行process.py
scGAC运行_第3张图片
(2)scGAC.py
输入参数:Björklund 3

data/Björklund/data.tsv Björklund 512 True 3 None
NE
Shape after transformation: (648, 512)
Pre-process: run time is 0.10  minutes

Pre-train: run time is 0.60  minutes
--------------------------------
Kmeans start, with data shape of (648, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.609 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.639 , delta_label 0.014 , loss: [ 1.36  0.77  0.59 -0.08]
Iter: 4 , sil_hid: 0.66 , delta_label 0.005 , loss: [ 1.39  0.77  0.62 -0.04]
Iter: 6 , sil_hid: 0.674 , delta_label 0.006 , loss: [ 1.32  0.77  0.55 -0.05]
Iter: 8 , sil_hid: 0.682 , delta_label 0.008 , loss: [ 1.3   0.77  0.53 -0.09]
Iter: 10 , sil_hid: 0.688 , delta_label 0.005 , loss: [ 1.25  0.77  0.48 -0.02]
Iter: 12 , sil_hid: 0.695 , delta_label 0.0 , loss: [ 1.24  0.77  0.47 -0.05]
Iter: 14 , sil_hid: 0.702 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.01]
Iter: 16 , sil_hid: 0.708 , delta_label 0.002 , loss: [ 1.18  0.77  0.41 -0.06]
Iter: 18 , sil_hid: 0.712 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.05]
Iter: 20 , sil_hid: 0.716 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.05]
Iter: 22 , sil_hid: 0.719 , delta_label 0.0 , loss: [ 1.14  0.77  0.38 -0.12]
Iter: 24 , sil_hid: 0.721 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.1 ]
Iter: 26 , sil_hid: 0.724 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.09]
Iter: 28 , sil_hid: 0.727 , delta_label 0.0 , loss: [ 1.15  0.77  0.38 -0.16]
Iter: 30 , sil_hid: 0.73 , delta_label 0.002 , loss: [ 1.14  0.77  0.38 -0.09]
Iter: 32 , sil_hid: 0.734 , delta_label 0.002 , loss: [ 1.14  0.77  0.38 -0.1 ]
Iter: 34 , sil_hid: 0.737 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.09]
Iter: 36 , sil_hid: 0.738 , delta_label 0.003 , loss: [ 1.15  0.77  0.38 -0.12]
Iter: 38 , sil_hid: 0.74 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.13]
Iter: 40 , sil_hid: 0.742 , delta_label 0.0 , loss: [ 1.15  0.77  0.38 -0.12]
Iter: 42 , sil_hid: 0.744 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.08]
Iter: 44 , sil_hid: 0.745 , delta_label 0.003 , loss: [ 1.16  0.77  0.39 -0.1 ]
Iter: 46 , sil_hid: 0.746 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.09]
Iter: 48 , sil_hid: 0.749 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.09]
Iter: 50 , sil_hid: 0.75 , delta_label 0.003 , loss: [ 1.16  0.77  0.39 -0.11]
Iter: 52 , sil_hid: 0.751 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.12]
Iter: 54 , sil_hid: 0.753 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.09]
Iter: 56 , sil_hid: 0.755 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.14]
Iter: 58 , sil_hid: 0.757 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.11]
Iter: 60 , sil_hid: 0.758 , delta_label 0.0 , loss: [ 1.16  0.77  0.4  -0.09]
Iter: 62 , sil_hid: 0.759 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.1 ]
Iter: 64 , sil_hid: 0.761 , delta_label 0.0 , loss: [ 1.15  0.77  0.39 -0.13]
Iter: 66 , sil_hid: 0.762 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 68 , sil_hid: 0.761 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 70 , sil_hid: 0.762 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.14]
Iter: 72 , sil_hid: 0.763 , delta_label 0.003 , loss: [ 1.17  0.77  0.4  -0.08]
Iter: 74 , sil_hid: 0.763 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.07]
Iter: 76 , sil_hid: 0.764 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.08]
Iter: 78 , sil_hid: 0.764 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.1 ]
Iter: 80 , sil_hid: 0.764 , delta_label 0.0 , loss: [ 1.15  0.77  0.38 -0.07]
Iter: 82 , sil_hid: 0.766 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.08]
Iter: 84 , sil_hid: 0.766 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.15]
Iter: 86 , sil_hid: 0.766 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.08]
Iter: 88 , sil_hid: 0.766 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.11]
Iter: 90 , sil_hid: 0.767 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.17]
Iter: 92 , sil_hid: 0.768 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.07]
Iter: 94 , sil_hid: 0.768 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.08]
Iter: 96 , sil_hid: 0.767 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.1 ]
Iter: 98 , sil_hid: 0.768 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.09]
Iter: 100 , sil_hid: 0.768 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.08]
Iter: 102 , sil_hid: 0.77 , delta_label 0.003 , loss: [ 1.17  0.77  0.4  -0.09]
Iter: 104 , sil_hid: 0.77 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.09]
Iter: 106 , sil_hid: 0.771 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.11]
Iter: 108 , sil_hid: 0.772 , delta_label 0.002 , loss: [ 1.16  0.77  0.4  -0.14]
Iter: 110 , sil_hid: 0.771 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.09]
Iter: 112 , sil_hid: 0.771 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 114 , sil_hid: 0.771 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.1 ]
Iter: 116 , sil_hid: 0.772 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.08]
Iter: 118 , sil_hid: 0.772 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.1 ]
Iter: 120 , sil_hid: 0.773 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 122 , sil_hid: 0.773 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.14]
Iter: 124 , sil_hid: 0.774 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 126 , sil_hid: 0.774 , delta_label 0.0 , loss: [ 1.16  0.77  0.4  -0.11]
Iter: 128 , sil_hid: 0.775 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.08]
Iter: 130 , sil_hid: 0.775 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.07]
Iter: 132 , sil_hid: 0.776 , delta_label 0.002 , loss: [ 1.16  0.77  0.39 -0.07]
Iter: 134 , sil_hid: 0.778 , delta_label 0.003 , loss: [ 1.16  0.77  0.39 -0.1 ]
Iter: 136 , sil_hid: 0.781 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.1 ]
Iter: 138 , sil_hid: 0.781 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.11]
Iter: 140 , sil_hid: 0.781 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.1 ]
Iter: 142 , sil_hid: 0.782 , delta_label 0.0 , loss: [ 1.18  0.77  0.42 -0.11]
Iter: 144 , sil_hid: 0.783 , delta_label 0.002 , loss: [ 1.18  0.77  0.41 -0.09]
Iter: 146 , sil_hid: 0.784 , delta_label 0.006 , loss: [ 1.18  0.77  0.41 -0.07]
Iter: 148 , sil_hid: 0.785 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.1 ]
Iter: 150 , sil_hid: 0.786 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.11]
Stop early at 150 epoch
Train: run time is 0.79  minutes
Done.

输入参数 Björklund 3 --k 4

data/Björklund/data.tsv Björklund 512 True 3 4
NE
Shape after transformation: (648, 512)
Pre-process: run time is 0.11  minutes

Pre-train: run time is 0.50  minutes
--------------------------------
Kmeans start, with data shape of (648, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.67 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.686 , delta_label 0.002 , loss: [ 1.39  0.77  0.62 -0.13]
Iter: 4 , sil_hid: 0.701 , delta_label 0.003 , loss: [ 1.35  0.77  0.59 -0.1 ]
Iter: 6 , sil_hid: 0.711 , delta_label 0.008 , loss: [ 1.35  0.77  0.58 -0.09]
Iter: 8 , sil_hid: 0.72 , delta_label 0.005 , loss: [ 1.32  0.77  0.55 -0.15]
Iter: 10 , sil_hid: 0.726 , delta_label 0.002 , loss: [ 1.29  0.77  0.52 -0.14]
Iter: 12 , sil_hid: 0.731 , delta_label 0.002 , loss: [ 1.25  0.77  0.48 -0.11]
Iter: 14 , sil_hid: 0.736 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.12]
Iter: 16 , sil_hid: 0.741 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.16]
Iter: 18 , sil_hid: 0.745 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.09]
Iter: 20 , sil_hid: 0.748 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.1 ]
Iter: 22 , sil_hid: 0.753 , delta_label 0.002 , loss: [ 1.19  0.77  0.42 -0.15]
Iter: 24 , sil_hid: 0.759 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.09]
Iter: 26 , sil_hid: 0.764 , delta_label 0.002 , loss: [ 1.2   0.77  0.43 -0.18]
Iter: 28 , sil_hid: 0.768 , delta_label 0.002 , loss: [ 1.2   0.77  0.43 -0.14]
Iter: 30 , sil_hid: 0.771 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 32 , sil_hid: 0.775 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.15]
Iter: 34 , sil_hid: 0.778 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.16]
Iter: 36 , sil_hid: 0.783 , delta_label 0.002 , loss: [ 1.21  0.77  0.45 -0.14]
Iter: 38 , sil_hid: 0.786 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.17]
Iter: 40 , sil_hid: 0.789 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.12]
Iter: 42 , sil_hid: 0.791 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.14]
Iter: 44 , sil_hid: 0.793 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.15]
Iter: 46 , sil_hid: 0.796 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 48 , sil_hid: 0.797 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.14]
Iter: 50 , sil_hid: 0.798 , delta_label 0.003 , loss: [ 1.21  0.77  0.45 -0.18]
Iter: 52 , sil_hid: 0.799 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.14]
Iter: 54 , sil_hid: 0.799 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.15]
Iter: 56 , sil_hid: 0.8 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 58 , sil_hid: 0.8 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.17]
Iter: 60 , sil_hid: 0.801 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.12]
Iter: 62 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.2 ]
Iter: 64 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.16]
Iter: 66 , sil_hid: 0.803 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.17]
Iter: 68 , sil_hid: 0.803 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.19]
Iter: 70 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 72 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.12]
Iter: 74 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.14]
Iter: 76 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.17]
Iter: 78 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.15]
Iter: 80 , sil_hid: 0.802 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.16]
Iter: 82 , sil_hid: 0.803 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.14]
Iter: 84 , sil_hid: 0.804 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.13]
Iter: 86 , sil_hid: 0.805 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.15]
Iter: 88 , sil_hid: 0.805 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.14]
Iter: 90 , sil_hid: 0.806 , delta_label 0.002 , loss: [ 1.23  0.77  0.46 -0.2 ]
Iter: 92 , sil_hid: 0.806 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.15]
Iter: 94 , sil_hid: 0.806 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.13]
Iter: 96 , sil_hid: 0.806 , delta_label 0.003 , loss: [ 1.23  0.77  0.46 -0.13]
Iter: 98 , sil_hid: 0.807 , delta_label 0.0 , loss: [ 1.22  0.77  0.46 -0.17]
Iter: 100 , sil_hid: 0.807 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.2 ]
Iter: 102 , sil_hid: 0.807 , delta_label 0.002 , loss: [ 1.23  0.77  0.46 -0.16]
Iter: 104 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.14]
Iter: 106 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.22  0.77  0.46 -0.14]
Iter: 108 , sil_hid: 0.81 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 110 , sil_hid: 0.811 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.15]
Iter: 112 , sil_hid: 0.811 , delta_label 0.0 , loss: [ 1.22  0.77  0.46 -0.15]
Iter: 114 , sil_hid: 0.812 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.16]
Iter: 116 , sil_hid: 0.812 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.11]
Iter: 118 , sil_hid: 0.812 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.14]
Iter: 120 , sil_hid: 0.813 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.16]
Iter: 122 , sil_hid: 0.815 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.16]
Iter: 124 , sil_hid: 0.816 , delta_label 0.0 , loss: [ 1.22  0.77  0.46 -0.1 ]
Iter: 126 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.22  0.77  0.46 -0.14]
Iter: 128 , sil_hid: 0.817 , delta_label 0.002 , loss: [ 1.22  0.77  0.45 -0.12]
Iter: 130 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.13]
Iter: 132 , sil_hid: 0.817 , delta_label 0.003 , loss: [ 1.22  0.77  0.46 -0.16]
Iter: 134 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.12]
Iter: 136 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.18]
Iter: 138 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.22  0.77  0.45 -0.18]
Iter: 140 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.15]
Iter: 142 , sil_hid: 0.817 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.17]
Stop early at 142 epoch
Train: run time is 0.41  minutes
Done.

基因名称为第二列
例如
scGAC运行_第4张图片

输入数据名称 3

data/Björklund/data.tsv Björklund 512 True 3 None
NE
Shape after transformation: (648, 512)
Pre-process: run time is 0.22  minutes


Pre-train: run time is 0.44  minutes
--------------------------------
Kmeans start, with data shape of (648, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.646 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.671 , delta_label 0.012 , loss: [ 1.4   0.77  0.64 -0.09]
Iter: 4 , sil_hid: 0.69 , delta_label 0.008 , loss: [ 1.39  0.77  0.62 -0.1 ]
Iter: 6 , sil_hid: 0.705 , delta_label 0.005 , loss: [ 1.34  0.77  0.57 -0.1 ]
Iter: 8 , sil_hid: 0.713 , delta_label 0.005 , loss: [ 1.28  0.77  0.51 -0.12]
Iter: 10 , sil_hid: 0.72 , delta_label 0.003 , loss: [ 1.25  0.77  0.48 -0.09]
Iter: 12 , sil_hid: 0.726 , delta_label 0.0 , loss: [ 1.23  0.77  0.46 -0.07]
Iter: 14 , sil_hid: 0.731 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.06]
Iter: 16 , sil_hid: 0.734 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.1 ]
Iter: 18 , sil_hid: 0.736 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.1 ]
Iter: 20 , sil_hid: 0.74 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.08]
Iter: 22 , sil_hid: 0.742 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.13]
Iter: 24 , sil_hid: 0.744 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.12]
Iter: 26 , sil_hid: 0.747 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.12]
Iter: 28 , sil_hid: 0.748 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.13]
Iter: 30 , sil_hid: 0.749 , delta_label 0.0 , loss: [ 1.16  0.77  0.39 -0.1 ]
Iter: 32 , sil_hid: 0.751 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.15]
Iter: 34 , sil_hid: 0.751 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 36 , sil_hid: 0.752 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.15]
Iter: 38 , sil_hid: 0.752 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.18]
Iter: 40 , sil_hid: 0.754 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.13]
Iter: 42 , sil_hid: 0.755 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.11]
Iter: 44 , sil_hid: 0.757 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.16]
Iter: 46 , sil_hid: 0.76 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.13]
Iter: 48 , sil_hid: 0.763 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 50 , sil_hid: 0.765 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.2 ]
Iter: 52 , sil_hid: 0.766 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 54 , sil_hid: 0.767 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 56 , sil_hid: 0.768 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.13]
Iter: 58 , sil_hid: 0.77 , delta_label 0.002 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 60 , sil_hid: 0.771 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.17]
Iter: 62 , sil_hid: 0.772 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.12]
Iter: 64 , sil_hid: 0.773 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.2 ]
Iter: 66 , sil_hid: 0.775 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 68 , sil_hid: 0.776 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.19]
Iter: 70 , sil_hid: 0.777 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.22]
Iter: 72 , sil_hid: 0.778 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.15]
Iter: 74 , sil_hid: 0.779 , delta_label 0.002 , loss: [ 1.18  0.77  0.41 -0.19]
Iter: 76 , sil_hid: 0.78 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.18]
Iter: 78 , sil_hid: 0.782 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.14]
Iter: 80 , sil_hid: 0.784 , delta_label 0.002 , loss: [ 1.18  0.77  0.42 -0.18]
Iter: 82 , sil_hid: 0.785 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.11]
Iter: 84 , sil_hid: 0.785 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.1 ]
Iter: 86 , sil_hid: 0.785 , delta_label 0.0 , loss: [ 1.18  0.77  0.42 -0.16]
Iter: 88 , sil_hid: 0.786 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.18]
Iter: 90 , sil_hid: 0.787 , delta_label 0.002 , loss: [ 1.19  0.77  0.42 -0.17]
Iter: 92 , sil_hid: 0.788 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.15]
Iter: 94 , sil_hid: 0.789 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.16]
Iter: 96 , sil_hid: 0.79 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.13]
Iter: 98 , sil_hid: 0.792 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.13]
Iter: 100 , sil_hid: 0.793 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.17]
Iter: 102 , sil_hid: 0.792 , delta_label 0.002 , loss: [ 1.19  0.77  0.43 -0.18]
Iter: 104 , sil_hid: 0.793 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.16]
Iter: 106 , sil_hid: 0.792 , delta_label 0.002 , loss: [ 1.19  0.77  0.42 -0.14]
Iter: 108 , sil_hid: 0.792 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.12]
Iter: 110 , sil_hid: 0.793 , delta_label 0.0 , loss: [ 1.18  0.77  0.42 -0.15]
Iter: 112 , sil_hid: 0.793 , delta_label 0.0 , loss: [ 1.18  0.77  0.41 -0.18]
Iter: 114 , sil_hid: 0.794 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.17]
Iter: 116 , sil_hid: 0.794 , delta_label 0.0 , loss: [ 1.17  0.77  0.4  -0.18]
Iter: 118 , sil_hid: 0.794 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.14]
Iter: 120 , sil_hid: 0.796 , delta_label 0.002 , loss: [ 1.17  0.77  0.4  -0.17]
Iter: 122 , sil_hid: 0.796 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.14]
Iter: 124 , sil_hid: 0.797 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.15]
Iter: 126 , sil_hid: 0.797 , delta_label 0.0 , loss: [ 1.17  0.77  0.41 -0.15]
Iter: 128 , sil_hid: 0.797 , delta_label 0.0 , loss: [ 1.18  0.77  0.42 -0.17]
Iter: 130 , sil_hid: 0.799 , delta_label 0.0 , loss: [ 1.18  0.77  0.42 -0.15]
Iter: 132 , sil_hid: 0.8 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.17]
Iter: 134 , sil_hid: 0.802 , delta_label 0.002 , loss: [ 1.19  0.77  0.42 -0.15]
Iter: 136 , sil_hid: 0.803 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.21]
Iter: 138 , sil_hid: 0.804 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.13]
Iter: 140 , sil_hid: 0.804 , delta_label 0.0 , loss: [ 1.19  0.77  0.42 -0.13]
Iter: 142 , sil_hid: 0.805 , delta_label 0.0 , loss: [ 1.19  0.77  0.43 -0.13]
Iter: 144 , sil_hid: 0.805 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.16]
Iter: 146 , sil_hid: 0.806 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.18]
Iter: 148 , sil_hid: 0.806 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.17]
Iter: 150 , sil_hid: 0.807 , delta_label 0.002 , loss: [ 1.21  0.77  0.44 -0.18]
Iter: 152 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.21  0.77  0.45 -0.2 ]
Iter: 154 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.12]
Iter: 156 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.16]
Iter: 158 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.11]
Iter: 160 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.14]
Iter: 162 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.15]
Iter: 164 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.1 ]
Iter: 166 , sil_hid: 0.808 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.15]
Iter: 168 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.2   0.77  0.43 -0.16]
Iter: 170 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.21  0.77  0.44 -0.13]
Iter: 172 , sil_hid: 0.809 , delta_label 0.0 , loss: [ 1.2   0.77  0.44 -0.12]
Iter: 174 , sil_hid: 0.808 , delta_label 0.003 , loss: [ 1.19  0.77  0.42 -0.15]
Iter: 176 , sil_hid: 0.807 , delta_label 0.002 , loss: [ 1.21  0.77  0.45 -0.13]
Stop early at 176 epoch
Train: run time is 0.84  minutes
Done.

Process finished with exit code 0

输入数据名称 4


data/Björklund/data.tsv Björklund 512 True 4 None
NE
Shape after transformation: (648, 512)
Pre-process: run time is 0.22  minutes

Pre-train: run time is 0.75  minutes
--------------------------------
Kmeans start, with data shape of (648, 64)
Kmeans end
--------------------------------
Iter: 0 , sil_hid: 0.578 , delta_label 0.0 , loss: 0
Iter: 2 , sil_hid: 0.592 , delta_label 0.011 , loss: [ 1.54  0.77  0.77 -0.12]
Iter: 4 , sil_hid: 0.602 , delta_label 0.0 , loss: [ 1.52  0.77  0.75 -0.09]
Iter: 6 , sil_hid: 0.602 , delta_label 0.019 , loss: [ 1.46  0.77  0.69 -0.1 ]
Iter: 8 , sil_hid: 0.603 , delta_label 0.011 , loss: [ 1.4   0.77  0.63 -0.12]
Iter: 10 , sil_hid: 0.603 , delta_label 0.008 , loss: [ 1.34  0.77  0.57 -0.13]
Iter: 12 , sil_hid: 0.593 , delta_label 0.015 , loss: [ 1.3   0.77  0.53 -0.13]
Iter: 14 , sil_hid: 0.585 , delta_label 0.012 , loss: [ 1.27  0.77  0.5  -0.12]
Iter: 16 , sil_hid: 0.581 , delta_label 0.008 , loss: [ 1.23  0.77  0.46 -0.14]
Iter: 18 , sil_hid: 0.568 , delta_label 0.015 , loss: [ 1.2   0.77  0.43 -0.12]
Iter: 20 , sil_hid: 0.564 , delta_label 0.009 , loss: [ 1.19  0.77  0.41 -0.12]
Iter: 22 , sil_hid: 0.558 , delta_label 0.011 , loss: [ 1.17  0.77  0.4  -0.1 ]
Iter: 24 , sil_hid: 0.556 , delta_label 0.005 , loss: [ 1.16  0.77  0.39 -0.14]
Iter: 26 , sil_hid: 0.555 , delta_label 0.005 , loss: [ 1.15  0.77  0.38 -0.14]
Iter: 28 , sil_hid: 0.551 , delta_label 0.006 , loss: [ 1.15  0.77  0.38 -0.14]
Iter: 30 , sil_hid: 0.548 , delta_label 0.005 , loss: [ 1.14  0.77  0.37 -0.17]
Iter: 32 , sil_hid: 0.536 , delta_label 0.017 , loss: [ 1.15  0.77  0.38 -0.12]
Iter: 34 , sil_hid: 0.532 , delta_label 0.008 , loss: [ 1.14  0.77  0.37 -0.11]
Iter: 36 , sil_hid: 0.525 , delta_label 0.011 , loss: [ 1.14  0.77  0.37 -0.18]
Iter: 38 , sil_hid: 0.523 , delta_label 0.006 , loss: [ 1.14  0.77  0.37 -0.15]
Iter: 40 , sil_hid: 0.521 , delta_label 0.015 , loss: [ 1.14  0.77  0.37 -0.18]
Iter: 42 , sil_hid: 0.522 , delta_label 0.008 , loss: [ 1.15  0.77  0.38 -0.19]
Iter: 44 , sil_hid: 0.524 , delta_label 0.005 , loss: [ 1.14  0.77  0.37 -0.19]
Iter: 46 , sil_hid: 0.531 , delta_label 0.008 , loss: [ 1.14  0.77  0.38 -0.16]
Iter: 48 , sil_hid: 0.535 , delta_label 0.002 , loss: [ 1.14  0.77  0.37 -0.18]
Iter: 50 , sil_hid: 0.537 , delta_label 0.0 , loss: [ 1.15  0.77  0.38 -0.21]
Iter: 52 , sil_hid: 0.538 , delta_label 0.0 , loss: [ 1.15  0.77  0.38 -0.15]
Iter: 54 , sil_hid: 0.544 , delta_label 0.003 , loss: [ 1.15  0.77  0.38 -0.17]
Iter: 56 , sil_hid: 0.547 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.18]
Iter: 58 , sil_hid: 0.539 , delta_label 0.003 , loss: [ 1.15  0.77  0.38 -0.1 ]
Iter: 60 , sil_hid: 0.55 , delta_label 0.006 , loss: [ 1.15  0.77  0.38 -0.16]
Iter: 62 , sil_hid: 0.559 , delta_label 0.006 , loss: [ 1.15  0.77  0.38 -0.18]
Iter: 64 , sil_hid: 0.566 , delta_label 0.002 , loss: [ 1.14  0.77  0.37 -0.13]
Iter: 66 , sil_hid: 0.589 , delta_label 0.005 , loss: [ 1.16  0.77  0.39 -0.21]
Iter: 68 , sil_hid: 0.612 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.18]
Iter: 70 , sil_hid: 0.617 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.16]
Iter: 72 , sil_hid: 0.621 , delta_label 0.0 , loss: [ 1.14  0.77  0.37 -0.17]
Iter: 74 , sil_hid: 0.625 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.18]
Iter: 76 , sil_hid: 0.568 , delta_label 0.002 , loss: [ 1.15  0.77  0.38 -0.14]
Stop early at 76 epoch
Train: run time is 0.46  minutes
Done.

Process finished with exit code 0

你可能感兴趣的:(运行结果,笔记,python,开发语言)