NNI神经网络调参工具简单学东西笔记,关于参数传递逻辑的学习记录

main函数传入参数字典

if __name__ == '__main__':
    try:
        # get parameters form tuner
        tuner_params = nni.get_next_parameter()
        logger.debug(tuner_params)
        params = vars(merge_parameter(get_params(), tuner_params))
        print(params)
        main(params)
    except Exception as exception:
        logger.exception(exception)
        raise

对于这个简单的mnist.py示例,我们从上面源码就能掌握这个NNI训练的流程:

1 nni.get_next_parameter()返回NNI所挑选的超参字典,供main函数调用

2 logger.debug输出一下1的字典;

3 get_params()给用户从命令行输入超参数的权力。

这里我理解:vars(merge_parameter(get_params(), tuner_params))就是将用户的输入和NNI提供的超参结合起来作为最后的超参字典

但是在实际当中报了错误:

  File "D:\my_codeworkspace\bishe_new\jiaoben\train_KINN_NNIParameter.py", line 178, in <module>
    params = vars(merge_parameter(get_params(), tuner_params))
  File "C:\Users\asus\.conda\envs\pytorch\lib\site-packages\nni\utils.py", line 238, in merge_parameter
    raise ValueError('Key \'%s\' not found in base parameters.' % k)
ValueError: Key 'embedding_dim' not found in base parameters.

如果有大佬懂的请指点下我吧

4 输出一下,传入main函数。

注意,NNI产生的超参字典就是基于用户所设计的search_space.json文件产生的。例如在这次我的毕业设计项目中使用的超参json文件PINN_search.json:

{

	"momentum": {
		"_type": "uniform",
		"_value": [0.65, 0.99]
	},
	"lr": {
		"_type": "uniform",
		"_value": [1e-7, 5e-4]
	},
	"depth": {
		"_type": "choice",
		"_value": [1, 2]
	},
	"embedding_dim": {
		"_type": "choice",
		"_value": [{
				"_name": 4,
				"num_heads": {
					"_type": "choice",
					"_value": [1, 2, 4]
				}
			}, {
				"_name": 6,
				"num_heads": {
					"_type": "choice",
					"_value": [2, 3]
				}
			},
			{
				"_name": 12,
				"num_heads": {
					"_type": "choice",
					"_value": [2, 3, 4, 6]
				}
			}, {
				"_name": 24,
				"num_heads": {
					"_type": "choice",
					"_value": [2, 3, 4, 6]
				}
			},
			{
				"_name": 64,
				"num_heads": {
					"_type": "choice",
					"_value": [2, 4, 6, 8, 16]
				}
			}
		]
	},
	"ifRes": {
		"_type": "choice",
		"_value": [0, 1, 2]
	},
	"attnscore": {
		"_type": "choice",
		"_value": ["softmax", "tanh", "linear", "relu"]
	},
	"drop": {
		"_type": "uniform",
		"_value": [0.4, 0.98]
	},
	"attn_drop": {
		"_type": "uniform",
		"_value": [0.25, 0.98]
	}

}

它由NNI产生的字典就是含有momentum,lr,…attn_drop若干键的字典。
NNII允许条件参数空间,如上面的embedding_dim键,它的值也是一个字典,如果我想在main函数中用num_heads,那么调用办法就是:

arg["embedding_dim"]["num_heads"]

arg.get("embedding_dim",4).get("num_heads",2)

你可能感兴趣的:(NNI)