Unreal Engine NeuralNetworkInference 翻译

/**
 * NeuralNetworkInference (NNI) is Unreal Engine's framework for running deep learning and neural network inference. It is focused on:
 * - Efficiency: Underlying state-of-the-art accelerators (DirectML, AVX, CoreML, etc).
 * - Ease-of-use: Simple but powerful API.
 * - Completeness: All the functionality of any state-of-the-art deep learning framework.
 *
 * UNeuralNetwork is the key class of NNI, and the main one users should interact with. It represents the deep neural model itself. It is capable of
 * loading and running inference (i.e., a forward pass) on any ONNX (Open Neural Network eXchange) model. ONNX is the industry standard for ML
 * interoperability, and all major frameworks (PyTorch, TensorFlow, MXNet, Caffe2, etc.) provide converters to ONNX.
 *
 * The following code snippets show the UNeuralNetwork basics (reading a ONNX model and running inference on it). For more detailed examples, see
 * {UE5}/Sa

你可能感兴趣的:(UnrealEngine,虚幻,游戏引擎)