腾讯推理框架TNN简介

 GitHub - Tencent/TNN: TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and performance optimization for mobile devices, and also draws on the advantages of good extensibility and high performance from existed open source efforts. TNN has been deployed in multiple Apps from Tencent, such as Mobile QQ, Weishi, Pitu, etc. Contributions are welcome to work in collaborative with us and make TNN a better framework.TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and performance optimization for mobile devices, and also draws on the advantages of good extensibility and high performance from existed open source efforts. TNN has been deployed in multiple Apps from Tencent, such as Mobile QQ, Weishi, Pitu, etc. Contributions are welcome to work in collaborative with us and make TNN a better framework. - GitHub - Tencent/TNN: TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and performance optimization for mobile devices, and also draws on the advantages of good extensibility and high performance from existed open source efforts. TNN has been deployed in multiple Apps from Tencent, such as Mobile QQ, Weishi, Pitu, etc. Contributions are welcome to work in collaborative with us and make TNN a better framework.https://github.com/Tencent/TNN

腾讯推理框架TNN简介简介 TNN是腾讯开源的新一代跨平台深度学习推理框架,是首个同时支持移动端、桌面端、服务端的全平台开源版本。TNN新版本在通用性、易用性、性能方面进一步获得提升。 TNN官网地址 https://github.com/Tencent/TNN TNN通用性 在保证模型统一、接口统一的前提下,依托于硬件厂商提供的加速框架基础算子支持,以及手写kernel优化的方式,对移动端、桌面端和服务端提供了多种不同加速选择,实现了对常用CV、NLP模型的优化适配。 硬件平台支持 https://content.markdowner.net/pub/3jvAVz pgX2y8v TNN通过集成 OpenVINO 和 TensorRT 的方式新增了对服务端X86和NVIDIA硬件的支持,既能快速获取硬件厂商的最新优化成果,又能基于业务模型结构特点添加自定义实现达到性能极致。同时考虑到桌面端应用对安装包大小的限制,TNN通过JIT和手工优化的方式实现了轻量级的X86后端,整体库大小仅为5MB左右。 模型算子支持 TNN新版本在CV类模型的支持上扩展了对3D CNN、LSTM、BERT等模型结构的支持,总算子数从88个增加到10https://copyfuture.com/blogs-details/20211101220455950e

TNN服务端

通过集成OpenVINO和TensorRT的方式新增了对服务端X86和NVIDIA硬件的支持,既能快速获取硬件厂商的最新优化成果,又能基于业务模型结构特点添加自定义实现达到性能极致。与业界服务端统一框架onnxruntime性能最好版本相比,TNN当前在CV类模型有一定优势,而onnxruntime在NLP类模型有一定优势。TNN刚开始支持NLP模型,后续会在这块持续优化。

TNN桌面端

为了兼顾高性能和硬件兼容性,同时考虑应用App 对安装包大小的限制,通过JIT和手工优化的方式实现了轻量级的X86后端,支持SSE41、SSE42、AVX、AVX2、FMA等指令集。相比onnxruntime服务端库80MB,TNN桌面端整体库大小仅为5MB左右,而性能差距在20%以内。

你可能感兴趣的:(算法工程,深度学习,人工智能,自然语言处理)