ncnn is a high-performance neural network inference framework optimized for the mobile platform
https://github.com/Tencent/ncnn
ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply considerate about deployment and uses on mobile phones from the beginning of design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open source frameworks on mobile phone cpu. Developers can easily deploy deep learning algorithm models to the mobile platform by using efficient ncnn implementation, create intelligent APPs, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.
- Links to X11:Deepin:Factory / ncnn
- Has a link diff
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout home:hillwood:branches:X11:Deepin:Factory/ncnn && cd $_
- Create Badge
Refresh
Source Files (show merged sources derived from linked package)
Filename | Size | Changed |
---|---|---|
_link | 0000000126 126 Bytes | |
ncnn-20250503.tar.gz | 0013200604 12.6 MB | |
ncnn.changes | 0000001633 1.59 KB | |
ncnn.spec | 0000003072 3 KB |
Comments 0