Package: llama.cpp Architecture: amd64 Version: 2025.02.11-1 Priority: optional Section: utils Maintainer: gfdgd_xi <3025613752@qq.com> Installed-Size: 56771 Depends: libc6 (>= 2.34), libgcc-s1 (>= 3.0), libgomp1 (>= 4.9), libstdc++6 (>= 12) Filename: ./l/llama.cpp/llama.cpp_2025.02.11-1_amd64.deb Size: 8289304 MD5sum: 4476caf59724917a3d23ccb43e7331e3 SHA1: b48e3ccf5226524f032b61c13266bbb25423f418 SHA256: 0f7d9f42d549ef2c6af4b0a76575110d20310c1a643337a2ba8f4a5f46d0654a SHA512: 723bf66cfa0935359ea5e0c9c0c0941961f4dbd4ea784a323003688e60c250e777c6ff88f9e3b349fdf67cd46096c3153b993d12051e4c9b7ef537d4684bcaef Homepage: https://gitee.com/GXDE-OS/llama.cpp Description: Inference of Meta's LLaMA model (and others) in pure C/C++