Package: llama.cpp Architecture: amd64 Version: 2025.02.11-1 Priority: optional Section: utils Maintainer: gfdgd_xi <3025613752@qq.com> Installed-Size: 60738 Depends: libc6 (>= 2.38), libgcc-s1 (>= 3.0), libgomp1 (>= 4.9), libstdc++6 (>= 14) Filename: ./l/llama.cpp/llama.cpp_2025.02.11-1_amd64.deb Size: 8565592 MD5sum: 885ed5ccee9d89f08fd349f6037a2dc8 SHA1: ea73974e818741bf78210ec363e5a827fb2fb302 SHA256: e7a03680d4fb4b3c3d57fa06cd3b004a10a0b299e80320856fc9275d42823739 SHA512: b6e25cd6558deba316d62b47a2f1dbb5c607e13cc41de03019a2f7d1b67a597cfbe50008ba86f63440a062983613f09d944dd03dd438c610729705fe172cee62 Homepage: https://gitee.com/GXDE-OS/llama.cpp Description: Inference of Meta's LLaMA model (and others) in pure C/C++