Package: llama.cpp Architecture: i386 Version: 2025.02.11-1 Priority: optional Section: utils Maintainer: gfdgd_xi <3025613752@qq.com> Installed-Size: 61564 Depends: libc6 (>= 2.34), libgcc-s1 (>= 7), libgomp1 (>= 4.9), libstdc++6 (>= 12) Filename: ./l/llama.cpp/llama.cpp_2025.02.11-1_i386.deb Size: 8624688 MD5sum: 214bba3b4c87b670699ef6cb41d66832 SHA1: c21597f0cdb1194caefa40f877273b9f68e83820 SHA256: 2638138d005e6c3cc40d2688117df7e09fb05ef5b9c72acf615f640fbad32876 SHA512: 58acc71cced24e3b5bd33710eb1f2c5a955d012a6a1af778985ea1996d17c767bf022125b669123a751ed35e67dc43cef08464760ef47b871118218ed8b5f5bb Homepage: https://gitee.com/GXDE-OS/llama.cpp Description: Inference of Meta's LLaMA model (and others) in pure C/C++