Enables R users to run large language models locally using 'GGUF' model files and the 'llama.cpp' inference engine. Provides a complete R interface for loading models, generating text completions, and streaming responses in real-time. Supports local inference without requiring cloud APIs or internet connectivity, ensuring complete data privacy and control. References: 'Gerganov' et al. (2023) <https://github.com/ggml-org/llama.cpp>.
Version: | 0.1.0 |
Depends: | R (≥ 4.0) |
Imports: | Rcpp (≥ 1.0.0), utils, tools |
LinkingTo: | Rcpp |
Suggests: | testthat (≥ 3.0.0), knitr, rmarkdown |
Published: | 2025-09-22 |
Author: | Pawan Rama Mali [aut, cre] |
Maintainer: | Pawan Rama Mali <prm at outlook.in> |
BugReports: | https://github.com/PawanRamaMali/edgemodelr/issues |
License: | MIT + file LICENSE |
URL: | https://github.com/PawanRamaMali/edgemodelr |
NeedsCompilation: | yes |
SystemRequirements: | C++17, GNU make or equivalent for building |
CRAN checks: | edgemodelr results [issues need fixing before 2025-10-06] |
Reference manual: | edgemodelr.html , edgemodelr.pdf |
Package source: | edgemodelr_0.1.0.tar.gz |
Windows binaries: | r-devel: not available, r-release: edgemodelr_0.1.0.zip, r-oldrel: not available |
macOS binaries: | r-release (arm64): not available, r-oldrel (arm64): not available, r-release (x86_64): not available, r-oldrel (x86_64): not available |
Please use the canonical form https://CRAN.R-project.org/package=edgemodelr to link to this page.