support for mlx lm and llama.cpp
#2
by
Narutoouz
- opened
thanks
There is actually a PR in draft status in llama.cpp repo. It was paused due lack of interest, but maybe with this release the development will be resumed...
https://github.com/ggml-org/llama.cpp/pull/17454