I quantized GLM-4.7-PRISM using the same method than Unsloth Dynamics's GLM 4.7. I copied each tensor type from their quantization, and I used the same imatrix.
If you want a certain quantization type that I haven't uploaded yet, feel free to ask (the model is pretty big so I did only a few quantizations yet).
If you want to buy me a coffee https://ko-fi.com/alicesynthesisthirty, thank you, very much appreciated!
- Downloads last month
- 2,235
Hardware compatibility
Log In to add your hardware
1-bit
2-bit
4-bit
5-bit
6-bit
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ 1 Ask for provider support