These are quantized ggml binary files for minigpt4 13B model.
These files can be used in conjunction with [vicuna v0 ggml models](https://huggingface.co/datasets/maknee/ggml-vicuna-v0-quantized) to get minigpt4 working.
Not all implementations were tested. If there are any issues, use f16.