mradermacher commited on
Commit
8316b0a
·
verified ·
1 Parent(s): aad17d3

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -1
README.md CHANGED
@@ -3,6 +3,8 @@ base_model: zelk12/MT1-Gen2-BIMMMU-gemma-2-9B
3
  language:
4
  - en
5
  library_name: transformers
 
 
6
  quantized_by: mradermacher
7
  tags:
8
  - mergekit
@@ -18,6 +20,9 @@ tags:
18
  static quants of https://huggingface.co/zelk12/MT1-Gen2-BIMMMU-gemma-2-9B
19
 
20
  <!-- provided-files -->
 
 
 
21
  weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
22
  ## Usage
23
 
@@ -62,6 +67,6 @@ questions you might have and/or if you want some other model quantized.
62
 
63
  I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
64
  me use its servers and providing upgrades to my workstation to enable
65
- this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
66
 
67
  <!-- end -->
 
3
  language:
4
  - en
5
  library_name: transformers
6
+ mradermacher:
7
+ readme_rev: 1
8
  quantized_by: mradermacher
9
  tags:
10
  - mergekit
 
20
  static quants of https://huggingface.co/zelk12/MT1-Gen2-BIMMMU-gemma-2-9B
21
 
22
  <!-- provided-files -->
23
+
24
+ ***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#MT1-Gen2-BIMMMU-gemma-2-9B-GGUF).***
25
+
26
  weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
27
  ## Usage
28
 
 
67
 
68
  I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
69
  me use its servers and providing upgrades to my workstation to enable
70
+ this work in my free time.
71
 
72
  <!-- end -->