Sakalti commited on
Commit
4ceceb0
·
verified ·
1 Parent(s): 448a984

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -10
README.md CHANGED
@@ -10,26 +10,26 @@ library_name: transformers
10
  ---
11
 
12
 
13
- # Model Card for Model ID
14
 
15
  <!-- Provide a quick summary of what the model is/does. -->
16
 
17
- This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
18
 
19
- ## Model Details
20
 
 
 
21
  ### Model Description
22
 
23
  <!-- Provide a longer summary of what this model is. -->
24
 
25
 
26
 
27
- - **Developed by:** [More Information Needed]
28
  - **Funded by [optional]:** [More Information Needed]
29
  - **Shared by [optional]:** [More Information Needed]
30
- - **Model type:** [More Information Needed]
31
- - **Language(s) (NLP):** [More Information Needed]
32
- - **License:** [More Information Needed]
33
  - **Finetuned from model [optional]:** [More Information Needed]
34
 
35
  ### Model Sources [optional]
@@ -63,9 +63,8 @@ This modelcard aims to be a base template for new models. It has been generated
63
  [More Information Needed]
64
 
65
  ## Bias, Risks, and Limitations
66
-
67
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
68
-
69
  [More Information Needed]
70
 
71
  ### Recommendations
 
10
  ---
11
 
12
 
13
+ # Model Card for SakaMoe-3x14B
14
 
15
  <!-- Provide a quick summary of what the model is/does. -->
16
 
 
17
 
 
18
 
19
+ ## Model Details
20
+ This model is a reincarnation of [Saka-14B](https://huggingface.co/Sakalti/Saka-14B) as a Mixture of Experts (MoE) and has approximately 45 billion parameters.
21
  ### Model Description
22
 
23
  <!-- Provide a longer summary of what this model is. -->
24
 
25
 
26
 
27
+ - **Developed by:** [Sakalti]
28
  - **Funded by [optional]:** [More Information Needed]
29
  - **Shared by [optional]:** [More Information Needed]
30
+ - **Model type:** [MoE]
31
+ - **Language(s) (NLP):** [Japanese, English..]
32
+ - **License:** [apache-2.0 license]
33
  - **Finetuned from model [optional]:** [More Information Needed]
34
 
35
  ### Model Sources [optional]
 
63
  [More Information Needed]
64
 
65
  ## Bias, Risks, and Limitations
66
+ 1.This model supports Japanese and English, so unexpected responses may occur in other languages.
67
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->1.This model supports Japanese and English, so unexpected responses may occur in other languages.
 
68
  [More Information Needed]
69
 
70
  ### Recommendations