Update README.md
Browse files
README.md
CHANGED
|
@@ -284,17 +284,4 @@ print(outputs[0]["generated_text"])
|
|
| 284 |
|
| 285 |
Output:
|
| 286 |
|
| 287 |
-
> A Mixture of Experts (ME) is a machine learning technique that combines multiple expert models to make predictions or decisions. Each expert model is specialized in a different aspect of the problem, and their outputs are combined to produce a more accurate and robust solution. This approach allows the model to leverage the strengths of individual experts and compensate for their weaknesses, improving overall performance.
|
| 288 |
-
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
| 289 |
-
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v2)
|
| 290 |
-
|
| 291 |
-
| Metric |Value|
|
| 292 |
-
|---------------------------------|----:|
|
| 293 |
-
|Avg. |72.33|
|
| 294 |
-
|AI2 Reasoning Challenge (25-Shot)|68.77|
|
| 295 |
-
|HellaSwag (10-Shot) |86.80|
|
| 296 |
-
|MMLU (5-Shot) |65.10|
|
| 297 |
-
|TruthfulQA (0-shot) |60.68|
|
| 298 |
-
|Winogrande (5-shot) |80.90|
|
| 299 |
-
|GSM8k (5-shot) |71.72|
|
| 300 |
-
|
|
|
|
| 284 |
|
| 285 |
Output:
|
| 286 |
|
| 287 |
+
> A Mixture of Experts (ME) is a machine learning technique that combines multiple expert models to make predictions or decisions. Each expert model is specialized in a different aspect of the problem, and their outputs are combined to produce a more accurate and robust solution. This approach allows the model to leverage the strengths of individual experts and compensate for their weaknesses, improving overall performance.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|