YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

μž…λ ₯ ν…μŠ€νŠΈ μ˜ˆμ‹œ

μ•„λž˜λŠ” 저희 νŒ€μ—μ„œ κ΅¬μΆ•ν•œ RAG νŒŒμ΄ν”„λΌμΈμœΌλ‘œ μ™„μ„±λ˜λŠ” λͺ¨λΈ μž…λ ₯ ν…μŠ€νŠΈμ˜ μ˜ˆμ‹œμž…λ‹ˆλ‹€.

"""λ‹€μŒ 글은 μ–΄λ¬Έ κ·œλ²”μ— λ§žλŠ” ν‘œν˜„μ„ κ³ λ₯΄κ±°λ‚˜ μ–΄λ¬Έ κ·œλ²”μ— 따라 λ¬Έμž₯을 κ΅μ •ν•˜κ³  κ·Έ 이유λ₯Ό μ„€λͺ…ν•  수 μžˆλŠ” μžλ£Œμ΄λ‹€

2. 자음 뒀에 [w]κ°€ 올 λ•Œμ—λŠ” 두 음절둜 갈라 적되, [gw], [hw], [kw]λŠ” ν•œ 음절둜 λΆ™μ—¬ 
μ λŠ”λ‹€. 
- swing[swiΕ‹] μŠ€μœ™, twist[twist] νŠΈμœ„μŠ€νŠΈ, penguin[peΕ‹gwin] νŽ­κ·„, whistle[hwisl] 휘슬,quarter[kwɔːtΙ™] μΏΌν„°

λ‹€μŒμ€ 질문 μœ ν˜•κ³Ό λ‹΅λ³€ ν˜•μ‹μ„ μ΄ν•΄ν•˜λŠ” 데 도움이 λ˜λŠ” μ—¬λŸ¬ μ˜ˆμ‹œμž…λ‹ˆλ‹€. 이 μ˜ˆμ‹œλ“€μ˜ λ‚΄μš©μ— 얽맀이지 μ•Šκ³ , μ£Όμ–΄μ§„ μ§ˆλ¬Έμ— λŒ€ν•΄ κ°€μž₯ μ μ ˆν•œ 닡변을 μƒμ„±ν•˜μ„Έμš”.

question:""λ„€λœλž€λ“œμ˜ {헀이그/ν•˜ν}에 κ°€ λ΄€λ‹€."" κ°€μš΄λ° μ˜¬λ°”λ₯Έ 것을 μ„ νƒν•˜κ³ , κ·Έ 이유λ₯Ό μ„€λͺ…ν•˜μ„Έμš”.
answer:""λ„€λœλž€λ“œμ˜ 헀이그에 κ°€ λ΄€λ‹€.""κ°€ μ˜³λ‹€. μ›μ§€μŒμ΄ μ•„λ‹Œ 제3ꡭ의 발음으둜 ν†΅μš©λ˜κ³  μžˆλŠ” 것은 κ΄€μš©μ„ λ”°λ₯Έλ‹€. λ”°λΌμ„œ '헀이그'둜 μ“°λŠ” 것이 μ μ ˆν•˜λ‹€.

**μœ„ 'answer:' λ’€μ˜ 좜λ ₯ ν˜•μ‹μ„ μ—„κ²©νžˆ μ€€μˆ˜ν•˜μ—¬ 닡변을 μƒμ„±ν•˜μ‹­μ‹œμ˜€.**

이제 μ§ˆλ¬Έμ„ μ‹œμž‘ν•©λ‹ˆλ‹€.
question:""{νžˆμΉ˜ν•˜μ΄ν¬/νž›μΉ˜ν•˜μ΄ν¬}λ₯Ό ν•˜λ‹€."" κ°€μš΄λ° μ˜¬λ°”λ₯Έ 것을 μ„ νƒν•˜κ³ , κ·Έ 이유λ₯Ό μ„€λͺ…ν•˜μ„Έμš”."""

base_model: [] library_name: transformers tags:

  • mergekit
  • merge

0725_5-merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

  • /home/infidea/rebirth-hjun/KRAG_2025/modle_merge/kanana-1.5-8b-instruct-2505-lora-20250715-1532
  • /home/infidea/rebirth-hjun/KRAG_2025/modle_merge/0725_4-merge

Configuration

The following YAML configuration was used to produce this model:

base_model: /home/infidea/rebirth-hjun/KRAG_2025/modle_merge/0725_4-merge
dtype: bfloat16
merge_method: slerp
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 32]
        model: /home/infidea/rebirth-hjun/KRAG_2025/modle_merge/0725_4-merge
      - layer_range: [0, 32]
        model: /home/infidea/rebirth-hjun/KRAG_2025/modle_merge/kanana-1.5-8b-instruct-2505-lora-20250715-1532
parameters:
  t:
  - filter: self_attn
    value: [0.0, 0.5, 0.7, 0.9, 1.0]
  - filter: mlp
    value: [1.0, 0.5, 0.7, 0.5, 0.0]
  - value: 0.8
Downloads last month
1
Safetensors
Model size
8B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for overfit-brothers/KRAG-SOTA

Quantizations
1 model