File size: 2,754 Bytes
43b6fa7
57f0eee
 
43b6fa7
57f0eee
 
43b6fa7
 
 
 
57f0eee
 
 
43b6fa7
57f0eee
 
f12e4a9
43b6fa7
 
 
57f0eee
f12e4a9
57f0eee
f12e4a9
57f0eee
 
 
 
101fcdd
57f0eee
 
 
 
d9c1cc1
57f0eee
 
1010d28
57f0eee
 
f9870c4
57f0eee
 
 
d9c1cc1
f12e4a9
57f0eee
1010d28
57f0eee
 
 
 
 
d9c1cc1
57f0eee
1010d28
57f0eee
 
f42d00c
 
 
d9c1cc1
f42d00c
57f0eee
 
d9c1cc1
57f0eee
 
 
 
225c8ec
f42d00c
225c8ec
f42d00c
225c8ec
f42d00c
225c8ec
 
f42d00c
225c8ec
f42d00c
225c8ec
f42d00c
225c8ec
f42d00c
225c8ec
f42d00c
225c8ec
f42d00c
 
225c8ec
 
 
57f0eee
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
language:
- en
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
---

# Model Card for Model ID

![image/png](https://cdn-uploads.huggingface.co/production/uploads/6320e992beec1969845be447/25pTrbjySoblu8cuiHASu.png)
  
  Introducing Pixie Zehir Nano.

  Excelling in writing.
  
  A fine tune of Llama 3.2 3B on HQ DATA™ from Pixie Zehir.

## Model Details

- **Developed by:** [Maani x BLNKBLK]
- **Language(s) (NLP):** [English]
- **License:** [Apache 2.0]
- **Finetuned from model :** [meta-llama/Llama-3.2-3B-Instruct]


## Agreements

Model is created for research purposes.

## Usage

```python
!pip install -qqq transformers
import torch
from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="Maani/PXZNNO",
    torch_dtype=torch.bfloat16,
    device_map="auto",
)

prompt = "Write a haiku."
messages = [
    {"role": "user", "content": prompt},
]
prompt = pipe.tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True,
    enable_thinking=True  # Switches between thinking and non-thinking modes. Default is True.
)

res = pipe(
    prompt,
    max_new_tokens=4096,
    temperature=0.8,
    top_p=0.95,
    top_k=20,
    min_p=0.1,
    do_sample=True,
)

print(res[0]["generated_text"])
```
```markdown
#sample of output
<|begin_of_text|><|start_header_id|>system<|end_header_id|>

Enable deep thinking subroutine.<|eot_id|><|start_header_id|>user<|end_header_id|>

Write a haiku.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

<think>
Okay, I need to write a haiku. Let me remember the structure. A haiku is a traditional Japanese poem with three lines. The syllable count is 5-7-5. The first line has 5 syllables, the second 7, and the third 5.

Hmm, what's a common haiku theme? Nature is often popular. Maybe something about seasons. Let me think of a season. Autumn? Leaves falling, that's a classic image.

First line: "Falling leaves dance" – that's 5 syllables. Leaves, falling, dance. Nice and simple.

Second line: "Whispers of autumn's sigh" – that's 7 syllables. Whispers, autumn's, sigh. It captures the gentle sound of falling leaves.

Third line: "Time's fleeting breath" – 5 syllables. Time's fleeting, breath. It's a metaphor for the passage of time.

Does that work? Let me check the syllable count again. Yes, 5-7-5. The imagery is cohesive, and it evokes a sense of time passing and the beauty of autumn. I think this works. Let me see if there's a better option, but I think this one is solid.
</think>

Falling leaves dance
Whispers of autumn's sigh
Time's fleeting breath
```

Thanks to mradermacher, You can find the GGUF quantized versions of earlier 1.8B Zehir nano at: mradermacher/PixieZehirNano-GGUF