bhogan commited on
Commit
168d8fc
·
verified ·
1 Parent(s): 593259c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -3
README.md CHANGED
@@ -12,7 +12,7 @@ base_model:
12
  **qqWen-1.5B-SFT** is a 1.5-billion parameter language model specifically designed for advanced reasoning and code generation in the Q programming language. Built upon the robust Qwen 2.5 architecture, this model has undergone a comprehensive two-stage training process: pretraining and supervised fine-tuning (SFT), for the Q programming language.
13
 
14
 
15
- **Associated Technical Report**: [Link to paper will be added here]
16
 
17
  ## 🔤 About Q Programming Language
18
 
@@ -32,5 +32,15 @@ Q is a high-performance, vector-oriented programming language developed by Kx Sy
32
 
33
 
34
  ## 📝 Citation
35
-
36
- If you use this model in your research or applications, please cite our technical report.
 
 
 
 
 
 
 
 
 
 
 
12
  **qqWen-1.5B-SFT** is a 1.5-billion parameter language model specifically designed for advanced reasoning and code generation in the Q programming language. Built upon the robust Qwen 2.5 architecture, this model has undergone a comprehensive two-stage training process: pretraining and supervised fine-tuning (SFT), for the Q programming language.
13
 
14
 
15
+ **Associated Technical Report**: [Report](https://arxiv.org/abs/2508.06813)
16
 
17
  ## 🔤 About Q Programming Language
18
 
 
32
 
33
 
34
  ## 📝 Citation
35
+ ```
36
+ If you use this model in your research or applications, please cite our technical report.
37
+ @misc{hogan2025technicalreportfullstackfinetuning,
38
+ title={Technical Report: Full-Stack Fine-Tuning for the Q Programming Language},
39
+ author={Brendan R. Hogan and Will Brown and Adel Boyarsky and Anderson Schneider and Yuriy Nevmyvaka},
40
+ year={2025},
41
+ eprint={2508.06813},
42
+ archivePrefix={arXiv},
43
+ primaryClass={cs.LG},
44
+ url={https://arxiv.org/abs/2508.06813},
45
+ }
46
+ ```