--- language: [en, vi] license: apache-2.0 base_model: Qwen/Qwen2.5-1.5B-Instruct tags: [qwen, lora, movie, domain-adaptation, sft, t4x2, kaggle, research] pipeline_tag: text-generation library_name: transformers datasets: [nhonhoccode/movie-llm-artifacts] model-index: - name: Movie-Qwen1.5B (Domain SFT) results: [] --- # Movie-Qwen1.5B — Domain SFT (IMDb × Wikipedia) **Date**: 2025-10-14 Single-process model-parallel SFT with LoRA FP16 (T4×2). Answer-only loss. Time-capped. See usage snippet in repo.