Jerome Powell
Collection
Jerome Powell SFT datasets and models
β’
4 items
β’
Updated
Powell-Phi3-Mini is an fine-tuned language model that replicates Federal Reserve Chair Jerome Powell's distinctive communication style, tone, and strategic hedging patterns. This project showcases expertise in modern LLM fine-tuning techniques, parameter-efficient training methods, and responsible AI development β demonstrating industry-ready machine learning engineering skills.
from transformers import AutoTokenizer, AutoModelForCausalLM
# One-line model loading
tokenizer = AutoTokenizer.from_pretrained("BoostedJonP/powell-phi3-mini")
model = AutoModelForCausalLM.from_pretrained("BoostedJonP/powell-phi3-mini", device_map="auto")
# Economic analysis prompt
prompt = "How is the current labor market affecting your inflation outlook?"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
response = model.generate(**inputs, max_new_tokens=200, do_sample=True)
print(tokenizer.decode(response[0], skip_special_tokens=True))
| Component | Specification |
|---|---|
| Base Model | microsoft/Phi-3-mini-4k-instruct (3.8B parameters) |
| License | MIT License (Commercial Use Approved) |
| Fine-tuning Method | QLoRA with PEFT integration |
| Context Length | 4,096 tokens |
| Training Hardware | NVIDIA TESLA P100 (16GB VRAM) |
| Hyperparameter | Value | Rationale |
|---|---|---|
| LoRA Rank (r) | 16 | Optimal parameter/performance balance |
| LoRA Alpha | 32 | 2x rank for stable training |
| Dropout Rate | 0.05 | Regularization without overfitting |
| Learning Rate | 1.5e-4 | Conservative rate for stable convergence |
| Scheduler | Cosine decay | Smooth learning rate reduction |
| Training Epochs | 3 | Prevents overfitting on specialized domain |
| Sequence Length | 1,536 tokens | Optimized for dataset |
| Precision | Mixed fp16 | 2x memory efficiency, maintained accuracy |
| Metric | Baseline (Phi-3) | Powell-Phi3-Mini | Improvement |
|---|---|---|---|
| Powell-style Classification | NA | NA | NA |
| Economic Domain Accuracy | NA | NA | NA |
| Response Coherence (BLEU) | NA | NA | NA |
Try Powell-Phi3-Mini Interactive Demo β
BoostedJonP/powell-phi3-mini-adapterBoostedJonP/powell-phi3-mini (Full Model - 7.4GB)Base model
microsoft/Phi-3-mini-4k-instruct