metadata
			license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - glue
metrics:
  - matthews_correlation
model-index:
  - name: distilbert-base-uncased-finetuned-FFT-CoLA
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: glue
          type: glue
          args: cola
        metrics:
          - name: Matthews Correlation
            type: matthews_correlation
            value: 0.5049093009936784
distilbert-base-uncased-finetuned-lora-cola
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset. It achieves the following results on the evaluation set:
- Matthews Correlation: 0.5049
 - trainable model parameters: 1181954
 - all model parameters: 68136964
 - percentage of trainable model parameters: 1.73%
 
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-04
 - train_batch_size: 32
 - eval_batch_size: 32
 - seed: 42
 - weight_decay: 0.01
 - rank: 32
 - lora_alpha: 16
 - lora_dropout: 0.05
 - num_epochs: 5