bert-base-mnlimm / README.md
GeneZC's picture
Update README.md
be1ccdc
metadata
license: apache-2.0
datasets:
  - glue

Model Details

bert-base-uncased finetuned on MNLI-mm.

Parameter settings

batch size is 32, learning rate is 2e-5.

Metrics

acc: 0.8486