license: apache-2.0 datasets: - glue
bert-base-uncased finetuned on MNLI-mm.
bert-base-uncased
MNLI-mm
batch size is 32, learning rate is 2e-5.
acc: 0.8486