CrossEncoder based on jhu-clsp/ettin-encoder-17m

This is a Cross Encoder model finetuned from jhu-clsp/ettin-encoder-17m on the msmarco dataset using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.

Model Details

Model Description

  • Model Type: Cross Encoder
  • Base model: jhu-clsp/ettin-encoder-17m
  • Maximum Sequence Length: 512 tokens
  • Number of Output Labels: 1 label
  • Training Dataset:
  • Language: en

Model Sources

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import CrossEncoder

# Download from the 🤗 Hub
model = CrossEncoder("tomaarsen/ms-marco-ettin-17m-reranker")
# Get scores for pairs of texts
pairs = [
    ['the name olmec means', 'The name Olmec comes from the Nahuatl word for the Olmecs: Å\x8clmÄ\x93catl [oË\x90lË\x88meË\x90kat͡ɬ] (singular) or Å\x8clmÄ\x93cah [oË\x90lË\x88meË\x90kaÊ\x94] (plural). This word is composed of the two words Å\x8dlli [Ë\x88oË\x90lË\x90i] , meaning rubber, and mÄ\x93catl [Ë\x88meË\x90kat͡ɬ] , meaning people, so the word means rubber people.hey lived in the tropical lowlands of south-central Mexico, in the present-day states of Veracruz and Tabasco. It has been speculated that Olmec derive in part from neighboring Mokaya and/or Mixeâ\x80\x93Zoque.'],
    ['what causes hissing in the ears', 'The following medical conditions are some of the possible causes of Hissing in ears. There are likely to be other possible causes, so ask your doctor about your symptoms. 1  Acute ear infection.  Chronic ear infection.'],
    ['multiple birth sibling definition', 'multiple birth(Noun) a birth in which more than one baby are born. Multiple birth. A multiple birth occurs when more than one fetus is carried to term in a single pregnancy. Different names for multiple births are used, depending on the number of offspring.'],
    ['what cause fluid build up on the knee', 'No: The joint should not have any appreciable fluid in it normally. Fluid can be there from many reasons. Arthritis can cause fluid. Also surgery can cause some bleeding in the knee which can cause the fluid to build up. The fluid makes the joint stiff and makes it hard to get a good quad function. ...Read more.'],
    ['how long is the wait list for kidney transplant', 'At the center of this is the simple fact that organ transplantation is built upon altruism and public trust. Gift of Life works hard to ensure that this trust is maintained, through its commitment both to the donor family as well as to those on the waiting list. Average Median Wait Time to Transplant. Kidney â\x80\x93 5 years.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)

# Or rank different texts based on similarity to a single text
ranks = model.rank(
    'the name olmec means',
    [
        'The name Olmec comes from the Nahuatl word for the Olmecs: Å\x8clmÄ\x93catl [oË\x90lË\x88meË\x90kat͡ɬ] (singular) or Å\x8clmÄ\x93cah [oË\x90lË\x88meË\x90kaÊ\x94] (plural). This word is composed of the two words Å\x8dlli [Ë\x88oË\x90lË\x90i] , meaning rubber, and mÄ\x93catl [Ë\x88meË\x90kat͡ɬ] , meaning people, so the word means rubber people.hey lived in the tropical lowlands of south-central Mexico, in the present-day states of Veracruz and Tabasco. It has been speculated that Olmec derive in part from neighboring Mokaya and/or Mixeâ\x80\x93Zoque.',
        'The following medical conditions are some of the possible causes of Hissing in ears. There are likely to be other possible causes, so ask your doctor about your symptoms. 1  Acute ear infection.  Chronic ear infection.',
        'multiple birth(Noun) a birth in which more than one baby are born. Multiple birth. A multiple birth occurs when more than one fetus is carried to term in a single pregnancy. Different names for multiple births are used, depending on the number of offspring.',
        'No: The joint should not have any appreciable fluid in it normally. Fluid can be there from many reasons. Arthritis can cause fluid. Also surgery can cause some bleeding in the knee which can cause the fluid to build up. The fluid makes the joint stiff and makes it hard to get a good quad function. ...Read more.',
        'At the center of this is the simple fact that organ transplantation is built upon altruism and public trust. Gift of Life works hard to ensure that this trust is maintained, through its commitment both to the donor family as well as to those on the waiting list. Average Median Wait Time to Transplant. Kidney â\x80\x93 5 years.',
    ]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]

Evaluation

Metrics

Cross Encoder Reranking

  • Datasets: NanoMSMARCO_R100, NanoNFCorpus_R100 and NanoNQ_R100
  • Evaluated with CrossEncoderRerankingEvaluator with these parameters:
    {
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric NanoMSMARCO_R100 NanoNFCorpus_R100 NanoNQ_R100
map 0.6239 (+0.1343) 0.3478 (+0.0868) 0.6263 (+0.2067)
mrr@10 0.6129 (+0.1354) 0.5462 (+0.0464) 0.6337 (+0.2070)
ndcg@10 0.6576 (+0.1172) 0.3996 (+0.0745) 0.6720 (+0.1714)

Cross Encoder Nano BEIR

  • Dataset: NanoBEIR_R100_mean
  • Evaluated with CrossEncoderNanoBEIREvaluator with these parameters:
    {
        "dataset_names": [
            "msmarco",
            "nfcorpus",
            "nq"
        ],
        "rerank_k": 100,
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric Value
map 0.5327 (+0.1426)
mrr@10 0.5976 (+0.1296)
ndcg@10 0.5764 (+0.1210)

Training Details

Training Dataset

msmarco

  • Dataset: msmarco at 9e329ed
  • Size: 39,770,704 training samples
  • Columns: query_id, positive_id, negative_id, and score
  • Approximate statistics based on the first 1000 samples:
    query_id positive_id negative_id score
    type string string string float
    details
    • min: 11 characters
    • mean: 33.5 characters
    • max: 98 characters
    • min: 68 characters
    • mean: 358.26 characters
    • max: 988 characters
    • min: 50 characters
    • mean: 346.46 characters
    • max: 969 characters
    • min: -3.69
    • mean: 13.4
    • max: 22.15
  • Samples:
    query_id positive_id negative_id score
    how do you attach a sanding screen to a floor buffer Lay the buffer on the floor so you can get underneath the housing and hook the screening brush to the central spindle. Center a sanding screen on the brush and push on it so it stays in place, then right the buffer and allow the housing to rest on the floor. Plug in the machine and grasp the handles with both hands before you turn it on. The random orbital sander used an offset drive bearing that causes the pad to also move in an elliptical pattern. These pad movements are used in combination, randomly of course to help reduce the swirls that a non-random sheet sander might leave behind. Mechanics aside, a mojor difference is in the sanding medium. Most random orbital sanders use sanding disks, typically in a 5-inch diameter. They are attached to the sander's pad using hook & loop connections. The sanding disks have holes in them that match-up with the dust collection holes in the sander's pad. 15.936323801676433
    define method overriding Method overriding is when a child class redefines the same method as a parent class, with the same parameters. For example, the standard Java class java.util.LinkedHashSet e…xtends java.util.HashSet. The method add() is overridden in LinkedHashSet. In spite of its limitations, the MET concept provides a convenient method to describe the functional capacity or exercise tolerance of an individual as determined from progressive exercise testing and to define a repertoire of physical activities in which a person may participate safely, without exceeding a prescribed intensity level. PMID: 2204507 19.40437730153402
    toxic effects of constipation by Jane Kriese. The colon is the part of the large intestine extending from the cecum to the rectum. And the most common sign of having a toxic colon is a condition called constipation. An accumulation of waste food byproducts in the colon can lead to constipation, toxic build up, weight gain and low energy. In Summary. Commonly reported side effects of tizanidine include: bradycardia, dizziness, drowsiness, hypotension, asthenia, fatigue, and xerostomia. Other side effects include: constipation, and increased liver enzymes. See below for a comprehensive list of adverse effects. 7.669035658240318
  • Loss: MarginMSELoss with these parameters:
    {
        "activation_fn": "torch.nn.modules.linear.Identity"
    }
    

Evaluation Dataset

msmarco

  • Dataset: msmarco at 9e329ed
  • Size: 10,000 evaluation samples
  • Columns: query_id, positive_id, negative_id, and score
  • Approximate statistics based on the first 1000 samples:
    query_id positive_id negative_id score
    type string string string float
    details
    • min: 10 characters
    • mean: 34.37 characters
    • max: 116 characters
    • min: 61 characters
    • mean: 341.63 characters
    • max: 952 characters
    • min: 73 characters
    • mean: 346.27 characters
    • max: 1350 characters
    • min: -0.85
    • mean: 13.13
    • max: 22.52
  • Samples:
    query_id positive_id negative_id score
    the name olmec means The name Olmec comes from the Nahuatl word for the Olmecs: Ōlmēcatl [oːlˈmeːkat͡ɬ] (singular) or Ōlmēcah [oːlˈmeːkaʔ] (plural). This word is composed of the two words ōlli [ˈoːlːi] , meaning rubber, and mēcatl [ˈmeːkat͡ɬ] , meaning people, so the word means rubber people.hey lived in the tropical lowlands of south-central Mexico, in the present-day states of Veracruz and Tabasco. It has been speculated that Olmec derive in part from neighboring Mokaya and/or Mixe–Zoque. Jai: Glory to, Joy, Hail. Guru; Gu - means darkness, and Ru - means a remover. So, Guru means a remover of darkness. Deva; means a shining one and is the source of the English word, Divine. So when we say Jai Guru Deva we are giving thanks to whoever we feel is the source of knowledge to us. 19.567786693572998
    what causes hissing in the ears The following medical conditions are some of the possible causes of Hissing in ears. There are likely to be other possible causes, so ask your doctor about your symptoms. 1 Acute ear infection. Chronic ear infection. There are many possible causes of itchy ears. Although the annoyance of itchy ears is not usually found to be serious, the condition is very real no matter the reason for the uncomfortable nuisance. Poking objects like cotton swabs, pencils, and paperclips in the ears to relieve the irritation is a dangerous and non-effective method for relief. 10.332231203715006
    multiple birth sibling definition multiple birth(Noun) a birth in which more than one baby are born. Multiple birth. A multiple birth occurs when more than one fetus is carried to term in a single pregnancy. Different names for multiple births are used, depending on the number of offspring. From Wikipedia, the free encyclopedia. Not to be confused with Life unworthy of life or Wrongful birth. Wrongful life is the name given to a legal action in which someone is sued by a severely disabled child (through the child's legal guardian) for failing to prevent the child's birth.1 1 Definition.rongful life is the name given to a legal action in which someone is sued by a severely disabled child (through the child's legal guardian) for failing to prevent the child's birth. Contents. 16.912671327590942
  • Loss: MarginMSELoss with these parameters:
    {
        "activation_fn": "torch.nn.modules.linear.Identity"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • seed: 12
  • bf16: True
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 12
  • data_seed: None
  • jit_mode_eval: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss NanoMSMARCO_R100_ndcg@10 NanoNFCorpus_R100_ndcg@10 NanoNQ_R100_ndcg@10 NanoBEIR_R100_mean_ndcg@10
-1 -1 - - 0.0277 (-0.5127) 0.3408 (+0.0157) 0.0189 (-0.4817) 0.1291 (-0.3262)
0.0001 1 203.5639 - - - - -
0.0020 39 206.4834 - - - - -
0.0040 78 202.3608 - - - - -
0.0060 117 197.2066 - - - - -
0.0080 156 189.4029 - - - - -
0.0100 195 180.9973 174.2619 0.0536 (-0.4868) 0.2671 (-0.0580) 0.0676 (-0.4330) 0.1294 (-0.3259)
0.0121 234 172.5803 - - - - -
0.0141 273 165.2367 - - - - -
0.0161 312 157.6653 - - - - -
0.0181 351 147.8596 - - - - -
0.0201 390 129.2817 113.0435 0.1823 (-0.3581) 0.2122 (-0.1128) 0.2045 (-0.2961) 0.1997 (-0.2557)
0.0221 429 97.5978 - - - - -
0.0241 468 67.4161 - - - - -
0.0261 507 52.7053 - - - - -
0.0281 546 45.7155 - - - - -
0.0301 585 41.9731 41.3247 0.4712 (-0.0692) 0.2975 (-0.0275) 0.4695 (-0.0311) 0.4127 (-0.0426)
0.0321 624 39.3177 - - - - -
0.0341 663 36.7606 - - - - -
0.0362 702 34.6954 - - - - -
0.0382 741 33.4444 - - - - -
0.0402 780 31.8238 31.8207 0.5084 (-0.0320) 0.3346 (+0.0096) 0.5137 (+0.0131) 0.4522 (-0.0031)
0.0422 819 30.6819 - - - - -
0.0442 858 29.4152 - - - - -
0.0462 897 28.4563 - - - - -
0.0482 936 27.38 - - - - -
0.0502 975 26.8008 27.3043 0.4982 (-0.0422) 0.3449 (+0.0198) 0.5200 (+0.0194) 0.4544 (-0.0010)
0.0522 1014 26.0336 - - - - -
0.0542 1053 25.3375 - - - - -
0.0562 1092 24.9918 - - - - -
0.0582 1131 24.1657 - - - - -
0.0603 1170 24.4028 24.2684 0.5504 (+0.0100) 0.3672 (+0.0422) 0.5478 (+0.0471) 0.4885 (+0.0331)
0.0623 1209 23.3062 - - - - -
0.0643 1248 22.7702 - - - - -
0.0663 1287 22.4086 - - - - -
0.0683 1326 21.9251 - - - - -
0.0703 1365 21.9175 21.9635 0.5852 (+0.0448) 0.3549 (+0.0299) 0.5656 (+0.0649) 0.5019 (+0.0465)
0.0723 1404 20.9865 - - - - -
0.0743 1443 20.8581 - - - - -
0.0763 1482 20.584 - - - - -
0.0783 1521 19.8788 - - - - -
0.0803 1560 19.6365 19.9889 0.5988 (+0.0583) 0.3533 (+0.0283) 0.5369 (+0.0362) 0.4963 (+0.0409)
0.0823 1599 19.2009 - - - - -
0.0844 1638 18.9352 - - - - -
0.0864 1677 18.7044 - - - - -
0.0884 1716 18.3696 - - - - -
0.0904 1755 18.223 18.6559 0.6300 (+0.0896) 0.3561 (+0.0311) 0.5659 (+0.0652) 0.5173 (+0.0620)
0.0924 1794 18.1149 - - - - -
0.0944 1833 17.5771 - - - - -
0.0964 1872 17.2442 - - - - -
0.0984 1911 16.8317 - - - - -
0.1004 1950 16.8462 17.0786 0.6294 (+0.0890) 0.3552 (+0.0302) 0.5643 (+0.0637) 0.5163 (+0.0609)
0.1024 1989 16.5298 - - - - -
0.1044 2028 16.4289 - - - - -
0.1064 2067 16.0292 - - - - -
0.1085 2106 15.8428 - - - - -
0.1105 2145 15.7437 16.0163 0.6269 (+0.0865) 0.3538 (+0.0287) 0.5752 (+0.0746) 0.5186 (+0.0633)
0.1125 2184 15.7076 - - - - -
0.1145 2223 15.3865 - - - - -
0.1165 2262 15.174 - - - - -
0.1185 2301 15.0104 - - - - -
0.1205 2340 14.8128 14.9314 0.6112 (+0.0708) 0.3625 (+0.0375) 0.5936 (+0.0929) 0.5225 (+0.0671)
0.1225 2379 14.407 - - - - -
0.1245 2418 14.3549 - - - - -
0.1265 2457 14.3104 - - - - -
0.1285 2496 14.0911 - - - - -
0.1305 2535 13.8692 14.5249 0.6153 (+0.0749) 0.3644 (+0.0393) 0.5976 (+0.0969) 0.5258 (+0.0704)
0.1326 2574 13.9886 - - - - -
0.1346 2613 13.5694 - - - - -
0.1366 2652 13.6723 - - - - -
0.1386 2691 13.5047 - - - - -
0.1406 2730 13.3759 13.6388 0.6318 (+0.0914) 0.3578 (+0.0327) 0.5819 (+0.0812) 0.5238 (+0.0685)
0.1426 2769 13.042 - - - - -
0.1446 2808 13.2039 - - - - -
0.1466 2847 12.9865 - - - - -
0.1486 2886 13.0293 - - - - -
0.1506 2925 12.7206 12.6907 0.6392 (+0.0988) 0.3649 (+0.0399) 0.5919 (+0.0912) 0.5320 (+0.0766)
0.1526 2964 12.5802 - - - - -
0.1546 3003 12.7734 - - - - -
0.1567 3042 12.4192 - - - - -
0.1587 3081 12.3157 - - - - -
0.1607 3120 12.384 12.5061 0.6382 (+0.0978) 0.3784 (+0.0533) 0.5907 (+0.0901) 0.5358 (+0.0804)
0.1627 3159 12.0562 - - - - -
0.1647 3198 12.1539 - - - - -
0.1667 3237 12.0754 - - - - -
0.1687 3276 11.8814 - - - - -
0.1707 3315 11.7829 11.9039 0.6371 (+0.0967) 0.3818 (+0.0568) 0.5998 (+0.0992) 0.5396 (+0.0842)
0.1727 3354 11.8772 - - - - -
0.1747 3393 11.8422 - - - - -
0.1767 3432 11.6353 - - - - -
0.1787 3471 11.584 - - - - -
0.1808 3510 11.443 11.8674 0.6317 (+0.0912) 0.3828 (+0.0578) 0.6080 (+0.1073) 0.5408 (+0.0854)
0.1828 3549 11.475 - - - - -
0.1848 3588 11.1514 - - - - -
0.1868 3627 11.3053 - - - - -
0.1888 3666 11.0035 - - - - -
0.1908 3705 11.1834 11.4955 0.6333 (+0.0928) 0.3642 (+0.0392) 0.6094 (+0.1087) 0.5356 (+0.0803)
0.1928 3744 10.8977 - - - - -
0.1948 3783 10.9302 - - - - -
0.1968 3822 10.8946 - - - - -
0.1988 3861 10.821 - - - - -
0.2008 3900 10.8044 11.0295 0.6262 (+0.0858) 0.3784 (+0.0533) 0.6098 (+0.1091) 0.5381 (+0.0828)
0.2028 3939 10.6379 - - - - -
0.2049 3978 10.6515 - - - - -
0.2069 4017 10.4407 - - - - -
0.2089 4056 10.4732 - - - - -
0.2109 4095 10.3832 10.6019 0.6286 (+0.0882) 0.3631 (+0.0381) 0.6167 (+0.1161) 0.5362 (+0.0808)
0.2129 4134 10.4307 - - - - -
0.2149 4173 10.3511 - - - - -
0.2169 4212 10.2797 - - - - -
0.2189 4251 10.2157 - - - - -
0.2209 4290 10.2122 10.3425 0.6281 (+0.0877) 0.3868 (+0.0617) 0.5901 (+0.0895) 0.5350 (+0.0796)
0.2229 4329 10.185 - - - - -
0.2249 4368 10.163 - - - - -
0.2269 4407 10.1477 - - - - -
0.2290 4446 9.9438 - - - - -
0.2310 4485 9.9041 10.1282 0.6452 (+0.1048) 0.3772 (+0.0522) 0.6136 (+0.1130) 0.5454 (+0.0900)
0.2330 4524 9.8034 - - - - -
0.2350 4563 9.7994 - - - - -
0.2370 4602 9.9711 - - - - -
0.2390 4641 9.7652 - - - - -
0.2410 4680 9.6757 9.7409 0.6364 (+0.0960) 0.3800 (+0.0550) 0.6095 (+0.1088) 0.5420 (+0.0866)
0.2430 4719 9.558 - - - - -
0.2450 4758 9.6791 - - - - -
0.2470 4797 9.5759 - - - - -
0.2490 4836 9.4958 - - - - -
0.2510 4875 9.487 9.5897 0.6479 (+0.1075) 0.3734 (+0.0484) 0.6261 (+0.1255) 0.5492 (+0.0938)
0.2531 4914 9.4796 - - - - -
0.2551 4953 9.3878 - - - - -
0.2571 4992 9.3888 - - - - -
0.2591 5031 9.2657 - - - - -
0.2611 5070 9.1936 9.2371 0.6296 (+0.0892) 0.3857 (+0.0607) 0.6220 (+0.1213) 0.5458 (+0.0904)
0.2631 5109 9.2403 - - - - -
0.2651 5148 9.1963 - - - - -
0.2671 5187 9.066 - - - - -
0.2691 5226 9.1684 - - - - -
0.2711 5265 9.1036 9.1932 0.6188 (+0.0784) 0.3803 (+0.0552) 0.6280 (+0.1274) 0.5424 (+0.0870)
0.2731 5304 9.0206 - - - - -
0.2751 5343 9.2051 - - - - -
0.2772 5382 9.1335 - - - - -
0.2792 5421 8.9798 - - - - -
0.2812 5460 8.9199 9.0195 0.6210 (+0.0805) 0.3939 (+0.0688) 0.6178 (+0.1172) 0.5442 (+0.0888)
0.2832 5499 8.8599 - - - - -
0.2852 5538 8.7977 - - - - -
0.2872 5577 8.7378 - - - - -
0.2892 5616 8.7338 - - - - -
0.2912 5655 8.6566 8.8711 0.6308 (+0.0904) 0.3885 (+0.0635) 0.6194 (+0.1188) 0.5463 (+0.0909)
0.2932 5694 8.8228 - - - - -
0.2952 5733 8.6837 - - - - -
0.2972 5772 8.6967 - - - - -
0.2992 5811 8.6721 - - - - -
0.3013 5850 8.5856 8.4996 0.6424 (+0.1020) 0.3820 (+0.0569) 0.6268 (+0.1261) 0.5504 (+0.0950)
0.3033 5889 8.5552 - - - - -
0.3053 5928 8.4364 - - - - -
0.3073 5967 8.4726 - - - - -
0.3093 6006 8.6169 - - - - -
0.3113 6045 8.4303 8.4561 0.6142 (+0.0738) 0.3720 (+0.0470) 0.6226 (+0.1220) 0.5363 (+0.0809)
0.3133 6084 8.5336 - - - - -
0.3153 6123 8.3723 - - - - -
0.3173 6162 8.358 - - - - -
0.3193 6201 8.3435 - - - - -
0.3213 6240 8.2638 8.3670 0.6346 (+0.0941) 0.3838 (+0.0587) 0.6229 (+0.1222) 0.5471 (+0.0917)
0.3233 6279 8.2994 - - - - -
0.3254 6318 8.1733 - - - - -
0.3274 6357 8.2564 - - - - -
0.3294 6396 8.212 - - - - -
0.3314 6435 8.1111 8.3150 0.6385 (+0.0981) 0.3828 (+0.0578) 0.6353 (+0.1346) 0.5522 (+0.0968)
0.3334 6474 8.1655 - - - - -
0.3354 6513 8.0391 - - - - -
0.3374 6552 8.0784 - - - - -
0.3394 6591 8.0369 - - - - -
0.3414 6630 8.012 8.0672 0.6191 (+0.0787) 0.3899 (+0.0648) 0.6482 (+0.1475) 0.5524 (+0.0970)
0.3434 6669 7.9809 - - - - -
0.3454 6708 8.0001 - - - - -
0.3474 6747 8.009 - - - - -
0.3495 6786 7.9692 - - - - -
0.3515 6825 7.8565 7.9072 0.6220 (+0.0815) 0.3866 (+0.0616) 0.6154 (+0.1147) 0.5413 (+0.0859)
0.3535 6864 7.9108 - - - - -
0.3555 6903 7.7341 - - - - -
0.3575 6942 7.8442 - - - - -
0.3595 6981 7.912 - - - - -
0.3615 7020 7.7133 8.0526 0.6389 (+0.0985) 0.3991 (+0.0740) 0.6408 (+0.1402) 0.5596 (+0.1042)
0.3635 7059 7.8985 - - - - -
0.3655 7098 7.6834 - - - - -
0.3675 7137 7.7494 - - - - -
0.3695 7176 7.7146 - - - - -
0.3715 7215 7.6553 7.8366 0.6338 (+0.0934) 0.3992 (+0.0742) 0.6253 (+0.1247) 0.5528 (+0.0974)
0.3736 7254 7.612 - - - - -
0.3756 7293 7.6707 - - - - -
0.3776 7332 7.716 - - - - -
0.3796 7371 7.7436 - - - - -
0.3816 7410 7.6003 7.6881 0.6307 (+0.0902) 0.4029 (+0.0779) 0.6334 (+0.1327) 0.5556 (+0.1003)
0.3836 7449 7.5153 - - - - -
0.3856 7488 7.5351 - - - - -
0.3876 7527 7.5687 - - - - -
0.3896 7566 7.5155 - - - - -
0.3916 7605 7.4451 7.6049 0.6321 (+0.0916) 0.3914 (+0.0664) 0.6358 (+0.1352) 0.5531 (+0.0977)
0.3936 7644 7.5113 - - - - -
0.3956 7683 7.5135 - - - - -
0.3977 7722 7.4258 - - - - -
0.3997 7761 7.458 - - - - -
0.4017 7800 7.3602 7.4536 0.6472 (+0.1067) 0.4034 (+0.0783) 0.6372 (+0.1365) 0.5626 (+0.1072)
0.4037 7839 7.4779 - - - - -
0.4057 7878 7.4154 - - - - -
0.4077 7917 7.2897 - - - - -
0.4097 7956 7.3614 - - - - -
0.4117 7995 7.2537 7.4876 0.6298 (+0.0893) 0.3999 (+0.0749) 0.6318 (+0.1312) 0.5538 (+0.0985)
0.4137 8034 7.2474 - - - - -
0.4157 8073 7.2166 - - - - -
0.4177 8112 7.2344 - - - - -
0.4197 8151 7.2647 - - - - -
0.4218 8190 7.2618 7.4345 0.6347 (+0.0943) 0.4057 (+0.0806) 0.6289 (+0.1282) 0.5564 (+0.1010)
0.4238 8229 7.2227 - - - - -
0.4258 8268 7.2384 - - - - -
0.4278 8307 7.2133 - - - - -
0.4298 8346 7.1558 - - - - -
0.4318 8385 7.0712 7.2899 0.6523 (+0.1119) 0.4027 (+0.0777) 0.6341 (+0.1335) 0.5631 (+0.1077)
0.4338 8424 7.1063 - - - - -
0.4358 8463 7.0908 - - - - -
0.4378 8502 7.1122 - - - - -
0.4398 8541 7.1549 - - - - -
0.4418 8580 7.0516 7.2487 0.6553 (+0.1148) 0.3955 (+0.0705) 0.6399 (+0.1392) 0.5635 (+0.1082)
0.4438 8619 7.0792 - - - - -
0.4459 8658 7.0351 - - - - -
0.4479 8697 7.0315 - - - - -
0.4499 8736 6.9912 - - - - -
0.4519 8775 7.0102 7.0829 0.6569 (+0.1165) 0.3949 (+0.0698) 0.6534 (+0.1527) 0.5684 (+0.1130)
0.4539 8814 6.9478 - - - - -
0.4559 8853 6.9741 - - - - -
0.4579 8892 7.0344 - - - - -
0.4599 8931 6.907 - - - - -
0.4619 8970 6.9089 7.0948 0.6369 (+0.0964) 0.3976 (+0.0725) 0.6367 (+0.1360) 0.5570 (+0.1017)
0.4639 9009 6.9295 - - - - -
0.4659 9048 6.863 - - - - -
0.4679 9087 6.9167 - - - - -
0.4700 9126 6.9123 - - - - -
0.4720 9165 6.8659 6.9749 0.6478 (+0.1073) 0.3977 (+0.0727) 0.6420 (+0.1414) 0.5625 (+0.1071)
0.4740 9204 6.8238 - - - - -
0.4760 9243 6.7847 - - - - -
0.4780 9282 6.7687 - - - - -
0.4800 9321 6.8748 - - - - -
0.4820 9360 6.7392 6.9672 0.6469 (+0.1064) 0.4037 (+0.0787) 0.6375 (+0.1369) 0.5627 (+0.1073)
0.4840 9399 6.6911 - - - - -
0.4860 9438 6.6688 - - - - -
0.4880 9477 6.7891 - - - - -
0.4900 9516 6.7704 - - - - -
0.4920 9555 6.7022 6.8938 0.6519 (+0.1115) 0.4040 (+0.0789) 0.6484 (+0.1477) 0.5681 (+0.1127)
0.4941 9594 6.7143 - - - - -
0.4961 9633 6.6628 - - - - -
0.4981 9672 6.7368 - - - - -
0.5001 9711 6.5205 - - - - -
0.5021 9750 6.6955 6.8142 0.6485 (+0.1081) 0.3968 (+0.0718) 0.6302 (+0.1296) 0.5585 (+0.1032)
0.5041 9789 6.6244 - - - - -
0.5061 9828 6.6949 - - - - -
0.5081 9867 6.5489 - - - - -
0.5101 9906 6.6067 - - - - -
0.5121 9945 6.5962 6.7924 0.6574 (+0.1170) 0.3910 (+0.0660) 0.6459 (+0.1452) 0.5648 (+0.1094)
0.5141 9984 6.5248 - - - - -
0.5161 10023 6.5204 - - - - -
0.5182 10062 6.5291 - - - - -
0.5202 10101 6.5512 - - - - -
0.5222 10140 6.4731 6.7636 0.6638 (+0.1234) 0.3973 (+0.0722) 0.6433 (+0.1426) 0.5681 (+0.1128)
0.5242 10179 6.5327 - - - - -
0.5262 10218 6.5019 - - - - -
0.5282 10257 6.5113 - - - - -
0.5302 10296 6.5181 - - - - -
0.5322 10335 6.4987 6.6856 0.6656 (+0.1252) 0.3946 (+0.0695) 0.6501 (+0.1494) 0.5701 (+0.1147)
0.5342 10374 6.4575 - - - - -
0.5362 10413 6.4239 - - - - -
0.5382 10452 6.4181 - - - - -
0.5402 10491 6.3535 - - - - -
0.5423 10530 6.4066 6.6634 0.6482 (+0.1078) 0.3993 (+0.0743) 0.6449 (+0.1442) 0.5641 (+0.1088)
0.5443 10569 6.4005 - - - - -
0.5463 10608 6.4521 - - - - -
0.5483 10647 6.4178 - - - - -
0.5503 10686 6.3495 - - - - -
0.5523 10725 6.3246 6.6161 0.6434 (+0.1030) 0.3915 (+0.0664) 0.6426 (+0.1420) 0.5592 (+0.1038)
0.5543 10764 6.4175 - - - - -
0.5563 10803 6.3035 - - - - -
0.5583 10842 6.2432 - - - - -
0.5603 10881 6.3024 - - - - -
0.5623 10920 6.4134 6.4813 0.6441 (+0.1037) 0.3936 (+0.0685) 0.6392 (+0.1386) 0.5590 (+0.1036)
0.5643 10959 6.2943 - - - - -
0.5664 10998 6.2934 - - - - -
0.5684 11037 6.3379 - - - - -
0.5704 11076 6.2481 - - - - -
0.5724 11115 6.256 6.4519 0.6371 (+0.0967) 0.3985 (+0.0735) 0.6343 (+0.1336) 0.5566 (+0.1013)
0.5744 11154 6.2639 - - - - -
0.5764 11193 6.2727 - - - - -
0.5784 11232 6.2347 - - - - -
0.5804 11271 6.3073 - - - - -
0.5824 11310 6.2281 6.3956 0.6336 (+0.0932) 0.3973 (+0.0723) 0.6454 (+0.1448) 0.5588 (+0.1034)
0.5844 11349 6.0973 - - - - -
0.5864 11388 6.282 - - - - -
0.5884 11427 6.1308 - - - - -
0.5905 11466 6.0991 - - - - -
0.5925 11505 6.2648 6.3232 0.6362 (+0.0958) 0.3943 (+0.0693) 0.6527 (+0.1520) 0.5611 (+0.1057)
0.5945 11544 6.1303 - - - - -
0.5965 11583 6.2142 - - - - -
0.5985 11622 6.115 - - - - -
0.6005 11661 6.1109 - - - - -
0.6025 11700 6.2147 6.3124 0.6190 (+0.0785) 0.3905 (+0.0654) 0.6544 (+0.1538) 0.5546 (+0.0993)
0.6045 11739 6.1101 - - - - -
0.6065 11778 6.1246 - - - - -
0.6085 11817 6.0777 - - - - -
0.6105 11856 6.1565 - - - - -
0.6125 11895 5.9654 6.2800 0.6371 (+0.0967) 0.3990 (+0.0740) 0.6534 (+0.1528) 0.5632 (+0.1078)
0.6146 11934 6.0115 - - - - -
0.6166 11973 6.0402 - - - - -
0.6186 12012 6.1312 - - - - -
0.6206 12051 6.0977 - - - - -
0.6226 12090 6.1147 6.2629 0.6438 (+0.1033) 0.3997 (+0.0746) 0.6595 (+0.1589) 0.5676 (+0.1123)
0.6246 12129 6.0727 - - - - -
0.6266 12168 6.0468 - - - - -
0.6286 12207 6.022 - - - - -
0.6306 12246 5.9995 - - - - -
0.6326 12285 6.0553 6.2714 0.6511 (+0.1107) 0.3965 (+0.0714) 0.6490 (+0.1483) 0.5655 (+0.1102)
0.6346 12324 6.0713 - - - - -
0.6366 12363 5.9443 - - - - -
0.6387 12402 6.0045 - - - - -
0.6407 12441 5.9835 - - - - -
0.6427 12480 5.9936 6.1741 0.6330 (+0.0926) 0.3993 (+0.0743) 0.6452 (+0.1446) 0.5592 (+0.1038)
0.6447 12519 5.9637 - - - - -
0.6467 12558 5.9407 - - - - -
0.6487 12597 5.8556 - - - - -
0.6507 12636 6.0084 - - - - -
0.6527 12675 6.0038 6.1379 0.6555 (+0.1150) 0.3981 (+0.0731) 0.6501 (+0.1495) 0.5679 (+0.1125)
0.6547 12714 5.8648 - - - - -
0.6567 12753 5.9154 - - - - -
0.6587 12792 5.9591 - - - - -
0.6607 12831 5.9369 - - - - -
0.6628 12870 5.8238 6.1443 0.6430 (+0.1026) 0.3945 (+0.0694) 0.6621 (+0.1614) 0.5665 (+0.1111)
0.6648 12909 5.8622 - - - - -
0.6668 12948 5.9296 - - - - -
0.6688 12987 5.8676 - - - - -
0.6708 13026 5.8811 - - - - -
0.6728 13065 5.9176 6.0863 0.6422 (+0.1018) 0.3931 (+0.0681) 0.6651 (+0.1644) 0.5668 (+0.1114)
0.6748 13104 5.8536 - - - - -
0.6768 13143 5.8865 - - - - -
0.6788 13182 5.7893 - - - - -
0.6808 13221 5.8791 - - - - -
0.6828 13260 5.8686 6.0588 0.6384 (+0.0979) 0.3980 (+0.0730) 0.6555 (+0.1548) 0.5640 (+0.1086)
0.6848 13299 5.8052 - - - - -
0.6869 13338 5.8452 - - - - -
0.6889 13377 5.8033 - - - - -
0.6909 13416 5.7734 - - - - -
0.6929 13455 5.7619 5.9937 0.6524 (+0.1120) 0.3965 (+0.0714) 0.6505 (+0.1498) 0.5664 (+0.1111)
0.6949 13494 5.7637 - - - - -
0.6969 13533 5.7513 - - - - -
0.6989 13572 5.8172 - - - - -
0.7009 13611 5.8323 - - - - -
0.7029 13650 5.76 5.9823 0.6451 (+0.1047) 0.4061 (+0.0810) 0.6428 (+0.1421) 0.5646 (+0.1093)
0.7049 13689 5.7541 - - - - -
0.7069 13728 5.7465 - - - - -
0.7089 13767 5.8207 - - - - -
0.7110 13806 5.7531 - - - - -
0.7130 13845 5.7436 5.9584 0.6417 (+0.1012) 0.4021 (+0.0771) 0.6469 (+0.1462) 0.5636 (+0.1082)
0.7150 13884 5.6921 - - - - -
0.7170 13923 5.6492 - - - - -
0.7190 13962 5.7005 - - - - -
0.7210 14001 5.7808 - - - - -
0.7230 14040 5.7058 5.9436 0.6403 (+0.0998) 0.3904 (+0.0654) 0.6602 (+0.1595) 0.5636 (+0.1083)
0.7250 14079 5.7288 - - - - -
0.7270 14118 5.6235 - - - - -
0.7290 14157 5.6438 - - - - -
0.7310 14196 5.6733 - - - - -
0.7330 14235 5.7588 5.9466 0.6583 (+0.1179) 0.3944 (+0.0693) 0.6479 (+0.1473) 0.5669 (+0.1115)
0.7351 14274 5.6172 - - - - -
0.7371 14313 5.649 - - - - -
0.7391 14352 5.7198 - - - - -
0.7411 14391 5.627 - - - - -
0.7431 14430 5.6958 5.8615 0.6463 (+0.1058) 0.3905 (+0.0654) 0.6619 (+0.1613) 0.5662 (+0.1109)
0.7451 14469 5.6216 - - - - -
0.7471 14508 5.5954 - - - - -
0.7491 14547 5.5794 - - - - -
0.7511 14586 5.6821 - - - - -
0.7531 14625 5.5987 5.8702 0.6436 (+0.1032) 0.3936 (+0.0685) 0.6563 (+0.1557) 0.5645 (+0.1091)
0.7551 14664 5.6134 - - - - -
0.7571 14703 5.6476 - - - - -
0.7592 14742 5.679 - - - - -
0.7612 14781 5.6292 - - - - -
0.7632 14820 5.6129 5.8149 0.6454 (+0.1049) 0.3949 (+0.0699) 0.6553 (+0.1546) 0.5652 (+0.1098)
0.7652 14859 5.5883 - - - - -
0.7672 14898 5.6431 - - - - -
0.7692 14937 5.5464 - - - - -
0.7712 14976 5.6273 - - - - -
0.7732 15015 5.6428 5.8067 0.6576 (+0.1172) 0.3996 (+0.0745) 0.6720 (+0.1714) 0.5764 (+0.1210)
0.7752 15054 5.4733 - - - - -
0.7772 15093 5.5618 - - - - -
0.7792 15132 5.5804 - - - - -
0.7812 15171 5.5858 - - - - -
0.7833 15210 5.4994 5.7438 0.6424 (+0.1019) 0.3973 (+0.0722) 0.6573 (+0.1566) 0.5656 (+0.1103)
0.7853 15249 5.5405 - - - - -
0.7873 15288 5.5032 - - - - -
0.7893 15327 5.5755 - - - - -
0.7913 15366 5.4816 - - - - -
0.7933 15405 5.4922 5.7130 0.6523 (+0.1118) 0.3851 (+0.0601) 0.6633 (+0.1626) 0.5669 (+0.1115)
0.7953 15444 5.5049 - - - - -
0.7973 15483 5.5061 - - - - -
0.7993 15522 5.5243 - - - - -
0.8013 15561 5.4995 - - - - -
0.8033 15600 5.5222 5.7124 0.6440 (+0.1035) 0.3981 (+0.0731) 0.6695 (+0.1689) 0.5705 (+0.1152)
0.8053 15639 5.5646 - - - - -
0.8074 15678 5.5963 - - - - -
0.8094 15717 5.5167 - - - - -
0.8114 15756 5.5645 - - - - -
0.8134 15795 5.4805 5.7093 0.6571 (+0.1167) 0.3948 (+0.0697) 0.6566 (+0.1560) 0.5695 (+0.1141)
0.8154 15834 5.5332 - - - - -
0.8174 15873 5.4952 - - - - -
0.8194 15912 5.5312 - - - - -
0.8214 15951 5.5023 - - - - -
0.8234 15990 5.3999 5.6825 0.6641 (+0.1237) 0.3984 (+0.0734) 0.6575 (+0.1569) 0.5733 (+0.1180)
0.8254 16029 5.4961 - - - - -
0.8274 16068 5.5271 - - - - -
0.8294 16107 5.4806 - - - - -
0.8315 16146 5.4955 - - - - -
0.8335 16185 5.4969 5.6419 0.6576 (+0.1172) 0.4014 (+0.0763) 0.6578 (+0.1572) 0.5723 (+0.1169)
0.8355 16224 5.4633 - - - - -
0.8375 16263 5.4822 - - - - -
0.8395 16302 5.4476 - - - - -
0.8415 16341 5.4727 - - - - -
0.8435 16380 5.4001 5.5943 0.6513 (+0.1109) 0.4057 (+0.0806) 0.6477 (+0.1471) 0.5682 (+0.1129)
0.8455 16419 5.4684 - - - - -
0.8475 16458 5.4463 - - - - -
0.8495 16497 5.4672 - - - - -
0.8515 16536 5.4415 - - - - -
0.8535 16575 5.4633 5.6234 0.6645 (+0.1241) 0.3943 (+0.0693) 0.6620 (+0.1614) 0.5736 (+0.1183)
0.8556 16614 5.3845 - - - - -
0.8576 16653 5.4805 - - - - -
0.8596 16692 5.4391 - - - - -
0.8616 16731 5.382 - - - - -
0.8636 16770 5.4113 5.5914 0.6645 (+0.1241) 0.3938 (+0.0688) 0.6598 (+0.1591) 0.5727 (+0.1173)
0.8656 16809 5.4142 - - - - -
0.8676 16848 5.3805 - - - - -
0.8696 16887 5.4068 - - - - -
0.8716 16926 5.392 - - - - -
0.8736 16965 5.4121 5.5925 0.6592 (+0.1188) 0.4029 (+0.0779) 0.6473 (+0.1466) 0.5698 (+0.1144)
0.8756 17004 5.3969 - - - - -
0.8776 17043 5.4349 - - - - -
0.8797 17082 5.3231 - - - - -
0.8817 17121 5.3217 - - - - -
0.8837 17160 5.3942 5.5690 0.6619 (+0.1215) 0.3941 (+0.0690) 0.6603 (+0.1597) 0.5721 (+0.1167)
0.8857 17199 5.3824 - - - - -
0.8877 17238 5.3817 - - - - -
0.8897 17277 5.3159 - - - - -
0.8917 17316 5.3866 - - - - -
0.8937 17355 5.3396 5.5496 0.6645 (+0.1241) 0.3996 (+0.0745) 0.6525 (+0.1518) 0.5722 (+0.1168)
0.8957 17394 5.4085 - - - - -
0.8977 17433 5.3788 - - - - -
0.8997 17472 5.3739 - - - - -
0.9017 17511 5.3322 - - - - -
0.9038 17550 5.3472 5.5212 0.6513 (+0.1109) 0.4041 (+0.0791) 0.6607 (+0.1601) 0.5721 (+0.1167)
0.9058 17589 5.3451 - - - - -
0.9078 17628 5.3297 - - - - -
0.9098 17667 5.3158 - - - - -
0.9118 17706 5.363 - - - - -
0.9138 17745 5.3346 5.5014 0.6650 (+0.1245) 0.3989 (+0.0739) 0.6602 (+0.1595) 0.5747 (+0.1193)
0.9158 17784 5.3702 - - - - -
0.9178 17823 5.4226 - - - - -
0.9198 17862 5.3099 - - - - -
0.9218 17901 5.3468 - - - - -
0.9238 17940 5.3903 5.5093 0.6523 (+0.1118) 0.4000 (+0.0750) 0.6624 (+0.1618) 0.5716 (+0.1162)
0.9258 17979 5.3288 - - - - -
0.9279 18018 5.3464 - - - - -
0.9299 18057 5.2696 - - - - -
0.9319 18096 5.3532 - - - - -
0.9339 18135 5.3093 5.4815 0.6518 (+0.1114) 0.4021 (+0.0771) 0.6689 (+0.1683) 0.5743 (+0.1189)
0.9359 18174 5.3378 - - - - -
0.9379 18213 5.3576 - - - - -
0.9399 18252 5.3102 - - - - -
0.9419 18291 5.3322 - - - - -
0.9439 18330 5.3015 5.4963 0.6645 (+0.1241) 0.3980 (+0.0729) 0.6573 (+0.1566) 0.5732 (+0.1179)
0.9459 18369 5.2332 - - - - -
0.9479 18408 5.2934 - - - - -
0.9499 18447 5.3187 - - - - -
0.9520 18486 5.2961 - - - - -
0.9540 18525 5.2884 5.4628 0.6654 (+0.1250) 0.3982 (+0.0731) 0.6628 (+0.1621) 0.5754 (+0.1201)
0.9560 18564 5.277 - - - - -
0.9580 18603 5.3364 - - - - -
0.9600 18642 5.2839 - - - - -
0.9620 18681 5.2503 - - - - -
0.9640 18720 5.2547 5.4463 0.6596 (+0.1192) 0.3981 (+0.0730) 0.6642 (+0.1635) 0.5740 (+0.1186)
0.9660 18759 5.2803 - - - - -
0.9680 18798 5.3167 - - - - -
0.9700 18837 5.2781 - - - - -
0.9720 18876 5.2809 - - - - -
0.9740 18915 5.2917 5.4579 0.6596 (+0.1192) 0.4000 (+0.0749) 0.6689 (+0.1683) 0.5762 (+0.1208)
0.9761 18954 5.2893 - - - - -
0.9781 18993 5.3301 - - - - -
0.9801 19032 5.2753 - - - - -
0.9821 19071 5.2079 - - - - -
0.9841 19110 5.2672 5.4520 0.6592 (+0.1188) 0.3989 (+0.0738) 0.6689 (+0.1683) 0.5757 (+0.1203)
0.9861 19149 5.268 - - - - -
0.9881 19188 5.224 - - - - -
0.9901 19227 5.3047 - - - - -
0.9921 19266 5.2771 - - - - -
0.9941 19305 5.233 5.4545 0.6592 (+0.1188) 0.4003 (+0.0752) 0.6689 (+0.1683) 0.5761 (+0.1208)
0.9961 19344 5.2459 - - - - -
0.9981 19383 5.3069 - - - - -
-1 -1 - - 0.6576 (+0.1172) 0.3996 (+0.0745) 0.6720 (+0.1714) 0.5764 (+0.1210)
  • The bold row denotes the saved checkpoint.

Environmental Impact

Carbon emissions were measured using CodeCarbon.

  • Energy Consumed: 4.348 kWh
  • Carbon Emitted: 1.605 kg of CO2
  • Hours Used: 1.576 hours

Training Hardware

  • On Cloud: No
  • GPU Model: 8 x NVIDIA H100 80GB HBM3
  • CPU Model: AMD EPYC 7R13 Processor
  • RAM Size: 1999.99 GB

Framework Versions

  • Python: 3.10.14
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.1
  • PyTorch: 2.9.1+cu126
  • Accelerate: 1.12.0
  • Datasets: 4.4.1
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MarginMSELoss

@misc{hofstätter2021improving,
    title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
    author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
    year={2021},
    eprint={2010.02666},
    archivePrefix={arXiv},
    primaryClass={cs.IR}
}
Downloads last month
10
Safetensors
Model size
16.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for tomaarsen/ms-marco-ettin-17m-reranker

Finetuned
(22)
this model

Dataset used to train tomaarsen/ms-marco-ettin-17m-reranker

Collection including tomaarsen/ms-marco-ettin-17m-reranker

Evaluation results