NDA_finetuned_V1 / README.md
lucagafner's picture
Update README.md
e9a15c5 verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:45
  - loss:DenoisingAutoEncoderLoss
base_model: nlpaueb/legal-bert-base-uncased
widget:
  - source_sentence: >-
      The not apply to the prove to part of public at time it without violation
      any non-disclosure in; or already Receiving before disclosure Party
      evidenced its written records revealed Receiving by a third without
      non-disclosure favour of the Party received the Party
    sentences:
      - >-
        The receiving party will segregate Confidential Information from the
        confidential materials of third parties to prevent commingling.
      - |-
        NON-DISCLOSURE AGREEMENT (NDA)

        1.
      - >-
        The non-disclosure undertaking under this Agreement shall not apply to
        information which the Receiving Party can prove to

        have been part of public knowledge at the time the Receiving Party
        received it or became public knowledge thereafter without violation of
        any non-disclosure undertaking in favour of the Disclosing Party; or

        have been already known to the Receiving Party before disclosure by the
        Disclosing Party as evidenced by its written records or has been
        revealed to the Receiving Party by a third party without violation of a
        non-disclosure undertaking in favour of [Name disclosing party]; or

        have been developed by the Receiving Party independently of the
        information received by the Disclosing Party.
  - source_sentence: Disclosing receives Confidential Information Receiving for,,, 6,, Party
    sentences:
      - >-
        THE PARTIES AGREE AS FOLLOWS:


        The non-disclosure undertaking of the Receiving Party covers all
        information provided by the Disclosing Party, or any third party on
        behalf of the Disclosing Party, to the Receiving Party (the Confidential
        Information), whether transferred on paper, verbally, electronically, or
        by any other means or on any other media.
      - >-
        In case the Disclosing Party receives any Confidential Information from
        the Receiving Party for the purposes described in Section 2, the
        sections 3, 4, 5, 6, 7, 8 and 9 shall apply also to the Disclosing
        Party.
      - >-
        The term “person” as used in this Agreement shall be interpreted to
        include, without limitation, any other corporation, company, group,
        partnership or individual.
  - source_sentence: >-
      return to disclosing party all forms such Information in control,
      including to drawings, documents, models or any material copies thereof,
      delete would be any but hardware
    sentences:
      - >-

        PREAMBLE

        The Disclosing Party will disclose to the Receiving Party certain
        information which is non-public at the time of disclosure and considered
        confidential by the Disclosing Party.
      - >-
        Upon request, the receiving party will return or deliver to the
        disclosing party all tangible forms of such Confidential Information in
        its possession or control, including but not limited to drawings,
        specifications, documents, records, devices, models or any other
        material and copies or reproductions thereof, respectively delete
        Confidential Information that would be contained on any intangible
        format such as but not exclusively hardware.
      - >-
        Governing Law and Jurisdiction

        This Agreement is governed by and shall be construed in accordance with
        the laws of Switzerland.
  - source_sentence:  Receiving the below  Receiving
    sentences:
      - >-
        The “Receiving Party” is specified by the information below (see
        “Receiving Party”).
      - The Parties submit to the exclusive jurisdiction of Zug.
      - >-
        The courts at the domicile of [Name of disclosing party] shall have
        exclusive jurisdiction.
  - source_sentence: >-
      that term Agreement it the solely the purposes Section 2 Confidential
      Information at its costs will appropriate confidentiality
    sentences:
      - >-
        The “Disclosing Party” in this Agreement refers to [Name Disclosing
        Party].
      - >-
        The Parties hereto undertake to replace such invalid or ineffective
        provision by an effective/valid provision.
      - >-
        The Receiving Party hereby agrees that for the term of this Agreement it
        shall use the Confidential Information solely for the purposes described
        in Section 2 and shall keep the Confidential Information strictly
        confidential, at all times and at its own costs and will take
        appropriate steps to protect the confidentiality thereof.
pipeline_tag: sentence-similarity
library_name: sentence-transformers

SentenceTransformer based on nlpaueb/legal-bert-base-uncased

This is a sentence-transformers model finetuned from nlpaueb/legal-bert-base-uncased. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nlpaueb/legal-bert-base-uncased
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("lucagafner/NDA_finetuned_V1")
# Run inference

embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 45 training samples
  • Columns: damaged_sentence and original_sentence
  • Approximate statistics based on the first 45 samples:
    damaged_sentence original_sentence
    type string string
    details
    • min: 3 tokens
    • mean: 18.18 tokens
    • max: 81 tokens
    • min: 4 tokens
    • mean: 41.53 tokens
    • max: 183 tokens
  • Loss: DenoisingAutoEncoderLoss

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 5
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss
0.3333 1 3.3557
0.6667 2 4.3068
1.0 3 3.916
1.3333 4 3.1851
1.6667 5 3.2759
2.0 6 3.1665
2.3333 7 3.1407
2.6667 8 2.6952
3.0 9 2.4053
3.3333 10 2.5579
3.6667 11 2.0525
4.0 12 2.2234
4.3333 13 1.8476
4.6667 14 2.0873
5.0 15 1.8472

Framework Versions

  • Python: 3.12.9
  • Sentence Transformers: 4.0.1
  • Transformers: 4.50.2
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.5.2
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

DenoisingAutoEncoderLoss

@inproceedings{wang-2021-TSDAE,
    title = "TSDAE: Using Transformer-based Sequential Denoising Auto-Encoderfor Unsupervised Sentence Embedding Learning",
    author = "Wang, Kexin and Reimers, Nils and Gurevych, Iryna",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    pages = "671--688",
    url = "https://arxiv.org/abs/2104.06979",
}