• Crosscoder
    • model_A: google/gemma-2-2b-it
    • model_B: google/gemma-2-2b-jpn-it
    • d_sae: 2**15
    • hook-point: blocks.14.hook_resid_pre
    • Total training tokens (1B):
      • monology/pile-uncopyrighted: 0.5B tokens (tokenizer: model_A)
      • statmt/cc100 (ja): 0.5B tokens (tokenizer: model_B)
    • Training details: image/png
  • Implemented from https://github.com/ckkissane/crosscoder-model-diff-replication/tree/main
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for haggingfacehyz/crosscoder-gemma-2-2b-it-gemma-2-2b-jpn-it-L13

Base model

google/gemma-2-2b
Finetuned
(724)
this model