Nomic Embed
					Collection
				
Open Source Long Context Text Embedders
					• 
				8 items
				• 
				Updated
					
				•
					
					22
nomic-embed-text-v1-unsupervised is 8192 context length text encoder. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the
final model. The purpose of releasing this checkpoint is to open-source training artifacts from our Nomic Embed Text tech report here
If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v1.