Transformers
	
	
	
	
	GGUF
	
	
	
		
	
	English
	
	
	
	
	reasoning
	
	
	
	
	thinking
	
	
	
	
	uncensored
	
	
	
	
	gated
	
	
	
	
	mixture of experts
	
	
	
		
	
	
		Mixture of Experts
	
	
	
	
	8x3B
	
	
	
	
	Llama 3.2 MOE
	
	
	
	
	128k context
	
	
	
	
	creative
	
	
	
	
	creative writing
	
	
	
	
	fiction writing
	
	
	
	
	plot generation
	
	
	
	
	sub-plot generation
	
	
	
	
	story generation
	
	
	
	
	scene continue
	
	
	
	
	storytelling
	
	
	
	
	fiction story
	
	
	
	
	science fiction
	
	
	
	
	romance
	
	
	
	
	all genres
	
	
	
	
	story
	
	
	
	
	writing
	
	
	
	
	vivid prosing
	
	
	
	
	vivid writing
	
	
	
	
	fiction
	
	
	
	
	roleplaying
	
	
	
	
	float32
	
	
	
	
	swearing
	
	
	
	
	rp
	
	
	
	
	horror
	
	
	
		
	
	mergekit
	
	
	
	
	llama-3
	
	
	
	
	llama-3.2
	
	
	
	
	conversational
	
	
cannot pull?
#1
by
						
tebal91901
	
							
						- opened
							
					
What am I doing wrong?
ollama pull hf.co/mradermacher/Llama-3.2-8X3B-GATED-MOE-Reasoning-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF:Q8_K
Error: 400 Bad Request: invalid model name
I have no idea, you should probably ask in an ollama forum :) Maybe because of Q8_K?
There are no Q8_K quants. Such quants do not exist in llama.cpp. Either specify Q6_K or Q8_0.