model_id large_stringlengths 13 47 | organization large_stringlengths 4 14 | model_type large_stringlengths 3 16 | schema_version int64 3 3 | analyzed_at large_stringdate 2026-03-29 15:32:25 2026-04-01 23:41:04 | config large_stringlengths 145 207 | techniques large_stringlengths 374 800 | provenance large_stringlengths 356 703 | intervention_map large_stringlengths 11.4k 51.5k | contribution_metadata large_stringlengths 127 127 | access_level large_stringclasses 1
value | quality_score float64 0.95 0.95 |
|---|---|---|---|---|---|---|---|---|---|---|---|
Qwen/Qwen3-8B | Qwen | qwen3 | 3 | 2026-03-29T15:32:25.900493+00:00 | {"hidden_size": 4096, "intermediate_size": 12288, "vocab_size": 151936, "max_position_embeddings": 40960, "num_hidden_layers": 36, "rms_norm_eps": 1e-06, "model_type": "qwen3"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 40960, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon": 1... | {"organization": "Qwen", "first_publish_date": "2025-04-27", "paper_urls": ["https://arxiv.org/abs/2309.00071", "https://arxiv.org/abs/2505.09388"], "base_model": "Qwen/Qwen3-8B-Base", "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["given inputs.", "switch", ... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.861598", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen2.5-7B | Qwen | qwen2 | 3 | 2026-03-29T15:32:29.752121+00:00 | {"hidden_size": 3584, "intermediate_size": 18944, "vocab_size": 152064, "max_position_embeddings": 131072, "num_hidden_layers": 28, "rms_norm_eps": 1e-06, "model_type": "qwen2"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000.0, "rope_type": "default"}, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "Qwen", "first_publish_date": "2024-09-15", "paper_urls": [], "base_model": null, "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 2.5", "qwen 2", "qwen"], "hub_metadata": {"downloads": 1024285, "likes": 266, "created_at": "2024-09-15T12:... | {"residual_stream_dim": 3584, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.861726", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3-30B-A3B | Qwen | qwen3_moe | 3 | 2026-03-29T15:32:33.576356+00:00 | {"hidden_size": 2048, "intermediate_size": 6144, "vocab_size": 151936, "max_position_embeddings": 40960, "num_hidden_layers": 48, "rms_norm_eps": 1e-06, "model_type": "qwen3_moe"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 40960, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "Qwen", "first_publish_date": "2025-04-27", "paper_urls": ["https://arxiv.org/abs/2309.00071", "https://arxiv.org/abs/2505.09388"], "base_model": "Qwen/Qwen3-30B-A3B-Base", "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["switch", "qwen 2.5", ... | {"residual_stream_dim": 2048, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.861856", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3-235B-A22B | Qwen | qwen3_moe | 3 | 2026-03-29T15:32:37.467930+00:00 | {"hidden_size": 4096, "intermediate_size": 12288, "vocab_size": 151936, "max_position_embeddings": 40960, "num_hidden_layers": 94, "rms_norm_eps": 1e-06, "model_type": "qwen3_moe"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 40960, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "Qwen", "first_publish_date": "2025-04-27", "paper_urls": ["https://arxiv.org/abs/2309.00071", "https://arxiv.org/abs/2505.09388"], "base_model": null, "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["llama", "qwen 3", "switch", "qwen 3t", "qw... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862043", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B | deepseek-ai | qwen2 | 3 | 2026-03-29T15:32:46.354569+00:00 | {"hidden_size": 3584, "intermediate_size": 18944, "vocab_size": 152064, "max_position_embeddings": 131072, "num_hidden_layers": 28, "rms_norm_eps": 1e-06, "model_type": "qwen2"} | {"rope_type": null, "rope_scaling": {"rope_theta": 10000, "rope_type": "default"}, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon": 1e-... | {"organization": "deepseek-ai", "first_publish_date": "2025-01-20", "paper_urls": ["https://arxiv.org/abs/2501.12948"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["switch", "gpt 4", "llama 70b", "gpt", "qwen 7b", "claude", "gpt 4o", "llama 3"... | {"residual_stream_dim": 3584, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862159", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | deepseek-ai | qwen2 | 3 | 2026-03-29T15:32:50.200935+00:00 | {"hidden_size": 5120, "intermediate_size": 27648, "vocab_size": 152064, "max_position_embeddings": 131072, "num_hidden_layers": 64, "rms_norm_eps": 1e-05, "model_type": "qwen2"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000.0, "rope_type": "default"}, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "deepseek-ai", "first_publish_date": "2025-01-20", "paper_urls": ["https://arxiv.org/abs/2501.12948"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen2.5 and llama3 series to the community.", "llama 70b", "gpt", "claude 3.5"... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862302", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-V2.5 | deepseek-ai | deepseek_v2 | 3 | 2026-03-29T15:32:56.833668+00:00 | {"hidden_size": 5120, "intermediate_size": 12288, "vocab_size": 102400, "max_position_embeddings": 163840, "num_hidden_layers": 60, "rope_theta": 10000, "rms_norm_eps": 1e-06, "model_type": "deepseek_v2"} | {"rope_type": null, "rope_scaling": {"beta_fast": 32, "beta_slow": 1, "factor": 40, "mscale": 1.0, "mscale_all_dim": 1.0, "original_max_position_embeddings": 4096, "type": "yarn", "rope_theta": 10000, "rope_type": "yarn"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "atten... | {"organization": "deepseek-ai", "first_publish_date": "2024-09-05", "paper_urls": ["https://arxiv.org/abs/2405.04434"], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["your devices\nmax"], "hub_metadata": {"downloads": 5837, "likes": 733, "crea... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.862432", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-V3-Base | deepseek-ai | deepseek_v3 | 3 | 2026-03-29T15:33:04.667941+00:00 | {"hidden_size": 7168, "intermediate_size": 18432, "vocab_size": 129280, "max_position_embeddings": 163840, "num_hidden_layers": 61, "rope_theta": 10000, "rms_norm_eps": 1e-06, "model_type": "deepseek_v3"} | {"rope_type": null, "rope_scaling": {"beta_fast": 32, "beta_slow": 1, "factor": 40, "mscale": 1.0, "mscale_all_dim": 1.0, "original_max_position_embeddings": 4096, "type": "yarn", "rope_theta": 10000, "rope_type": "yarn"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "atten... | {"organization": "deepseek-ai", "first_publish_date": "2024-12-25", "paper_urls": ["https://arxiv.org/abs/2412.19437"], "base_model": null, "license": null, "architecture_tags": [], "model_card_lineage": ["llama", "gpt", "claude 3.5", "gpt 4o", "internlm", "qwen 2.5", "llama 3.1", "claude"], "hub_metadata": {"downloads... | {"residual_stream_dim": 7168, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.862572", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
XiaomiMiMo/MiMo-7B-RL | XiaomiMiMo | mimo | 3 | 2026-03-29T15:33:14.094468+00:00 | {"hidden_size": 4096, "intermediate_size": 11008, "vocab_size": 151680, "max_position_embeddings": 32768, "num_hidden_layers": 36, "rms_norm_eps": 1e-05, "model_type": "mimo"} | {"rope_type": null, "rope_scaling": {"rope_theta": 640000, "rope_type": "default"}, "attention_bias": true, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 32768, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon": 1e-... | {"organization": "XiaomiMiMo", "first_publish_date": "2025-04-29", "paper_urls": ["https://arxiv.org/abs/2505.07608"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["claude 3.5", "qwen", "gpt 4.1", "claude", "qwen 14b", "qwen 7b", "gpt", "the mo... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862690", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
THUDM/glm-4-9b-chat-hf | THUDM | glm | 3 | 2026-03-29T15:33:18.543209+00:00 | {"hidden_size": 4096, "intermediate_size": 13696, "vocab_size": 151552, "max_position_embeddings": 131072, "num_hidden_layers": 40, "rms_norm_eps": 1.5625e-07, "model_type": "glm"} | {"rope_type": null, "rope_scaling": {"rope_theta": 10000.0, "partial_rotary_factor": 0.5, "rope_type": "default"}, "attention_bias": true, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_embeddi... | {"organization": "THUDM", "first_publish_date": "2024-10-23", "paper_urls": [], "base_model": "THUDM/glm-4-9b-chat", "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["chatglm 3", "glm-4-9b.", "llama", "claude", "qwen", "gpt 4", "glm 2024c", "chatglm", "claude 3", "g... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862794", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
THUDM/glm-4-9b-hf | THUDM | glm | 3 | 2026-03-29T15:33:22.734695+00:00 | {"hidden_size": 4096, "intermediate_size": 13696, "vocab_size": 151552, "max_position_embeddings": 8192, "num_hidden_layers": 40, "rms_norm_eps": 1.5625e-07, "model_type": "glm"} | {"rope_type": null, "rope_scaling": {"rope_theta": 10000.0, "partial_rotary_factor": 0.5, "rope_type": "default"}, "attention_bias": true, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 8192, "positional_encoding": "rope", "tie_word_embedding... | {"organization": "THUDM", "first_publish_date": "2025-01-16", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation"], "model_card_lineage": ["gpt 4", "glm 4v", "gpt", "glm 2024c", "glm-4-9b.", "chatglm 3", "llama", "glm 4", "glm", "qwen", "claude 3", "glm 130b", "chatglm", "g... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.862891", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
zai-org/GLM-4-32B-0414 | zai-org | glm4 | 3 | 2026-03-29T15:33:26.619721+00:00 | {"hidden_size": 6144, "intermediate_size": 23040, "vocab_size": 151552, "max_position_embeddings": 32768, "num_hidden_layers": 61, "rms_norm_eps": 1e-05, "model_type": "glm4"} | {"rope_type": null, "rope_scaling": {"rope_theta": 10000.0, "partial_rotary_factor": 0.5, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 32768, "positional_encoding": "rope", "tie_word_embeddi... | {"organization": "zai-org", "first_publish_date": "2025-04-07", "paper_urls": [], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["glm", "switch", "gpt 4o", "qwen 2.5", "search results", "glm 4", "pairwise ranking feedback", "the list of available... | {"residual_stream_dim": 6144, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.863014", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
01-ai/Yi-1.5-9B-Chat | 01-ai | llama | 3 | 2026-03-29T15:33:30.543328+00:00 | {"hidden_size": 4096, "intermediate_size": 11008, "vocab_size": 64000, "max_position_embeddings": 4096, "num_hidden_layers": 48, "rms_norm_eps": 1e-06, "model_type": "llama"} | {"rope_type": null, "rope_scaling": {"rope_theta": 5000000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 4096, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon": ... | {"organization": "01-ai", "first_publish_date": "2024-05-10", "paper_urls": ["https://arxiv.org/abs/2403.04652"], "base_model": null, "license": "apache-2.0", "architecture_tags": ["llama", "text-generation", "conversational"], "model_card_lineage": ["yi 1.5", "yi 15", "yi"], "hub_metadata": {"downloads": 18917, "likes... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.863130", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
baichuan-inc/Baichuan2-13B-Chat | baichuan-inc | baichuan | 3 | 2026-03-29T15:33:34.877086+00:00 | {"hidden_size": 5120, "intermediate_size": 13696, "vocab_size": 125696, "num_hidden_layers": 40, "rms_norm_eps": 1e-06, "model_type": "baichuan"} | {"rope_type": null, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "eager", "position_embedding_type": null, "max_position_embeddings": null, "tie_word_embeddings": false, "norm_epsilon": 1e-06, "activation_function": "silu", "residual_dropout": null, "attention_dropout": null, "actual... | {"organization": "baichuan-inc", "first_publish_date": "2023-08-29", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation"], "model_card_lineage": ["llama", "baichuan 13b", "glm 2", "llama 13b", "gpt 4", "chatglm 2", "baichuan 7b", "baichuan 2", "gpt", "baichuan", "baichuan 2... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.863225", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
ByteDance-Seed/Seed-Coder-8B-Instruct | ByteDance-Seed | llama | 3 | 2026-03-29T15:33:38.656812+00:00 | {"hidden_size": 4096, "intermediate_size": 14336, "vocab_size": 155136, "max_position_embeddings": 32768, "num_hidden_layers": 32, "rms_norm_eps": 1e-06, "model_type": "llama"} | {"rope_type": null, "rope_scaling": {"rope_theta": 500000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 32768, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon": ... | {"organization": "ByteDance-Seed", "first_publish_date": "2025-04-27", "paper_urls": ["https://arxiv.org/abs/2506.03524"], "base_model": "ByteDance-Seed/Seed-Coder-8B-Base", "license": "mit", "architecture_tags": ["llama", "text-generation", "conversational"], "model_card_lineage": ["yi", "llama", "qwen 1.5", "codellam... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.863320", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
moonshotai/Kimi-K2-Instruct | moonshotai | kimi_k2 | 3 | 2026-03-29T15:57:11.098748+00:00 | {"hidden_size": 7168, "intermediate_size": 18432, "vocab_size": 163840, "max_position_embeddings": 131072, "num_hidden_layers": 61, "rope_theta": 50000.0, "rms_norm_eps": 1e-06, "model_type": "kimi_k2"} | {"rope_type": null, "rope_scaling": {"beta_fast": 1.0, "beta_slow": 1.0, "factor": 32.0, "mscale": 1.0, "mscale_all_dim": 1.0, "original_max_position_embeddings": 4096, "type": "yarn", "rope_theta": 50000.0, "rope_type": "yarn"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager",... | {"organization": "moonshotai", "first_publish_date": "2025-07-11", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 3", "llama 4", "claude", "gpt", "llama", "qwen 2.5", "gpt 4.1"], "hub_metadata": {"downloads": 113628, "li... | {"residual_stream_dim": 7168, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.863446", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
MiniMaxAI/MiniMax-M2.1 | MiniMaxAI | minimax_m2 | 3 | 2026-03-29T15:33:55.295821+00:00 | {"hidden_size": 3072, "intermediate_size": 1536, "vocab_size": 200064, "max_position_embeddings": 196608, "num_hidden_layers": 62, "rope_theta": 5000000, "rms_norm_eps": 1e-06, "model_type": "minimax_m2"} | {"rope_type": null, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 196608, "tie_word_embeddings": false, "norm_epsilon": 1e-06, "activation_function": "silu", "residual_dropout": null, "attention_dropout": 0.0, "actual... | {"organization": "MiniMaxAI", "first_publish_date": "2025-12-20", "paper_urls": ["https://arxiv.org/abs/2509.06501"], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["gpt 5.2", "minimax-m2.1", "gpt", "claude"], "hub_metadata": {"downloads": 4178... | {"residual_stream_dim": 3072, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "attn_pre_norm": "layers... | {"contributed_at": "2026-04-01T19:06:09.863582", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3.5-9B | Qwen | qwen3_5_text | 3 | 2026-03-29T15:43:11.356013+00:00 | {"hidden_size": 4096, "intermediate_size": 12288, "vocab_size": 248320, "max_position_embeddings": 262144, "num_hidden_layers": 32, "rms_norm_eps": 1e-06, "model_type": "qwen3_5_text"} | {"rope_type": null, "rope_scaling": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "default", "rope_theta": 10000000, "partial_rotary_factor": 0.25}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddin... | {"organization": "Qwen", "first_publish_date": "2026-02-27", "paper_urls": [], "base_model": "Qwen/Qwen3.5-9B-Base", "license": "apache-2.0", "architecture_tags": ["conversational"], "model_card_lineage": ["qwen 3", "claude", "gpt", "qwen 3.5", "qwen", "switch", "gpt 5"], "hub_metadata": {"downloads": 4286464, "likes":... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "attn_pre_norm": "layers.0.input_layernorm", "attn_post_norm": "layers.0.post_attention... | {"contributed_at": "2026-04-01T19:06:09.863670", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3.5-35B-A3B | Qwen | qwen3_5_moe_text | 3 | 2026-03-29T15:43:22.406730+00:00 | {"hidden_size": 2048, "vocab_size": 248320, "max_position_embeddings": 262144, "num_hidden_layers": 40, "rms_norm_eps": 1e-06, "model_type": "qwen3_5_moe_text"} | {"rope_type": null, "rope_scaling": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "default", "rope_theta": 10000000, "partial_rotary_factor": 0.25}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddin... | {"organization": "Qwen", "first_publish_date": "2026-02-24", "paper_urls": [], "base_model": "Qwen/Qwen3.5-35B-A3B-Base", "license": "apache-2.0", "architecture_tags": ["conversational"], "model_card_lineage": ["qwen", "gpt 5", "claude", "gpt", "our model adopt a simple context-folding strategy", "qwen 3", "qwen 3.5", ... | {"residual_stream_dim": 2048, "layers": [{"index": 0, "intervention_points": {"mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate", "moe_gate": "layers.0.mlp.gate", "moe_experts": "layers.0.mlp.experts", "moe_shared_expert": "layers.0.mlp.shared_expert", "attn_pre_norm": "layers.0.input_layernorm", "attn_... | {"contributed_at": "2026-04-01T19:06:09.863769", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3.5-397B-A17B | Qwen | qwen3_5_moe_text | 3 | 2026-03-29T15:43:32.479492+00:00 | {"hidden_size": 4096, "vocab_size": 248320, "max_position_embeddings": 262144, "num_hidden_layers": 60, "rms_norm_eps": 1e-06, "model_type": "qwen3_5_moe_text"} | {"rope_type": null, "rope_scaling": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "default", "rope_theta": 10000000, "partial_rotary_factor": 0.25}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddin... | {"organization": "Qwen", "first_publish_date": "2026-02-16", "paper_urls": [], "base_model": null, "license": "apache-2.0", "architecture_tags": ["conversational"], "model_card_lineage": ["qwen 3", "switch", "qwen", "gpt 5.2", "claude 4.5", "our model adopt a simple context-folding strategy", "claude", "qwen 3.5"], "hu... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate", "moe_gate": "layers.0.mlp.gate", "moe_experts": "layers.0.mlp.experts", "moe_shared_expert": "layers.0.mlp.shared_expert", "attn_pre_norm": "layers.0.input_layernorm", "attn_... | {"contributed_at": "2026-04-01T19:06:09.863891", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3.5-27B | Qwen | qwen3_5_text | 3 | 2026-03-29T15:43:42.293743+00:00 | {"hidden_size": 5120, "intermediate_size": 17408, "vocab_size": 248320, "max_position_embeddings": 262144, "num_hidden_layers": 64, "rms_norm_eps": 1e-06, "model_type": "qwen3_5_text"} | {"rope_type": null, "rope_scaling": {"mrope_interleaved": true, "mrope_section": [11, 11, 10], "rope_type": "default", "rope_theta": 10000000, "partial_rotary_factor": 0.25}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddin... | {"organization": "Qwen", "first_publish_date": "2026-02-24", "paper_urls": [], "base_model": null, "license": "apache-2.0", "architecture_tags": ["conversational"], "model_card_lineage": ["claude", "qwen 3.5", "gpt", "qwen", "switch", "our model adopt a simple context-folding strategy", "qwen 3", "gpt 5"], "hub_metadat... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "attn_pre_norm": "layers.0.input_layernorm", "attn_post_norm": "layers.0.post_attention... | {"contributed_at": "2026-04-01T19:06:09.864010", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-V3.1 | deepseek-ai | deepseek_v3 | 3 | 2026-03-29T15:43:57.325330+00:00 | {"hidden_size": 7168, "intermediate_size": 18432, "vocab_size": 129280, "max_position_embeddings": 163840, "num_hidden_layers": 61, "rope_theta": 10000, "rms_norm_eps": 1e-06, "model_type": "deepseek_v3"} | {"rope_type": null, "rope_scaling": {"beta_fast": 32, "beta_slow": 1, "factor": 40, "mscale": 1.0, "mscale_all_dim": 1.0, "original_max_position_embeddings": 4096, "type": "yarn", "rope_theta": 10000, "rope_type": "yarn"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "atten... | {"organization": "deepseek-ai", "first_publish_date": "2025-08-21", "paper_urls": ["https://arxiv.org/abs/2412.19437"], "base_model": "deepseek-ai/DeepSeek-V3.1-Base", "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["deepseek-r1."], "hub_metadata": {"downloads": 1533... | {"residual_stream_dim": 7168, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.864137", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
deepseek-ai/DeepSeek-R1-0528-Qwen3-8B | deepseek-ai | qwen3 | 3 | 2026-03-29T15:44:03.764306+00:00 | {"hidden_size": 4096, "intermediate_size": 12288, "vocab_size": 151936, "max_position_embeddings": 131072, "num_hidden_layers": 36, "rms_norm_eps": 1e-06, "model_type": "qwen3"} | {"rope_type": null, "rope_scaling": {"rope_type": "yarn", "factor": 4.0, "original_max_position_embeddings": 32768, "attn_factor": 0.8782488562869419, "rope_theta": 1000000}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddin... | {"organization": "deepseek-ai", "first_publish_date": "2025-05-29", "paper_urls": ["https://arxiv.org/abs/2501.12948"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["phi", "the user", "gpt 4.1", "qwen 3", "gpt", "the question.\r\n- for listing-... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.864254", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
zai-org/GLM-4.7 | zai-org | glm4_moe | 3 | 2026-03-29T15:44:49.559148+00:00 | {"hidden_size": 5120, "intermediate_size": 12288, "vocab_size": 151552, "max_position_embeddings": 202752, "num_hidden_layers": 92, "rms_norm_eps": 1e-05, "model_type": "glm4_moe"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000, "partial_rotary_factor": 0.5, "rope_type": "default"}, "attention_bias": true, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 202752, "positional_encoding": "rope", "tie_word_embeddi... | {"organization": "zai-org", "first_publish_date": "2025-12-22", "paper_urls": ["https://arxiv.org/abs/2508.06471"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["gpt 5", "switch", "claude", "glm", "gpt", "glm 4.6", "glm 45a", "yi", "glm 4.5", "... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.864459", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
zai-org/GLM-4.7-Flash | zai-org | glm4_moe_lite | 3 | 2026-03-29T15:44:53.562389+00:00 | {"hidden_size": 2048, "intermediate_size": 10240, "vocab_size": 154880, "max_position_embeddings": 202752, "num_hidden_layers": 47, "rms_norm_eps": 1e-05, "model_type": "glm4_moe_lite"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000, "partial_rotary_factor": 1.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "attention_architecture": "mla", "mla_q_lora_rank": 768, "mla_kv_lora_rank": 512, "mla_qk_nope_head_dim": 192, "mla... | {"organization": "zai-org", "first_publish_date": "2026-01-19", "paper_urls": ["https://arxiv.org/abs/2508.06471"], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["glm 4.5", "glm", "glm 47", "qwen 3", "glm 45a", "claude", "glm 45", "yi", "gpt", "... | {"residual_stream_dim": 2048, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.864569", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
XiaomiMiMo/MiMo-V2-Flash | XiaomiMiMo | mimo_v2_flash | 3 | 2026-03-29T15:45:04.990961+00:00 | {"hidden_size": 4096, "intermediate_size": 16384, "vocab_size": 152576, "max_position_embeddings": 262144, "num_hidden_layers": 48, "rope_theta": 10000, "model_type": "mimo_v2_flash"} | {"rope_type": null, "rope_scaling": {"rope_type": "default", "rope_theta": 5000000, "partial_rotary_factor": 0.334}, "attention_bias": false, "sliding_window_size": 128, "attention_implementation": "eager", "flash_attention_capable": true, "position_embedding_type": null, "max_position_embeddings": 262144, "positional_... | {"organization": "XiaomiMiMo", "first_publish_date": "2025-12-16", "paper_urls": [], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["top of sglang. with", "gpt 5", "gpt", "qwen 3", "claude"], "hub_metadata": {"downloads": 91678, "likes": 677, "cr... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.864690", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Qwen/Qwen3-Coder-30B-A3B-Instruct | Qwen | qwen3_moe | 3 | 2026-03-29T15:45:38.648112+00:00 | {"hidden_size": 2048, "intermediate_size": 6144, "vocab_size": 151936, "max_position_embeddings": 262144, "num_hidden_layers": 48, "rms_norm_eps": 1e-06, "model_type": "qwen3_moe"} | {"rope_type": null, "rope_scaling": {"rope_theta": 10000000, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "sdpa", "position_embedding_type": null, "max_position_embeddings": 262144, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "Qwen", "first_publish_date": "2025-07-31", "paper_urls": ["https://arxiv.org/abs/2505.09388"], "base_model": null, "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 3t", "qwen 3", "given inputs.", "qwen", "llama"], "hub_metadata": {"downl... | {"residual_stream_dim": 2048, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.864808", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Skywork/MindLink-72B-0801 | Skywork | qwen2 | 3 | 2026-03-29T16:18:40.753724+00:00 | {"hidden_size": 8192, "intermediate_size": 29568, "vocab_size": 152064, "max_position_embeddings": 32768, "num_hidden_layers": 80, "rms_norm_eps": 1e-06, "model_type": "qwen2"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000.0, "rope_type": "default"}, "attention_bias": null, "sliding_window_size": null, "attention_implementation": "eager", "position_embedding_type": null, "max_position_embeddings": 32768, "positional_encoding": "rope", "tie_word_embeddings": false, "norm_epsilon":... | {"organization": "Skywork", "first_publish_date": "2025-08-01", "paper_urls": [], "base_model": "Qwen/Qwen2.5-72B-Instruct", "license": "apache-2.0", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen", "task complexity", "improvements from", "qwen 2.5"], "hub_metadata": {"download... | {"residual_stream_dim": 8192, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.864995", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
tencent/Hunyuan-A13B-Instruct | tencent | hunyuan_v1_moe | 3 | 2026-03-29T16:20:27.238698+00:00 | {"hidden_size": 4096, "intermediate_size": 3072, "vocab_size": 128167, "max_position_embeddings": 32768, "num_hidden_layers": 32, "rope_theta": 10000.0, "rms_norm_eps": 1e-05, "model_type": "hunyuan_v1_moe"} | {"rope_type": null, "rope_scaling": {"alpha": 1000.0, "beta_fast": 32, "beta_slow": 1, "factor": 1.0, "mscale": 1.0, "mscale_all_dim": 1.0, "type": "dynamic", "rope_theta": 10000.0, "rope_type": "dynamic"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "flash_attention_capab... | {"organization": "tencent", "first_publish_date": "2025-06-25", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 2.5", "qwen 3", "switch", "a fine-grained mixture-of-experts", "opt", "the latest version of sglang.\n\nto ge... | {"residual_stream_dim": 4096, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:06:09.865095", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
zai-org/GLM-5 | zai-org | glm_moe_dsa | 3 | 2026-03-29T16:30:24.700922+00:00 | {"hidden_size": 6144, "intermediate_size": 12288, "vocab_size": 154880, "max_position_embeddings": 202752, "num_hidden_layers": 78, "rms_norm_eps": 1e-05, "model_type": "glm_moe_dsa"} | {"rope_type": null, "rope_scaling": {"rope_theta": 1000000, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "attention_architecture": "mla", "mla_q_lora_rank": 2048, "mla_kv_lora_rank": 512, "mla_qk_nope_head_dim": 192, "mla_qk_rope_head_dim": 64, "pos... | {"organization": "zai-org", "first_publish_date": "2026-02-11", "paper_urls": [], "base_model": null, "license": "mit", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["glm 47", "glm", "glm 5t", "glm 45", "gpt", "glm 5", "glm 4.7", "glm 4.5", "claude", "gpt 5.2"], "hub_metadata": {"do... | {"residual_stream_dim": 6144, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.mlp.", "mlp_gate_proj": "layers.0.mlp.gate_proj", "mlp_up_proj": "layers.0.mlp.up_proj", "mlp_down_proj": "layers.0.mlp.down_proj", "... | {"contributed_at": "2026-04-01T19:06:09.865240", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
IQuestLab/IQuest-Coder-V1-40B-Instruct | IQuestLab | iquestcoder | 3 | 2026-04-01T23:40:19.538892+00:00 | {"hidden_size": 5120, "intermediate_size": 27648, "vocab_size": 76800, "max_position_embeddings": 131072, "num_hidden_layers": 80, "rope_theta": 500000.0, "rms_norm_eps": 1e-05, "model_type": "iquestcoder"} | {"rope_type": null, "rope_scaling": {"rope_theta": 500000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "flash_attention_capable": true, "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_e... | {"organization": "IQuestLab", "first_publish_date": "2025-12-30", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 3"], "hub_metadata": {"downloads": 12216, "likes": 289, "created_at": "2025-12-30T15:30:10.000Z", "last_mod... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:41:14.509541", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
trohrbaugh/IQuest-Coder-V1-40B-Thinking-heretic | trohrbaugh | iquestcoder | 3 | 2026-04-01T23:40:47.306445+00:00 | {"hidden_size": 5120, "intermediate_size": 27648, "vocab_size": 76800, "max_position_embeddings": 131072, "num_hidden_layers": 80, "rope_theta": 500000.0, "rms_norm_eps": 1e-05, "model_type": "iquestcoder"} | {"rope_type": null, "rope_scaling": {"rope_theta": 500000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "flash_attention_capable": true, "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_e... | {"organization": "trohrbaugh", "first_publish_date": "2026-04-01", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 3", "claude"], "hub_metadata": {"downloads": 0, "likes": 0, "created_at": "2026-04-01T18:23:14.000Z", "las... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:41:14.509889", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
IQuestLab/IQuest-Coder-V1-40B-Thinking | IQuestLab | iquestcoder | 3 | 2026-04-01T23:41:05.000325+00:00 | {"hidden_size": 5120, "intermediate_size": 27648, "vocab_size": 76800, "max_position_embeddings": 131072, "num_hidden_layers": 80, "rope_theta": 500000.0, "rms_norm_eps": 1e-05, "model_type": "iquestcoder"} | {"rope_type": null, "rope_scaling": {"rope_theta": 500000.0, "rope_type": "default"}, "attention_bias": false, "sliding_window_size": null, "attention_implementation": "eager", "flash_attention_capable": true, "position_embedding_type": null, "max_position_embeddings": 131072, "positional_encoding": "rope", "tie_word_e... | {"organization": "IQuestLab", "first_publish_date": "2025-12-30", "paper_urls": [], "base_model": null, "license": "other", "architecture_tags": ["text-generation", "conversational"], "model_card_lineage": ["qwen 3", "claude"], "hub_metadata": {"downloads": 330, "likes": 16, "created_at": "2025-12-30T15:32:25.000Z", "l... | {"residual_stream_dim": 5120, "layers": [{"index": 0, "intervention_points": {"attn_input": "layers.0.self_attn.", "attn_q_proj": "layers.0.self_attn.q_proj", "attn_k_proj": "layers.0.self_attn.k_proj", "attn_v_proj": "layers.0.self_attn.v_proj", "attn_output_proj": "layers.0.self_attn.o_proj", "mlp_input": "layers.0.m... | {"contributed_at": "2026-04-01T19:41:14.510113", "contributor": "trohrbaugh", "attribution_level": "full", "schema_version": 1} | public | 0.95 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.