Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 49 new columns ({'time_stats.attn_post_proj.median', 'time_stats.attn_pre_proj.mean', 'n_expanded_embd', 'time_stats.mlp_up_proj.median', 'time_stats.mlp_up_proj.std', 'time_stats.mlp_up_proj.min', 'time_stats.attn_post_proj.std', 'time_stats.input_norm_fused.median', 'time_stats.mlp_down_proj.median', 'time_stats.input_norm_fused.std', 'time_stats.mlp_down_proj.min', 'time_stats.attn_pre_proj.std', 'time_stats.attn_post_proj.max', 'time_stats.mlp_up_proj.max', 'time_stats.post_attention_norm_fused.max', 'time_stats.mlp_down_proj.std', 'time_stats.mlp_act.max', 'time_stats.mlp_act.median', 'time_stats.emb.median', 'time_stats.attn_rope.median', 'time_stats.attn_rope.std', 'time_stats.attn_pre_proj.max', 'time_stats.mlp_up_proj.mean', 'time_stats.attn_rope.max', 'time_stats.mlp_act.std', 'time_stats.post_attention_norm_fused.median', 'time_stats.mlp_act.mean', 'num_tokens', 'time_stats.attn_post_proj.mean', 'time_stats.emb.std', 'time_stats.mlp_act.min', 'time_stats.attn_pre_proj.min', 'time_stats.attn_pre_proj.median', 'time_stats.attn_post_proj.min', 'time_stats.input_norm_fused.max', 'time_stats.input_norm_fused.min', 'vocab_size', 'time_stats.post_attention_norm_fused.mean', 'time_stats.input_norm_fused.mean', 'time_stats.mlp_down_proj.max', 'time_stats.attn_rope.min', 'time_stats.attn_rope.mean', 'time_stats.emb.min', 'time_stats.emb.max', 'n_head', 'time_stats.post_attention_norm_fused.min', 'time_stats.mlp_down_proj.mean', 'time_stats.post_attention_norm_fused.std', 'time_stats.emb.mean'}) and 15 missing columns ({'attention_backend', 'max_model_len', 'prefill_chunk_size', 'time_stats.AttentionForward.max', 'time_stats.AttentionForward.std', 'time_stats.AttentionForward.mean', 'dtype', 'is_prefill', 'causal', 'time_stats.AttentionForward.median', 'batch_size', 'block_size', 'time_stats.AttentionForward.min', 'kv_cache_size', 'n_q_head'}).

This happened while the csv dataset builder was generating data using

/tmp/hf-datasets-cache/medium/datasets/12645371123672-config-parquet-and-info-project-vajra-dev-staging-e1f8aea7/hub/datasets--project-vajra--dev-staging-meta-llama-meta-llama-3-70b-instruct-h200-nvl/snapshots/edcd99ca727d6cabcc7fa7fef6e51a2ed2ccbe6c/mlp.csv.xz

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              time_stats.mlp_up_proj.min: double
              time_stats.mlp_up_proj.max: double
              time_stats.mlp_up_proj.mean: double
              time_stats.mlp_up_proj.median: double
              time_stats.mlp_up_proj.std: double
              time_stats.post_attention_norm_fused.min: double
              time_stats.post_attention_norm_fused.max: double
              time_stats.post_attention_norm_fused.mean: double
              time_stats.post_attention_norm_fused.median: double
              time_stats.post_attention_norm_fused.std: double
              time_stats.attn_post_proj.min: double
              time_stats.attn_post_proj.max: double
              time_stats.attn_post_proj.mean: double
              time_stats.attn_post_proj.median: double
              time_stats.attn_post_proj.std: double
              time_stats.mlp_down_proj.min: double
              time_stats.mlp_down_proj.max: double
              time_stats.mlp_down_proj.mean: double
              time_stats.mlp_down_proj.median: double
              time_stats.mlp_down_proj.std: double
              time_stats.attn_rope.min: double
              time_stats.attn_rope.max: double
              time_stats.attn_rope.mean: double
              time_stats.attn_rope.median: double
              time_stats.attn_rope.std: double
              time_stats.attn_pre_proj.min: double
              time_stats.attn_pre_proj.max: double
              time_stats.attn_pre_proj.mean: double
              time_stats.attn_pre_proj.median: double
              time_stats.attn_pre_proj.std: double
              time_stats.input_norm_fused.min: double
              time_stats.input_norm_fused.max: double
              time_stats.input_norm_fused.mean: double
              time_stats.input_norm_fused.median: double
              time_stats.input_norm_fused.std: double
              time_stats.mlp_act.min: double
              time_stats.mlp_act.max: double
              time_stats.mlp_act.mean: double
              time_stats.mlp_act.median: double
              time_stats.mlp_act.std: double
              time_stats.emb.min: double
              time_stats.emb.max: double
              time_stats.emb.mean: double
              time_stats.emb.median: double
              time_stats.emb.std: double
              n_head: int64
              n_kv_head: int64
              n_embd: int64
              n_expanded_embd: int64
              vocab_size: int64
              num_tokens: int64
              num_tensor_parallel_workers: int64
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 8184
              to
              {'time_stats.AttentionForward.min': Value('float64'), 'time_stats.AttentionForward.max': Value('float64'), 'time_stats.AttentionForward.mean': Value('float64'), 'time_stats.AttentionForward.median': Value('float64'), 'time_stats.AttentionForward.std': Value('float64'), 'n_embd': Value('int64'), 'n_q_head': Value('int64'), 'n_kv_head': Value('int64'), 'block_size': Value('int64'), 'num_tensor_parallel_workers': Value('int64'), 'max_model_len': Value('int64'), 'batch_size': Value('int64'), 'prefill_chunk_size': Value('int64'), 'kv_cache_size': Value('int64'), 'is_prefill': Value('bool'), 'attention_backend': Value('string'), 'dtype': Value('string'), 'causal': Value('bool')}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1339, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 972, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 894, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 970, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1833, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 49 new columns ({'time_stats.attn_post_proj.median', 'time_stats.attn_pre_proj.mean', 'n_expanded_embd', 'time_stats.mlp_up_proj.median', 'time_stats.mlp_up_proj.std', 'time_stats.mlp_up_proj.min', 'time_stats.attn_post_proj.std', 'time_stats.input_norm_fused.median', 'time_stats.mlp_down_proj.median', 'time_stats.input_norm_fused.std', 'time_stats.mlp_down_proj.min', 'time_stats.attn_pre_proj.std', 'time_stats.attn_post_proj.max', 'time_stats.mlp_up_proj.max', 'time_stats.post_attention_norm_fused.max', 'time_stats.mlp_down_proj.std', 'time_stats.mlp_act.max', 'time_stats.mlp_act.median', 'time_stats.emb.median', 'time_stats.attn_rope.median', 'time_stats.attn_rope.std', 'time_stats.attn_pre_proj.max', 'time_stats.mlp_up_proj.mean', 'time_stats.attn_rope.max', 'time_stats.mlp_act.std', 'time_stats.post_attention_norm_fused.median', 'time_stats.mlp_act.mean', 'num_tokens', 'time_stats.attn_post_proj.mean', 'time_stats.emb.std', 'time_stats.mlp_act.min', 'time_stats.attn_pre_proj.min', 'time_stats.attn_pre_proj.median', 'time_stats.attn_post_proj.min', 'time_stats.input_norm_fused.max', 'time_stats.input_norm_fused.min', 'vocab_size', 'time_stats.post_attention_norm_fused.mean', 'time_stats.input_norm_fused.mean', 'time_stats.mlp_down_proj.max', 'time_stats.attn_rope.min', 'time_stats.attn_rope.mean', 'time_stats.emb.min', 'time_stats.emb.max', 'n_head', 'time_stats.post_attention_norm_fused.min', 'time_stats.mlp_down_proj.mean', 'time_stats.post_attention_norm_fused.std', 'time_stats.emb.mean'}) and 15 missing columns ({'attention_backend', 'max_model_len', 'prefill_chunk_size', 'time_stats.AttentionForward.max', 'time_stats.AttentionForward.std', 'time_stats.AttentionForward.mean', 'dtype', 'is_prefill', 'causal', 'time_stats.AttentionForward.median', 'batch_size', 'block_size', 'time_stats.AttentionForward.min', 'kv_cache_size', 'n_q_head'}).
              
              This happened while the csv dataset builder was generating data using
              
              /tmp/hf-datasets-cache/medium/datasets/12645371123672-config-parquet-and-info-project-vajra-dev-staging-e1f8aea7/hub/datasets--project-vajra--dev-staging-meta-llama-meta-llama-3-70b-instruct-h200-nvl/snapshots/edcd99ca727d6cabcc7fa7fef6e51a2ed2ccbe6c/mlp.csv.xz
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

time_stats.AttentionForward.min
float64
time_stats.AttentionForward.max
float64
time_stats.AttentionForward.mean
float64
time_stats.AttentionForward.median
float64
time_stats.AttentionForward.std
float64
n_embd
int64
n_q_head
int64
n_kv_head
int64
block_size
int64
num_tensor_parallel_workers
int64
max_model_len
int64
batch_size
int64
prefill_chunk_size
int64
kv_cache_size
int64
is_prefill
bool
attention_backend
string
dtype
string
causal
bool
0.085763
0.085763
0.085763
0.085763
0
8,192
64
8
16
1
8,192
1
32
0
true
flashinfer_auto
torch.bfloat16
true
0.067331
0.067331
0.067331
0.067331
0
8,192
64
8
16
1
8,192
1
32
32
true
flashinfer_auto
torch.bfloat16
true
0.10202
0.10202
0.10202
0.10202
0
8,192
64
8
16
1
8,192
1
32
64
true
flashinfer_auto
torch.bfloat16
true
0.078466
0.078466
0.078466
0.078466
0
8,192
64
8
16
1
8,192
1
32
96
true
flashinfer_auto
torch.bfloat16
true
0.07725
0.07725
0.07725
0.07725
0
8,192
64
8
16
1
8,192
1
32
128
true
flashinfer_auto
torch.bfloat16
true
0.073539
0.073539
0.073539
0.073539
0
8,192
64
8
16
1
8,192
1
32
160
true
flashinfer_auto
torch.bfloat16
true
0.077699
0.077699
0.077699
0.077699
0
8,192
64
8
16
1
8,192
1
32
192
true
flashinfer_auto
torch.bfloat16
true
0.083651
0.083651
0.083651
0.083651
0
8,192
64
8
16
1
8,192
1
32
224
true
flashinfer_auto
torch.bfloat16
true
0.070979
0.070979
0.070979
0.070979
0
8,192
64
8
16
1
8,192
1
32
256
true
flashinfer_auto
torch.bfloat16
true
0.075267
0.075267
0.075267
0.075267
0
8,192
64
8
16
1
8,192
1
32
288
true
flashinfer_auto
torch.bfloat16
true
0.071875
0.071875
0.071875
0.071875
0
8,192
64
8
16
1
8,192
1
32
320
true
flashinfer_auto
torch.bfloat16
true
0.086051
0.086051
0.086051
0.086051
0
8,192
64
8
16
1
8,192
1
32
352
true
flashinfer_auto
torch.bfloat16
true
0.079267
0.079267
0.079267
0.079267
0
8,192
64
8
16
1
8,192
1
32
384
true
flashinfer_auto
torch.bfloat16
true
0.077348
0.077348
0.077348
0.077348
0
8,192
64
8
16
1
8,192
1
32
416
true
flashinfer_auto
torch.bfloat16
true
0.079107
0.079107
0.079107
0.079107
0
8,192
64
8
16
1
8,192
1
32
448
true
flashinfer_auto
torch.bfloat16
true
0.084931
0.084931
0.084931
0.084931
0
8,192
64
8
16
1
8,192
1
32
480
true
flashinfer_auto
torch.bfloat16
true
0.079107
0.079107
0.079107
0.079107
0
8,192
64
8
16
1
8,192
1
32
512
true
flashinfer_auto
torch.bfloat16
true
0.089443
0.089443
0.089443
0.089443
0
8,192
64
8
16
1
8,192
1
32
544
true
flashinfer_auto
torch.bfloat16
true
0.080227
0.080227
0.080227
0.080227
0
8,192
64
8
16
1
8,192
1
32
576
true
flashinfer_auto
torch.bfloat16
true
0.088291
0.088291
0.088291
0.088291
0
8,192
64
8
16
1
8,192
1
32
608
true
flashinfer_auto
torch.bfloat16
true
0.082979
0.082979
0.082979
0.082979
0
8,192
64
8
16
1
8,192
1
32
640
true
flashinfer_auto
torch.bfloat16
true
0.084995
0.084995
0.084995
0.084995
0
8,192
64
8
16
1
8,192
1
32
672
true
flashinfer_auto
torch.bfloat16
true
0.080451
0.080451
0.080451
0.080451
0
8,192
64
8
16
1
8,192
1
32
704
true
flashinfer_auto
torch.bfloat16
true
0.086819
0.086819
0.086819
0.086819
0
8,192
64
8
16
1
8,192
1
32
736
true
flashinfer_auto
torch.bfloat16
true
0.083139
0.083139
0.083139
0.083139
0
8,192
64
8
16
1
8,192
1
32
768
true
flashinfer_auto
torch.bfloat16
true
0.088163
0.088163
0.088163
0.088163
0
8,192
64
8
16
1
8,192
1
32
800
true
flashinfer_auto
torch.bfloat16
true
0.08522
0.08522
0.08522
0.08522
0
8,192
64
8
16
1
8,192
1
32
832
true
flashinfer_auto
torch.bfloat16
true
0.09242
0.09242
0.09242
0.09242
0
8,192
64
8
16
1
8,192
1
32
864
true
flashinfer_auto
torch.bfloat16
true
0.087682
0.087682
0.087682
0.087682
0
8,192
64
8
16
1
8,192
1
32
896
true
flashinfer_auto
torch.bfloat16
true
0.095107
0.095107
0.095107
0.095107
0
8,192
64
8
16
1
8,192
1
32
928
true
flashinfer_auto
torch.bfloat16
true
0.084995
0.084995
0.084995
0.084995
0
8,192
64
8
16
1
8,192
1
32
960
true
flashinfer_auto
torch.bfloat16
true
0.093602
0.093602
0.093602
0.093602
0
8,192
64
8
16
1
8,192
1
32
992
true
flashinfer_auto
torch.bfloat16
true
0.103171
0.103171
0.103171
0.103171
0
8,192
64
8
16
1
8,192
1
32
1,024
true
flashinfer_auto
torch.bfloat16
true
0.088162
0.088162
0.088162
0.088162
0
8,192
64
8
16
1
8,192
1
32
1,056
true
flashinfer_auto
torch.bfloat16
true
0.089923
0.089923
0.089923
0.089923
0
8,192
64
8
16
1
8,192
1
32
1,088
true
flashinfer_auto
torch.bfloat16
true
0.092707
0.092707
0.092707
0.092707
0
8,192
64
8
16
1
8,192
1
32
1,120
true
flashinfer_auto
torch.bfloat16
true
0.109091
0.109091
0.109091
0.109091
0
8,192
64
8
16
1
8,192
1
32
1,152
true
flashinfer_auto
torch.bfloat16
true
0.101475
0.101475
0.101475
0.101475
0
8,192
64
8
16
1
8,192
1
32
1,184
true
flashinfer_auto
torch.bfloat16
true
0.093891
0.093891
0.093891
0.093891
0
8,192
64
8
16
1
8,192
1
32
1,216
true
flashinfer_auto
torch.bfloat16
true
0.097476
0.097476
0.097476
0.097476
0
8,192
64
8
16
1
8,192
1
32
1,248
true
flashinfer_auto
torch.bfloat16
true
0.115139
0.115139
0.115139
0.115139
0
8,192
64
8
16
1
8,192
1
32
1,280
true
flashinfer_auto
torch.bfloat16
true
0.09533
0.09533
0.09533
0.09533
0
8,192
64
8
16
1
8,192
1
32
1,312
true
flashinfer_auto
torch.bfloat16
true
0.094466
0.094466
0.094466
0.094466
0
8,192
64
8
16
1
8,192
1
32
1,344
true
flashinfer_auto
torch.bfloat16
true
0.098467
0.098467
0.098467
0.098467
0
8,192
64
8
16
1
8,192
1
32
1,376
true
flashinfer_auto
torch.bfloat16
true
0.107396
0.107396
0.107396
0.107396
0
8,192
64
8
16
1
8,192
1
32
1,408
true
flashinfer_auto
torch.bfloat16
true
0.098179
0.098179
0.098179
0.098179
0
8,192
64
8
16
1
8,192
1
32
1,440
true
flashinfer_auto
torch.bfloat16
true
0.096803
0.096803
0.096803
0.096803
0
8,192
64
8
16
1
8,192
1
32
1,472
true
flashinfer_auto
torch.bfloat16
true
0.100867
0.100867
0.100867
0.100867
0
8,192
64
8
16
1
8,192
1
32
1,504
true
flashinfer_auto
torch.bfloat16
true
0.088131
0.088131
0.088131
0.088131
0
8,192
64
8
16
1
8,192
1
32
1,536
true
flashinfer_auto
torch.bfloat16
true
0.099557
0.099557
0.099557
0.099557
0
8,192
64
8
16
1
8,192
1
32
1,568
true
flashinfer_auto
torch.bfloat16
true
0.088068
0.088068
0.088068
0.088068
0
8,192
64
8
16
1
8,192
1
32
1,600
true
flashinfer_auto
torch.bfloat16
true
0.091363
0.091363
0.091363
0.091363
0
8,192
64
8
16
1
8,192
1
32
1,632
true
flashinfer_auto
torch.bfloat16
true
0.091939
0.091939
0.091939
0.091939
0
8,192
64
8
16
1
8,192
1
32
1,664
true
flashinfer_auto
torch.bfloat16
true
0.090115
0.090115
0.090115
0.090115
0
8,192
64
8
16
1
8,192
1
32
1,696
true
flashinfer_auto
torch.bfloat16
true
0.091431
0.091431
0.091431
0.091431
0
8,192
64
8
16
1
8,192
1
32
1,728
true
flashinfer_auto
torch.bfloat16
true
0.091875
0.091875
0.091875
0.091875
0
8,192
64
8
16
1
8,192
1
32
1,760
true
flashinfer_auto
torch.bfloat16
true
0.091907
0.091907
0.091907
0.091907
0
8,192
64
8
16
1
8,192
1
32
1,792
true
flashinfer_auto
torch.bfloat16
true
0.096898
0.096898
0.096898
0.096898
0
8,192
64
8
16
1
8,192
1
32
1,824
true
flashinfer_auto
torch.bfloat16
true
0.058689
0.058689
0.058689
0.058689
0
8,192
64
8
16
1
8,192
1
32
1,856
true
flashinfer_auto
torch.bfloat16
true
0.092834
0.092834
0.092834
0.092834
0
8,192
64
8
16
1
8,192
1
32
1,888
true
flashinfer_auto
torch.bfloat16
true
0.096547
0.096547
0.096547
0.096547
0
8,192
64
8
16
1
8,192
1
32
1,920
true
flashinfer_auto
torch.bfloat16
true
0.093572
0.093572
0.093572
0.093572
0
8,192
64
8
16
1
8,192
1
32
1,952
true
flashinfer_auto
torch.bfloat16
true
0.060737
0.060737
0.060737
0.060737
0
8,192
64
8
16
1
8,192
1
32
1,984
true
flashinfer_auto
torch.bfloat16
true
0.098052
0.098052
0.098052
0.098052
0
8,192
64
8
16
1
8,192
1
32
2,016
true
flashinfer_auto
torch.bfloat16
true
0.097187
0.097187
0.097187
0.097187
0
8,192
64
8
16
1
8,192
1
32
2,048
true
flashinfer_auto
torch.bfloat16
true
0.095587
0.095587
0.095587
0.095587
0
8,192
64
8
16
1
8,192
1
32
2,080
true
flashinfer_auto
torch.bfloat16
true
0.073475
0.073475
0.073475
0.073475
0
8,192
64
8
16
1
8,192
1
32
2,112
true
flashinfer_auto
torch.bfloat16
true
0.113987
0.113987
0.113987
0.113987
0
8,192
64
8
16
1
8,192
1
32
2,144
true
flashinfer_auto
torch.bfloat16
true
0.097509
0.097509
0.097509
0.097509
0
8,192
64
8
16
1
8,192
1
32
2,176
true
flashinfer_auto
torch.bfloat16
true
0.100066
0.100066
0.100066
0.100066
0
8,192
64
8
16
1
8,192
1
32
2,208
true
flashinfer_auto
torch.bfloat16
true
0.064547
0.064547
0.064547
0.064547
0
8,192
64
8
16
1
8,192
1
32
2,240
true
flashinfer_auto
torch.bfloat16
true
0.101859
0.101859
0.101859
0.101859
0
8,192
64
8
16
1
8,192
1
32
2,272
true
flashinfer_auto
torch.bfloat16
true
0.102243
0.102243
0.102243
0.102243
0
8,192
64
8
16
1
8,192
1
32
2,304
true
flashinfer_auto
torch.bfloat16
true
0.100164
0.100164
0.100164
0.100164
0
8,192
64
8
16
1
8,192
1
32
2,336
true
flashinfer_auto
torch.bfloat16
true
0.066308
0.066308
0.066308
0.066308
0
8,192
64
8
16
1
8,192
1
32
2,368
true
flashinfer_auto
torch.bfloat16
true
0.105093
0.105093
0.105093
0.105093
0
8,192
64
8
16
1
8,192
1
32
2,400
true
flashinfer_auto
torch.bfloat16
true
0.102818
0.102818
0.102818
0.102818
0
8,192
64
8
16
1
8,192
1
32
2,432
true
flashinfer_auto
torch.bfloat16
true
0.101156
0.101156
0.101156
0.101156
0
8,192
64
8
16
1
8,192
1
32
2,464
true
flashinfer_auto
torch.bfloat16
true
0.069026
0.069026
0.069026
0.069026
0
8,192
64
8
16
1
8,192
1
32
2,496
true
flashinfer_auto
torch.bfloat16
true
0.105284
0.105284
0.105284
0.105284
0
8,192
64
8
16
1
8,192
1
32
2,528
true
flashinfer_auto
torch.bfloat16
true
0.111331
0.111331
0.111331
0.111331
0
8,192
64
8
16
1
8,192
1
32
2,560
true
flashinfer_auto
torch.bfloat16
true
0.128292
0.128292
0.128292
0.128292
0
8,192
64
8
16
1
8,192
1
32
2,592
true
flashinfer_auto
torch.bfloat16
true
0.073186
0.073186
0.073186
0.073186
0
8,192
64
8
16
1
8,192
1
32
2,624
true
flashinfer_auto
torch.bfloat16
true
0.115298
0.115298
0.115298
0.115298
0
8,192
64
8
16
1
8,192
1
32
2,656
true
flashinfer_auto
torch.bfloat16
true
0.116292
0.116292
0.116292
0.116292
0
8,192
64
8
16
1
8,192
1
32
2,688
true
flashinfer_auto
torch.bfloat16
true
0.113828
0.113828
0.113828
0.113828
0
8,192
64
8
16
1
8,192
1
32
2,720
true
flashinfer_auto
torch.bfloat16
true
0.076995
0.076995
0.076995
0.076995
0
8,192
64
8
16
1
8,192
1
32
2,752
true
flashinfer_auto
torch.bfloat16
true
0.122051
0.122051
0.122051
0.122051
0
8,192
64
8
16
1
8,192
1
32
2,784
true
flashinfer_auto
torch.bfloat16
true
0.109507
0.109507
0.109507
0.109507
0
8,192
64
8
16
1
8,192
1
32
2,816
true
flashinfer_auto
torch.bfloat16
true
0.111075
0.111075
0.111075
0.111075
0
8,192
64
8
16
1
8,192
1
32
2,848
true
flashinfer_auto
torch.bfloat16
true
0.074689
0.074689
0.074689
0.074689
0
8,192
64
8
16
1
8,192
1
32
2,880
true
flashinfer_auto
torch.bfloat16
true
0.109571
0.109571
0.109571
0.109571
0
8,192
64
8
16
1
8,192
1
32
2,912
true
flashinfer_auto
torch.bfloat16
true
0.108388
0.108388
0.108388
0.108388
0
8,192
64
8
16
1
8,192
1
32
2,944
true
flashinfer_auto
torch.bfloat16
true
0.110883
0.110883
0.110883
0.110883
0
8,192
64
8
16
1
8,192
1
32
2,976
true
flashinfer_auto
torch.bfloat16
true
0.074851
0.074851
0.074851
0.074851
0
8,192
64
8
16
1
8,192
1
32
3,008
true
flashinfer_auto
torch.bfloat16
true
0.111779
0.111779
0.111779
0.111779
0
8,192
64
8
16
1
8,192
1
32
3,040
true
flashinfer_auto
torch.bfloat16
true
0.112161
0.112161
0.112161
0.112161
0
8,192
64
8
16
1
8,192
1
32
3,072
true
flashinfer_auto
torch.bfloat16
true
0.111427
0.111427
0.111427
0.111427
0
8,192
64
8
16
1
8,192
1
32
3,104
true
flashinfer_auto
torch.bfloat16
true
0.078627
0.078627
0.078627
0.078627
0
8,192
64
8
16
1
8,192
1
32
3,136
true
flashinfer_auto
torch.bfloat16
true
0.116132
0.116132
0.116132
0.116132
0
8,192
64
8
16
1
8,192
1
32
3,168
true
flashinfer_auto
torch.bfloat16
true
End of preview.

No dataset card yet

Downloads last month
3