Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ArrowNotImplementedError
Message:      Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 712, in write_table
                  self._build_writer(inferred_schema=pa_table.schema)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
                  self.pa_writer = pq.ParquetWriter(
                                   ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
                  self.writer = _parquet.ParquetWriter(
                                ^^^^^^^^^^^^^^^^^^^^^^^
                File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1847, in _prepare_split_single
                  num_examples, num_bytes = writer.finalize()
                                            ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 731, in finalize
                  self._build_writer(self.schema)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
                  self.pa_writer = pq.ParquetWriter(
                                   ^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
                  self.writer = _parquet.ParquetWriter(
                                ^^^^^^^^^^^^^^^^^^^^^^^
                File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1455, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1054, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 894, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 970, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1858, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

best_metric
null
best_model_checkpoint
null
epoch
float64
eval_steps
int64
global_step
int64
is_hyper_param_search
bool
is_local_process_zero
bool
is_world_process_zero
bool
log_history
list
logging_steps
int64
max_steps
int64
num_input_tokens_seen
int64
num_train_epochs
int64
save_steps
int64
stateful_callbacks
dict
total_flos
float64
train_batch_size
int64
trial_name
null
trial_params
null
null
null
115.85906
500
232
false
true
true
[ { "epoch": 0.42953020134228187, "grad_norm": 10.125, "learning_rate": 4.166666666666666e-8, "loss": 0.6557, "step": 1 }, { "epoch": 0.8590604026845637, "grad_norm": 10.3125, "learning_rate": 8.333333333333333e-8, "loss": 0.6669, "step": 2 }, { "epoch": 1.429530201...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
209,921,899,528,978,430
1
null
null
null
null
115.85906
500
232
false
true
true
[ { "epoch": 0.42953020134228187, "grad_norm": 8.8125, "learning_rate": 4.166666666666666e-8, "loss": 0.712, "step": 1 }, { "epoch": 0.8590604026845637, "grad_norm": 8.9375, "learning_rate": 8.333333333333333e-8, "loss": 0.7198, "step": 2 }, { "epoch": 1.42953020134...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
243,388,770,567,585,800
1
null
null
null
null
115.85906
500
232
false
true
true
[ { "epoch": 0.42953020134228187, "grad_norm": 17.375, "learning_rate": 4.166666666666666e-8, "loss": 0.9944, "step": 1 }, { "epoch": 0.8590604026845637, "grad_norm": 17.125, "learning_rate": 8.333333333333333e-8, "loss": 1.0053, "step": 2 }, { "epoch": 1.4295302013...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
217,714,658,528,722,940
1
null
null
null
null
77.430976
500
232
false
true
true
[ { "epoch": 0.43097643097643096, "grad_norm": 11.25, "learning_rate": 4.166666666666666e-8, "loss": 0.7093, "step": 1 }, { "epoch": 0.8619528619528619, "grad_norm": 11.625, "learning_rate": 8.333333333333333e-8, "loss": 0.7181, "step": 2 }, { "epoch": 1, "grad_...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
488,582,934,243,360,800
1
null
null
null
null
77.430976
500
232
false
true
true
[ { "epoch": 0.43097643097643096, "grad_norm": 9.5625, "learning_rate": 4.166666666666666e-8, "loss": 0.7483, "step": 1 }, { "epoch": 0.8619528619528619, "grad_norm": 10, "learning_rate": 8.333333333333333e-8, "loss": 0.7577, "step": 2 }, { "epoch": 1, "grad_nor...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
567,702,596,938,014,700
1
null
null
null
null
77.430976
500
232
false
true
true
[ { "epoch": 0.43097643097643096, "grad_norm": 12.9375, "learning_rate": 4.166666666666666e-8, "loss": 1.133, "step": 1 }, { "epoch": 0.8619528619528619, "grad_norm": 13.6875, "learning_rate": 8.333333333333333e-8, "loss": 1.1669, "step": 2 }, { "epoch": 1, "gra...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
485,317,661,597,368,300
1
null
null
null
null
115.85906
500
232
false
true
true
[ { "epoch": 0.42953020134228187, "grad_norm": 5.40625, "learning_rate": 4.166666666666666e-8, "loss": 0.4544, "step": 1 }, { "epoch": 0.8590604026845637, "grad_norm": 5.40625, "learning_rate": 8.333333333333333e-8, "loss": 0.4636, "step": 2 }, { "epoch": 1.42953020...
1
232
0
116
10
{ "TrainerControl": { "args": { "should_epoch_stop": false, "should_evaluate": false, "should_log": false, "should_save": true, "should_training_stop": true }, "attributes": {} } }
133,227,408,936,402,940
1
null
null
null
null
109.85906
500
220
false
true
true
[{"epoch":0.42953020134228187,"grad_norm":4.4375,"learning_rate":4.545454545454545e-8,"loss":0.5185,(...TRUNCATED)
1
220
0
110
10
{"TrainerControl":{"args":{"should_epoch_stop":false,"should_evaluate":false,"should_log":false,"sho(...TRUNCATED)
146,568,203,953,242,100
1
null
null
null
null
115.85906
500
232
false
true
true
[{"epoch":0.42953020134228187,"grad_norm":6.875,"learning_rate":4.166666666666666e-8,"loss":0.5747,"(...TRUNCATED)
1
232
0
116
10
{"TrainerControl":{"args":{"should_epoch_stop":false,"should_evaluate":false,"should_log":false,"sho(...TRUNCATED)
138,321,463,118,659,580
1
null
null

No dataset card yet

Downloads last month
5