update text checkpoints
bc8d05d Di4C_ReDi1.ckpt Detected Pickle imports (19)
- "__builtin__.long",
- "tokenizers.Tokenizer",
- "tokenizers.models.Model",
- "typing.Any",
- "torch.FloatStorage",
- "omegaconf.listconfig.ListConfig",
- "torch._utils._rebuild_tensor_v2",
- "torch.DoubleStorage",
- "omegaconf.nodes.AnyNode",
- "collections.OrderedDict",
- "__builtin__.list",
- "omegaconf.base.ContainerMetadata",
- "omegaconf.base.Metadata",
- "omegaconf.dictconfig.DictConfig",
- "__builtin__.dict",
- "tokenizers.AddedToken",
- "collections.defaultdict",
- "transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast",
- "_codecs.encode"
How to fix it?
2.72 GB update text checkpoints ReDi1.ckpt Detected Pickle imports (19)
- "_codecs.encode",
- "omegaconf.dictconfig.DictConfig",
- "torch._utils._rebuild_tensor_v2",
- "omegaconf.listconfig.ListConfig",
- "omegaconf.nodes.AnyNode",
- "__builtin__.long",
- "tokenizers.Tokenizer",
- "torch.DoubleStorage",
- "typing.Any",
- "tokenizers.models.Model",
- "collections.OrderedDict",
- "tokenizers.AddedToken",
- "omegaconf.base.Metadata",
- "omegaconf.base.ContainerMetadata",
- "__builtin__.list",
- "collections.defaultdict",
- "__builtin__.dict",
- "transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast",
- "torch.FloatStorage"
How to fix it?
2.72 GB update text checkpoints ReDi2.ckpt Detected Pickle imports (19)
- "__builtin__.long",
- "tokenizers.Tokenizer",
- "tokenizers.models.Model",
- "typing.Any",
- "torch.FloatStorage",
- "omegaconf.listconfig.ListConfig",
- "torch._utils._rebuild_tensor_v2",
- "torch.DoubleStorage",
- "omegaconf.nodes.AnyNode",
- "collections.OrderedDict",
- "__builtin__.list",
- "omegaconf.base.ContainerMetadata",
- "omegaconf.base.Metadata",
- "omegaconf.dictconfig.DictConfig",
- "__builtin__.dict",
- "tokenizers.AddedToken",
- "collections.defaultdict",
- "transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast",
- "_codecs.encode"
How to fix it?
2.72 GB update text checkpoints ReDi3.ckpt Detected Pickle imports (19)
- "__builtin__.long",
- "tokenizers.Tokenizer",
- "tokenizers.models.Model",
- "typing.Any",
- "torch.FloatStorage",
- "omegaconf.listconfig.ListConfig",
- "torch._utils._rebuild_tensor_v2",
- "torch.DoubleStorage",
- "omegaconf.nodes.AnyNode",
- "collections.OrderedDict",
- "__builtin__.list",
- "omegaconf.base.ContainerMetadata",
- "omegaconf.base.Metadata",
- "omegaconf.dictconfig.DictConfig",
- "__builtin__.dict",
- "tokenizers.AddedToken",
- "collections.defaultdict",
- "transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast",
- "_codecs.encode"
How to fix it?
2.72 GB update text checkpoints