-
1.53 kB
new_tokenizer
-
28 Bytes
initial commit
-
1.43 kB
1122
-
1.78 GB
1122
-
892 MB
1122
rng_state.pth
Detected Pickle imports (7)
- "collections.OrderedDict",
- "torch.ByteStorage",
- "numpy.ndarray",
- "_codecs.encode",
- "torch._utils._rebuild_tensor_v2",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct"
How to fix it?
14.5 kB
1122
-
623 Bytes
1122
-
1.79 kB
tokenizer
-
1.82 MB
tokenizer3
-
1.96 kB
tokenizer3
-
2.86 kB
1122
training_args.bin
Detected Pickle imports (6)
- "transformers.training_args.TrainingArguments",
- "transformers.trainer_utils.SchedulerType",
- "transformers.training_args.OptimizerNames",
- "torch.device",
- "transformers.trainer_utils.HubStrategy",
- "transformers.trainer_utils.IntervalStrategy"
How to fix it?
2.99 kB
1122