commit files to HF hub
7a7852f
-
1.38 kB
initial commit
-
753 Bytes
commit files to HF hub
-
1.12 kB
commit files to HF hub
-
360 kB
commit files to HF hub
-
467 MB
commit files to HF hub
-
5.07 MB
commit files to HF hub
-
279 Bytes
commit files to HF hub
-
480 Bytes
commit files to HF hub
vi_deberta_base_checkpoint_1.pt
Detected Pickle imports (27)
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2SelfOutput",
- "transformers.models.deberta_v2.modeling_deberta_v2.StableDropout",
- "collections.OrderedDict",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Intermediate",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.deberta_v2.configuration_deberta_v2.DebertaV2Config",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Model",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Attention",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Layer",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2OnlyMLMHead",
- "torch._C._nn.gelu",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Output",
- "torch.LongStorage",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2PredictionHeadTransform",
- "__builtin__.set",
- "torch.nn.modules.linear.Linear",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Encoder",
- "transformers.models.deberta_v2.modeling_deberta_v2.DisentangledSelfAttention",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.FloatStorage",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Embeddings",
- "torch.nn.modules.container.ModuleList",
- "transformers.activations.GELUActivation",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2ForMaskedLM",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2LMPredictionHead",
- "torch._utils._rebuild_parameter"
How to fix it?
467 MB
commit files to HF hub
vi_deberta_base_checkpoint_2.pt
Detected Pickle imports (27)
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Layer",
- "collections.OrderedDict",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2OnlyMLMHead",
- "transformers.models.deberta_v2.modeling_deberta_v2.StableDropout",
- "transformers.models.deberta_v2.modeling_deberta_v2.DisentangledSelfAttention",
- "transformers.activations.GELUActivation",
- "transformers.models.deberta_v2.configuration_deberta_v2.DebertaV2Config",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Attention",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Output",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2SelfOutput",
- "torch.nn.modules.container.ModuleList",
- "torch._C._nn.gelu",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2ForMaskedLM",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Embeddings",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2LMPredictionHead",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2PredictionHeadTransform",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Model",
- "torch.LongStorage",
- "torch.FloatStorage",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Encoder",
- "torch.nn.modules.linear.Linear",
- "torch._utils._rebuild_parameter",
- "__builtin__.set",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Intermediate"
How to fix it?
467 MB
commit files to HF hub
vi_deberta_base_checkpoint_3.pt
Detected Pickle imports (27)
- "torch._C._nn.gelu",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Intermediate",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Attention",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Output",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2ForMaskedLM",
- "torch.nn.modules.container.ModuleList",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Model",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.linear.Linear",
- "transformers.models.deberta_v2.configuration_deberta_v2.DebertaV2Config",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2LMPredictionHead",
- "collections.OrderedDict",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2OnlyMLMHead",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Encoder",
- "transformers.models.deberta_v2.modeling_deberta_v2.StableDropout",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2SelfOutput",
- "transformers.models.deberta_v2.modeling_deberta_v2.DisentangledSelfAttention",
- "torch.FloatStorage",
- "__builtin__.set",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2PredictionHeadTransform",
- "torch.LongStorage",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Layer",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Embeddings",
- "transformers.activations.GELUActivation"
How to fix it?
467 MB
commit files to HF hub
vi_deberta_base_checkpoint_4.pt
Detected Pickle imports (27)
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2PredictionHeadTransform",
- "torch.LongStorage",
- "transformers.models.deberta_v2.configuration_deberta_v2.DebertaV2Config",
- "torch.nn.modules.container.ModuleList",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Embeddings",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2SelfOutput",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Intermediate",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2ForMaskedLM",
- "torch.nn.modules.sparse.Embedding",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2OnlyMLMHead",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Attention",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Layer",
- "torch.nn.modules.linear.Linear",
- "torch._utils._rebuild_tensor_v2",
- "collections.OrderedDict",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Encoder",
- "__builtin__.set",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Model",
- "torch._utils._rebuild_parameter",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2LMPredictionHead",
- "torch.FloatStorage",
- "transformers.models.deberta_v2.modeling_deberta_v2.DisentangledSelfAttention",
- "transformers.models.deberta_v2.modeling_deberta_v2.DebertaV2Output",
- "transformers.activations.GELUActivation",
- "transformers.models.deberta_v2.modeling_deberta_v2.StableDropout",
- "torch._C._nn.gelu"
How to fix it?
467 MB
commit files to HF hub