Fetching metadata from the HF Docker repository...
Update app.py
e733852 - 1.52 kB initial commit
- 1.71 kB Upload 17 files
- 1.38 kB Upload 17 files
- 232 Bytes Upload 17 files
- 5.23 kB Update app.py
bp_model.pkl Detected Pickle imports (5)
- "BackPropogation.BackPropogation",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar"
How to fix it?
4.3 kB Upload 17 files bp_tokeniser.pkl Detected Pickle imports (4)
- "collections.defaultdict",
- "builtins.int",
- "collections.OrderedDict",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
4.99 MB Upload 17 files - 392 MB Upload 17 files
- 80.9 kB Upload 17 files
dnn_tokeniser.pkl Detected Pickle imports (4)
- "collections.defaultdict",
- "builtins.int",
- "collections.OrderedDict",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
287 kB Upload 17 files - 41.2 MB Upload 17 files
lstm_tokeniser.pkl Detected Pickle imports (4)
- "collections.defaultdict",
- "builtins.int",
- "collections.OrderedDict",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
4.53 MB Upload 17 files ppn_model.pkl Detected Pickle imports (4)
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "Perceptron.Perceptron"
How to fix it?
2.27 kB Upload 17 files ppn_tokeniser.pkl Detected Pickle imports (4)
- "collections.defaultdict",
- "builtins.int",
- "collections.OrderedDict",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
4.85 MB Upload 17 files - 102 Bytes Upload 17 files
- 2.24 MB Upload 17 files
rnn_tokeniser.pkl Detected Pickle imports (4)
- "collections.defaultdict",
- "builtins.int",
- "collections.OrderedDict",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
287 kB Upload 17 files