Dataset Viewer
Auto-converted to Parquet Duplicate
repo_id
stringlengths
15
89
file_path
stringlengths
27
180
content
stringlengths
1
2.23M
__index_level_0__
int64
0
0
hf_public_repos
hf_public_repos/tokenizers/LICENSE
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, ...
0
hf_public_repos
hf_public_repos/tokenizers/CITATION.cff
# This CITATION.cff file was generated with cffinit. # Visit https://bit.ly/cffinit to generate yours today! cff-version: 1.2.0 title: HuggingFace's Tokenizers message: >- Fast State-of-the-Art Tokenizers optimized for Research and Production. type: software authors: - given-names: Anthony family-names: Moi ...
0
hf_public_repos
hf_public_repos/tokenizers/README.md
<p align="center"> <br> <img src="https://huggingface.co/landing/assets/tokenizers/tokenizers-logo.png" width="600"/> <br> <p> <p align="center"> <img alt="Build" src="https://github.com/huggingface/tokenizers/workflows/Rust/badge.svg"> <a href="https://github.com/huggingface/tokenizers/blob/main/LI...
0
hf_public_repos
hf_public_repos/tokenizers/RELEASE.md
## How to release # Before the release Simple checklist on how to make releases for `tokenizers`. - Freeze `master` branch. - Run all tests (Check CI has properly run) - If any significant work, check benchmarks: - `cd tokenizers && cargo bench` (needs to be run on latest release tag to measure difference if it's ...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/Cargo.toml
[package] authors = ["Anthony MOI <m.anthony.moi@gmail.com>", "Nicolas Patry <patry.nicolas@protonmail.com>"] edition = "2018" name = "tokenizers" version = "0.15.1-dev.0" homepage = "https://github.com/huggingface/tokenizers" repository = "https://github.com/huggingface/tokenizers" documentation = "https://docs.rs/tok...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/rust-toolchain
stable
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/LICENSE
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, ...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/CHANGELOG.md
# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). ## [0.13.2] - Python only changes ## [0.13.1] - [#1072] Fixing ...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/Makefile
DATA_DIR = data BENCHMARK_DIR = benches TESTS_DIR = tests dir_guard=@mkdir -p $(@D) SHARED_RESOURCES = $(DATA_DIR)/gpt2-vocab.json $(DATA_DIR)/gpt2-merges.txt $(DATA_DIR)/bert-base-uncased-vocab.txt $(DATA_DIR)/big.txt $(DATA_DIR)/small.txt BENCHMARK_RESOURCES = $(SHARED_RESOURCES) TESTS_RESOURCES = $(SHARED_RESOURCE...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/README.md
<p align="center"> <br> <img src="https://huggingface.co/landing/assets/tokenizers/tokenizers-logo.png" width="600"/> <br> <p> <p align="center"> <img alt="Build" src="https://github.com/huggingface/tokenizers/workflows/Rust/badge.svg"> <a href="https://github.com/huggingface/tokenizers/blob/master/...
0
hf_public_repos/tokenizers
hf_public_repos/tokenizers/tokenizers/README.tpl
<p align="center"> <br> <img src="https://huggingface.co/landing/assets/tokenizers/tokenizers-logo.png" width="600"/> <br> <p> <p align="center"> <img alt="Build" src="https://github.com/huggingface/tokenizers/workflows/Rust/badge.svg"> <a href="https://github.com/huggingface/tokenizers/blob/master/...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/added_tokens.rs
mod common; use common::*; use tokenizers::tokenizer::AddedToken; #[test] fn add_tokens() { let mut tokenizer = get_empty(); assert_eq!( tokenizer.add_special_tokens(&[ AddedToken::from("<cls>", true), AddedToken::from("<sep>", true) ]), 2 ); assert_eq!...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/unigram.rs
#[cfg(not(debug_assertions))] use assert_approx_eq::assert_approx_eq; use std::collections::HashMap; use std::fs::read_to_string; use std::path::Path; #[cfg(not(debug_assertions))] use tokenizers::models::unigram::Lattice; use tokenizers::models::unigram::Unigram; use tokenizers::models::unigram::UnigramTrainer; use to...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/documentation.rs
use tokenizers::models::bpe::{BpeTrainerBuilder, BPE}; use tokenizers::normalizers::{Sequence, Strip, NFC}; use tokenizers::pre_tokenizers::byte_level::ByteLevel; use tokenizers::{AddedToken, TokenizerBuilder}; use tokenizers::{DecoderWrapper, NormalizerWrapper, PostProcessorWrapper, PreTokenizerWrapper}; use tokenizer...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/serialization.rs
mod common; use common::*; use tokenizers::decoders::byte_level::ByteLevel; use tokenizers::decoders::DecoderWrapper; use tokenizers::models::bpe::BPE; use tokenizers::models::wordlevel::WordLevel; use tokenizers::models::wordpiece::WordPiece; use tokenizers::models::ModelWrapper; use tokenizers::normalizers::bert::Be...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/training.rs
use tokenizers::models::bpe::BPE; use tokenizers::pre_tokenizers::whitespace::Whitespace; use tokenizers::{DecoderWrapper, NormalizerWrapper, PostProcessorWrapper, PreTokenizerWrapper}; use tokenizers::{Model, Tokenizer, TokenizerBuilder}; #[test] fn bpe_values_after_training() { let mut tokenizer = TokenizerBuild...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/from_pretrained.rs
#![cfg(feature = "http")] use tokenizers::{FromPretrainedParameters, Result, Tokenizer}; #[test] fn test_from_pretrained() -> Result<()> { let tokenizer = Tokenizer::from_pretrained("bert-base-cased", None)?; let encoding = tokenizer.encode("Hey there dear friend!", false)?; assert_eq!( encoding.ge...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/tests/offsets.rs
mod common; use common::*; use tokenizers::tokenizer::AddedToken; macro_rules! check_offsets { ($input: expr, $output:expr, $offset:expr, $result:expr) => { let offsets = $output.get_offsets()[$offset]; assert_eq!(&$input[offsets.0..offsets.1], $result); }; } #[test] fn byte_level_basic() { ...
0
hf_public_repos/tokenizers/tokenizers/tests
hf_public_repos/tokenizers/tokenizers/tests/common/mod.rs
use tokenizers::decoders::wordpiece::WordPiece as WordPieceDecoder; use tokenizers::models::bpe::BPE; use tokenizers::models::wordpiece::WordPiece; use tokenizers::normalizers::bert::BertNormalizer; use tokenizers::pre_tokenizers::bert::BertPreTokenizer; use tokenizers::pre_tokenizers::byte_level::ByteLevel; use tokeni...
0
hf_public_repos/tokenizers/tokenizers
hf_public_repos/tokenizers/tokenizers/examples/serialization.rs
use tokenizers::models::wordpiece::WordPiece; use tokenizers::{AddedToken, Tokenizer}; fn main() { let start = std::time::Instant::now(); let mut tokenizer = Tokenizer::new(WordPiece::default()); // Mix special and not special // You can make sure ids are in order, and special status is correct. l...
0
hf_public_repos/tokenizers/tokenizers/examples
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/Cargo.toml
[package] name = "unstable_wasm" version = "0.1.0" authors = ["Nicolas Patry"] edition = "2018" [lib] crate-type = ["cdylib", "rlib"] [features] default = ["console_error_panic_hook"] [dependencies] wasm-bindgen = "0.2.63" # The `console_error_panic_hook` crate provides better debugging of panics by # logging them ...
0
hf_public_repos/tokenizers/tokenizers/examples
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/README.md
<div align="center"> <h1><code>wasm-pack-template</code></h1> <strong>A template for kick starting a Rust and WebAssembly project using <a href="https://github.com/rustwasm/wasm-pack">wasm-pack</a>.</strong> <p> <a href="https://travis-ci.org/rustwasm/wasm-pack-template"><img src="https://img.shields.io/tr...
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/tests/web.rs
//! Test suite for the Web and headless browsers. #![cfg(target_arch = "wasm32")] extern crate wasm_bindgen_test; use wasm_bindgen_test::*; wasm_bindgen_test_configure!(run_in_browser); #[wasm_bindgen_test] fn pass() { assert_eq!(1 + 1, 2); }
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/LICENSE-APACHE
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution ...
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/index.html
<!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>Hello wasm-pack!</title> </head> <body> <noscript>This page contains webassembly and javascript content, please enable javascript in your browser.</noscript> <script src="./bootstrap.js"></script> </body> </html>
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/bootstrap.js
// A dependency graph that contains any wasm must all be imported // asynchronously. This `bootstrap.js` file does the single async import, so // that no one else needs to worry about it again. import("./index.js") .catch(e => console.error("Error importing `index.js`:", e));
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/index.js
import * as wasm from "unstable_wasm"; console.log(wasm.tokenize("ab")); console.log(wasm.tokenize("abc"));
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/webpack.config.js
const CopyWebpackPlugin = require("copy-webpack-plugin"); const path = require('path'); module.exports = { entry: "./bootstrap.js", output: { path: path.resolve(__dirname, "dist"), filename: "bootstrap.js", }, mode: "development", plugins: [ new CopyWebpackPlugin(['index.html']) ], };
0
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm
hf_public_repos/tokenizers/tokenizers/examples/unstable_wasm/www/README.md
<div align="center"> <h1><code>create-wasm-app</code></h1> <strong>An <code>npm init</code> template for kick starting a project that uses NPM packages containing Rust-generated WebAssembly and bundles them with Webpack.</strong> <p> <a href="https://travis-ci.org/rustwasm/create-wasm-app"><img src="https:...
0
End of preview. Expand in Data Studio

Dataset Card for "hf-stack-zyx"

More Information needed

Downloads last month
5