llama_cpp_sys_4

Function llama_tokenize

Source
pub unsafe extern "C" fn llama_tokenize(
    model: *const llama_model,
    text: *const c_char,
    text_len: i32,
    tokens: *mut llama_token,
    n_tokens_max: i32,
    add_special: bool,
    parse_special: bool,
) -> i32
Expand description

@details Convert the provided text into tokens. @param tokens The tokens pointer must be large enough to hold the resulting tokens. @return Returns the number of tokens on success, no more than n_tokens_max @return Returns a negative number on failure - the number of tokens that would have been returned @param add_special Allow to add BOS and EOS tokens if model is configured to do so. @param parse_special Allow tokenizing special and/or control tokens which otherwise are not exposed and treated as plaintext. Does not insert a leading space.