Module makepad_live_tokenizer::tokenizer
source · Expand description
This module contains code for tokenizing Rust code.
The tokenizer in this module supports lazy tokenization. That is, it has an explicit state, which can be recorded at the start of each line. Running the tokenizer with the same starting state on the same line will always result in the same sequence of tokens. This means that if neither the contents nor the starting state of the tokenizer changed for a given line, that line does not need to be retokenized.
The tokenizer consumes one token at a time. The only exception to this are multiline tokens, such as comments and strings, which are broken up into separate tokens for each line. Consequently, the only time the tokenizer can end up in a state other than the initial state is when it is in the middle of tokenizing a multiline token and runs into the end of the line before it finds the end of the token.
Structs
- A cursor over a slice of chars.
- The state of the tokenizer when it is in the middle of a double quoted string.
- The state of the tokenizer when it is not in the middle of any token.
- The state of the tokenizer when it is in the middle of a raw double quoted string.
Enums
- The state of the tokenizer.