Read Maximum Munch Principle
"maximal munch" or "longest match" is the principle that when creating some construct, as much of the available input as possible should be consumed.
Every compiler has a tokenizer, which is a component that parses a source file into distinct tokens (keywords, operators, identifiers etc.). One of the tokenizer's rules is called "maximal munch", which says that the tokenizer should keep reading characters from the source file until adding one more character causes the current token to stop making sense