Fix bug in lexer_next_number not correctly tracking character number
All checks were successful
Validate the build / validate-build (push) Successful in 28s

When a number has a suffix the lexer state didn't record the number of
characters consumed for this suffix. This made the lexer state be 2-3
characters short in its line location reporting until it encountered a
newline character. It did not otherwise corrupt the state of the lexer.
This commit is contained in:
omicron 2025-04-04 19:45:34 +02:00
parent 27099c9899
commit f1f4c93a8e

View File

@ -299,6 +299,7 @@ error_t *lexer_next_number(lexer_t *lex, lexer_token_t *token) {
token->explanation =
"Number length exceeds the maximum of 128 characters";
}
lex->character_number += n;
so_far += n;
if (n == 0) {
token->id = TOKEN_ERROR;
@ -328,10 +329,11 @@ error_t *lexer_next_number(lexer_t *lex, lexer_token_t *token) {
token->id = TOKEN_ERROR;
token->explanation =
"Number length exceeds the maximum of 128 characters";
} else {
lex->character_number += suffix_length;
}
}
lex->character_number += n;
token->value = strdup(buffer);
return nullptr;
}