Currently, if there's an input level that is ended, TeX will try to remove it
as soon as it's finished. For example, the following will create an infinite
loop.
```
\def\a{\b1}
\def\b#1{\a}
\a
```
Not so when `tex.cprint()` or `tex.sprint()` is involved. The following code
causes a "TeX capacity exceeded" issue.
```
\def\a{\expandafter\b\directlua{tex.cprint(12,"1")}}
\def\b#1{\a}
\a
```
This causes a problem because internally LaTeX `\char_generate:nn` macro
(sometimes) uses `tex.cprint()` under the hood, and this creates issue such as
https://github.com/latex3/latex3/issues/1540 .
Do you think it would be reasonable modify the engine to pop the input level
(i.e. call `end_token_list()`) of type `cprint` or `sprint` ? I think this is
safe because, unlike the traditional reading from file with e.g. `\input`,
there is no newline character in a line created with `tex.sprint` or
`tex.cprint`, so there is no good reason to keep the empty suffix of a line
behind.
This may prove to be a nontrivial change however, because there are 7 calls to
`end_token_list()` scattered throughout the code. (Although I think only the
one call in `inputstack.c` and two calls in `expand.c` matters.)
There might be performance implication as well because of the additional
checking whether the token list ended --- although this may not be as large of
a concern as it appears because we can simply move the check to only when the
input limit is exceeded.
What do you think?