Skip to content

Commit 9781137

Browse files
committedDec 9, 2021
README: Example of Python source tranformation on token stream level.
Signed-off-by: Paul Sokolovsky <[email protected]>
1 parent 1b8de37 commit 9781137

File tree

1 file changed

+40
-0
lines changed

1 file changed

+40
-0
lines changed
 

‎README.rst

+40
Original file line numberDiff line numberDiff line change
@@ -343,6 +343,46 @@ run it as module, using ``-m`` switch::
343343

344344
python3 -m imphook -i mod_funkw_naive -m example_funkw
345345

346+
And we get::
347+
348+
imphook's lambdaality is cool!
349+
350+
Oops! The word "lambdaality" is definitely cool, but that's not what
351+
we expected! It happens because the code just blindly replaces
352+
occurrances everywhere, including within string literals. We could
353+
try to work that around by using regular expression replace and match
354+
whole words, that would help with the case above, but would still
355+
replace lone "function" in the strings. Which makes us conclude:
356+
transforming surface representation of a program (i.e. a sequence
357+
of characters) is never an adequte method. We should operate on
358+
a more suitable program representation, and the baseline such
359+
representation is a sequence (or stream) of tokens. Let's use it:
360+
361+
mod_funkw.py::
362+
363+
import tokenize
364+
import imphook
365+
366+
def hook(filename):
367+
368+
def xform(token_stream):
369+
for t in token_stream:
370+
if t[0] == tokenize.NAME and t[1] == "function":
371+
yield (tokenize.NAME, "lambda") + t[2:]
372+
else:
373+
yield t
374+
375+
with open(filename, "rb") as f:
376+
# Fairly speaking, tokenizing just to convert back to string form
377+
# isn't too efficient, but CPython doesn't offer us a way to parse
378+
# token stream so far, so we have no choice.
379+
source = tokenize.untokenize(xform(tokenize.tokenize(f.readline)))
380+
mod = type(imphook)("")
381+
exec(source, vars(mod))
382+
return mod
383+
384+
imphook.add_import_hook(hook, (".py",))
385+
346386

347387
Credits and licensing
348388
---------------------

0 commit comments

Comments
 (0)