Skip to content
Advertisement

tokenize in python3.x

I have following codes in python2.x:

class _CHAIN(object):

    def __init__(self, execution_context=None):
        self.execution_context = execution_context

    def eat(self, toktype, tokval, rowcol, line, logical_line):        
        #some code and error checking


operations = _CHAIN(execution_context)

tokenize(StringIO(somevalue).readline, operations.eat)

Now problem is that in python3.x second argument does not exist. I need to call the function operations.eat() before tokenize. How can we perform the above task in python3.x. One idea is to directly call the function tokenize.eat() before ‘tokenize’ statement(last line of the code). But I am not sure about the arguments to be passed. I’m sure there must be better ways to do it.

Advertisement

Answer

Accoring to http://docs.python.org/py3k/library/tokenize.html, you should now use tokenize.tokenize(readline):

import tokenize
import io

class _CHAIN(object):

    def __init__(self, execution_context=None):
        self.execution_context = execution_context

    def eat(self, toktype, tokval, rowcol, line, logical_line):        
        #some code and error checking
        print(toktype, tokval, rowcol, line, logical_line)


operations = _CHAIN(None)

readline = io.StringIO('aaaa').readline

#Python 2 way:
#tokenize.tokenize(readline, operations.eat)

#Python 3 way:
for token in tokenize.generate_tokens(readline):
    operations.eat(token[0], token[1], token[2], token[3], token[4])
User contributions licensed under: CC BY-SA
4 People found this is helpful
Advertisement