bitbake: codeparser.py: support deeply nested tokens

For shell constructs like
   echo hello & wait $!
the process_tokens() method ended up with a situation where "token"
in the "name, value = token" assignment was a list of tuples
and not the expected tuple, causing the assignment to fail.

There were already two for loops (one in _parse_shell(), one in
process_tokens()) which iterated over token lists. Apparently the
actual nesting can also be deeper.

Now there is just one such loop in process_token_list() which calls
itself recursively when it detects that a list entry is another list.

As a side effect (improvement?!) of the loop removal in
_parse_shell(), the local function definitions in process_tokens() get
executed less often.

Fixes: [YOCTO #10668]

(Bitbake rev: d18a74de9ac75ba32f84c40620ca9d47c1ef96a3)

Signed-off-by: Patrick Ohly <patrick.ohly@intel.com>
Signed-off-by: Richard Purdie <richard.purdie@linuxfoundation.org>
This commit is contained in:
Patrick Ohly 2016-11-18 16:23:22 +01:00 committed by Richard Purdie
parent 38438b6cf4
commit caf1a69577
1 changed files with 17 additions and 12 deletions

View File

@ -342,8 +342,7 @@ class ShellParser():
except pyshlex.NeedMore:
raise sherrors.ShellSyntaxError("Unexpected EOF")
for token in tokens:
self.process_tokens(token)
self.process_tokens(tokens)
def process_tokens(self, tokens):
"""Process a supplied portion of the syntax tree as returned by
@ -389,18 +388,24 @@ class ShellParser():
"case_clause": case_clause,
}
for token in tokens:
name, value = token
try:
more_tokens, words = token_handlers[name](value)
except KeyError:
raise NotImplementedError("Unsupported token type " + name)
def process_token_list(tokens):
for token in tokens:
if isinstance(token, list):
process_token_list(token)
continue
name, value = token
try:
more_tokens, words = token_handlers[name](value)
except KeyError:
raise NotImplementedError("Unsupported token type " + name)
if more_tokens:
self.process_tokens(more_tokens)
if more_tokens:
self.process_tokens(more_tokens)
if words:
self.process_words(words)
if words:
self.process_words(words)
process_token_list(tokens)
def process_words(self, words):
"""Process a set of 'words' in pyshyacc parlance, which includes