From /usr/lib/python3.5/ast.py: lines 38-46
def literal_eval(node_or_string): "" " Safely evaluate an expression node or a string containing a Python expression.The string or node provided may only consist of the following Python literal structures: strings, bytes, numbers, tuples, lists, dicts, sets, booleans, and None. "" " if isinstance(node_or_string, str): node_or_string = parse(node_or_string, mode = 'eval')
From /usr/lib/python3.5/ast.py: lines 47-48
if isinstance(node_or_string, Expression): node_or_string = node_or_string.body
From /usr/lib/python3.5/ast.py: lines 49-84
def _convert(node):
if isinstance(node, (Str, Bytes)):
return node.s
elif isinstance(node, Num):
return node.n
elif isinstance(node, Tuple):
return tuple(map(_convert, node.elts))
elif isinstance(node, List):
return list(map(_convert, node.elts))
elif isinstance(node, Set):
return set(map(_convert, node.elts))
elif isinstance(node, Dict):
return dict((_convert(k), _convert(v)) for k, v in zip(node.keys, node.values))
elif isinstance(node, NameConstant):
return node.value
elif isinstance(node, UnaryOp) and\
isinstance(node.op, (UAdd, USub)) and\
isinstance(node.operand, (Num, UnaryOp, BinOp)):
operand = _convert(node.operand)
if isinstance(node.op, UAdd):
return +operand
else:
return -operand
elif isinstance(node, BinOp) and\
isinstance(node.op, (Add, Sub)) and\
isinstance(node.right, (Num, UnaryOp, BinOp)) and\
isinstance(node.left, (Num, UnaryOp, BinOp)):
left = _convert(node.left)
right = _convert(node.right)
if isinstance(node.op, Add):
return left + right
else:
return left - right
raise ValueError('malformed node or string: ' + repr(node))
return _convert(node_or_string)
E.g.:
import ast
code_nocall = "1+1"
node = ast.parse(code_nocall, mode='eval')
body = node.body
print(type(body)) # Returns <class '_ast.BinOp'>
code_call = "print('hello')"
node = ast.parse(code_call, mode='eval')
body = node.body
print(type(body)) # Returns <class '_ast.Call'>
The best solution I have found so far, to not use eval
directly, is to perform the process manually. With this function:
import ast
def eval_code(code):
parsed = ast.parse(code, mode='eval')
fixed = ast.fix_missing_locations(parsed)
compiled = compile(fixed, '<string>', 'eval')
eval(compiled)
It is widely known that using eval() is a potential security risk so the use of ast.literal_eval(node_or_string) is promoted,70381/valueerror-malformed-string-when-using-ast-literal-eval,Why does it run on python 3 and not python 2? How can I fix it in python 2.7 without using the risky eval() function?,Java : Sort integer array without using Arrays.sort() 3 days ago
However In python 2.7 it returns ValueError: malformed string when running this example:
>>> ast.literal_eval("4 + 9")
Whereas in python 3.3 this example works as expected:
>>> ast.literal_eval('4+9')
13
An updated version of the answer from @poke that allows negative numbers in py3.x or other unary operators. So "-3" evaluates to -3 for example, rather than an error.
import ast, operator
binOps = {
ast.Add: operator.add,
ast.Sub: operator.sub,
ast.Mult: operator.mul,
ast.Div: operator.truediv,
ast.Mod: operator.mod
}
unOps = {
ast.USub: operator.neg
}
node = ast.parse(s, mode = 'eval')
def arithmetic_eval(s):
binOps = {
ast.Add: operator.add,
ast.Sub: operator.sub,
ast.Mult: operator.mul,
ast.Div: operator.truediv,
ast.Mod: operator.mod
}
unOps = {
ast.USub: operator.neg
}
node = ast.parse(s, mode = 'eval')
def _eval(node):
if isinstance(node, ast.Expression):
return _eval(node.body)
elif isinstance(node, ast.Str):
return node.s
elif isinstance(node, ast.Num):
The keyword arguments used for passing initializers to layers depends on the layer. Usually, it is simply kernel_initializer and bias_initializer:,Initializers define the way to set the initial random weights of Keras layers.,If you need to configure your initializer via various arguments (e.g. stddev argument in RandomNormal), you should implement it as a subclass of tf.keras.initializers.Initializer.,You can pass a custom callable as initializer. It must take the arguments shape (shape of the variable to initialize) and dtype (dtype of generated values):
from tensorflow.keras
import layers
from tensorflow.keras
import initializers
layer = layers.Dense(
units = 64,
kernel_initializer = initializers.RandomNormal(stddev = 0.01),
bias_initializer = initializers.Zeros()
)
layer = layers.Dense(
units = 64,
kernel_initializer = 'random_normal',
bias_initializer = 'zeros'
)
tf.keras.initializers.RandomNormal(mean = 0.0, stddev = 0.05, seed = None)
>>> # Standalone usage:
>>>
initializer = tf.keras.initializers.RandomNormal(mean = 0., stddev = 1.) >>>
values = initializer(shape = (2, 2))
>>> # Usage in a Keras layer:
>>>
initializer = tf.keras.initializers.RandomNormal(mean = 0., stddev = 1.) >>>
layer = tf.keras.layers.Dense(3, kernel_initializer = initializer)
tf.keras.initializers.RandomUniform(minval = -0.05, maxval = 0.05, seed = None)
ValueError: malformed node or string with ast.literal_eval() when adding a Keras layer,You need to JSON.parse() the resulting string from fs.readFile(). For example:,Make Entity Framework (using Linq queries) use alias of custom field instead of redoing the subquery,Train softmax layer after adding to intermediate layers of the (pretrained/non-trained) neural network in keras
console.log(data.data.quotes.USD.price);
console.log(typeof response) // result is string
var get = function(model, path, def) {
path = path || '';
model = model || {};
def = typeof def === 'undefined' ? '' : def;
var parts = path.split('.');
if (parts.length > 1 && typeof model[parts[0]] === 'object') {
return get(model[parts[0]], parts.splice(1).join('.'), def);
} else {
return model[parts[0]] || def;
}
}
data.msg.name[0]
{ { facture[0].id | json } }
fs.readFile('/tmp/foo.json', {
encoding: 'utf8'
}, function(err, data) {
if (err) throw err;
try {
data = JSON.parse(data);
} catch (ex) {
console.log('Error parsing json');
return;
}
console.log(data.server.module.InPluginPath);
});
1 week ago Dec 21, 2020 · 🐛 Bug When loading the canonical pretrained weights for bart.base, the following exception is raised: ValueError: malformed node or string: <ast.Name object at 0x7f2c0f467eb0> To Reproduce Steps to reproduce the behavior: from fairseq.mo... , 1 week ago May 21, 2020 · ValueError: malformed node or string: <_ast.Name object at 0x000001F62A851FC8> Woh. There’s something about this data that literal_eval doesn’t like. I wonder what it is? Let me do a little exception handling around literal_eval to see if I can find the row(s) where the function is breaking: , 5 days ago Malformed String ValueError ast.literal_eval() with String representation of Tuple - PYTHON [ Glasses to protect eyes while coding : https://amzn.to/3N1ISWI ... ,However In python 2.7 it returns ValueError: malformed string when running this example:
>>> ast.literal_eval("4 + 9")
import ast, operator binOps = {
ast.Add: operator.add,
ast.Sub: operator.sub,
ast.Mult: operator.mul,
ast.Div: operator.div,
ast.Mod: operator.mod
}
def arithmeticEval(s): node = ast.parse(s, mode = 'eval') def _eval(node): if isinstance(node, ast.Expression): return _eval(node.body) elif isinstance(node, ast.Str): return node.s elif isinstance(node, ast.Num): return node.n elif isinstance(node, ast.BinOp): return binOps[type(node.op)](_eval(node.left), _eval(node.right))
else: raise Exception('Unsupported type {}'.format(node)) return _eval(node.body)
raw_data = userfile.read().split('\n') for a in raw_data: print abtc_history.append(ast.literal_eval(a))
(Decimal('11.66985'), Decimal('0E-8')) Traceback (most recent call last): File "./goxnotify.py", line 74, in <module>main() File "./goxnotify.py", line 68, in mainlocal.load_user_file(username,btc_history) File "/home/unix-dude/Code/GoxNotify/local_functions.py", line 53, in load_user_filebtc_history.append(ast.literal_eval(a)) File "/usr/lib/python2.7/ast.py", line 80, in literal_evalreturn _convert(node_or_string) `File "/usr/lib/python2.7/ast.py", line 58, in _convert return tuple(map(_convert, node.elts)) File "/usr/lib/python2.7/ast.py", line 79, in _convert raise ValueError('malformed string') ValueError: malformed string
import ast import decimal source = "(Decimal('11.66985'), Decimal('1e-8'),"\"(1,), (1,2,3), 1.2, [1,2,3], {1:2})" tree = ast.parse(source, mode='eval') # using the NodeTransformer, you can also modify the nodes in the tree, # however in this example NodeVisitor could do as we are raising exceptions # only. class Transformer(ast.NodeTransformer):ALLOWED_NAMES = set(['Decimal', 'None', 'False', 'True'])ALLOWED_NODE_TYPES = set([ 'Expression', # a top node for an expression 'Tuple',# makes a tuple 'Call',# a function call (hint, Decimal()) 'Name',# an identifier... 'Load',# loads a value of a variable with given identifier 'Str', # a string literal 'Num', # allow numbers too 'List',# and list literals 'Dict',# and dicts...])def visit_Name(self, node): if not node.id in self.ALLOWED_NAMES: raise RuntimeError("Name access to %s is not allowed" % node.id) # traverse to child nodes return self.generic_visit(node)def generic_visit(self, node): nodetype = type(node).__name__ if nodetype not in self.ALLOWED_NODE_TYPES: raise RuntimeError("Invalid expression: %s not allowed" % nodetype) return ast.NodeTransformer.generic_visit(self, node) transformer = Transformer() # raises RuntimeError on invalid code transformer.visit(tree) # compile the ast into a code object clause = compile(tree, '<AST>', 'eval') # make the globals contain only the Decimal class, # and eval the compiled object result = eval(clause, dict(Decimal=decimal.Decimal)) print(result)
my_string = my_string.replace(':false', ':False').replace(':true', ':True') ast.literal_eval(my_string)
This layer translates a set of arbitrary strings into integer output via a table-based vocabulary lookup. This layer will perform no splitting or transformation of input strings. For a layer than can split and tokenize natural language, see the TextVectorization layer.,This example demonstrates how to use the vocabulary of a standard lookup layer to create an inverse lookup layer.,Calling adapt() on a StringLookup layer is an alternative to passing in a precomputed vocabulary on construction via the vocabulary argument. A StringLookup layer should always be either adapted over a dataset or supplied with a vocabulary.,This example creates a lookup layer and generates the vocabulary by analyzing the dataset.
View aliases
Main aliases
tf.keras.layers.StringLookup(
max_tokens = None,
num_oov_indices = 1,
mask_token = None,
oov_token = '[UNK]',
vocabulary = None,
idf_weights = None,
encoding = None,
invert = False,
output_mode = 'int',
sparse = False,
pad_to_max_tokens = False, **
kwargs
)
This example creates a lookup layer with a pre-existing vocabulary.
vocab = ["a", "b", "c", "d"]
data = tf.constant([["a", "c", "d"], ["d", "z", "b"]])
layer = tf.keras.layers.StringLookup(vocabulary=vocab)
layer(data)
<tf.Tensor: shape=(2, 3), dtype=int64, numpy=array([[1, 3, 4], [4, 0, 2]])>
This example creates a lookup layer and generates the vocabulary by analyzing the dataset.
data = tf.constant([
["a", "c", "d"],
["d", "z", "b"]
])
layer = tf.keras.layers.StringLookup()
layer.adapt(data)
layer.get_vocabulary()['[UNK]', 'd', 'z', 'c', 'b', 'a']
This example demonstrates how to use a lookup layer with multiple OOV indices. When a layer is created with more than one OOV index, any OOV values are hashed into the number of OOV buckets, distributing OOV values in a deterministic fashion across the set.
vocab = ["a", "b", "c", "d"]
data = tf.constant([["a", "c", "d"], ["m", "z", "b"]])
layer = tf.keras.layers.StringLookup(vocabulary=vocab, num_oov_indices=2)
layer(data)
<tf.Tensor: shape=(2, 3), dtype=int64, numpy=array([[2, 4, 5], [0, 1, 3]])>
Configure the layer with output_mode='one_hot'
. Note that the first
num_oov_indices
dimensions in the ont_hot encoding represent OOV values.
vocab = ["a", "b", "c", "d"]
data = tf.constant(["a", "b", "c", "d", "z"])
layer = tf.keras.layers.StringLookup(
vocabulary=vocab, output_mode='one_hot')
layer(data)
<tf.Tensor: shape=(5, 5), dtype=float32, numpy=array([[0., 1., 0., 0., 0.], [0., 0., 1., 0., 0.], [0., 0., 0., 1., 0.], [0., 0., 0., 0., 1.], [1., 0., 0., 0., 0.]], dtype=float32)>