Automatically "metatize" backend helper functions #56
Labels
enhancement
New feature or request
important
Features and issues that need to be addressed ASAP
meta graph
This issue involves the meta graph objects
It would be great if we could automatically transform backend helper/graph constructor functions into meta equivalents.
The problem
These "helper functions" are the standard Python functions found in the
theano.tensor
andtensorflow
modules (e.g.theano.tensor.eye
/tensorflow.linalg.eye
, etc.) . They're generally used to simplify the construction of graphs, which is actually done by creatingApply
/Operation
objects—as well as the objects underlying those (e.g.OpDef
s andNodeDef
s for TensorFlow and sometimesOp
s in Theano).When we want to craft a meta graph—from the ground up—corresponding to the kind of graph that one of these helper functions would produce (e.g. in the normal course of using TensorFlow or Theano), we unfortunately have to do this at the underlying
Op
/OpDef
level and effectively reproduce the code within these helper functions. In other words, we would like to have meta versions of these helper functions (e.g.mt.eye
as the meta version oftensorflow.eye
) that more-or-less do the same things as the originals, but use metaOp
s/OpDef
s instead, and—ideally—they would even work with logic variable inputs (e.g. anmt.eye
that's given a meta tensor with logic variables for shape and/or dtype).For instance,
tensorflow.abs
is one such function; it takes a tensor/graph as input and does a simple check to determine whichOpDef
(i.e.Abs
orComplexAbs
) to use when it constructs the output graph representing an absolute value of the input. That check is simply a condition on thedtype.is_complex
property of the input tensor.The steps in
tensorflow.abs
can largely be applied tosymbolic_pymc
's meta tensors without much/any changes. Even so,tensorflow.abs
will necessarily construct TensorFlow objects and not the corresponding meta ones we actually want.The question is: how do we [re]use as much of the existing helper function code as possible without entirely rewriting them? Of course, it could be quite an undertaking to cover every case, but there might be a few cheap work-arounds that help in more than a few cases.
FYI: This applies to both Theano and TensorFlow backends.
An example AST-based approach
In Theano,
tt.diagonal
is a plain function and won't produce meta objects or accept them as arguments. However, the implementation oftt.diagonal
is extremely simple: it just constructs anExtractDiag
operator instance. Since we can create a meta version of theExtractDiag
operator, we just needtt.diagonal
to use it.The following demonstrates how we could automatically convert some simple helper functions into meta function equivalents using straight-forward AST manipulation.
Here's the source for the original Theano function we want to make compatible with our meta objects:
Now, we run the AST transform:
and view the [source for the] transformed Theano helper function:
Simply put, we've automatically made the change from Theano's
ExtractDiag
to our own metaExtractDiag
(via the meta accessormt
).The following will create the transformed function in the current namespace/scope:
One major shortcoming to this approach involves how the converted meta objects are used. For instance, some conditions in these helper functions involve comparisons that aren't sound when performed with meta objects (e.g. inequalities involving fields populated by logic variables). However, in this case, it's possible that large sets of such restrictions could be lifted by implementing "typed" logic variables (e.g. logic variables in numeric/array-valued fields that implement numeric comparisons, etc.)
The text was updated successfully, but these errors were encountered: