What is the recommended way to mix TensorFlow and TensorFlow Federated code?

旧城冷巷雨未停 提交于 2019-12-30 11:07:42

问题


TensorFlow (TF) and TensorFlow Federated (TFF) are different layers of functionality that are designed to play well together (as the names implie).

Still, they are different things designed to solve different problems.

I wonder what is the best way to describe computation in a way that can be used by both vanilla TF and in TFF workloads, as well as the kind of pitfalls that one might want to avoid.


回答1:


Great question. Indeed, there are at least 3 ways to approach composition of TensorFlow code for use with TFF, each with its own merits.

  1. Using TensorFlow's compositional mechanism (defuns) is the recommended way, assuming it works for your specific situation. TensorFlow already has mechanisms for composing code, and we don't want to reinvent the wheel. The reason we've created our own compositional mechanism in TFF (@tff.tf_computation) was to deal with specific limitations (such as the lack of support for data sets at the interface level in TF, and the need for TF components to interoperate with the rest of TFF), and we'd ideally limit the use of this mechanism to just the situations that really call for it.

When possible, decorate TensorFlow components using @tf.function, and wrap the entire TensorFlow block as a @tff.tf_computation only at the top level, before embedding it in a @tff.federated_computation. One of the many benefits of this is that it allows you to test components outside of TFF, using standard TensorFlow tools.

So, the following is encouraged and preferred:

# here using TensorFlow's compositional mechanism (defuns)
# rather than TFF's to decorate "foo"
@tf.function(...)
def foo(...):
  ...

@tff.tf_computation(...)
def bar(...):
  # here relying on TensorFlow to embed "foo" as a component of "bar"
  ...foo(...)...
  1. Using Python's compositional mechanism (plain undecorated Python functions) is also a good option, although it's less preferable than (1), since it just causes one body of code to be embedded within the other at the definition time as TFF traces through all the TFF-decorated Python functions to construct a serialized representation of the computation to execute, without offering you isolation or any other special benefits.

You may still want to use this pattern to allow your components to be tested outside of TFF, or in situations where neither (1) or (3) works.

So, the following is an alternative you should consider first if (1) doesn't work:

# here composing things in Python, no special TF or TFF mechanism employed
def foo(...):
  # keep in mind that in this case, "foo" can access and tamper with
  # the internal state of "bar" - you get no isolation benefits
  ... 

@tff.tf_computation(...)
def bar(...):
  # here effectively just executing "foo" within "bar" at the
  # time "bar" is traced
  ...foo(...)...
  1. Using TFF's compositional mechanism (@tff.tf_computation) is not recommended, except - as noted above - in situations that require it, such as when a TensorFlow component needs to accept a data set as a parameter, or if it's going to be invoked only from a @tff.federated_computation. Keep in mind that TFF's support for data sets as parameters is still experimental, and while in some cases it may be the only solution, you may still run into problems. You can expect the implementation to evolve.

Not encouraged (although currently sometimes necessary):

# here using TFF's compositional mechanism
@tff.tf_computation(...)
def foo(...):
  # here you do get isolation benefits - "foo" is traced and
  # serialized by TFF, but you can expect that e.g., some
  # tf.data.Dataset features won't work
  ...

@tff.tf_computation(...)
def bar(...):
  # here relying on TFF to embed "foo" within "bar"
  ...foo(...)...


来源:https://stackoverflow.com/questions/55286731/what-is-the-recommended-way-to-mix-tensorflow-and-tensorflow-federated-code

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!