Friday, August 17, 2018

[TensorFlow] Rewriter_Config and Memory Optimization Passes

In the previous post as the below link, I mentioned that the default value of rewrite_config seems to change a little bit.
https://danny270degree.blogspot.com/2018/06/tensorflow-compare-memory-options-in.html

To clarify my doubt, I check the TensorFlow's memory_optimizer.cc and arrange the mapping table:


















So, for instance, if you choose to set your rewrite option as "SCHEDULING_HEURISTICS", then SchedulingPass will be triggered.
On more thing to remind, TensorFlow is to set 25 as the number of rewrite passes. It tries to avoid long processing times on graphs that simply won't fit in memory.

P.S: this is the definition of rewriter_config.proto, and it contains useful explanations about each options.
tensorflow/core/protobuf/rewriter_config.proto
enum MemOptType {
  // The default setting (SCHEDULING and SWAPPING HEURISTICS only)
  DEFAULT_MEM_OPT = 0;
  // Disabled in the meta-optimizer.
  NO_MEM_OPT = 1;
  // Driven by manual op-level annotations.
  MANUAL = 2;

  // Driven by heuristics. The behavior of these heuristics is subject to
  // change. Currently includes an experimental recomputation and swapping
  // heuristics. Manual annotations are respected, but additional nodes are
  // selected automatically.

  // Swapping heuristic will move a tensor from the GPU to the CPU and move
  // it back when needed to reduce peak memory usage.
  SWAPPING_HEURISTICS = 4;
  // Recomputation heuristics will recompute ops (such as Relu activation)
  // during backprop instead of storing them, reducing peak memory usage.
  RECOMPUTATION_HEURISTICS = 5;
  // Scheduling will split big ops such as AddN and try to enforce a schedule
  // of the new computations that decreases peak memory usage.
  SCHEDULING_HEURISTICS = 6;
  // Use any combination of swapping and recomputation heuristics.
  HEURISTICS = 3;
}

No comments: