2.5. Chapter Summary

  • The computational graph technology is introduced to machine learning frameworks in order to achieve a trade-off between programming flexibility and computational efficiency.

  • A computational graph contains tensors (as units of data) and operators (as units of operations).

  • A computational graph represents the computational logic and status of a machine learning model and offers opportunities for optimizations.

  • A computational graph is a directed acyclic graph. Operators in the graph are directly or indirectly dependent on or independent of each other, without circular dependencies.

  • Control flows, represented by conditional control and loop control, determines how data flows in a computational graph.

  • Computational graphs come in two types: static and dynamic.

  • Static graphs support easy model deployment, offering high computational efficiency and low memory footprint at the expense of debugging performance.

  • Dynamic graphs provide computational results on the fly, which increases programming flexibility and makes debugging easy for model optimization and iterative algorithm improvement.

  • We can appropriately schedule the execution of operators based on their dependencies reflected in computational graphs.

  • For operators that run independently, we can consider concurrent scheduling to achieve parallel computing. For operators with computational dependencies, schedule them to run in serial.

  • Specific training tasks of a computational graph can run synchronously or asynchronously. The asynchronous mechanism effectively improves the hardware efficiency and shortens the training time.

2.6. Further Reading