Automatically Flatten & Unflatten Nested Containers

This post is about a small functionality that is found useful in TensorFlow / JAX / PyTorch.

Low-level components of these systems often use a plain list of values/tensors as inputs & outputs. However, end-users that develop models often want to work with more complicated data structures: Dict[str, Any], List[Any], custom classes, and their nested combinations. Therefore, we need bidirectional conversion between nested structures and a plain list of tensors. I found that different libraries invent similar approaches to solve this problem, and it's interesting to list them here.

Read more

TorchScript: Tracing vs. Scripting

PyTorch provides two methods to turn an nn.Module into a graph represented in TorchScript format: tracing and scripting. This article will:

  1. Compare their pros and cons, with a focus on useful tips for tracing.
  2. Try to convince you that torch.jit.trace should be preferred over torch.jit.script for deployment of non-trivial models.
Read more

Where Are Pixels? -- a Deep Learning Perspective

Technically, an image is a function that maps a continuous domain, e.g. a box , to intensities such as (R, G, B). To store it on computer memory, an image is discretized to an array array[H][W], where each element array[i][j] is a pixel.

How does discretization work? How does a discrete pixel relate to the abstract notion of the underlying continuous image? These basic questions play an important role in computer graphics & computer vision algorithms.

This article discusses these low-level details, and how they affect our CNN models and deep learning libraries. If you ever wonder which resize function to use or whether you should add/subtract 0.5 or 1 to some pixel coordinates, you may find answers here. Interestingly, these details have contributed to many accuracy improvements in Detectron and Detectron2.

Read more

Patching STB_GNU_UNIQUE of Buggy Binaries

开源工具链里有很多陈年小 "feature", 最初由于各种原因 (例如作为 workaround) 实现了之后, 即使语义模糊或设计不合理, 也因为兼容性被留到了今天.

Read more