Conditional Batch Normalization (CBN)

Keywords: conditional batch normalization, neural architecture

Conditional Batch Normalization (CBN) is a batch normalization variant where the affine parameters ($gamma, eta$) are predicted by a conditioning input — allowing the normalization to adapt based on class labels, text descriptions, or other conditioning information.

How Does CBN Work?

- Standard BN: Fixed learned $gamma, eta$ per channel.
- CBN: $gamma = f_gamma(c)$, $eta = f_eta(c)$ where $c$ is the conditioning variable and $f$ is typically a linear layer.
- Conditioning: Class label (one-hot), text embedding, noise vector, or any other signal.
- Used In: Conditional GANs, BigGAN, text-to-image generation.

Why It Matters

- Conditional Generation: Enables class-conditional image generation by modulating normalization statistics per class.
- BigGAN: CBN is the primary conditioning mechanism in BigGAN for generating class-specific images.
- Efficiency: Only the $gamma, eta$ parameters change per condition — the rest of the network is shared.

CBN is normalization that listens to instructions — dynamically adjusting feature statistics based on what you want the network to produce.

Want to learn more?

Search 13,225+ semiconductor and AI topics or chat with our AI assistant.

Search Topics Chat with CFSGPT