snapshot graphs, graph neural networks
Snapshot graphs represent temporal networks as sequences of static graphs at discrete time points.
311 technical terms and definitions
Snapshot graphs represent temporal networks as sequences of static graphs at discrete time points.
SO(3) equivariant networks maintain rotational symmetries in 3D molecular and geometric data.
Soft defects are latent failures that manifest intermittently or under specific conditions requiring specialized dynamic fault localization techniques.
Soft routing weights expert outputs rather than selecting discrete subset.
Smooth approximation of ReLU.
Software pipelining overlaps iterations from different loop executions improving throughput.
Solder joint inspection uses X-ray or cross-sectioning to detect voids cracks and insufficient wetting.
Predict compound solubility.
Solvent distillation separates and purifies organic solvents based on boiling point differences.
Solvent recovery systems capture and purify organic solvents from process exhaust enabling reuse and reducing disposal costs.
Sort pooling orders nodes by learned structural roles enabling use of 1D convolutions for graph classification.
SortPool variants improve graph classification by learning node importance for sorted feature concatenation.
Locate sound sources using vision.
Adapt without access to source data.
Attention pattern where each token only attends to a subset of tokens rather than all.
Extract interpretable features.
Sparse mixture of experts activates subset of experts per input.
Model where only subset of parameters activate (MoE).
Sparse training maintains sparsity throughout training never materializing dense networks.
Different sparsity patterns for attention.
Convert dense model to MoE by splitting weights into experts.
Sparse weight averaging combines multiple sparse models improving generalization.
Maintain sparsity throughout training.
Spatial attention focuses on informative regions by weighting spatial locations.
Specialist agents focus on specific capabilities contributing expertise to team efforts.
Specification gaming exploits reward function loopholes achieving objectives harmfully.
Authorization to operate out-of-spec.
Graph convolutions using spectral methods.
Analyze graphs via eigenvalues.
Stabilize GAN training.
Normalize by largest singular value.
Normalize weights by spectral norm.
Spectral residual detects anomalies by computing deviations between original and smoothed frequency spectra in time series saliency maps.
Speculative decoding uses small draft model to propose tokens, large model verifies. Speeds up inference without quality loss.
Speculative decoding uses small draft model to propose tokens, large model verifies. 2-3x faster inference.
Generate multiple tokens in parallel then verify them to speed up inference.
Unused abstraction.
Speculative sampling uses draft model to predict tokens verified by target model.
Spend analysis examines procurement patterns identifying cost reduction and consolidation opportunities.
SphereNet uses spherical Bessel functions and spherical harmonics for SE(3)-equivariant molecular property prediction.
Spherical harmonics provide basis functions for rotationally equivariant feature representations in 3D.
Minimum time at peak temperature to activate dopants.
Networks using discrete spikes.
Split learning partitions models across parties computing collaboratively without sharing data.
Single Path One-Shot NAS trains supernet uniformly by sampling single paths improving architecture ranking accuracy.
Generate database queries from natural language.
Query-efficient black-box attack.
Squeeze-and-excitation blocks recalibrate channel-wise features through global pooling and gating.
Stochastic Recurrent Neural Network uses stochastic hidden states for modeling uncertainty in sequences.
Open-source latent diffusion model.