graph optimization, model optimization
Graph optimization transforms computation graphs improving efficiency through fusion reordering and elimination.
135 technical terms and definitions
Graph optimization transforms computation graphs improving efficiency through fusion reordering and elimination.
Downsample graphs in GNNs.
Graph recurrence applies RNNs to sequences of graph snapshots learning temporal graph dynamics.
Graph serialization stores model structure and parameters in portable format for deployment.
Graph U-Net applies encoder-decoder architecture with skip connections to graphs for node classification.
Upsample graphs.
Variational autoencoder for graphs.
Wavelet transforms on graphs.
Use graph networks for relational tasks.
Graph Autoregressive Flow generates graphs by sequentially adding nodes and edges with normalizing flows.
Sequential graph generation.
GraphNVP applies normalizing flows to graph generation enabling exact likelihood computation and efficient sampling of molecular structures.
GraphRNN generates graphs sequentially by modeling node and edge formation as a sequence generation problem using recurrent neural networks.
RNN-based graph generation.
GraphSAGE generates node embeddings by sampling and aggregating features from local neighborhoods enabling inductive learning on unseen graphs.
Inductive GNN with sampling.
Graph Transformer applies full self-attention over graph nodes with positional encodings for structural information.
Graph Variational Autoencoder generates graphs by learning latent distributions over graph structures.
Green chemistry principles minimize hazardous substances in semiconductor processes through alternative chemistries and process optimization.
Green solvents replace hazardous organic solvents with safer alternatives like supercritical CO2 or water-based solutions.
Try all combinations of hyperparameter values.
Sudden generalization long after overfitting.
Sudden jump in generalization long after training loss plateaus.
Convolutions over symmetry groups.
Grouped convolutions partition input channels into groups processing each independently reducing parameters.
Middle ground between MQA and multi-head attention.
Normalize within groups of channels.
Quantum search algorithm.
Graph Transformer Networks learn new graph structures through soft edge selection for heterogeneous graphs.
Framework for adding structure validation and safety to LLM outputs.
Guardrails constrain model behavior preventing specific undesired outputs.
Guardrails prevent unwanted model behavior. Topic restrictions, format requirements, safety filters.
Strength of conditioning guidance.
Guidance scale controls trade-off between prompt adherence and sample diversity in guided generation.
Modified backprop for visualization.