se3-equivariant gnn, graph neural networks
SE(3)-equivariant GNNs process molecular geometries maintaining equivariance to rotations translations and reflections.
311 technical terms and definitions
SE(3)-equivariant GNNs process molecular geometries maintaining equivariance to rotations translations and reflections.
Generate seamlessly tileable images.
Search space design determines the set of possible architectures balancing expressiveness with search efficiency.
Seasonal state space models represent seasonality through trigonometric or dummy variable state equations.
Secure aggregation computes collective statistics without revealing individual contributions.
Secure multi-party computation allows joint computation without revealing private inputs.
Seebeck effect imaging detects voltage gradients from thermal variations revealing resistive defects and current paths.
Seeds yield model accounts for systematic and random defects with separate density and clustering parameters.
Process segments recurrently.
Condition on semantic maps.
Choose which knowledge to transfer.
Selective prediction abstains from answering when confidence is insufficient.
Choose when to abstain from prediction.
Selective SSMs dynamically adjust state transitions based on input content.
Self-alignment improves models using their own generated critiques and revisions.
Combine capsules with attention.
Self-attentive Hawkes processes combine transformer architectures with temporal point process likelihoods for scalable event sequence modeling.
Model critiques its own outputs.
Distill model into itself.
Self-distillation uses model's own predictions as soft targets improving generalization.
Ensemble predictions for consistency.
Architectures designed for interpretability.
Use gates controlled by input.
Model temperature rise in device.
Self-Instruct generates training data by prompting models to create instructions and responses.
Self-monitoring tracks progress evaluating whether actions advance toward goals.
Self-paced learning automatically selects training samples based on current model capability starting with easier examples.
Let model select training order.
Self-supervised graph neural networks learn representations through pretext tasks without labeled data.
DINO MAE BEiT methods.
Self-training is a semi-supervised method where models generate pseudo-labels for unlabeled data to augment training sets iteratively.
Use model predictions on unlabeled data as training labels.
Alternative molecular representation.
Self-normalizing activation.
Semantic attention in heterogeneous GNNs learns importance weights for different metapaths or relation types.
Semantic caching matches semantically similar queries to cached results.
Search by meaning not keywords.
Semantic segmentation maps guide generation controlling regional content.
Find meaningful directions in latent space.
Semantic editing manipulates specific attributes in latent space for targeted modifications.
Encode meaning.
Semantic memory stores factual knowledge and learned concepts without temporal tags.
Semantic-aware metapath attention learns which metapaths are relevant for specific prediction tasks.
Semi-autonomous agents handle routine tasks but request help for complex decisions.
Generate multiple tokens per step.
Use limited target labels.
Approaches to maintaining fabrication tools.
SendGrid provides email API. Transactional, marketing.
Sentence Transformers produce semantically meaningful sentence embeddings.
Sentence Transformers (SBERT) library. Easy semantic similarity.