multi-domain rec, recommendation systems
Multi-domain recommendation jointly models user preferences across multiple domains leveraging shared patterns.
3,145 technical terms and definitions
Multi-domain recommendation jointly models user preferences across multiple domains leveraging shared patterns.
Multiple exits at different depths.
Multi-fidelity NAS evaluates architectures at different training lengths resolutions or data subsets for efficiency.
Approaches to use multiple GPUs.
Multi-horizon forecasting predicts multiple future time steps simultaneously or autoregressively.
Complete multiple lines at once.
Distributed across machines.
Optimize accuracy latency and size together.
Share key/value across heads to reduce memory and speed up inference.
Multi-resolution hash encoding stores features at multiple scales in hash tables.
Train on multiple resolutions.
Discriminators at different resolutions.
Multi-scale generation produces images at multiple resolutions simultaneously or progressively.
Adapt from multiple source domains.
Multiple checks at different points.
Use multiple steps to gradually bypass restrictions.
Gradually elicit harmful behavior.
Multi-style training uses diverse acoustic conditions during ASR training for robustness.
Adapt to multiple target domains.
Pre-train on multiple objectives simultaneously.
Train on multiple tasks together.
Learn from multiple teacher models.
Share resources across teams.
Multi-token prediction forecasts several future tokens enabling faster generation.
Multi-view learning leverages multiple representations or modalities of data to improve model robustness and performance.
Learn from different views of data.
Multilingual models handle multiple languages through diverse training data.
Single model for many language pairs.
Pre-train on many languages together.
Compress multimodal information.
Reasoning across modalities.
Combine information from modalities.
Combine text audio and visual cues.
Multimodal transformers process audio and visual sequences with cross-modal attention mechanisms.
Translate between modalities.
Diffusion over categorical distributions.
Multitask instruction learning trains on diverse tasks simultaneously improving generalization.
Multivariate temporal point processes model interdependent event sequences across multiple event types with cross-excitation.
Murphy yield model incorporates defect clustering through alpha parameter representing degree of spatial correlation.
Muse uses masked generative transformers for parallel image generation from text.
Music Transformer applies relative positional encoding to transformers enabling generation of long expressive musical sequences.
Multiple models learn from each other.
Mutually exciting point processes model how events of one type trigger events of other types.
N-BEATS is a neural basis expansion analysis architecture for interpretable time series forecasting using forward and backward forecast stacks.
Naive Bayes is simple probabilistic classifier. Fast, baseline.
Replace names to test bias.
Cell-based neural architecture search discovers repeatable computational blocks that are stacked for full networks.
NAS-Bench provides standardized benchmarks with pre-computed architecture performance metrics to enable reproducible and efficient NAS research.
Reinforcement learning agents for NAS explore architecture spaces using policy gradients to maximize validation performance.
Neural Architecture Search Without Training uses gradient magnitude statistics at initialization to predict architecture performance.