attentionnas, neural architecture search
AttentionNAS discovers efficient attention mechanisms and architectures jointly through neural architecture search.
3,145 technical terms and definitions
AttentionNAS discovers efficient attention mechanisms and architectures jointly through neural architecture search.
AttentiveNAS uses attention mechanisms to predict architecture performance from training statistics without full training.
Edit specific image attributes.
Attribute predictions to components.
Create music speech or sound effects using AI.
Fill in missing or corrupted portions of audio.
Match audio and video.
Learn associations between audio and vision.
Learn from audio and video together.
Use both audio and lip movements.
Determine if audio and video are synced.
Add extra dimensions for more expressive dynamics.
Auto-vectorization automatically generates SIMD code from scalar operations.
Ensemble of attacks for robust evaluation.
Autoencoder-based forecasting learns compressed representations of time series for prediction.
Autoencoders detect anomalies through reconstruction error with normal patterns reconstructed better than anomalies.
Autoformer for time series decomposes trends and seasonal components using auto-correlation mechanism instead of point-wise attention.
Autoformer applies NAS to vision transformer design discovering efficient attention patterns and layer configurations.
AutoGen facilitates multi-agent conversations for complex task solving.
AutoGPT demonstrates autonomous agent capabilities through iterative goal pursuit.
Autonomous agent that breaks goals into tasks.
Fix bugs automatically.
Fully automated content filtering.
Intelligently remove less important parts of context.
Automatically decide which operations to run in lower precision.
Automated machine learning: automatically select models hyperparameters and features.
Autonomous agents operate independently making decisions without human intervention.
Autonomous maintenance empowers operators to perform routine maintenance preventing deterioration.
Operators perform basic maintenance.
Autoregressive models detect anomalies through likelihood ratios or prediction errors on test sequences.
Combine autoregressive and diffusion approaches.
Flow models for sequential data.
Automatically learn slimmable networks.
AutoTVM automatically tunes operator implementations through machine learning-guided search.
Availability rate is actual operating time divided by planned production time.
Availability is percentage of time equipment is operational when needed.
Fraction of time tool is operational.
Approved Vendor List documents qualified suppliers authorized for specific materials or services ensuring consistent supply quality.
BabyAGI creates and executes task lists autonomously pursuing objectives.
Task-driven autonomous agent with priority queue.
Hidden trigger in training data that causes specific malicious behavior.
Insert triggers causing misclassification.
Learn background appearance.
Backorders are unfulfilled customer orders awaiting material availability.
Back-translation augments text data by translating to another language and back generating paraphrases for training.
Backward planning works from goal to current state identifying prerequisite steps.
Backward scheduling starts from due date working backward to determine start times.
Molecular representation.
Train models on bootstrap samples.
Baichuan is Chinese open source LLM. Good Chinese understanding.