← Back to AI Factory Chat

AI Factory Glossary

189 technical terms and definitions

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Showing page 1 of 4 (189 entries)

t-closeness, training techniques

T-closeness ensures sensitive attribute distribution within groups matches overall distribution.

t0, t0, foundation model

Zero-shot instruction-following model.

t0, t0, training techniques

T0 trains models on prompted datasets converted to unified text-to-text format.

t2i-adapter, generative models

Lightweight control adapter.

t5 (text-to-text transfer transformer),t5,text-to-text transfer transformer,foundation model

Treats all NLP tasks as text-to-text problems.

tail-free sampling, tfs, text generation

Remove low-probability tail.

take-back program, environmental & sustainability

Take-back programs enable manufacturers to collect used products for recycling or proper disposal.

talking heads attention, transformer

Learn to redistribute attention across heads.

task allocation, ai agents

Task allocation assigns responsibilities to agents based on capabilities and load.

task arithmetic, model merging

Add/subtract task vectors.

task decomposition, ai agents

Task decomposition breaks complex goals into manageable subtasks.

task diversity, training techniques

Task diversity in instruction tuning exposes models to varied problem types.

task-specific pre-training, transfer learning

Pre-train for specific downstream task.

taylor expansion pruning, model optimization

Taylor expansion pruning approximates loss change from removing weights using Taylor series.

tbats, tbats, time series models

TBATS combines Box-Cox transformation Fourier seasonality ARMA errors and trend for complex seasonal time series.

tcad model parameters, tcad, simulation

Physical parameters used in device/process simulation.

tcn, tcn, time series models

Temporal Convolutional Networks use dilated causal convolutions for sequence modeling with long effective history.

te-nas, te-nas, neural architecture search

Training-free ensemble NAS combines multiple zero-cost proxies improving architecture evaluation reliability.

teacher-student cl, advanced training

Teacher-student curriculum learning uses a teacher model to assess sample difficulty and guide curriculum design for student training.

teacher-student framework, model compression

General paradigm for distillation.

teacher-student training, model optimization

Teacher-student training transfers knowledge from complex to simple models through soft targets.

teaching assistant, model compression

Intermediate model between teacher and student.

team training,internal course,playbook

I can help you turn your knowledge into internal docs, playbooks, or mini-courses for onboarding your team.

technical debt identification, code ai

Find areas needing refactoring.

technical debt,refactor,maintain

AI tech debt: hacky prompts, hardcoded logic, missing tests. Schedule time to refactor and maintain.

temperature calibration,ai safety

Adjust temperature for better calibrated probabilities.

temperature distillation, model optimization

Temperature parameter in distillation softens predictions revealing relative class probabilities.

temperature in distillation, model compression

Soften probability distributions.

temperature-humidity-bias failure analysis, thb, failure analysis

Investigate THB failures.

temporal coding,neural architecture

Encode information in spike timing.

temporal consistency, multimodal ai

Temporal consistency ensures smooth transitions and coherent motion across video frames.

temporal fusion transformer, time series models

Temporal Fusion Transformer combines LSTM encoder-decoder with multi-head attention for interpretable multi-horizon time series forecasting with covariates.

temporal graph networks,graph neural networks

GNNs for dynamic graphs.

temporal information extraction, healthcare ai

Extract time-related medical info.

temporal point process gnn, graph neural networks

Temporal point process GNNs model event sequences on graphs through learned intensity functions.

temporal point process, time series models

Temporal point processes model event sequences in continuous time by specifying conditional intensity functions governing event occurrence rates.

temporal random walk, graph neural networks

Temporal random walks respect edge timestamps when sampling paths for representation learning.

temporal smoothing, graph neural networks

Temporal smoothing in dynamic graphs regularizes learned representations to change gradually over time.

tensor decomposition for chemistry, chemistry ai

Represent chemical tensors efficiently.

tensor decomposition, model optimization

Tensor decomposition factorizes weight tensors reducing parameters while maintaining capacity.

tensor field network, graph neural networks

Tensor field networks achieve equivariance through tensor product operations on irreducible representations of rotation groups.

tensor fusion, multimodal ai

Outer product of modality features.

tensor parallelism,model training

Split individual tensors/layers across devices.

tensor train, model optimization

Tensor train decomposition chains matrices through successive products for efficient compression.

tensorboard,visualize,training

TensorBoard visualizes training. Loss curves, histograms, graphs.

tensorflow lite, model optimization

TensorFlow Lite provides lightweight runtime for mobile and embedded deployment with optimization tools.

tensorrt-llm,deployment

NVIDIA's optimized library for LLM inference.

tensorrt, model optimization

TensorRT optimizes trained models for NVIDIA GPUs through fusion quantization and kernel selection.

ternary gradients, distributed training

Quantize gradients to -1 0 +1.

ternary networks, model optimization

Ternary networks use three-level weights providing expressiveness between binary and full precision.