context precision, rag
Context precision quantifies proportion of retrieved context that is useful.
1,005 technical terms and definitions
Context precision quantifies proportion of retrieved context that is useful.
Predict spatial relationships.
Predict spatial relationships between patches.
Remove less relevant context.
Context pruning selects most relevant portions of retrieved text.
Context recall measures if all necessary information was retrieved.
Are retrieved docs relevant to query.
Context relevance evaluates whether retrieved documents support answers.
Techniques to handle longer than training context.
Context window management maximizes information density within token limits.
Handle limited context length.
Manage long conversations: truncate old messages, summarize history, or use RAG for memory.
Context-aware recommendations incorporate situational factors like time location or device.
Use situational context.
Context length = max tokens model can see in one request (prompt + answer). Larger context = longer documents and multi-step conversations in one shot.
ContextNet is a fully convolutional architecture using squeeze-and-excitation modules for efficient on-device speech recognition.
Contextual augmentation uses language models to replace words with contextually appropriate alternatives for text augmentation.
Bandit with context/features for each decision.
Contextual bandits frame recommendation as sequential decision-making with context-dependent rewards for online learning.
Contextual compression removes irrelevant parts of retrieved documents.
Contextual decomposition attributes predictions to phrases or word combinations.
Generate embeddings that include surrounding context for better retrieval.
Contingency tables cross-tabulate categorical variables for independence testing.
Adapt models on edge over time.
Continual learning updates models on new data without forgetting old. Challenge: catastrophic forgetting.
Training strategy to learn new tasks sequentially without forgetting old ones.
Adapt continuously to test stream.
Continue is open source IDE AI assistant. VS Code, JetBrains.
Continuity chains verify metal line integrity through long serpentine resistors detecting opens.
Conservation of carriers.
Dynamically batch requests.
Continuous batching processes sequences of varying lengths efficiently by dynamic grouping.
Dynamically batch requests as they arrive instead of fixed batches.
Continuous batching adds new requests to running batch as slots free up. Maximizes GPU utilization for inference.
Continuous batching adds/removes requests dynamically. No waiting for batch completion. Higher throughput.
Continuous flow processes work without interruption maintaining constant movement through operations.
Ongoing incremental improvement.
Flows defined by ODEs.
Continuous-filter convolutions use radial basis functions to model distance-dependent interactions in molecular graphs.
Models operating in continuous time.
Natural language inference on contracts.
Automatically review contracts.
AI drafts contracts. NDA, terms, clauses. Always have lawyer review.
Contrast with weaker model.
Training algorithm for EBMs.
Contrastive divergence approximates maximum likelihood gradient for energy-based models using short MCMC chains from data points.
Show both positive and negative examples.
Explain why this class not another.
Contrastive explanations highlight differences between actual and foil outputs.
Learn representations of defect types.