k-anonymity, training techniques
K-anonymity ensures each record is indistinguishable from at least k-1 others.
82 technical terms and definitions
K-anonymity ensures each record is indistinguishable from at least k-1 others.
Each record is indistinguishable from k-1 others.
K-means partitions data into specified number of clusters minimizing within-group variance.
At least k of n must work.
k-dimensional Weisfeiler-Lehman test extends standard WL providing stronger graph isomorphism testing.
Kaggle hosts ML competitions. Datasets, notebooks, community.
Kaizen events are focused improvement workshops targeting specific problems over short duration.
Kaizen suggestions propose incremental improvements from frontline workers.
Kaizen is continuous incremental improvement philosophy engaging all employees.
Kalman filter provides optimal linear state estimation for Gaussian state space models through recursive prediction and update steps.
Visual signal for production control.
Kanban cards signal need for material replenishment in pull production systems.
Kanban is a pull-based inventory management system using visual signals to trigger material replenishment based on consumption.
Kappa statistic measures agreement between raters accounting for chance agreement.
Region avoiding TSV proximity.
Four-terminal measurement to eliminate probe resistance.
Measure work function and surface potential.
Non-contact work function measurement.
Kelvin probing uses four-terminal sensing with separate current and voltage probes to eliminate contact resistance in resistance measurements.
Keras is high-level TensorFlow API. Easy to use.
Combine operations into single kernel.
Kernel fusion merges adjacent GPU operations eliminating intermediate memory writes.
Combine multiple operations into single GPU kernel for efficiency.
Kernel fusion combines multiple operations into one GPU kernel. Reduces memory traffic. Major optimization.
Set grid and block dimensions.
Measure individual kernel performance.
View attention as memory lookup.
Choose representative frames.
Knowledge Graph Attention Network combines collaborative filtering with knowledge graph embedding using attention mechanisms.
Minimum size that causes failures.
Killer defects are critical flaws causing immediate device failure affecting yield.
Defect large or critical enough to cause chip failure.
Simulate time evolution of processes.
Sudden change in I-V curve due to floating body.
Kirkendall voids form at bond interfaces from differential diffusion rates weakening interconnections.
Augment LM with nearest neighbor retrieval.
K-nearest neighbors classifies by proximity. No training, just lookup.
Distill large model to small for edge.
Different distillation approaches.
Knowledge distillation trains compact student models to mimic larger teacher model predictions.
Train small student model to mimic large teacher model.
Modify specific facts in models.
Update model's factual knowledge without full retraining.
Extract knowledge without full replication.
How up-to-date information is.
Knowledge graph embeddings map entities and relations to vectors for link prediction and reasoning.
Represent entities and relations as vectors.
Learn representations of KG entities and relations.
Knowledge graph-based recommendations leverage entity relationships for enhanced item understanding and cold-start handling.
Verbalize knowledge graphs.