calibration

Keywords: calibration,metrology

Calibration adjusts tool measurements to match known standards, ensuring accuracy and traceability in semiconductor metrology. Process: (1) measure reference standard with known value, (2) compare indicated value to certified value, (3) calculate offset/gain corrections, (4) apply corrections to tool algorithms, (5) verify with independent standard. Types: (1) Zero/offset calibration—correct systematic bias; (2) Gain/span calibration—correct sensitivity across measurement range; (3) Linearity calibration—multi-point correction across range; (4) Cross-talk calibration—correct interference between measurement channels. Frequency: daily (critical tools), weekly (stable tools), after PM, after major component replacement. Calibration hierarchy: primary standards (national labs) → secondary standards (accredited labs) → working standards (fab). Documentation: calibration certificates, measurement uncertainty, traceability chain, validity period. SPC on calibration data: monitor bias drift, detect tool degradation. Auto-calibration: built-in routines using internal references (e.g., CD-SEM stage calibration using pitch standards, ellipsometer with known oxide). Out-of-calibration response: quarantine tool, recalibrate, remeasure affected wafers. Maintains measurement accuracy essential for process control, specification compliance, and cross-tool matching.

Want to learn more?

Search 13,225+ semiconductor and AI topics or chat with our AI assistant.

Search Topics Chat with CFSGPT