← Back to AI Factory Chat

AI Factory Glossary

9,967 technical terms and definitions

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Showing page 78 of 200 (9,967 entries)

hierarchical attention, transformer

Multi-level attention structure.

hierarchical clustering, manufacturing operations

Hierarchical clustering creates tree-structured groupings at multiple similarity levels.

hierarchical context, llm architecture

Multi-level context organization.

hierarchical federated learning, federated learning

Multi-level federation structure.

hierarchical fusion, multimodal ai

Multi-level fusion strategy.

hierarchical moe, moe

Multi-level expert organization.

hierarchical optimization, optimization

Optimize at different levels sequentially.

hierarchical planning, ai agents

Hierarchical planning operates at multiple abstraction levels from high-level goals to low-level actions.

hierarchical pooling, graph neural networks

Hierarchical pooling creates multi-resolution graph representations through successive coarsening operations.

hierarchical rl, reinforcement learning

Decompose tasks into subtasks.

hierarchical rl, reinforcement learning advanced

Hierarchical reinforcement learning decomposes tasks into subtasks with multiple levels of temporal abstraction.

hierarchical sampling, 3d vision

Sample coarse then fine.

hifi-gan, audio & speech

HiFi-GAN is a generative adversarial network vocoder that synthesizes high-fidelity audio efficiently using multi-scale discriminators.

high availability (ha),high availability,ha,reliability

System remains operational despite failures.

high bandwidth memory advanced, hbm, advanced packaging

Stacked DRAM with wide interface.

high dimensional optimization, bayesian optimization, gaussian process, response surface, doe, design of experiments, pareto optimization, robust optimization, surrogate modeling, tcad, run to run control

# Semiconductor Manufacturing Process Recipe Optimization: Mathematical Modeling ## 1. Problem Context A semiconductor **recipe** is a vector of controllable parameters: $$ \mathbf{x} = \begin{bmatrix} T \\ P \\ Q_1 \\ Q_2 \\ \vdots \\ t \\ P_{\text{RF}} \end{bmatrix} \in \mathbb{R}^n $$ Where: - $T$ = Temperature (°C or K) - $P$ = Pressure (mTorr or Pa) - $Q_i$ = Gas flow rates (sccm) - $t$ = Process time (seconds) - $P_{\text{RF}}$ = RF power (Watts) **Goal**: Find optimal $\mathbf{x}$ such that output properties $\mathbf{y}$ meet specifications while accounting for variability. ## 2. Mathematical Modeling Approaches ### 2.1 Physics-Based (First-Principles) Models #### Chemical Vapor Deposition (CVD) Example **Mass transport and reaction equation:** $$ \frac{\partial C}{\partial t} + \nabla \cdot (\mathbf{u}C) = D\nabla^2 C + R(C, T) $$ Where: - $C$ = Species concentration - $\mathbf{u}$ = Velocity field - $D$ = Diffusion coefficient - $R(C, T)$ = Reaction rate **Surface reaction kinetics (Arrhenius form):** $$ k_s = A \exp\left(-\frac{E_a}{RT}\right) $$ Where: - $A$ = Pre-exponential factor - $E_a$ = Activation energy - $R$ = Gas constant - $T$ = Temperature **Deposition rate (transport-limited regime):** $$ r = \frac{k_s C_s}{1 + \frac{k_s}{h_g}} $$ Where: - $C_s$ = Surface concentration - $h_g$ = Gas-phase mass transfer coefficient **Characteristics:** - **Advantages**: Extrapolates outside training data, physically interpretable - **Disadvantages**: Computationally expensive, requires detailed mechanism knowledge ### 2.2 Empirical/Statistical Models (Response Surface Methodology) **Second-order polynomial model:** $$ y = \beta_0 + \sum_{i=1}^{n}\beta_i x_i + \sum_{i=1}^{n}\beta_{ii}x_i^2 + \sum_{i 50$ parameters) | PCA, PLS, sparse regression (LASSO), feature selection | | Small datasets (limited wafer runs) | Bayesian methods, transfer learning, multi-fidelity modeling | | Nonlinearity | GPs, neural networks, tree ensembles (RF, XGBoost) | | Equipment-to-equipment variation | Mixed-effects models, hierarchical Bayesian models | | Drift over time | Adaptive/recursive estimation, change-point detection, Kalman filtering | | Multiple correlated responses | Multi-task learning, co-kriging, multivariate GP | | Missing data | EM algorithm, multiple imputation, probabilistic PCA | ## 6. Dimensionality Reduction ### 6.1 Principal Component Analysis (PCA) **Objective:** $$ \max_{\mathbf{w}} \quad \mathbf{w}^T\mathbf{S}\mathbf{w} \quad \text{s.t.} \quad \|\mathbf{w}\|_2 = 1 $$ Where $\mathbf{S}$ is the sample covariance matrix. **Solution:** Eigenvectors of $\mathbf{S}$ $$ \mathbf{S} = \mathbf{W}\boldsymbol{\Lambda}\mathbf{W}^T $$ **Reduced representation:** $$ \mathbf{z} = \mathbf{W}_k^T(\mathbf{x} - \bar{\mathbf{x}}) $$ Where $\mathbf{W}_k$ contains the top $k$ eigenvectors. ### 6.2 Partial Least Squares (PLS) **Objective:** Maximize covariance between $\mathbf{X}$ and $\mathbf{Y}$ $$ \max_{\mathbf{w}, \mathbf{c}} \quad \text{Cov}(\mathbf{Xw}, \mathbf{Yc}) \quad \text{s.t.} \quad \|\mathbf{w}\|=\|\mathbf{c}\|=1 $$ ## 7. Multi-Fidelity Optimization **Combine cheap simulations with expensive experiments:** **Auto-regressive model (Kennedy-O'Hagan):** $$ y_{\text{HF}}(\mathbf{x}) = \rho \cdot y_{\text{LF}}(\mathbf{x}) + \delta(\mathbf{x}) $$ Where: - $y_{\text{HF}}$ = High-fidelity (experimental) response - $y_{\text{LF}}$ = Low-fidelity (simulation) response - $\rho$ = Scaling factor - $\delta(\mathbf{x}) \sim \mathcal{GP}$ = Discrepancy function **Multi-fidelity GP:** $$ \begin{bmatrix} \mathbf{y}_{\text{LF}} \\ \mathbf{y}_{\text{HF}} \end{bmatrix} \sim \mathcal{N}\left(\mathbf{0}, \begin{bmatrix} \mathbf{K}_{\text{LL}} & \rho\mathbf{K}_{\text{LH}} \\ \rho\mathbf{K}_{\text{HL}} & \rho^2\mathbf{K}_{\text{LL}} + \mathbf{K}_{\delta} \end{bmatrix}\right) $$ ## 8. Transfer Learning **Domain adaptation for tool-to-tool transfer:** $$ y_{\text{target}}(\mathbf{x}) = y_{\text{source}}(\mathbf{x}) + \Delta(\mathbf{x}) $$ **Offset model (simple):** $$ \Delta(\mathbf{x}) = c_0 \quad \text{(constant offset)} $$ **Linear adaptation:** $$ \Delta(\mathbf{x}) = \mathbf{c}^T\mathbf{x} + c_0 $$ **GP adaptation:** $$ \Delta(\mathbf{x}) \sim \mathcal{GP}(0, k_\Delta) $$ ## 9. Complete Optimization Framework ``` ┌────────────────────────────────────────────────────────────────────────────────────┐ │ RECIPE OPTIMIZATION FRAMEWORK │ ├────────────────────────────────────────────────────────────────────────────────────┤ │ │ │ RECIPE PARAMETERS PROCESS MODEL │ │ ───────────────── ───────────── │ │ x₁: Temperature (°C) ───► ┌───────────────┐ │ │ x₂: Pressure (mTorr) ───► │ │ │ │ x₃: Gas flow 1 (sccm) ───► │ y = f(x;θ) │ ───► y₁: Thickness (nm) │ │ x₄: Gas flow 2 (sccm) ───► │ │ ───► y₂: Uniformity (%) │ │ x₅: RF power (W) ───► │ + ε │ ───► y₃: CD (nm) │ │ x₆: Time (s) ───► └───────────────┘ ───► y₄: Defects (#/cm²) │ │ ▲ │ │ │ │ │ Uncertainty ξ │ │ │ ├────────────────────────────────────────────────────────────────────────────────────┤ │ OPTIMIZATION PROBLEM: │ │ │ │ min Σⱼ wⱼ(E[yⱼ] - yⱼ,target)² + λ·Var[y] │ │ x │ │ │ │ subject to: │ │ y_L ≤ E[y] ≤ y_U (specification limits) │ │ Pr(y ∈ spec) ≥ 0.9973 (Cpk ≥ 1.0) │ │ x_L ≤ x ≤ x_U (equipment limits) │ │ g(x) ≤ 0 (process constraints) │ │ │ └────────────────────────────────────────────────────────────────────────────────────┘ ``` ## 10. Key Equations Summary ### Process Modeling | Model Type | Equation | |:-----------|:---------| | Linear regression | $y = \mathbf{X}\boldsymbol{\beta} + \varepsilon$ | | Quadratic RSM | $y = \beta_0 + \sum_i \beta_i x_i + \sum_i \beta_{ii}x_i^2 + \sum_{i

high temperature, text generation

More random generation.

high vacuum pump, manufacturing operations

High vacuum pumps achieve extremely low pressures for critical processes.

high-angle annular dark field, haadf, metrology

Z-contrast imaging in STEM.

high-angle grain boundary, defects

Large misorientation.

high-aspect-ratio mol, process integration

High-aspect-ratio contacts and vias in scaled nodes challenge gap fill and reliability requiring advanced processes.

high-k dielectric,technology

Dielectric with high dielectric constant.

high-k first, process integration

Deposit high-k before poly gate.

high-k last, process integration

Deposit high-k after removing poly.

high-k metal gate (hkmg),high-k metal gate,hkmg,technology

High dielectric constant gate oxide and metal gate for better performance.

high-k metal gate, process integration

High-k metal gate technology replaces silicon dioxide and polysilicon with high dielectric constant materials and metal electrodes reducing leakage.

high-na euv,lithography

Higher numerical aperture for better resolution.

high-order overlay, metrology

Overlay beyond simple X-Y shift (rotation scaling).

high-power probe, advanced test & probe

High-power probing tests devices at elevated current and voltage levels requiring specialized probe tips and thermal management.

high-resolution fine-tuning, computer vision

Adapt to higher resolution images.

high-resolution generation, generative models

Create images beyond training resolution.

high-temperature bake, packaging

Faster moisture removal.

high-throughput screening, materials science

Rapidly evaluate many candidates.

high-volume manufacturing, hvm, production

Full-scale production.

higher reflow temperature, packaging

Lead-free requires higher temp.

higher-order gnn, graph neural networks

Higher-order GNNs increase expressiveness by aggregating information from k-tuples of nodes rather than individuals.

highly accelerated life test, halt, reliability

Extreme stress to find limits.

highly accelerated life, business & standards

Highly accelerated life testing uses extreme stress to precipitate failures quickly.

highly accelerated stress screening, hass, reliability

Screen production units.

highly accelerated stress test (hast),highly accelerated stress test,hast,reliability

Severe temperature humidity stress.

highly accelerated temperature and humidity stress test, hast, reliability

Severe environmental test.

highway networks, neural architecture

Gated skip connections.

hillock formation,reliability

Copper protrusions from stress.

hindsight experience replay, her, reinforcement learning

Learn from failed attempts.

hindsight experience, reinforcement learning advanced

Hindsight Experience Replay relabels failed trajectories with achieved goals as successes improving sparse reward learning.

hint learning, model compression

Student learns from teacher's intermediate layers.

hipaa compliance nlp, hipaa, legal

Ensure text processing complies with privacy.

hiro, hiro, reinforcement learning advanced

Hierarchical Reinforcement Learning with Off-policy correction trains goal-setting and goal-achieving policies jointly.

histogram, quality & reliability

Histograms display frequency distributions revealing shape center and spread.

history effect,device physics

Device behavior depends on previous state.