Data Science & ML

Inference

/ˈɪnfərəns/

Definition

The process of running new input data through a trained model to generate predictions.

Example in context

"Inference latency matters for our real-time API — the model must respond in under 200ms per request."

Related terms

Practice this term

Master Inference in context by working through exercises in the Data Science & ML module. You'll see the term used in real engineering scenarios with multiple-choice, fill-in-the-blank, and matching drills.