Back to journey

Attention Mechanism Demo

Phase 2

Home

Attention Mechanism Demo

Visualize how LLMs focus on different parts of text to understand context

Attention Visualization

The
cat
sat
on
the
mat
because
it
was
comfortable.
Animation Speed1x

About Attention

Attention is a key mechanism in transformer models that allows them to focus on different parts of the input when generating each word of output.

How it works:

  1. For each word, the model calculates attention scores for all other words in the context.
  2. Higher scores mean the model pays more attention to that word when processing the current word.
  3. This helps resolve references (like pronouns) and understand relationships.

Quick Check

Self‑attention lets the model…
Multi‑head attention helps by…
    Farzad Bayat - AI Consulting & Automation Expert | Practical AI Solutions