Positional encoding
Because attention alone is oblivious to word order, fixed sinusoidal features or learned embeddings add absolute indices.
Later research explores relative rotations, rotary embeddings ALiBi-style biases swapping absolute tables for sharper extrapolation lengths.