Rotary Position Embeddings and ALiBi are the two leading methods modern LLMs use to handle sequence position without learned embeddings. They enable longer context, better extrapolation, and faster training-replacing old positional encoding techniques entirely.
Read More