Low Rank Sparse Attention Collection Open source weights of Lorsa modules introduced in "Lorsa: Low-Rank Sparse Attention Finds Monosemantic Heads in Attention Superposition". • 2 items • Updated 1 day ago