*Language Models Use Trigonometry to Do Addition* Subhash Kantamneni, Max Tegmark MIT 2025 https://arxiv.org/abs/2502.00873 <https://arxiv.org/abs/2502.00873?fbclid=IwZXh0bgNhZW0CMTAAAR3eeuot_BGe0CySvn...> "We first discover that numbers are represented in these LLMs as a generalized helix, which is strongly causally implicated for the tasks of addition and subtraction, and is also causally relevant for integer division, multiplication, and modular arithmetic. We then propose that LLMs compute addition by manipulating this generalized helix using the “Clock” algorithm: to solve a + b, the helices for a and b are manipulated to produce the a + b answer helix which is then read out to model logits. " Imagine teaching a computer to do math, not by giving it step-by-step rules like a calculator, but by letting it figure things out on its own. Scientists have been trying to understand how large language models (LLMs)—the AI behind tools like ChatGPT—actually "think" when solving even basic math problems. In this study, researchers uncovered something fascinating: LLMs seem to use trigonometry to perform addition. Instead of treating numbers as simple values, these AI models represent them as points on a twisted spiral—like a helix. When the model needs to add two numbers, it doesn’t just line them up and sum them. Instead, it rotates and shifts these spiral representations using what the researchers call the "Clock" algorithm. Essentially, the AI aligns the spirals for two numbers in a way that lands it on the correct result. By tracking how different parts of the AI process these helical number patterns—from individual neurons to larger structures called attention heads—scientists confirmed that this method is not just a coincidence but a fundamental part of how the model handles math. This discovery provides the first detailed glimpse into how AI "sees" numbers and could help us design smarter, more reliable models in the future. ___ via Cecile Tamura: https://www.facebook.com/cecile.tamura/posts/pfbid0U7FwUxbXoB5CtPV3BgC7WoLcp...