1. Tag all content with a date (probably almost done). In long passages, estimate a date based on published + written rate. 2. Once every sentence or phrase has a date, create a few blockchains. One for short thoughts (128 tokens), and more chains for longer thoughts (3072+ tokens). 3. Run an LLM embeddings cosine similarity or some better metric for each tokenized idea with a threshold of say 75%, tokenized by natural punctuation if not found. 4. Only NEWish content gets stored in the block, chained if novel. Again, short and long thoughts recorded. Coordinates from the vectors. 5a. I remember reading once that working through a PhD was like pricking the inside of a balloon, I love that visualization. Every novel idea pricks the inside of a growing sphere of knowledge over time. 5b. Has anyone ever tried to map human knowledge in three dimensions? 5c. Alternatively, a tree or root system growing over time. Branching from the vectors.