In the mid-90s, I was so happy with a flat-text indexing service called AskSam. I’d install the software from a set of floppies and it would keep track of all sorts of notes I had scattered across my laptop. Today, I still have a scattergun approach to information and I’m using a much more powerful piece of software.
It’s Mem.ai
Now I’ve discovered Mem is being tweaked to process PDFs, images, links, audio recordings, and videos that are added as Mems. This means when I use the chat or search functions inside Mem, I will be able to locate multimedia objects as well as Markdown Mems. This is a big deal because it means Mem will span by Zoom meeting recordings, Otter transcripts, and the dozens of screenshots I take every day.
I hope to see the first iteration of this new second brain as my Mems become aware of the knowledge from within my email threads and attachments. I cannot do these sorts of deep dives with Microsoft Copilot or ChatGPT. I want to be able to chat to my Mem collection and leverage these exceptionally rich knowledge management system.