About two months ago, I was leaving my school’s library after class, and I noticed a Time cover with a provocative title (not provocative enough that I remember it, of course), indicating the imminent arrival of quantum computing. Because I like pretending that I understand anything about quantum mechanics and the micro-structures and activities of physical reality, I picked up the issue with intellectual gusto. The article lucidly explained (i.e., I read it and my eyes didn’t bleed) the distinction between classical (current) and quantum computing, focusing on how information is coded and decoded (ye olde 1s and 0s business). Whereas classical computers read one bit or another, quantum computers would be able to process everything at once, able to eat up data like the Cookie Monster eats cookies (and it would be able to spit out an infinite string of better analogies to elucidate its power).
And who has their hands on such technology? Google. IBM. The government. Which sounds like the start to a dystopian novel.
Fortunately (?), this piece had little to add in terms of news, suggesting that progress is slow (or rather, “classical”). Will it take another significant war to see a leap forward? At that point though, we’d have to wonder if there’d be anything left to compute.
For a better understanding of this computing stuff, await Fuller’s inevitable response in the comments below. Or he could say nothing. We won’t know until he chooses. Regardless, he actually lives in this world, and if I were smart enough (there’s that S word! oh no!), I’d make some joke here about the uncertainty principle, since I haven’t already, right?