Consequences of bot-mediated reality
I have a lot of catch-up listening to do with regards to The Long Now Foundation’s excellent Seminars About Long-term Thinking (SALT) lecture and podcast series. I’m a charter member of the Foundation, which gets you a sweet membership card and access to video of their lectures, among other less tangible things like knowing you’re helping inject some much-needed awareness of long-term thinking and planning into public discourse.
One of the lectures I’m particularly looking forward to downloading is the recent Daemon: Bot-Mediated Reality by Daniel Suarez, which I think has particular relevance given the recent and rather large f-up in which Google’s news crawler inadvertently “evaporated $1.14B USD”.
Unfortunately, I think that in the near future, as more and more processes are automated, we will see more such screw-ups of this scale. I can’t help but think that this might have been avoidable, though, if the indexing engine had been able to take advantage of semantic data rather than relying on scraping and evaluating natural language.