Thanks for this measured perspective (I expected a good read to follow an epigraph from Eric Topol). I appreciate the categorization of these new and developing tools alongside the scalpel and the stethoscope. Hominids co-evolved with technologies including language, and I think AI continues that foundational relation. Human intelligence has always been, in some essential way, artificial intelligence. So I think you're right that the stakes are mainly to be found in the social relations that implement the tech and not the tech itself (though it is worth considering the energy demands for running thousands of GPUs). I've come to regard the fears surrounding generative LLMs—especially the most extreme dystopian scenarios—as a PR strategy. Why else would the likes of Sam Altman and Elon Musk pronounce such stark warnings while continuing to work on and invest in it?
It's more than a PR strategy, I think, it's fear-mongering to induce regulation. Regulation helps the incumbents. I couldn't find the original paper, but I found this from Bill Gurley at the All In Conference a few weeks back: https://www.youtube.com/watch?v=F9cO3-MLHOM
I think it could prove much more self-serving than only PR. I hate to sound cynical but it's hard to ignore.
And "human intelligence has always been, in some essential way, artificial intelligence" made me think of the parable of Thoth, the ancient Egyptian equivalent to Prometheus, from Plato's Phaedrus.
The phrase that describes written language literally means "The Speech of the Gods" in ancient Egyptian. Thoth is discussing his invention of writing with Thamus (Ammon), a god-king who rebukes him:
"This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without its reality."
The ink wasn't dry on the invention of writing - because ink hadn't been invented yet - and someone wrote a story of how the technology would ruin us. It really has always been, in some essential way, artificial intelligence.
"The lecture as described by the article also appears to overlook the fact that AI does not replace the practitioner but supports them."
Well, that is how it *SHOULD* be used. Based on my previous work experiences, I can say that is not always how the C-suite views it or how it is implemented on the ground......... 👀
Thanks for this measured perspective (I expected a good read to follow an epigraph from Eric Topol). I appreciate the categorization of these new and developing tools alongside the scalpel and the stethoscope. Hominids co-evolved with technologies including language, and I think AI continues that foundational relation. Human intelligence has always been, in some essential way, artificial intelligence. So I think you're right that the stakes are mainly to be found in the social relations that implement the tech and not the tech itself (though it is worth considering the energy demands for running thousands of GPUs). I've come to regard the fears surrounding generative LLMs—especially the most extreme dystopian scenarios—as a PR strategy. Why else would the likes of Sam Altman and Elon Musk pronounce such stark warnings while continuing to work on and invest in it?
It's more than a PR strategy, I think, it's fear-mongering to induce regulation. Regulation helps the incumbents. I couldn't find the original paper, but I found this from Bill Gurley at the All In Conference a few weeks back: https://www.youtube.com/watch?v=F9cO3-MLHOM
I think it could prove much more self-serving than only PR. I hate to sound cynical but it's hard to ignore.
And "human intelligence has always been, in some essential way, artificial intelligence" made me think of the parable of Thoth, the ancient Egyptian equivalent to Prometheus, from Plato's Phaedrus.
The phrase that describes written language literally means "The Speech of the Gods" in ancient Egyptian. Thoth is discussing his invention of writing with Thamus (Ammon), a god-king who rebukes him:
"This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without its reality."
The ink wasn't dry on the invention of writing - because ink hadn't been invented yet - and someone wrote a story of how the technology would ruin us. It really has always been, in some essential way, artificial intelligence.
"The lecture as described by the article also appears to overlook the fact that AI does not replace the practitioner but supports them."
Well, that is how it *SHOULD* be used. Based on my previous work experiences, I can say that is not always how the C-suite views it or how it is implemented on the ground......... 👀