Blog Archive

Tuesday, 3 January 2023

Some, perhaps most, of the dystopian AI scenarios tacitly assume that AIs will have desires, as if that was a natural corollary of (what is called) intelligence beyond a certain level of sophistication. This is certainly not the case, desire is a whole other problem. Coding in a psychoanalytic account of desire would seem to be a recipe for failure, but such theories underline the mercurial complexity of desire and its relations to language and memory, as well as its tendency towards fixation, and its predilection for impossibility, its oblique aim. They also help to distinguish it from strong appetite and the motivations towards exploration and play all of which are adjacent to it. Another possibility might be mimetic desire as a sort of bootstrap phenomenon, but this is surely a secondary phenomenon presupposing a theory of mind and an ability to identify alter-egos, and hence the existence of an ego, in particular a bodily ego, as both basis and obstacle to be transcended, for which it would seem desire is already a precondition. Relying on the idea of goals in the absence of any such theorising is symptomatic of a lack running through all of AI discourse.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.