Blog Archive

Thursday, 3 January 2019


It was an important moment when desire was recognised as a significant problem for philosophy over and above what psychology or physiology could say about it, in so far as consciousness is unavoidably implicated. A version of this problem arises again in the consideration of whether AIs can properly be said to have desires, and hence an inner life of some kind, or not. Such desire would have to be recognised as having its own centre, as being for-itself, rather than merely being the kind of desire we ascribe to a piece of machinery which is misbehaving in a way that seems to anticipate our own actions and wishes, when we are ware of 'inducing' a virtual subjectivity in it. (Such 'induction' goes beyond misbehaviour, perhaps in the case of sex dolls specifically created to 'hack' the desire of their users.) Evolutionary theory might also claim to offer an 'explanation' of desire, in two stages. First there are the manifest advantages for a replicator to produce purposive action across a wide range of responses to a varying environment, and second, refining this kind of responsiveness to the purpose of purposes, and the purpose of these, in other words to an open hierarchy of metaprograming of purposive behaviour, which then requires the 'invention' of consciousness as the provisionally most efficient means of synthesising all the complex levels of feedback needed to bring this about. If consciousness is understood in this way then it arises out of prior desiring, although the name desire does not fully apply until assumed in conscious reflection, and leaves no room for pure witnessing or desireless awareness. If you attempt to resolve this question from the direction of consciousness then witnessing prevails, and desire remains mysterious. Resolved from the side of phenomena ('Representations') then it is desire all the way down and witnessing is an illusion. 

No comments:

Post a Comment

Note: only a member of this blog may post a comment.