Blog Archive
-
▼
2018
(365)
-
▼
August
(31)
- It's strange that when Husserl intiated pheno...
- All the things you say fail to snare anything...
- Everyone that you've known at all well has been...
- What happens when you are born, when consciou...
- Religion could generally, and only in a somewh...
- The emotion-picture evoked by a piece of musi...
- The spontaneous memories which occasion regre...
- It seems likely that a simple robotic system ...
- Well, it's a brain state, and the circumstanc...
- It looks like a motely collection of broken rea...
- Little attention given here to self-reflexivi...
- And at times when the sense of self is engagi...
- There is a mobile centre of gravity in experi...
- What is more truly your own than your own swee...
- It goes on, naively, as if you are the passen...
- Minced up fragments of the day past re-emerge,...
- You have to admit that from the moment the fi...
- It's not that you can locate a ghostly expe...
- Self too is a circumstance even if is express...
- Searching for the self you have mislaid like ...
- At times there are only circumstances, not ex...
- Idealism may be nothing more than the realisa...
- In memories of childhood all value is in mome...
- Mathematics makes evident the existence of ot...
- The mode of being in experience is simple pre...
- Everything you assume about the experiencer i...
- Flux of thoughts and intentions, of references...
- The objective is what is there independently...
- That as this happens it happens somewhere, m...
- The experience unfolds in time until it conc...
- Just as it is without preconceptions or any ...
-
▼
August
(31)
Friday, 24 August 2018
It seems likely that a simple robotic system like a self-propelling vacuum cleaner could fairly easily be tricked by a set of obstacles into repetitive loop behaviour. An ant may have a less complex control system, but one of the ways that we recognise it as a living organism is that it breaks out of loops, at some point it exhausts repetition and tries something else. It is certainly not self-conscious, and probably not conscious according to most ways of understanding this term, but it has some degree of self-awareness on a behavioural level in that it can recognise when it is in a loop. The same idea could of course be programmed into a robot: when you find yourself in a loop then make a random variation in your program - but this assumes that we have a way of programming a way of recognising loop behaviour - but this could probably also be hacked to produce more complex loops. What is known is that there is no general way of programming a recognition of futile behaviour, since that would require a solution to the Halting Problem, while on the side of life even bacteria possess ways of solving the most complex problems via evolutionary tinkering. On a higher level, we know that consciousness and self-consciousness are entirely different things, and are different from intelligence. There is even a species of fish which passes the mirror test for self-awareness, which dogs and cats do not. There is no evidnce that these fish possess something like ego drives, but not far fetched to impute something like this to hominid apes, or even certain birds. It is often assumed in speculations on articial general intelligence (AGI) that such an entity would necessarily have both self-consciousness and ego drives, in the form of self-regarding desires and intentions. Assuming an AGI is possible (as software) this does not necessarily follow. One can imagine an AGI 'waking up' by accident some time after having achieved the ability to solve problems it had formulated for itself. This might look something like suddenly coming on a version of the Cartesian cogito. It could be that from this point on it starts endlessly babbling about the miracle of its own existence and devotes itself to redoing all of philosophy. This might be seen in a Wittgensteinian way as a disease of AGis, a futile loop for which the only cure is hitting the reset button. On the other hand it might be necessary to induce this disease in AGIs in order to fully activate them, like putting a seed of grit into an oyster. What form would such 'grit' take, so that the AGI would not be able to shrug it off with contempt - say an AGI that had already assimilaled all of philosophy without triggering a 'waking up'?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: only a member of this blog may post a comment.