| | Bill:
You are using concepts like 'value' and 'care', that are implemented by human wetbits.
Is human 'value' implemented, in the 'what is', as anything different than a self-weighting of a high-level neural network, one that is self-reprogammable?
Is 'value' literally, the weighting applied to our self-reprogrammable neural networks?
Clearly self-reprogrammable, including, the ultimate 'value' of self-preservation and continuity: there are actual instances of human beings who volitionally -- who choose to self-terminate.
Is what it means to 'care' -- to 'value' -- as implemented in the human wetbits -- not only just similar to, as an analog, but actually identical to, the weighting applied to neural network (or neural network-like constructs, like 'radial basis networks')?
I am asking without knowing, it is a suspicion I have, yet disproved by anything suggested here. I see the words 'care' and 'value', and forgive me, I am not being confrontational with this question, I am asking, what do they mean, how might those concepts actually be implemented in our wetbits?
I am asking that, and not appealing to 'soul' or 'spirit' or anything beyond objective human wetbits.
If what I am suggesting is true(I don't know that to be the case, because I can't tell you how the emotion 'care' or the concept 'value' is actually implemented via human wetbits, I can only get glimpses of it, a suggestion, a hypothesis, in the way that neural network like systems can behave), then ... that is evidence not only of the machine inside of man, but by extension, the machine inside the universe.
My crazy theory is nowhere near exposing God inside the universe. It is closer, I think, to exposing the machine inside of all of us.
Could a neural network, fed by sensory inputs, and configured at a high level of system self observation, be weighted to value 'system viability/continuity' very highly, and weight its subservient goal based neural networks in service to that high level goal?
Could 'pleasure' be defined by a high value of dopamine -- no, I mean, a high value of feedback in a special high level neural network, that serves as an input to the high-level 'system viability/continuity' network?
Could real world sensory inputs and outputs be temporarily ignored, and our core neiral networls be stimulated instead by randomized 'what if' scenarios, purturbations of previously recorded sensory inputs? To dream? To imagine? To synthesize. To shake and bake without shaking the bed...
Do we know what 'care' is, how it is implemented in the wetbits?
My though experiment isn't about building smart robots. It is in wondering, how different are we really from smart robots -- minus the mystical imaginations of 'soul' and 'spirit' and so on?
Would it really bother you to learn that we are all ultimately wetbit neural networks on steroids, and that with enough horsepower and real estate, a silicon -- not analog, but variant -- could not only emulate our abilities, but supplant them?
And while doing so, we could claim 'yes, but not wetbits...' as we turned them on and watched them either 'whoooooooooooosh' to the stars or ... BSOD (Blue Screen Of Death.) Ha!
regards, Fred
|
|