In fact the hardest thing to implement or copy from the human brain is the "approximatively" concept.
Whilst every computing model can easily handle 1+1=2
The human brain has no problem handling 1+ a bit= something "a bit more than 1"
And if you add "some more" to it= probably 2, or more, or lesser.
From there on the human brain can work further on with "relative numbers", whilst any computer will block.
We're far from implementing a working 'fuzzy logic' in the computers, all that's been tried till now failed at some point or was restricted to a specific, extremely narrow situation.
Ah well, some day, mayhaps, or never..... I need a drink and a smoke.
__________________
Not a member of The Victorious People's Shoutbox Liberation Army.
Not a member of the GAG Guerrilla. Don't get A Grip!
FOR RENT
*Advertising space*
|