Conscious machines are clearly impossible, and it has nothing to do with biology and everything to do with formalism and how it locks out intentionality. As soon as you design something, it's game over. Algorithms are empty, and can't refer to anything:
===
You memorize a whole bunch of shapes. Then, you memorize the order the shapes are supposed to go in so that if you see a bunch of shapes in a certain order, you would “answer” by picking a bunch of shapes in another prescribed order. Now, did you just learn any meaning behind any language?
Conscious machines are clearly impossible, and it has nothing to do with biology and everything to do with formalism and how it locks out intentionality. As soon as you design something, it's game over. Algorithms are empty, and can't refer to anything:
===
You memorize a whole bunch of shapes. Then, you memorize the order the shapes are supposed to go in so that if you see a bunch of shapes in a certain order, you would “answer” by picking a bunch of shapes in another prescribed order. Now, did you just learn any meaning behind any language?
===
https://towardsdatascience.com/artificial-consciousness-is-impossible-c1b2ab0bdc46
"evolutionary [ancestor] of wolves". It would make more sense to say dogs are the evolutionary descendants of wolves.
Yes, that should say descendent, good spot.