Discussion about this post

User's avatar
Alex Tolley's avatar

"But even with all this hardware (or wetware), Chefbot wouldn’t know what anything tastes like, because Chefbot doesn’t know anything, and doesn’t taste anything."

I think you have gone to far here. Humans taste through receptors creating a pattern of receptor coding that identifies a combination. We experience the qualia of taste through some overlay of neural coding on that pattern There is no intrinsic reason why ChefBot could not ultimately do the same thing with a newer architecture, perhaps one that allows "sensations" and even consciousness.

Humans have to find associations to understand words, which are no more than perceived scribbles on a surface, or sound waves tickling our cochlea to create a pattern. We associate flavours with the words taste and smell, and associate these words with the sensation evoked by the receptor firing patterns. We understand the word "chair" be having experience of a movable device to sit on, with a back support (otherwise it is a stool). But more abstract concepts have to be understood through patterns of words, as there may not be any experience to have.

I have a friend who can no longer smell, which makes food "taste" very differently to most people's experience. But he understands smell from prior experience. If they had no sense of smell from birth, could they have any true understanding of smell?

So, a fun thought experiment, but perhaps you should not make unnecessary assumptions about ChefBot's capabilities.

No posts

Ready for more?