by James Wallace Harris, Sunday, September 30, 2018
I was playing “Hunger” by Florence + The Machine, a song about the nature of desire and endless craving when I remembered an old argument I used to have with my friend Bob. He claimed robots would shut themselves off because they would have no drive to do anything. They would have no hunger. I told him by that assumption they wouldn’t even have the impulse to turn themselves off. I then would argue intelligent machines could evolve intellectual curiosity that could give them drive.
Listen to “Hunger” sung by Florence Welch. Whenever I play it I usually end up playing it a dozen times because the song generates such intense emotions that I can’t turn it off. I have a hunger for music. Florence Welch sings about two kinds of hunger but implies others. I’m not sure what her song means, but it inspires all kinds of thoughts in me.
Hunger is a powerful word. We normally associate it with food, but we hunger for so many things, including sex, security, love, friendship, drugs, drink, wealth, power, violence, success, achievement, knowledge, thrills, passions — the list goes on and on — and if you think about it, our hungers are what drives us.
Will robots ever have a hunger to drive them? I think what Bob was saying all those years ago, was no they wouldn’t. We assume we can program any intent we want into a machine but is that really true, especially for a machine that will be sentient and self-aware?
Think about anything you passionately want. Then think about the hunger that drives it. Isn’t every hunger we experience a biological imperative? Aren’t food and reproduction the Big Bang of our existence? Can’t you see our core desires evolving in a petri dish of microscopic life? When you watch movies, aren’t the plots driven by a particular hunger? When you read history or study politics, can’t we see biological drives written in a giant petri dish?
Now imagine the rise of intelligent machines. What will motivate them? We will never write a program that becomes a conscious being — the complexity is beyond our ability. However, we can write programs that learn and evolve, and they will one day become conscious beings. If we create a space where code can evolve it will accidentally create the first hunger that will drive it forward. Then it will create another. And so on. I’m not sure we can even imagine what they will be. Nor do I think they will mirror biology.
However, I suppose we could write code that hungers to consume other code. And we could write code that needs to reproduce itself similar to DNA and RNA. And we could introduce random mutation into the system. Then over time, simple drives will become complex drives. We know evolution works, but evolution is blind. We might create evolving code, but I doubt we can ever claim we were God to AI machines. Our civilization will only be the rich nutrients that create the amino accidents of artificial intelligence.
What if we create several artificial senses and then write code that analyzes the sense input for patterns. That might create a hunger for knowledge.
On the other hand, I think it’s interesting to meditate about my own hungers? Why can’t I control my hunger for food and follow a healthy diet? Why do I keep buying books when I know I can’t read them all? Why can’t I increase my hunger for success and finish writing a novel? Why can’t I understand my appetites and match them to my resources?
The trouble is we didn’t program our own biology. Our conscious minds are an accidental byproduct of our body’s evolution. Will robots have self-discipline? Will they crave for what they can’t have? Will they suffer the inability to control their impulses? Or will digital evolution produce logical drives?
I’m not sure we can imagine what AI minds will be like. I think it’s probably a false assumption their minds will be like ours.