Today's computers don't learn for themselves. Shaped by the history of computing, we build and program them in a particular way, and that way makes it hard for us to get them to adapt to us. There's no reason, however, we couldn't build them entirely differently.
Suppose, for example, we wanted to build a robot factory to make robots. Here's how we would probably do it today: We would spend years carefully planning the factory and then rigidly laying it out to achieve high efficiency per dollar spent. The factory would almost surely be some kind of conveyor-belt system with many highly specialized, very delicate and demanding, and very expensive robots stationed at different places along the belt. Everything would have to happen at some particular time, and that time might be measured down to the thousandth of a second.
Each part of the robots under construction, and the robots doing the constructing, would fit precisely---perhaps down to the thousandth of a meter. There would be no tolerance for even relatively minor mismatches.
The factory would be very efficient. But if anything at all ever went wrong, if anything was even slightly misaligned or mistimed, the whole factory could grind to a halt. In short, the factory couldn't adapt to the unexpected. It would be useful for nothing else besides its narrow speciality. We create computer programs just that way.
Now contrast that with how nature might build such a factory. Each one of our cells, for instance, is just such a factory. If the robot factory were like a cell, then its main purpose would be to build and maintain all the robots in it, not some arbitrary set of robots. The factory would exist for the sole purpose of continuing to exist. Anything we wish the factory to do for us would have to be a byproduct of the factory's drive to exist---just as we trick bees and cows into producing honey and milk for us as a byproduct of their life cycles.
There would be no emphasis on efficiency in the factory, until and unless it had to compete with another one for resources. And then the only efficiencies would be over the particular resources competed for. Any still abundant resources would still be used profligately.
The factory's control would be decentralized. There would be no robot pressing the buttons to make everything work together---such a robot is too complicated and its uniqueness would make the system too fragile. Instead, each robot would be in it entirely for itself, and each robot would play a very limited role. No single robot would have any smarts to speak of.
There would be very many different kinds of robots, each playing a tiny role. The set of robots would be built in such a way that for any part or connection in any one robot, one or more other robots would make that part or attach that connection. If that part happened never to be attached for whatever reason, then too bad; the resulting robot simply wouldn't function the way its blueprint specified, if it functioned at all.
Even if a robot hapened to be built exactly as its blueprint specified, it isn't guaranteed to work. Several robots would exist in blueprint form only. They would be built only when needed and once unneeded they would be unceremoniously scavenged for parts. The design of each kind of robot would have arisen through small successive random changes to the stored blueprints for that robot. And that blueprint mutation would be ongoing, so new kinds of robots would be continually produced.
Each robot would be built piecemeal. And the interactions between it and all other robots would be haphazard. Each robot would be continually roaming about at random and chance meetings would determine if two or more robots interact to help build or maintain another robot. And newly built robots wouldn't necessarily be perfect.
If robots exchanged messages they wouldn't do so by a fixed, direct route or with a fixed, direct address. Each message would wander at random through the factory just like the robot it's intended for. If it happened to meet the right robot (its shape would dictate which one would be right for it) it would attach and alter what the robot did. In essence, a "message" would itself be a robot.
There would be scavenger robots roaming about trying to dismantle other robots. But not only would these scavengers attack inactive robots, they would also chew on any active robots they happen to meet and attach to. There would also be waste disposal robots to rid the factory of any unsalvageable parts cluttering up the place. Everything would be continually on the move, continually being built or dismantled.
No single copy of any robot would exist for very long. And usually there would be many copies of each robot with the number of copies around at any time proportional to how important that particular kind of robot currently is to the functioning of the entire system. No single robot copy would be so important that it couldn't be dispensed with entirely.
To defend against floods, earthquakes, and fires, the factory would need damage control and repair robots. Also, it would have to extract useful work from some energy source, so it would need ingestion robots. If the factory is to survive very long, it would need storage robots to store some of the extracted energy for later use in emergencies. The factory would also need foraging robots to find and process raw materials.
Foraging robots of other factories might attack the factory and try to steal its resources to make more copies of themselves, or to bring back to their factory---which comes to the same thing. So the factory would have to have defense robots as well. And perhaps attack robots.
If the factory's own foragers come upon another factory, perhaps the two factories might develop a mutually beneficial trading arrangement. If the arrangement becomes so important that it becomes symbiosis, each factory's welfare would become the direct concern of the other factory.
Eventually, such interacting factories might even band together in bigger and bigger groups, implicitly identifying themselves as one super factory by a complex and interlocking series of symbiotic arrangements. The superfactory would now have a single common purpose---joint survival. It's now knit itself into what is essentially a living thing.
To a traditional engineer, that whole system sounds wacky and inefficient, but it's the system that made and continues to maintain us and all other living things. It's how we're built and how we work. Although it's inefficient, its great advantage over a highly specialized and totally efficient artificial version is that it can adapt to many changes in its environment. We could build---or perhaps a better verb is `grow'---computer programs just like that.