By delving into a theory of complex systems, you help me see things simple! In addition to the concept of the learn drive, we should be able to talk about the concept network! It addresses all your doubts (I believe). We can give up schools, learn naturally, build complex societies, optimize the control hierarchy, and maximize global intelligence (with participation of humans and the algorithms).
“ignoring complexity” is exactly what a toddler would do. There is no fear of the “complex world”, which we seem to learn at school (“if you do not know X, you will surly fail at Z”). It is just probing the environment and making the most of current knowledge to get some more knowledge.
Trying to drive a car without understanding the purpose of the brake pedal, seat belt or headlights will result in big trouble!
A fearless brain in the process of adaptation will make dozens of rounds around the car before attempting the first touch. Dogs are curious, but they never drive. Kids are even more curious and they will seek all clues to figure out how the car works. A rat will jump into a hot boiling oil and die when using its best navigation knowledge in escape (source: YouTube).
The learn drive will make optimal learning decisions. Exploration carries a risk and good learning decisions may still lead to death (as discussed before).
Brains and evolution have a simple strategy of adapting the speed of exploration and conceptualization to the hazards of the world. Let the brave and the fearful die and find the optimum of the exploratory drive. I support limiting behavioral spaces to incrementally expose kids to cars and pedals (in essence, slowing down the conceptualization). This way we can maximize the efficiency of the process without having kids die.
George’s suggestion is correct: expose a concept network to the world and let it build adaptations incrementally.
how a controller can increase in complexity to match the environment that it wants to control
Evolution has found a super-simple and effective mechanism. I describe it in: Concept network : Optimization of connectivity
Your words make me recall that each concept in a concept network is in a sense a controller, and the network is a form of control hierarchy. The hierarchy emerges in response to the properties of the environment as perceived by the senses. It seems so perfect that I see it unimprovable
The whole of humanity forms a next level of the concept network. All individual brain can be seen as concepts in the network. This idea would need to ignore our complex states (e.g. thought) or the actuators (e.g. ability to take on a physical fight). However, future society seems to be driving in the direction of integrating all brains, all knowledge, and all artificial intelligences into a big concept network. In this evolution, I do not see the problem of “renounced autonomy”. All my rants against schooling are the exact effect of the “war of the networks” at the societal level (the same bad thing that happens at school, except I hope for a better outcome). My inputs are contradictory and I fight to retain the consistency of the model (of the world) by combating incongruent control signals. What we (in this forum) decide to say to the world will make other concepts/humans “wake up” and we may have a global “brainwave” of change
If you read “optimization of connectivity” (above), you might wonder how would the brain solve the problem of propagating an important message (to succeed in the fight for a better education system). The brain would simply grow a good axon and seek receptive dendrites. We can grow an easy to interpret message (literally reducible to an ON/OFF signal) and hope sufficiently many neurons/humans will find the resulting output rewarding (in the sense of the “learn drive” reward).