>It's a fundamental law that can be applied to freaking machines, the mechanisms will be different but it will still be Darwinist. We haven't even begun to tap selection as a design/engineering tool.
We are starting too. Gradient descent algorithms that adjust kernel filters in deep neural networks is an algorithm based on step evolution. Error is measured in the output and then weights are adjusted up or down over the course of the next epoch step based on the error and which direction the surface topology of the gradiant indicates. This is the core of how we are creating algorithms that learn and evolve over time as they absorb new data and then measure feedback from the output step. Evolution itself is a closed loop control system.
(post is archived)