Brain modelling

后端 未结 8 2041
忘了有多久
忘了有多久 2021-02-06 05:42

Just wondering, since we\'ve reached 1 teraflop per PC, yet we are still not able to model an insect\'s brain. Has anyone seen a decent implementation of a self-learning, self-d

相关标签:
8条回答
  • 2021-02-06 06:44

    Just wondering, we've reached 1 teraflop per PC, and we are still not able to model an insect's brain. has anyone seen a decent implementation of a self-learning self-developing neural network?

    We can already model brains. The question these days, is how fast, and how accurate.

    In the beginning, there was effort expended on trying to find the most abstract representation of neurons with the least amount of physical properties needed.

    This led to the invention of the perceptron at Cornell University, which is a very simple model indeed. In fact, it may have been too simple, as the famous MIT AI professor, Marvin Minsky, wrote a paper which mistakenly concluded that it would be impossible for this type of model to learn XOR (a basic logic gate that could be emulated by every computer we have today). Unfortunately, his paper plunged neural network research into the dark ages for at least 10 years.

    While probably not as impressive as many would like, there are learning networks that are already in existence that can do visual and speech learning and recognition.

    And even though we have faster CPUs, it is still not the same as a neuron. Neurons in our brain are, at the very least, parallel adder units. So imagine 100 billion simulated human neurons, adding each second, sending their outputs to 100 trillion connections with a "clock" of about 20hz. The amount of computation going on here far exceeds the petaflops of processing power we have, especially when our cpus are mostly serial instead of parallel.

    0 讨论(0)
  • 2021-02-06 06:46

    Yup: OpenCog is working on it.

    0 讨论(0)
提交回复
热议问题