Turns out our brains can store a petabyte of information – basically the whole Internet
The human brain's memory capacity may be as much as 10 times larger than previously thought, according to a new study by scientists in California that looked at how hippocampal neurons in the brain function with low energy but high computational power.
The researchers built a 3D reconstruction of rat hippocampus tissue – the memory centre of the brain – and in doing so, discovered something strange. Synapses, the junctions that form between neurons, were being duplicated in about 10 percent of cases.
To measure the differences between these duplicate synapses, Sejnowski's team reconstructed the connectivity, shapes, volumes, and surface area of the rat brain tissue at a nano-molecular level, using advanced microscopy and computational algorithms.
The revelation that synapses can vary in size by increments as subtle as 8 percent suggests that there may be as many as 26 categories of sizes of synapses, rather than just a few, as scientists previously believed. According to the researchers, this extra complexity in synaptic dimensions translates to a huge boost in the brain's potential memory capacity. The researchers' calculations suggest that synapses also change their size and ability depending on neural transmissions. About 1,500 transmissions provoke a change in small synapses (taking about 20 minutes), while a couple hundred (1 to 2 minutes) will change large synapses.
Scientists consider that this could lead to advancements in computing, with ultra-precise and energy-efficient machines employing deep learning and neural networking techniques.