By gg42

Scientific Misinformation

October 28th, 2010 in Uncategorized 3 comments

Bookmark and Share

Stuart Hameroff, MD, is an anesthesiologist and professor at the University of Arizona. In one of many articles and videos about consciousness on the Huffington Post, Hameroff describes how anesthesia can help explain consciousness.

If the brain produces consciousness (all aspects of the term), then it seems to follow that turning off the brain will also turn off consciousness. This is exactly how anesthetics work.

While most anesthetics are nonselective “dirty” drugs, they all produce loss of consciousness, amnesia, and immobility by either opening inhibitory ion channels or closing excitatory ion channels in neurons. The commonly used intravenous drug propofol, for example, acts by activating GABA receptors, the ubiquitous inhibitory channels in CNS interneurons. Brain off = consciousness off.

Hameroff does not subscribe to this. He argues that consciousness is an intrinsic part of the universe and that anesthetics simply disconnect it from the brain. He also thinks that by saying “quantum” a lot, he can scientifically prove the existence of the soul.

What’s scary is that Hameroff has “MD” and “Professor” next to his name. Will Joe the Plumber see through the misinformation?

Don’t take the HuffPost too seriously:

Consciousness and Anesthesia with Stuart Hameroff

Can Science Explain the Soul?

Tagged , ,

Replacing Neurons

August 20th, 2010 in Uncategorized 6 comments

Bookmark and Share

Imagine: a mad scientist with a ray gun shoots at a neuron somewhere in cortical layer IV of your visual area MT, burning it up in a matter of microseconds (just for fun, imagine also that the ray gun leaves everything else intact).

With one neuron missing, you probably won't notice any perceptual change. But what if, one by one, all neurons in are MT went AWOL? You'd be stuck with an annoying inability to visually detect motion.

Now imagine that for every cell that our fancy ray gun hits, it replaces it with a magical transistor equivalent. These magical transistors have wires in place of each and every dendrite, a processing core, and some wires in place of axon(s). Naturally, the computational core analyzes the sum of all inputs and instructs the axon to "fire" accordingly. Given any set of inputs to the dendrite wires, the output of the axon wires is indistinguishable from that of the deceased neuron.

We can still imagine that with one neuron replaced with one magical transistor, there wouldn't be any perceptual change. But what happens when more and more cells are replaced with transistors? Does perception change? Will our subject become blind to motion, as if area MT weren't there? Or will motion detection be just as good as with the real neurons? I am tempted to vote in favor of "No change [we can believe in]," but have to remain skeptical: there is simply no direct evidence for either stance.

Ray guns aside, it is not hard to see that a computational model of a brain circuit may be a candidate replacement of real brain parts (this is especially true considering the computational success of the Blue Brain Project's cortical column, which comprises 10,000 neurons and many more connections among them). For example, we can imagine thousands of electrodes in place of inputs to area MT that connect to a computer model (instead of to MT neurons); the model's outputs are then connected, via other electrodes, to the real MT's outputs, and ta-da!  Not so fast. This version of the upgrade doesn't shed any more light on the problem than the first, but it does raise some questions: do the neurons in a circuit have to be connected in one specific way in order for the circuit to support perception? Or is it sufficient simply for the outputs of the substitute to match those of the real circuit, given any set of inputs? And, what if the whole brain were replaced with something that produced the same outputs (i.e. behavior) given a set of sensory inputs - would that "brain" still produce perception?

Tagged , ,

Reading Good for You

August 12th, 2010 in Uncategorized 0 comments

Bookmark and Share

Research has shown that reading stimulates white matter growth, improves memory, and in general makes you smarter. If you think any of those things are good for you, visit Project Gutenburg, a collection of thousands of free e-books.

Disclaimer: reading is not intended to cure any disease. Consult your physician before engaging in excessive reading. Side effects include nerdiness, myopia, and know-it-all syndrome.

Tagged

Extra extra!!! Storm brewing in espresso shot!

August 4th, 2010 in Uncategorized 0 comments

Bookmark and Share

The media is always hungry for juicy stories about anything. Topics of interest range from Lindsey Lohan's latest adventures to the implications of another political ethics violation. The science writers at the New York Times are no exception. Dennis Overbye confessed in an essay yesterday that some writers are so eager to report on sensational findings that they sometimes hype up their stories.

Shocking! Overbye gives an example of one such NYT article, which reported the amazing story that scientists found hints of the elusive and mysterious dark matter in a Minnesota mine. He said the article raised a hysteria, but it eventually left people disappointed when someone cared to report that the amount of dark matter found was not far above amounts found by chance. Dennis Overbye goes on to condemn the internet for spreading rumors, but he fails to note that the original hyped report on dark matter was written by him!

Perhaps our trusted science writers should do a bit more research before they publish their articles. But wait! they need the stories, and they need those stories to be catchy, damnit! Their job isn't to educate readers on the current state of whatever scientific field; their job is to report the latest findings. They more controversial they are, the better. There's a new article everyday about how exercise is good for you (or is it bad? I can't remember anymore) or how prostate or breast exams for cancer have been wrong all these years (don't worry - they'll turn out to be right again next week). No wonder Americans are confused about their health.

Individual studies are great, but they have to be taken in context and have to stand the test of time. Most findings in basic science research are small; it's the knowledge collected over many experiments and years that gives us a big picture of any one field. So the next time you read about "a new study," take it with a critical grain of salt.

Tagged , ,

Blue Brains

July 20th, 2010 in Uncategorized 3 comments

Henry Markram talks about the Blue Brain project

Tagged

Toasters With Feelings

July 7th, 2010 in Uncategorized 5 comments

The Brave Little Toaster

The Brave Little Toaster


Bookmark and Share

Anthropomorphism is the attribution of human characteristics to inanimate objects, animals, or God. It has been a hallmark of faiths and religions worldwide. Humans have a natural tendency to assign intentions and desires to inanimate objects ("my computer isn't feeling well today - he's so slow!"), but they also strip "lower" beings (animals) of those same human characteristics.

We have a history of treating animals unnecessarily cruelly. I don't mean killing for food - that's necessary for our survival; I'm referring to dog fights, hunting, and other violence. We didn't even think that animals could sense pain until quite recently!

Why do we think of lifeless forms as agents with intentions but of actual living creatures as emotionally inferior clumps of cells?

Could it be that the need to rationalize phenomena is simply stronger when the phenomena have absolutely no visible explanation?

And do toasters really have feelings??

Tagged ,

On "Daring to Discuss Women in Science"

June 18th, 2010 in Uncategorized 1 comment


Bookmark and Share

New York Times columnist John Tierney recently dared to discuss women in science (the occasion being the passing of a law called “Fulfilling the potential of women in academic science and engineering" in the House).

The problem is well known: there are fewer women than men in the top scientific positions at universities. Speculations about its causes have sparked considerable controversy over the years, leading to the demise of many an academic (Larry Summers is a notable example. When Summers was president of Harvard, he gave a speech in which he suggested that women are inherently incapable of being top-notch scientists; Summers was forced to resign shortly afterward) and instilling in them a fear of the taboo topic.

Discussing women in science has nevertheless remained sexy in a strange sort of way. Tierney cites some relevant research on the right tail of scores distributions (while average scores on standardized tests are the same for boys and girls, boys outnumber girls four to one on the top math scores around the 99.9th percentile; this is despite the push since the 1990's to close the gap). The fear is not only that this gap won't disappear anytime soon, but that its causes are genetic (innate!).

99.9th percentile aside, Tierney seems to have overlooked the possible social reasons for the underrepresentation of women in the sciences. These might be the games and toys that parents and teachers give their children (do girls play games involving spatial reasoning?) and/or the gargantuan investment that women make when starting a family (having children and raising them is no small feat).

Perhaps while neuroplasticity is such a hot topic in the brain sciences, someone could systematically explore the effect of early exposure to math and science on later performance on standardized tests.

As for family planning, maternal investment is a product of evolution and unlikely to change soon. Perhaps fathers can take one for the team by helping rear the kids (after they're born, of course)?

Tierney's original opinion may be found here.

-- G. Guitchounts

Howard Hughes Medical Institute Awards BU Neuroscience $1.5 million!

May 21st, 2010 in Uncategorized 0 comments

Story here.

“Memristors” to replace your neurons? Thanks, but no thanks.

April 13th, 2010 in Uncategorized 1 comment


Bookmark and Share

Researchers at the Hewlett-Packard laboratories in California have produced tiny electronic switches called memristors (shortening of memory-resistor) that have the potential to revolutionize computing.

Traditional electronic devices use small switches called transistors as the elements of information storage and transfer. A typical computer may have millions of transistors, which may be on the scale of tens of nanometers. Limits in possible reduction of transistor size serve a great threat to progress in integrated circuit design. Memristors – about 3 nanometers in length – therefore offer a new path for making smaller and denser electronic devices.

The team’s report in last week’s issue of Nature shows off the data, with electric traces that are hauntingly reminiscent of neuronal current-voltage plots and action potentials.

The New York Times quotes Dr. Chua, who envisaged memristors in 1971, as saying that “our brains are made of memristors… We have the right stuff now to build real brains.” But are these inglorious transistors really capable of mimicking biological brains?

Simply thinking of the scale differences suggests that the answer may be… maybe. A neuron cell body is on the order of 10-25 micrometers. Compare that to the 3 nanometers of the memristor. Furthermore, memristors operate on a time scale of nanoseconds, whereas most neurons are much slower, spiking in milliseconds.

So memristors are smaller and faster than neurons. In fact, current transistors are also smaller and faster than neurons. So why haven’t computers taken over the world? For one, computers are designed to do what we tell them. And even maverick computers (if they exist) aren’t nearly as smart as the average human. This is because information is transferred in parallel in the brain; and in series in the computer. Put simply: the brain does many things simultaneously, even if slowly, while the computer does only one thing at a time, very quickly. (Curious readers should see “The computer and the brain” by John Von Neumann).

So while memristors may be found inside your next nano-MacBook or iPod-atomic, don’t expect them to replace your neurons.

ORIGINAL NATURE PAPER: http://www.nature.com/nature/journal/v464/n7290/full/nature08940.html#B15

NY TIMES ARTICLE: http://www.nytimes.com/2010/04/08/science/08chips.html?ref=science

- G. Guitchounts

Evolutionary Neurobehavior Laboratory

June 17th, 2009 in Uncategorized 0 comments

Cool Link:

The Evolutionary Neurobehavior Laboratory takes an evolutionary approach to understanding neurobehavioral traits and systems in human beings. They study a variety of topics, particularly sleep, Parkinson’s Disease, and religion.

Read more here.