Caveat lector: This blog is where I try out new ideas. I will often be wrong, but that's the point.

Home | Personal | Entertainment | Professional | Publications | Blog

Search Archive




TEDbrain: Dragging neuroscience into the 21st century

(Source: Neil Girling)

That's me after my TEDxBerkeley last year. If I could do my talk again, it would be very different.

If it were tomorrow, I'd talk about how much more we could be doing in neuroscience if only some of the old walls were brought down. If only there was a way to bring together engineers, visual artists, designers, scientists, philosophers, mathematicians, etc. under one roof.

Not only do I want to see a TEDbrain, I want to see an entirely new research environment.

The mysteries of the brain aren't going to be solved with one, beautiful equation, or by a super-elegant experiment, or by a great new technique. The problem is bigger than that.

Understanding the brain isn't just a scientific problem, it's a data problem.

Granted there have been some amazing advances, but most cognitive neuroscience experiments are basically 1960s psychology experiments with better toys.

We can do better. We need to think bigger. We need to leverage the massive amounts of data we're collecting.

We need better data ontolgies. Better data mining. Better data visualization.

Right now, neuroscience in particular is undergoing a renaissance.

Hell, when you've got people like Sergey Brin who want to help spur on novel research, it strikes me as silly that a lot of the ways we're thinking about brain problems hasn't changed too drastically.

Neuroscience is fairly unique among the sciences in that it pulls in from so many different disciplines.

At Google last year, I gave a presentation titled, Computational Analysis Methods and Issues in Human Cognitive Neuroscience that was supposed to be about the difficulty we have in making the most out of the huge amounts of brain data that we collect from each of our subjects. We run very narrow experiments and throw out most of the information in our datasets by intentionally removing information via averaging, filtering, smoothing, a priori selection, etc.

Wasting data is stupid. But we all do it.

This winter, my wife and I put together brainSCANr in an attempt to synthesize and mine data from published research papers.

Tal Yarkoni and Russ Poldrack have done a more sophisticated job with PubBrain, which is a super cool, massive meta-analytic tool. Tal and Russ are doing an amazing job, but they're a rare breed.

My soon-to-be post-doc advisor, Adam Gazzaley, is doing awesome work in modernizing cognitive neurosciecne. At TEDxSanJose this year, he spoke about his work on taking neuroscience experiments out of the lab and into peoples' homes. About giving them at-home EEG and testing them using more natural stimuli (i.e., more "ethologically valid").

He and I have a really amazing project in the works.

The Bay Area is uniquely positioned at the interface of academic science and the intellectual tech communities. It would be the perfect place for a TEDbrain event.

The academic system around us is falling apart. Grant writing and politics make for a difficult environment for a researcher.

How is a scientist supposed to innovate when they're worried about their next grant? But I digress...