Monday, 6 September 2010

Imaging the Brain in the Operating Room: Challenges and Promise

Based on a seminar by Frank M. Skidmore, director of the Movement Disorders Center at the VA Medical Center


The phrase "this isn't brain surgery'' is used for good reason. Neurosugery is definitely in my top 10 list of real hard things to do. Your brain is an extremely delicate mass off specialised tissue sealed in a tough, durable casing, and many functioning regions are deeply buried that are often impossible to reach without burrowing through other active regions. Even so, it has become apparent that affecting these deep regions, or nuclei, can be hugely beneficial when treating motor diseases like Parkinson's.

Frank M. Skidmore directs the Movement Disorders Center at the VA Medical Center, Florida. A neurologist by trade, he works as a clinician diagnosing neurological disease and referring patients to teams of surgeons. Many of his patients require deep brain stimulaters (DBS), electrodes inserted into the brain that deliver electrical impulses, rather like a pacemaker. They act to interrupt misfiring nuclei with the hope of alleviating symptoms, such as loss of motor control. Unfortunately, these nuclei are small and difficult to pin point making accurate imaging of the brain, particularly while a patient is on the operating table, highly desirable.

What makes locating nuclei so difficult? Everyone's brain is different to a degree so no universal structural map exists. Even if a perfect structural map did exist, certain nuclei are independent of structure and so must be found by determining a functional map. As a result, the boundaries between brain regions are extremely blurred. But it gets worse. Opening the brain case causes pressure changes, fluids start to drain and air rushes in, so the sponge like mass of the brains begins to droop, and it does so continuously throughout an operation. Your small target is constantly moving. Even sneezing can course your brain to shift!

The situation can be alleviated some what by using a combination of MRI scans prior to an operation and a stereotactic frame, a scaffold that bolts to the skull and provides a 3D co-ordinate system for navigation. But inaccuracies in placing a frame bring additional problems and MRI scans can only provide a rough guide, as when the patient is moved to an upright position the brain will again shift in addition to the changes that occur during surgery.

Surgeons do have one extra trick up their sleeves. Pulses of electrical activity can be recorded using an electrode and played back as sound during live surgery. As it happens, different parts of the brain produce distinct sounds, including the target nuclei, which in the case of Parkinson's sounds like rain on a tin roof. This can indicate to the surgeon, with reasonable reliability, as to when he has hit the correct spot. The problem is that the electrode will have to be re-inserted until it is been placed correctly. With each re-insertion damage is caused in the form of scaring that can have major consequences to a patients mental abilities. One patient, who suffered from hand spasms, lost his ability to form coherent sentences due to scaring of the speech centre of the brain, but surprisingly he didn't mind as he could once again indulge himself in his hobby of model ship building.

The ultimate goal would be to do away with this trail and error approach and succeed with a quick, single pass insertion. Pioneering surgeons are utilising mobile MRI scanners to provide live inter-operative imaging of the brain, but this poses many new problems. The machinery is sizable and complex. Special surgical equipment is required that wont be effected by the strong magnetic fields radiated by the scanner. Alternatively, a mobile CT (computerised tomography) scanner can be used but software for tracking brain deformations during surgery does not yet exist. Even so, a CT scan can only reveal structure and not function.

If the problem of imaging could be solved it would need to be combined with robotic surgical hands that a computer could guide in precise co-ordination with live images. Advances in robotic surgery are promising, but the lack of the technology needed to map brain function and structure during surgery limits its use to body parts less complex.  Without detracting from the amazing feats that brain surgeons achieve today, there is still a long way to go if we are to be masters of our own minds.

Friday, 3 September 2010

The SixthSense Device: Projecting the digital world into reality

Designed and built by Pranav Mistry

A child develops a deep understanding with its physical world by interacting with everyday objects. We learn to associate objects with gestures, the movements we use to interact with an object, and how we can use an object to interact with other people. The use of gestures generally breaks down at the interface with computers. Data is typically accessed through a mouse, keyboard, or touch screen, and misses the intuitiveness of interacting with a physical object. SixthSense is a mobile device designed to bring alive intuitive physical objects with digital information.

Pranav Mistry wishes to bring the digital world back into reality in a cost effective way that everyone can afford. His experiments began with a series of simple devices that allow the use of real world gestures to interact with a computer. For example, by deconstructing an ordinary mouse, removing the rollers and attaching a basic system of pulleys, he was able to create a device to sense hand movements that could control a virtual pointer.

A common theme in his experiments was to bring a part of the physical world into the digital world, but on realising that what is really desired is information, and that we don't care about the computer, why not reverse the process and extract data from the digital world and paste it into the physical world. This idea lead to the SixthSense device.

The device itself is relatively simple. A potable projector hung
around the neck does the job pasting data onto any real world object, such as a wall. To allow interaction with the projection a small camera tracks the movement of a users fingers (as long as the users wears coloured rings). Your fingers can then be used to perform any number of gestures such as pinching to grab a graphic, pulling to zoom, or making the shape of picture frame to take a snapshot. When combined with a mobile internet connection, the SixthSense device allows you to carry your entire digital world with you.

The device can also recognise objects, such as a book cover. Imagine you are browsing in a bookstore. On looking at book the device can recognise and instantly project a review of the book directly onto the cover, or maybe the best online price, or a rating from your favourite book review web site. Information can also be inputted into the system via the camera. If reading a magazine article why not grab it with a pinch of the fingers and drag it onto your projected display, where you can recognise it as text and begin editing it, emailing it friends, or blogging it.

The SixthSense device is a fantastic piece of kit thats within the budget of the masses, and all the software is freely available. Its a great way to pull yourself back out of the digital world and reconnect with the physical.

Thursday, 2 September 2010

UniView: A virtual atlas of the Universe

As demonstrated by Carter Emmart


Whats our place in the universe? This is deep philosophical question thats been pondered over for thousands of years, but modern astronomy can now answer the literally form of the question, to a degree. Humans evolved to view the world as a flat horizon which seemed to stretch out to infinity in all directions. It wasn't until we got high above the earth that we saw the horizon curve and realised we are confined to the prison of a finite sphere. Fortunately, if we gaze up past the horizon to the stars our instincts are once again ignited by the vastness of space.

Carter Emmart's vision was to create a scientifically accurate, hugely detailed, virtual atlas of the universe that captures everything from the small scale features of a planets land
scape all the way up to galaxy clusters. For 12 years he managed a team of research students, astronomers, and 3D artists to create UniView, a realtime visualisation of the universe.

UniView is based at the American Museum of Natural History and was supported by NASA as part of the rebuilding of the Hayden planetarium. UniView now forms the bases for space shows in numerous domes across the world.

The software allows you take a virtual tour through the universe.
From a view of the Earth you can zoom outwards, further and further into the viewable universe until you have left our solar system, our galaxy, and our galaxy cluster, all of which are rendered accurately and beautifully. The further out you travel the further into past you move until you leave the visible universe itself and can view the universe from before time began. From here you can observe the WMAP of cosmic background radiation, the residue from the big bang.

UniView doesn't just feature planets and stars, but also reams of information on satellites, spacecraft, and space missions. You can follow the Voyager spacecrafts on their journey beyond our solar system, or take a look at the mission map of Cassini's visit to the planets. UniView is made more powerful by allowing a continuous influx of new data. For example, images taken of earths surface just hours before can be feed directly onto the virtual Earth.

As a civilisation we are beginning to see ourselves in a much wider context and by doing so it will help us understand where we are in the universe, and maybe why we are here.

For more information visit the UniView website.

Wednesday, 1 September 2010

Drilling into the frozen past


Review of a Lee Hotz article

Leave your comfy, warm living room, and venture south to the bottom of the world where you'll hit Antarctica, the highest, windiest, driest and coldest place on the planet. The entire continent has been drowned below sea level by the sheer mass of ice weighing it down. The ice mass represents a unique opportunity, to peer thousands of years into the past and observe incredibly detailed pictures of the earths ever changing climate.


A team of 45 scientists and engineers from the University of Wisconsin are drilling deep into the ice at location named WAIS Divide to extract clues as to the future of climate change. The ice contains a precise record of the rise and fall of greenhouse gases and temperature as far back as the last ice age.

Snow falls in huge quantities at WAIS Divide, with each snow fall forming a layer which is slowly compressed as it becomes buried by later snow storms. But its not just snow thats buried. Storms wash out dust, soot and trace chemicals from the atmosphere and deposits them on the fallen snow, year after year. We can observe calcium deposits from desert formation, soot from distant wild fires and methane levels indicating the strength of pacific monsoons, but most importantly each layer also traps the air itself. Roughly 10% of an ice layer is ancient air, a record of greenhouse gases, carbon dioxide, nitrous oxide, all unchanged since it was locked away thousands of years ago.

Drilling for Arctic ice isn't easy. So much snow falls at WAIS Divide the research team are practically living and operating a $8 million drill underneath the snow. They biopsy 10 feet long ice cores 10 times a day which equates to 360 years into the past per day. The ice cores are very susceptible to damage and contamination, and need to be kept at minus 20 degrees Celsius, so the team must work in a giant fridge located at the coldest place on earth. The samples they extract are shipped out to 27 independent labs across the world where they are analysed for 40 different trace elements at concentrations as low as 1 part in a quadrillion.

Why continue such obviously difficult research? Don't we know greenhouse gases are having adverse effects already? Greenhouse gases are detrimental to climate, but what we don't know is precisely what affect human activity will have on natural climate patterns, such as winds, ocean currents, precipitation rates, and cloud formation. These factors will affect billions of lives.

Snow storms have given antarctic ice an unprecedented knowledge of climate change, we just need to make sure we ask it lots of questions.

For more information on the WAIS Divide project see their web site.