Video Script: The History and Future of Electron Tomography
This page represents the script I wrote to show the history of electron microscopy (EM) and also highlight the incredibly steep exponential increase in data rates over the last few years, up to 2012. The video features on the front page of a new "SLASH Segmentation" website which aims to help scientists in EM deal with massive dataset which might otherwise take years to segment the "slow way".
Script: History and Future of Electron Tomography
In this video I’ll talk a bit about the history and future of electron microscopy.
The electron microscope was invented by Hungarian physicist Leo Szilard, however it was German physicist Enrst Ruska and electrical engineer Max Knoll who built the first transmission electron microscope (TEM) in 1931. In a matter of two years after introduction, was able to surpass the resolution of LM.
Soon afterwards, in 1935, German engineer, Manfred von Ardeen built the first scanning electron microscope.
Then, in 1938 Manfred helped construct the famous Siemen's TEM, a microscope regarded as the first practical electron microscope.
1945, electron microscopes achieved the milestone of 1 nm resolution.
In 1963, the world's first commercial available scanning electron microscope came out of Cambridge.
… And, around 1982, charge couple devices were first install into electron microscopes. Up until this point electron microscopes all saved to film, but using CCD cameras, images could be saved directly to a computer hard drive.
Early 1990's focused ion beams were introduced.
In 1993, Lens coupled camera's were introduced.
By the year 2000, technical advances made it possible to collect up about 1 GB of image data per week.
But since then, further technical advances, including the reinsturocion of serial block face scanning electron tomography by Winfried Denk in 2004, have made it possible to acquire data much, much faster.
And in 2007 a direct detection device was introduced by Mark Ellisman, James Bouwer and Nguyen-Huu Xuong….. and both these events have further accelerated the rate of data acquisition.
In fact… let’s looks at what's happened since 1982 to the rate of data acquisition. This blue trend line approximate the amount of data new instruments can obtain when an operator spends a full week on the microscope collecting images.
By contrast, this green line shows the approximate speed a single human can manually segment these images of cells if he's told to trace every visible compartment on every slice. If our user is a segmentation expert, they can use techniques like interpolation and semi-automatic drawing tools to increase their productivity by a factor of 10…. but the point here is that manual segmentation can never keep pace with data acquisition. Taking a look closer at this curve…. from 1 GB in 2000, microscopes such this one, using the 3View serial block face system, could acquire 140 GB per week by 2009. And a mere two years later… it became possible acquire over two terabytes worth of data over a week - and in fact with a 3View system, this doesn't even require the person to be present. Now at this stage, you're already would need on the order of 1 to 10,000 thousand users (depending on skill level)… just to keep up with a tracing your data. And the trend won't stop here…. companies continue to develop new ways to collect data faster - in fact in my own lab James Bouwer developed an 8k coupled camera in 2009 capable of 5 TB per week of montage. As we speak, there are also several other new technologies developed by various companies, various institutes, and these promise to acquire data even faster - but most of these projects are sworn to secrecy.
In these graphs, we've compared the trend of electron microscope data acquisition against Moore’s law: where byte the average size of new hard drives (in red) double every 1.5 years, as you can see in the bottom graph, where a log scale has been used. Data acquisition from EM has been like Moore's law on steroids…. And the blue trend has already overtaken the red one, meaning you could by one hard drive a week and it still wouldn’t be able to keep up with one of the latest electron microscope.
However, the reason we're showing you these trends is not to impress you. Instead we want you to really think about the growing divide between data acquisition and how fast we can actually segment data.
The challenge now is not who can acquire data the fastest.
The challenge now is who can analyze data the fastest.
Already, there are individual labs with petabytes of unanalyzed data. And this for us is the real motivation behind our SLASH initiative. An initiative which helps scientists get value out of these massive datasets by way of methodology: such as pairing high-fidelity segmentation with other techniques such as stereology. And by way of hybrid methods which combine the speed and scalability of automatic segmentation with the accuracy of manual segmentation. Only by using such methodologies can we hope to achieve something meaningful from the massive quantities of data we already have, and will continue to pile up as this exponential trend of data acquisition continues over the next few years.
For more information please visit slashsegmentation.com and watch some of our other videos.