In one of my posts I talked about using rendering brain volumes in-browser using XTK. The results, I’ll admit, weren’t spectacular. The volume rendering didn’t really give very defined edges. But now I’ll show a couple methods of rendering a brain using Blender. The first method is using volumetric data in Blender, and the second uses surfaces generated by FreeSurfer. I think it gives pretty cool results, check it out below.
Rendering MRI volumes in-browser with XTK
Recently I’ve been playing around with interactive visualizations, using tools like d3.js and GGobi. One of the things I like about interactive visualizations, as opposed to static graphics, is that with interactive visualizations you don’t have to make all the information available at once. You can present a broad overview of your data. And by having the user query specific data points, you can present more data as needed. Take this example of airport flight connectivity in the United States. If you had to display all the airport names and all the connections in one graph, it’d probably look pretty gross and would be very confusing to disentangle.
Similary, with MRI data, it’s usually hard to see the big picture at once. MRI data is usually just displayed in 2D slices. If you’re showing activations you may show a couple slices, perhaps one axial and one sagittal, so your audience can get an idea of where your clusters take place. If you wanted to show a whole brain, you could perhaps do an animated GIF, like so.