Sat, 21 May 2016, 8:09

Going https: Let’s encrypt.

The Brain Catalogue uses Web Sockets for all its interactive editing. We have been using unencrypted Web Sockets thus far, and the catalogue’s website itself was also unencrypted (using http and not https). Moving to a secure communication protocol involves a few small changes in the code, but in particular, it requires the obtention of a “certificate” that would prove that we are who we say we are.

Until recently, these certificates were sold by a few certification authorities — and they were quite expensive. But today we have Let’s Encrypt. Let’s Encrypt is “a free, automated, and open certificate authority, run for the public’s benefit. Let’s Encrypt is a service provided by the Internet Security Research Group“. Using a very simple script, we were able to secure the Brain Catalogue’s communications in no time. Here are the steps we followed:

1. Clone the certbot from GitHub:

git clone https://github.com/certbot/certbot

2. Generate a certificate using certbot-auto:

certbot-auto certonly --webroot -w /var/www/html/braincatalogue -d braincatalogue.org

3. Add a secure virtual host to our Apache configuration:

<VirtualHost>                               
    ServerName braincatalogue.org:443
    SSLEngine on
    SSLProtocol all -SSLv2
    SSLCipherSuite DEFAULT:!EXP:!SSLv2:!DES:!IDEA:!SEED:+3DES
    SSLCertificateFile /path/cert.pem
    SSLCertificateKeyFile /path/privkey.pem
    SSLCertificateChainFile /path/fullchain.pem
</VirtualHost>

Restarting Apache now allows to connect to the braincatalogue through https.

4. For the secure web socket configuration, we need to instantiate a web socket server like this:

var ws_cfg=JSON.parse(fs.readFileSync('ws_cfg.json'));
var httpServ = require('https');
var app = httpServ.createServer({
    key: fs.readFileSync(ws_cfg.ssl_key),
    cert: fs.readFileSync(ws_cfg.ssl_cert),
    ca: fs.readFileSync(ws_cfg.ssl_chain)
}, function(req,res){}).listen(ws_cfg.port);
var websocket=new WebSocketServer({server:app});

Where the file ws_cfg.json contains the paths to our key and certificate. You can now try the Brain Catalogue in all security (well… that’s easy said…) here: https://braincatalogue.org.

Big thanks to Sou, Yoko and Félix for telling us about Let’s Encrypt!

Google+TwitterFacebookPinterestTumblr
Sat, 16 Jan 2016, 11:31

At Brain Catalogue we love Zenodo

Zenodo (http://zenodo.org) is a research data repository created by OpenAIRE and CERN to provide a place for researchers to store datasets. Similar to figshare, Zenodo can store your data and give you a DOI to make it citable.
We have started to deposit all Brain Catalogue’s data at Zenodo, and soon you should be able to cite your favourite brains in your works.
Initially, we uploaded the data manually, but that became tedious very soon. Luckily, Zenodo has a very simple to use and well documented API. In just 3 lines of code using curl you can easily deposit a data file and make it citable (Full information is available at https://zenodo.org/dev).

Before starting anything you need to obtain a token, which is a random alphanumeric string that identifies your queries. You only need to do this once. With your token safely stored (I keep it in the $token variable), data uploading takes just 3 steps:

1. Create a new deposit and obtain a deposit ID:

curl -i -H "Content-Type: application/json" -X POST --data '{"metadata":{"access_right": "open","creators": [{"affiliation": "Brain Catalogue", "name": "Toro, Roberto"}],"description": "Brain MRI","keywords": ["MRI", "Brain"],"license": "cc-by-nc-4.0", "title": "Brain MRI", "upload_type": "dataset"}}' https://zenodo.org/api/deposit/depositions/?access_token=$token |tee zenodo.json

Zenodo responds with a json file, which here I’m saving to zenodo.json. Now you can use awk to parse that file and recover the deposit id. I do that like this:
zid=$(cat zenodo.json|tr , '\n'|awk '/"id"/{printf"%i",$2}')

With your deposit ID in hand, you are ready to upload your data file

2. Upload data file:

curl -i -F name=MRI.nii.gz -F file=@/path/to/the/data/file/MRI.nii.gz https://zenodo.org/api/deposit/depositions/$zid/files?access_token=$token

The server will respond with a HTTP 100 ‘Continue’ message, and depending on the size of your file you’ll have to wait some time. Once the upload is finished you are ready to

3. Publish your dataset:

curl -i -X POST https://zenodo.org/api/deposit/depositions/$zid/actions/publish?access_token=$token

And that’s it. You can now go to Zenodo and view the web page for your data

Google+TwitterFacebookPinterestTumblr
Sat, 3 Jan 2015, 7:47

New precise touch segmentation for tablets

Tablets could be a great tool for volume segmentation. They are lighter and cheaper than graphic tablet displays such as the Cintiqs. However, in tablets such as iPads drawing accurately is not easy because the place where you draw is hidden by your finger. A passive stylus isn’t of much help either: they are often not precise, larger than the pixels in the image, and it’s difficult to tell exactly where you will draw.

We have implemented a new cursor in Brain Catalogue’s collaborative segmentation interface to allow to draw accurately with a tablet. The idea is to disociate the place where you draw from the place where you put your finger (or stylus if you prefer). Brain Catalogue’s cursor is then composed of 2 parts: then pen and the ring. The pen is the square where you draw (which can be selected to be different sizes, from 1 to 15 pixels). The ring is where you put your finger. It has 3 states: when the ring is yellow, the cursor moves without drawing. Hold the ring for 1 second and it becomes orange, allowing you to displace it relative to the pen. To draw, tap on the yellow ring, which then turns green: now you can draw without your finger hiding the pen!

Precise touch segmentation cursor: Yellow for moving the cursor, Green for drawing and Orange for displacing the ring.

Precise touch segmentation cursor: Yellow for moving the cursor, Green for drawing and Orange for displacing the ring.

You will only see the new cursor if you use  Brain Catalogue from a tablet. Watch the video for a demonstration of the segmentation of our Red Kangaroo using an iPad.

Google+TwitterFacebookPinterestTumblr
Sat, 15 Nov 2014, 20:17

Red-necked wallaby

Red-necked wallabyWe just uploaded the  3rd marsupial of the BrainCatalogue: the Red-necked wallaby. The brain comes from the Vertebrate Brain Collection of the Muséum d’Histoire Naturelle de Paris, and was scanned by Emmanuel Gilissen, from the Royal Museum for Central Africa, in Belgium.

Here is also a short video of some of the red-necked wallabies of the Jardin des Plantes in Paris, showing them going by their own business.

Google+TwitterFacebookPinterestTumblr
Tue, 2 Sep 2014, 13:01

Collaborative segmentation of the Black rhinoceros (mesh available)

The MRI of a Black rhinoceros (which is actually grey) has been in the BrainCatalogue for some time already. But folded as it is, manually segmenting it was not an easy task, and the MRI was being displayed alone.

A bit more than a month ago, I started writing a tool – AtlasMaker – to segment MRI data on-line, collaboratively. AtlasMaker works in real-time: you can see inmediatly what other people are doing, and you can chat with them (the code is available in github here).

In the future, I would like to add AtlasMaker to all brains in the BrainCatalogue, but for the moment, only the Black rhinoceros is available. You are welcome to give it a try at http://brainspell.org/atlasMaker. And here’s a quick tutorial showing how it works:

After announcing AtlasMaker alpha on Google plus, a group of anonymous internet neuroanatomists helped me segment the Black rhinoceros brain in just a few days! There are still places where the segmentation could be improved, but think Wikipedia: if at any time you see place for improvement, just go for it (all revisions are saved).

To create the mesh, I downloaded the brain segmentation from AtlasMaker, cut the cerebellum, cleaned up a bit the olfactory bulbs, and used isosurf. I decimated the mesh to a reasonable size and smoothed it using MeshLab (a Taubin smoothing, that smooths the mesh without shrinking it). You can see it in all its interactive 3D beauty here:

http://braincatalogue.org/Black_rhinoceros

And a picture to thank again to all the people that help with the segmentation!

rhino

Google+TwitterFacebookPinterestTumblr
Mon, 14 Apr 2014, 20:29

Chimpanzee, Gorilla and Orangutan MRI data available

Chimpanzee Gorilla Orangutan

We have uploaded data for three of our primate relatives: Chimpanzee, Gorilla and Orangutan. All three specimens come from the Vertebrate Brain Collection of the Natural History Museum in Paris. The Chimpanzee and the Gorilla were scanned by Emmanuel Gilissen from the Royal Museum for Central Africa in Tervuren, Belgium, the Orangutan by Mathieu Santin and Jean Daunizeau at the Institut du Cerveau et de la Moelle Epinière in Paris.

You can have a look at the new specimens here:

http://braincatalogue.org/Chimpanzee
http://braincatalogue.org/Gorilla
http://braincatalogue.org/Orangutan

Google+TwitterFacebookPinterestTumblr
Sun, 13 Apr 2014, 12:25

Thylacine MRI data available

The Thylacine or Tasmanian Tiger was a carnivorous marsupial that lived in Australia, Tasmania and New Guinea. At first it could look like a mix of a Jackal and a Tiger, but is closer to Kangaroos. During the 20s, many farmers thought that Thylacines were killing their poultry, and the government even paid paid 1 pound per head for dead an adult and 10 shillings a pup.

Thylacines went extinct during the 30s (even though there are still reports of sightings!), but a brain was conserved at the Vertebrate Brain Collection of the Museum d’Histoire Naturelle de Paris. The brain was scanned by Emmanuel Gilissen, from the Royal Museum of Central Africa at Tervuren, Belgium. It is a bit damaged, but we are working to reconstruct a 3D surface render. In the meantime, you can look at the MRI data here:

http://braincatalogue.org/Thylacine.

This video, from the wikipedia, shows a compilation of the last images that remain from this surprising animal:

Google+TwitterFacebookPinterestTumblr
Thu, 14 Feb 2013, 18:17

Sloth bear MRI data available

The sloth bear (Melursus ursinus), also known as the Stickney Bear or labiated bear, is a nocturnal insectivorous species of bear found wild within the Indian subcontinent.

Our specimen comes from the Vertebrate Brain Collection at the Jardin des Plantes in Paris. It was fixated in 1931, and scanned in January 2013 in a 3 Tesla MRI scanner at the Institut du Cerveau et de la Moelle.

Google+TwitterFacebookPinterestTumblr