A couple of years ago I made a visualization tool for the Google Cardboard that could show you a brain by looking around. The two big downsides were that you could not walk around through the point cloud and that you could not point out to others what you are seeing. This weekend I fixed both points by creating a new app: Immersive Points. This web app still shows a point cloud, but also allows you to physically walk through it, and use your hands to point to things. Other people can look at the screencast to see what you want to point out.
The app can be found at https://rmeertens.github.io/ImmersivePoints/index.html. I currently put a frame with points from a brain in (taken from the old brain visualization app), two frames from a self-driving car dataset, and a frame from a scan of the Notre-Dame.
In terms of controls, you can currently walk around freely (if your headset supports this). You can use the select button on the left hand of your VR device to move forward, and the right hand of your VR device to move backwards. Movement is always in the direction in which you are looking, which means you can also get a birds-eye view of a scene if you want.
In the process of adding a virtual reality option to the visualizer I also made the data loading more efficient (this used to take up to 10 seconds), and data is now hosted in an S3 bucket. If you have point cloud data you would like to walk around in, you can even upload it!
Currently, some nice examples are:
- The entire Notre dame in colour, taken from here
- An aggregated scan from the AEV dataset.
- A semantic segmented scan from the AEV dataset.
- A brain with some clustered information, with data from the Radboud University.