Forum: Possible augmented reality tools for bioinformatics and current problems
5
gravatar for Peter Leontev
23 months ago by
Russia
Peter Leontev90 wrote:

Hi.

I was thinking about an application of augmented reality tools to help understanding bioinformatics problems. Although it may look like a useless thing to do (to have) could it actually be helpful for someone to have a tool that solve particular problems related to 3D-visualization using some augmented reality stuff?

If above statement is true could you give me some examples of such applications? I am considering all the possibilities because for me it looks like a good question to investigate.


It may be obvious from above but I would like to take similar topic for MSc thesis so all the suggestions are really welcome!

Thank you!

ADD COMMENTlink modified 23 months ago by nishanth.merwin0 • written 23 months ago by Peter Leontev90

One of the more common 3-D plots people seem to use in bioinformatics is PCA. I don't know if one data visualization is complicated enough to count, but just saying.

ADD REPLYlink written 23 months ago by Madelaine Gogol4.8k

Sorry, I think i don't fully understand your statement. Do you mean it would be much better to have different/intuitive representation of PCA?

ADD REPLYlink modified 23 months ago • written 23 months ago by Peter Leontev90

I guess I just meant it's a 3-D plot that people are making a lot. But not often actually viewing it in 3-D -- just viewing it in 2-D.

ADD REPLYlink written 23 months ago by Madelaine Gogol4.8k

It's clear now, thank you!

ADD REPLYlink written 23 months ago by Peter Leontev90
10
gravatar for John
23 months ago by
John11k
Germany
John11k wrote:

I have an Occulus Rift DK2 that I wrote some software to convert ChIP-Seq peak/tracks to Minecraft maps, and then walk around it in 3D (which was already possible). I learnt a few things:

1) 3D on the occulus is really bad. If anyone tells you otherwise, they are just too proud to admit they paid a lot of money for a phone screen without the phone. Maybe the newer versions are better, but certainly the DK2 was awful. You can see the black lines between the pixels (the screen door effect), and the depth of the Z axis is about half of anything you might see in a 3D movie. It's a new experience, but not a particularly good one. Also it will make you throw up so much you'll think you're 21 again - particularly if you develop software on it.

2) These days it only really plays nice on Windows. Last time i checked they dropped driver support for everything else.

3) Interacting with 3D objects with a mouse and keyboard is about as much fun as interacting with a regular computer with just a scroll wheel and keyboard. Until 3D inputs like the the STEM Sixense can actually be bought (http://sixense.com), interacting with anything in 3D is like having to thread cotton through the eye of a needle, whilst the needle is constantly rotating because it inexplicably hates you.

4) There is absolutely no benefit to looking at ChIP-Seq data in Minecraft. There might be something in using biological data to create maps in games, but 2D data should stay in 2D. Even if you have 2 variables of data and a third time variable, you are better off keeping it as 2D movie than a 3D landscape. In applications like Hi-C I think there's a lot of potential, particularly if you then layer on other signals like RNA/Chip intensity as colour or volumes, but on the whole biological data does not fit 3D all that well, unfortunately. I could give a whole lecture on why, but it boils down to two things: 1) Humans don't see in 3D, they see in double-2D, and and you have to actively distort data to make it render correctly in the human brain. 2) The brain doesn't care much about the third dimension. It cares far more about colour, shape, and speed. If you want to 'see' the sense in the data, you are better off exploiting these features first.
Only in inherently 3D data is 3D really worth the time.

Also try to stay away from the terms like "augmented reality" and other such buzz words. There's a lot of undeserved hype over 3D, and really terms like augmented reality only make sense if you are a PhD student who spends so much time looking at data that you can justifiably redefine your definition of reality ;)

ADD COMMENTlink written 23 months ago by John11k
6

Can you take any screenshots and post them here? A lot of people, incuding myself, would get quite a kick out of seeing ChipSeq data in Minecraft - even if does not seem useful ... 

ADD REPLYlink written 23 months ago by Istvan Albert ♦♦ 73k
1

Any chance of making the MC world save available... e.g. dropbox?

ADD REPLYlink written 22 months ago by Ian4.9k

Sorry i've been working frantically on another project the past month (https://github.com/JohnLonginotto/ACGTrie/) and I haven't had even a minute of time to give to other projects :( [particularly not MineCraft related, hehe.]

The tool that converts ChIP-Seq -> MineCraft maps works off signal data stored in a NumPy array (or a standard C struct), not off BigWigs or any other common bioinfo format. The only reason I made the NumPy -> MCA converter was to show-off how fast analysis of signal data in NumPy could be - particularly since numpy arrays are smaller than BigWig files, etc, so you can cram a lot of them into memory without compression. Like hundreds. I started off with a converter for NumPy to MCA, then I worked on a real-time adaptor so you don't need the MCA files at all, just the numpy arrays, in memory. 
'Mine' data in real-time, etc etc. (did i mention it was totally impractical though? :P) But I never finished the adaptor - not without weird bugs like pigs randomly spawning - so its in a state of un-done-ness.

The unnamed tool that converts BAM files to these numpy arrays is also on ice until after ACGTrie is finished. If you are really good at python I can put what i've got up on GitHub and you can decide if you want to finish it off, otherwise you would have to wait 3-4 months until I get around to it :) If you (or anyone else) wants to help out on that project, please talk to me on Freenode on #acgt (http://ac.gt/chat)

ADD REPLYlink modified 22 months ago • written 22 months ago by John11k
4

After reading your comment on Minecraft, this piece lured me over: 

ADD REPLYlink written 23 months ago by Mary11k
1

Wow! Really helpful. Thank you!

BTW, one more question - do you think that is possible to apply such tools for educational purposes? I've found only one paper about it (I didn't google this topic deeply enough though...) - "An augmented reality tool to aid understanding of protein loop configuration". It seems like a right thing to do but unfortunately I've not found more info about it..

ADD REPLYlink written 23 months ago by Peter Leontev90
1

OMG, bring this into the institute tomorrow!

ADD REPLYlink written 23 months ago by Devon Ryan70k
4

I don't know if there is a Windows PC anywhere in our institute :P
I will ask Human Resources or Accounting - maybe I can borrow a laptop.... "for Science"
But screen shots at home I can definitely do. Actually I can do one better, since there is a Minecraft renderer for MCA files written in JavaScript/WebGL (http://voxeljs.com/), I can host a region file and we can all view it in our browsers in 2D.
But be prepared to kill a lot of pigs. I haven't worked on the code in over 8 months, and I can't remember if I fixed the bug where if your final bin in a 16x16 "chunk" was not divisible by two (which happens half the time) the extra data was appended to the block ID which usually spawned a pig. I called it a bacon-overflow.

ADD REPLYlink written 23 months ago by John11k

"Augmented reality" to me is more along the lines of the Microsoft Hololens than Oculus. Although admittedly I have no idea what the visual quality of that will be like, the PR material looks pretty impressive. Rather than a virtual environment you just have artificial data/images overlayed onto the real world.

I could imagine a situation like Istvan described below - an interactive 3D protein structure, but where you see a figure in a paper, the system recognises it (via qr code or similar), loads the model, and shows in front of you. You could then either leave it there as a little pop up figure, or move it to hover over your desk, scale it up, looks around it, and so on... Although that's rather dependent on a decent interactive control system (and an unobtrusive headset you don't mind wearing all the time).

I'll admit that I think augmented and virtual reality devices are really cool, but I hope they don't become too common as a means of displaying data. You know how bad those excel fake 3D plots are to understand? Imagine if you had a fully 3D plot but no way of viewing it in 3D. Augmented data would be worse than useless to people without an augmented reality device!

I would say to consider AR as virtual 3D printing. If your application would benefit from producing physical models of something, but that would be too expensive or time consuming, then AR is perfect. Maybe it's more useful for people like architects than bioinformaticians!

ADD REPLYlink written 23 months ago by 13en80
6
gravatar for Daniel Swan
23 months ago by
Daniel Swan13k
Earlham Institute, Norwich, UK
Daniel Swan13k wrote:

You've clearly never seen one of these sitting abandoned, at great expense, littering a university campus: https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment

I saw protein docking simulations running on that about 10 years go. Although it mostly got used for playing Quake3.

 

ADD COMMENTlink written 23 months ago by Daniel Swan13k

Yeah, (un)fortunately I've not seen this device in real life but now I know what I should look at :)

ADD REPLYlink written 23 months ago by Peter Leontev90
4
gravatar for Istvan Albert
23 months ago by
Istvan Albert ♦♦ 73k
University Park, USA
Istvan Albert ♦♦ 73k wrote:

I would say 3D protein structure investigations would be a very interesting application.

Take a 3D structure, walk around it, grab it, push and pull on it, try to unfold it, then let go, see how it folds back, 

I always have a hard time with 3D visualizers in that it is difficult to grab exactly what you want and move it where you want it to be.

 

ADD COMMENTlink modified 23 months ago • written 23 months ago by Istvan Albert ♦♦ 73k
1

I've heard of special monitors and 3D glasses people used to use to do this in the old days, but don't have firsthand experience with it. But it sounds cool, anyway.

ADD REPLYlink written 23 months ago by Madelaine Gogol4.8k

That's what I was told by an older postdoc too. He basically said that it was dropped because it did not really add anything and just complicated things (because most things are inside a molecule and it is hard to keep track where you are there when you have to imagine your own position within the moledule and ignore aa directly in front of you. Also I found AR impractical for works where you need to access many menus etc. to adjust what you see.

ADD REPLYlink written 23 months ago by Aerval230

Thank you! Could you, please, give me some examples of 3D visualization programs that you've tried and found something like non-intuitive?

ADD REPLYlink written 23 months ago by Peter Leontev90

I've personally used UCSF Chimera for structure visualizations. It's about as good as anything else and easy enough to script, but it still has all the issues that Istvan mentioned (getting anything to rotate around the right axis is always a royal pain). I used two monitors with 3D glasses once back in college but even that was awkward (but that was 15 years ago, so 3D could be MUCH better these days).

ADD REPLYlink written 23 months ago by Devon Ryan70k

Thank you, that is very helpful.

ADD REPLYlink written 23 months ago by Peter Leontev90
3
gravatar for Jean-Karim Heriche
23 months ago by
EMBL Heidelberg, Germany
Jean-Karim Heriche13k wrote:

We're building an interactive 3D+time atlas of a dividing human cell (see a screencast of our HTML5/webGL prototype here (QuickTime movie), note that this is NOT a simulation but is based on real data). At some point someone may be interested in making this a real 3D interactive environment. However, I think people are generally not ready for real 3D yet. For one thing, there are no clear conventions on how to represent things and interact with them in virtual 3D. Then there's the fact that there's actually not so much real 3D data being produced in biology (unless one counts all the multidimensional -omics data). As mentioned before, 3D visualization of protein structures has been tried before (see here for example) but I don't think it's become the norm, maybe because it's still too cumbersome to use or that there's no real benefit to it or it is not good enough. This should not deter you from trying though but choosing a compelling use-case would be key to success.

ADD COMMENTlink written 23 months ago by Jean-Karim Heriche13k

That is definitely useful, thank you!)
 

ADD REPLYlink modified 23 months ago • written 23 months ago by Peter Leontev90
2
gravatar for Mary
23 months ago by
Mary11k
Boston MA area
Mary11k wrote:

I am mildly surprised not to see mentions of the 3D virtual worm work: http://blog.openhelix.eu/?p=13872 

And just the other day I was watching this video that i thought was fascinating;

Like the cell division stuff, I think there's neat things to be tackled at the cell + developmental front. 

Another area that I think is closer is 3D tissue structure stuff, like this printed artery.  http://advances.sciencemag.org/content/1/9/e1500758

I can imagine looking at how mutations, or different amounts of structural proteins, or different substrates, impact any of these things and being neat to watch in new ways. 

 

ADD COMMENTlink written 23 months ago by Mary11k

Thank you very much for this stuff. I am very glad that this question still attracts very different answers!

ADD REPLYlink written 23 months ago by Peter Leontev90
0
gravatar for nishanth.merwin
23 months ago by
Canada
nishanth.merwin0 wrote:

Hi, I'm not sure whether this answers your question exactly, but I had a similar thought in mind a few weeks ago. There's actually an app called Augmented Reality that lets you put any 3D model into the world, viewable by phone or tablet as long as you have an appropriate surface to track it onto.

I wrote a blog post about my attempts with it using R and data from the 1000 genomes project. 

https://medium.com/@nishanthmerwin/plotting-in-3d-with-r-and-augment-566c73a37002

 

ADD COMMENTlink written 23 months ago by nishanth.merwin0
1

Thank you for the answer!

ADD REPLYlink written 23 months ago by Peter Leontev90
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 551 users visited in the last hour