Tuesday, December 4, 2018

NIPS is NeurIPS



Its a big day for regeneration, for non neural cognition and bioelectric mechanisms. The lowly flat worm has had its day. At NeurIPS 2018 up Montreal way.





Saturday, November 24, 2018

Detecting glaucoma from raw OCT with deep learning framework

''A team of scientists from IBM and New York University is looking at new ways AI could be used to help ophthalmologists and optometrists further utilize eye images, and potentially help to speed the process for detecting glaucoma in images. In a recent paper, they detail a new deep learning framework that detects glaucoma directly from raw optical coherence tomographic (OCT) imaging.''

''Logistic regression was found to be the best performing classical machine learning technique with an AUC* of 0.89. In direct comparison, the deep learning approach achieved AUC of 0.94 with the additional advantage of providing insight into which regions of an OCT volume are important for glaucoma detection.''

Read more at: https://phys.org/news/2018-10-deep-glaucoma.html#jCp
Also https://arxiv.org/abs/1807.04855v1
* "Area under the ROC Curve."

Friday, November 16, 2018

GPUs speed computation

The Science for Life Lab uses GROMACS on NVIDIA GPUs to accelerate drug design. The research group is studying the mechanisms behind various molecular phenomena that occur at human cellular membranes. GROMACS is a molecular dynamics application designed to simulate Newtonian equations of motion for systems with hundreds to millions of particles. The researchers write: The highly iterative nature of fitting the parameters of the kinetic models used to simulate the electrical current curves and running compute heavy simulations for each consumes both time and resources. Slower simulations mean fewer iterations.
Adding GPU acceleration provides a significant performance boost.
Read more. Shown at left: Voltage sensing protein doman.

Thursday, November 8, 2018

Platform for Terror

Sunday, September 30, 2018

They dont call it the Web for nothing

Was remembering when the Web first caught on: There have been a lot of changes in system and data architecture since the. One thing I remember back then is people saying “yeah, it is pretty cool, but, you know, it is stateless.” As most of what I heard on this issue was from enterprise software vendors, with all the bias that could entail, I should have taken what I was told with a grain of salt. The first big problem these folks saw with the Web was its statelessness, which made it far different from the synchronously connect clients and servers (at that time, Java servers) they were used to. Wrote this up for a podcast page related to a podcast ...

 Podcast Page
https://itknowledgeexchange.techtarget.com/talking-data/web-what-have-you-wrought-on-strata-microservices-and-more/
Podcast https://cdn.ttgtmedia.com/Editorial/2016/PodcastTechTarget/Talking_Data_Podcast_092418_withmusic.mp3

Friday, September 28, 2018

Name that tune, Now Playing!



A recent note on the Google AI blog discusses the company’s use of a deep neural network for music recognition on mobile devices. As it brings extreme-scale noodling (convolution) to bandwidth limited devices (smart phones) it could be a breakthrough on par with MPEG and JPEG, which dramatically transformed music distribution beginning in the 1990s. It’s known as Now Playing, and it can use a sequence of embeddings that run your music against its network and recognize the song, while conserving energy on the device. Each embedding has 96 to 128 dimensions. An embedding threshold is raised for obscure songs – which is the town where I live. I guess when you look at what Google has done with Search, it shouldn’t be that surprising – but the idea that so much of the work occurs on the Thing (device), is pretty astounding. I  asked it ‘what’s that song’ and it got it right. Slam dunk. “Ride Your Pony” by Lee Dorsey. Now, Shoot! Shoot! Shoot! Shoot!  Jack Vaughan

RELATED 


Speaking of Name That Tune – why not a little vignette from the time when Humans Walked the Earth?