Thursday, March 28, 2019

Julia language

Haven’t been to an MIT open lecture for a while. Recently took in one that concerned Julia, an open source programming language with interesting characteristics.

The session was led by MIT math prof Alan Edelman. He said the key to the language was its support of composable abstractions.

An MIT News report has it that:“Julia allows researchers to write high-level code in an intuitive syntax and produce code with the speed of production programming languages,” according to a statement from the selection committee. “Julia has been widely adopted by the scientific computing community for application areas that include astronomy, economics, deep learning, energy optimization, and medicine. In particular, the Federal Aviation Administration has chosen Julia as the language for the next-generation airborne collision avoidance system.”

The language is built to work easily with other programming language, so you can sew things together. I take it that Julia owes debts to Jupyter, Python and R, and like them find use in science. Prof Edelman contrasted Julia's speed with that of Python.

In Deep Neurals as people work through gradients its like linear algebra as a scalar neural net problem these days, Edelman said. Julia can do this quickly, (it's good as a 'backprop')he indicated. He also saw it as useful in addressing the niggling problem of reporducibility in scientific experiments using computing.

Here are some bullet points on the language from Wikipedia:

*Multiple dispatch: providing ability to define function behavior across many combinations of argument types
*Dynamic type system: types for documentation, optimization, and dispatch
*Good performance, approaching that of statically-typed languages like C
*A built-in package manager
*Lisp-like macros and other metaprogramming facilities
*Call Python functions: use the PyCall package[a]
*Call C functions directly: no wrappers or special APIs

Also from Wikipedia: Julia has attracted some high-profile clients, from investment manager BlackRock, which uses it for time-series analytics, to the British insurer Aviva, which uses it for risk calculations. In 2015, the Federal Reserve Bank of New York used Julia to make models of the US economy, noting that the language made model estimation "about 10 times faster" than its previous MATLAB implementation. 

Edelman more or less touts superior values for Julia versus NumPy. Google has worked with it and TPUs and machine learning [see Automatic Full Compilation of Julia Programs and ML Models to Cloud TPUs".

It's magic he says is multiple dispatch. Python does single dispatch on the first argument. That's one of the biggies. (Someone in the audience sees a predecessor in Forth. There is nothing new in computer science, Edelman conjects. Early people didnt see its applications to use cases like we see here, he infers. )Also important is pipe stability. What are composable abstractions? I don’t know. J. Vaughan

Related
http://calendar.mit.edu/event/julia_programming_-_humans_compose_when_software_does#.XJ1julVKiM9
http://news.mit.edu/2018/julia-language-co-creators-win-james-wilkinson-prize-numerical-software-1226
https://en.wikipedia.org/wiki/Julia_(programming_language)
https://www.nature.com/articles/s41562-016-0021

Saturday, February 2, 2019

Data.gov shutdown


The Data.gov shutdown shows that, as open data can be turned off, data professionals may need to consider alternative sources for the kinds of data the government offers.

It occurred as a result of the larger partial government shutdown that began in December 2018 and proceeded to surpass any previous shutdown in length.

Data.gov, an Open Government initiative that began during the Obama administration, is on hold for now. As of last week, site visitors were greeted with a message: "Due to a lapse in government funding, all Data.gov websites will be unavailable until further notice." Read more.

Sunday, January 6, 2019

Up Lyft Story

Tuesday, December 25, 2018

For the unforeseeable future we have no paradigm


James Burke came up in conversation the other day. You know, the British science reporter who hosted the late 1970s Connections TV series?

On the show, he globe-hopped from one scene to the other, always wearing the same white leisure suit, weaving a tale of technological invention that would span disparate events - show for example, how the Jacquard loom or the Napoleonic semaphore system led to the mainframe or the fax machine.

Its hard to pick up a popular science book these days that doesn't owe something to Connections. Burke of Connections had cosmic charisma - in his hands,  Everything is connected to Everything. You'll hear that again.

Today I picked up Connections (the book that accompanied the series), looking, this being Christmas, illuminations. Not just the connections - but how the connections are connected. Cause its been a search for me over many years - I've stumbled and bumbled, but I have never been knocked on my heels more than this year, 2018.

And Burke delivered: It's not just about the connected, but also about the unconnected. How things happen: "The triggering factor is more often than not operating in an area entirely unconnected with situation which is about to undergo change," he writes. [Connections, p.289]

This seems to me today pertinent. Because the year just past was one where some among my interests (horse race handicapping and predictive analytics; Facebook, feedback, news and agitprop, and the mystic history of technology) seemed to defy understanding.

You see, you look close, and you analyze, but there is a cue ball just outside your frame of reference that  will break up the balls. It is a dose of nature - dose of reality - a dose of chaos. In horse racing it can be quite visible when a favorite bobbles at the start, or a hefty horse takes a wide turn and thus impels another horse a significant number of paths (and, ultimately, lengths) wider.  We (journalists, handicappers, stock market analysts) generally predict by looking in the rear view mirror, because we don't have a future-ready time machine.

I had the good fortune to cover events that Burke keynoted. There was OOPSLA in Tampa in 2001 (less than a month after 9/11 terror attack). And their was O'Reilly Strata West in Santa Clara (?) in about 2013 (?), at which the O'Reilly folks kindly set up a small press conference with Burke for media his keynote.

Burke is adamant that inventors do not understand all the ramifications their inventions will have in society in practice. One thing I tried to press him on was the role of the social structure (in our case the capitalist system) has in technology's development. He'd just gotten off a transatlantic cross continental flight, and delivered a startling keynote, before sitting down with press ( he was asked would he like some coffee, and he said that in his time zone it was time for wine), and Jack's questions did not so resonate.

My notes thereof are a bit of jumble... Everything is connected to everything. He said of Descartes… and his fledgling scentific methods... he "froze the world.." with  reductionist - which may have value but which, as forecasters, pundits, and handicappers have found, "doesn’t tell you how all the parts work together."

"For the future we have no paradigm."

Screaming from the conversation with Burke, was a quote, actually from Mark Twain.

In the real world, the right thing never happens in the right place and the right time. It is the job of journalists and historians to make it appear that it has.”

Tuesday, December 4, 2018

NIPS is NeurIPS



Its a big day for regeneration, for non neural cognition and bioelectric mechanisms. The lowly flat worm has had its day. At NeurIPS 2018 up Montreal way.





Saturday, November 24, 2018

Detecting glaucoma from raw OCT with deep learning framework

''A team of scientists from IBM and New York University is looking at new ways AI could be used to help ophthalmologists and optometrists further utilize eye images, and potentially help to speed the process for detecting glaucoma in images. In a recent paper, they detail a new deep learning framework that detects glaucoma directly from raw optical coherence tomographic (OCT) imaging.''

''Logistic regression was found to be the best performing classical machine learning technique with an AUC* of 0.89. In direct comparison, the deep learning approach achieved AUC of 0.94 with the additional advantage of providing insight into which regions of an OCT volume are important for glaucoma detection.''

Read more at: https://phys.org/news/2018-10-deep-glaucoma.html#jCp
Also https://arxiv.org/abs/1807.04855v1
* "Area under the ROC Curve."

Friday, November 16, 2018

GPUs speed computation

The Science for Life Lab uses GROMACS on NVIDIA GPUs to accelerate drug design. The research group is studying the mechanisms behind various molecular phenomena that occur at human cellular membranes. GROMACS is a molecular dynamics application designed to simulate Newtonian equations of motion for systems with hundreds to millions of particles. The researchers write: The highly iterative nature of fitting the parameters of the kinetic models used to simulate the electrical current curves and running compute heavy simulations for each consumes both time and resources. Slower simulations mean fewer iterations.
Adding GPU acceleration provides a significant performance boost.
Read more. Shown at left: Voltage sensing protein doman.

Thursday, November 8, 2018

Platform for Terror