Monday, February 19, 2018

Cybernetic Sutra

I'd had an opportunity in college days to study comparative world press under professor Lawrence Martin Bittman, who introduced BU journalism students to the world of disinformation, a discipline he'd learned first hand in the 1960s, before his defection to the West, as a head of Czech Intelligence. We got a view into the information wars within the Cold War. This gave me a more nuanced view of the news than I might otherwise have known. Here I am going to make a jump. 

I'd begun a life-long dance with the news. 

I'd also begun a life-long study of cybernetics. 

And lately the two interests have begun oddly to blend. 

It was all on the back of Really Simple Syndication -RSS- and its ability to feed humongous quantities of online content in computer-ready form-It made me a publisher, as able as Gutenberg, and my brother a publisher, and my brother-in-law a publisher, and on ...

Cybernetics was a promising field of science that seemed ultimately to fizzle. After World War II, led by M.I.T.'s Norbert Wiener and others, cybernetics arose as, in Wiener's words, "the scientific study of control and communication in the animal and the machine."

It burst rather as a movement upon the mass consciousness at a time when fear of technology and the dehumanization of science were a growing concern. - As the shroud of war time secrecy dispersed, in 1948 penned Cybernetics, which was followed by a popularization.

Control, communication, feedback, regulation. It took its name for the Greek root cyber. Wiener - Brownian motion - artillery tables - development of the thermostat, autopilot, differential analyzer, radar, neural networks, back propagation.

Cybernetics flamed out in a few years, tho made an peculiar reentry in the era of the WWW. Flamed out but, somewhat oddly, continued as an operational style in the USSR for quite some time more. Control, communication, feedback, regulation played out there somewhat differently.

A proposal for a Soviet Institute of Cybernetics included "the subjects of logic, control, statistics, information theory, semiotics, machine translation, economics, game theory, biology, and computer programming."1 It came back to mate with cybernetics on the web in the combination of agitprop and social media, known as Russian meddling, that slightly tipped the scales, arguably, of American politics.

1 http://web.mit.edu/slava/homepage/reviews/review-control.pdf

Sunday, February 4, 2018

Pixie dust of technology





Back in the day, the Obama campaign got good press for its efforts to employ technology and then-new social media platforms to organize a large political base. Part of the effort was Dipayan Ghosh, who served in the Obama White House. Like others, Ghosh is having second - or deeper thoughts - on the subject. In a report on "#DigitalDeceit" he and a coauthor ruminate on the Internet giant's (Google's and Facebook's) alignment with advertising motivations - and the resultant penchant for misinformation. Comment: Technology always exists within the a larger context, and will eventually be subsumed thereto. What it will do is cast a haze of pixie dust over ethos, established mores, institutional memory. The haze gradually recedes. -- Jack Vaughan

Sunday, January 21, 2018

AI drive spawns new takes on chip design


As soon as we solve machine
learning we will fix printer.
It is has been interesting to see a re-mergence in interest in new chip architectures. Just when you think it is all been done and there's nothing new. Bam! For sure.

The driver these days is A.I. but more particularly the machine learning aspect of AI. GPUs jumped out of the gamer console and onto the Google and Facebook data center. But there was more in the way of hardware tricks to come. The effort is to get around the tableau I here repeatedly cited: the scene is the data scientist sitting there thumb twiddling while the neural machine slowly does its learning.

I know when I saw that Google had created a custom ASIC for Tensor Flow processing, I was taken aback. If new chips are what is needed to succeed in this racket, it will be a rich man's game.

Turns out a slew of startups are on the case. This article by Cade Metz suggests that at least 45 startups are working on chips for AI type applications such as speech recognition and self-driving cars. It seems the Nvidia GPU that has gotten us to where we are, may not be enough going forward. Co processors for co processors, chips that shuttle data about in I/O roles for GPUs, may be the next frontier.

Metz names a number of AI chip startups: Cerbras, Graphcore, Wave Computing, Mythic, Nervana (now part of Intel). - Jack Vaughan

Related
https://www.nytimes.com/2018/01/14/technology/artificial-intelligence-chip-start-ups.html

Monday, January 8, 2018

What is the risk of AI?

Happy New Year from all of us at the DataDataData blog. Let's start the year out with a look at Artificial Intelligence - actually a story thereof. That is, "Leave Artificial Intelligence Alone" by Andrew Burt, appearing in last Friday's NYTimes' Op-Ed section.

Would that people could leave AI alone! You cant pick up a the supermarket sales flyer without hearing someone's bit on the subject. As Burt points out, a lot of the discussion is unnecessarily - and unhelpfully - doomy and gloomy. Burt points out that AI lacks definition. You can see the effect in much of the criticism, which lashes out with haymakers at a phantom - one that really comprises very many tributary technologies - quite various ones at that.

Some definition, some narrowing of the problem scope is in order.

If you study the history of consumer data privacy you discover, as Burt reminds, the Equal Credit Opportunity Act of 1974. Consider it as a pathway for data privacy that still can be followed.

Burt also points to SR 11-7 regulations that are intended to provide breadcrumbs back to how and why trading models were constructed, so that there is good understanding of risk involved in the automated pits of Wall Street.

Within the United States’ vast framework of laws and regulatory agencies already lie answers to some of the most vexing challenges created by A.I. In the financial sector, for example, the Federal Reserve enforces a regulation called SR 11-7, which addresses the risks created by the complex algorithms used by today’s banks. SR 11-7’s solution to those challenges is called “effective challenge,” which seeks to embed critical analysis into every stage of an algorithm’s life cycle — from thoroughly examining the data used to train the algorithm to explicitly outlining the assumptions underlying the model, and more. While SR 11-7 is among the most detailed attempts at governing the challenges of complex algorithms, it’s also one of the most overlooked.

Burt sees such staged algorithm analysis as a path for understanding AI and machine learning going forward.

It is good to see there may be previous experience that can be tapped when looking at how to handle AI decision making - as opposed to jumping up and down and yelling 'the sky is falling.'

As he says, it is better to distinguish the elements of AI application according to use cases, and look at regulation specifically in verticals - where needed. 

Spoke with Andrew Burt last year as part of my work for SearchDataManagement - linked to here: Machine learning meets Data Governance.  - Jack Vaughan


Sunday, December 3, 2017

Paradise Graph Papers


The Paradise Papers files expose offshore holdings of political leaders and their financiers as well as household-name companies that slash taxes through transactions conducted in secret. Financial deals of billionaires and celebrities are also revealed in the documents. 1.4 TB of data – 13.4 million documents – includes information leaked from trust company Asiaciti and from Appleby, a 100-year-old offshore law firm specializing in tax havens as well as information leaked.more to come


Related




https://linkurio.us/blog/big-data-technology-fraud-investigations/

Friday, October 13, 2017

Does data make baseball duller?

Let's not talk quality of life and data, lets talk baseball and data. Moneyball was an eye opener in the rise of big data analytics as a popular meme. And why not? It had Brad Pitt. Well the movie did. It showed a guy thinking outside of the box could re-imagine the game. The hell with 'he looks like a ball player' hello to can he take a walks? For a small market team - a tonic. But now we are seeing a great downside of worshiping at the altar of data: Really boring baseball. Removing too many pitchers too soon...Embracing strikeouts...Avoiding ground ball and liner hits...focus on homer... Still one wonders if some of these move do auger obvious counter moves for those outside of box thinkers of today... in the face of elaborate boring shifts... why not bunt?

https://www.wsj.com/articles/the-downside-of-baseballs-data-revolutionlong-games-less-action-1507043924

Sunday, September 3, 2017

Forensic analytics

While at Bell Labs in the 1980s, Dalal said, he worked with a team that looked back on the 1986 Challenger space shuttle disaster to find out if the event could have been predicted. It is well-known that engineering teams held a tense teleconference the night before the launch to review data that measured risk. Ultimately, a go was ordered, even though Cape Canaveral, Fla., temperatures were much lower than in any previous shuttle flight. A recent article looks at the issues with an eye on how they are related to analytics today.

http://searchdatamanagement.techtarget.com/opinion/Making-connections-Big-data-algorithms-walk-a-thin-line