<< Modelling or predicting the deity. | ARCHIVES | Sparse relation spaces: analysing intersecting small datasets. >>
Cryptome repeats a US Dept of Transportation Pipeline and Hazardous Materials Safety Administration report on a security breach at a LNG facility in Lynn, MA, USA., which well displays the human capacity to ignore machines, and indeed everything around them.
The report says:
"A recent breach in security at an LNG facility shows the need for preparedness and vigilance. The operator discovered a breach of security at its LNG facility during routine maintenance on a gate at the side of the storage tank. Although there was no damage to the tank, intruders had broken through the gate to gain access to the tank.
Investigation revealed that the intruders had cut through the outer and inner perimeter fences and through the locked gate and gained access to the storage tank several days before the breach was discovered. A microwave intrusion system documented the intrusions on the computer monitoring system, which should have alerted operator personnel to the intrusions. Operator personnel did not respond. In the days following, personnel conducted several routine visual inspections of the area without noting the cuts in the fences. Although there was also video surveillance of the perimeter, personnel did not review the tape until they investigated the breach."
I go back to my earlier post about the ability of users to handle large amounts of data.
There's a growing discontent with agencies which collect huge amount of data apparently for its own sake: Financial Cryptography cites the example of the DHS monitoring of air travellers (ATS) which continues despite apparently being ruled illegal. Spokespersons say:
"If (the ATS programme) catches one potential terrorist, this is a success."
It's always easier to say this sort of thing and put out another contract for software with magic buzzwords (Bayesian, neural, etc.) than to actually think how to allocate resources sensibly and balance conflicting interests.
But the information, if it is of any use anyway, has to be used, and ultimately that comes down to some human, somewhere. And the more data you collect, the more people have to look at. Otherwise you get bad results: eg the US military fires off 250,000 rounds per insurgent killed, and still (in effect) loses the war. Or the LNG plant operators, secure behind microwave alarms and CCTV, accidentally discover they've been burgled.
As FC says: "There's no feedback loop, no profit signal, no way to cull out the loss-making programmes. The only success criteria is in the minds of the people who are running the show."
There is a feedback loop for the LNG plant operators, who appear to be a Belgian company anyway. They get hauled up before the US authorities, and as the DHS press release says:
"State authorities responded quickly to examine security at other LNG facilities in the state. These authorities inspected operator practices and procedures to ensure personnel and systems respond correctly during a security breach."
So that's all right, then: DHS are in charge now.
Seriously. This is not a post about human rights, or security policy, or US wars. (The US is not alone here, just more technologically equipped to make massively - multiplayer mistakes.)
It's just about common sense, that old-fashioned thing some people once used to have. I expect any day now to hear that the DHS is about to issue a 15 year $2.1 billion contract to General Dynamics, Lockheed Martin, BAe and L3 to simulate common sense, in order to solve the problem once and for all.