Matt Rosoff | Jul. 22, 2011, 6:05 PM http://www.businessinsider.com/how-to-predict-the-news-before-it-happens-2011-7
Recorded Future takes tens of thousands of inputs from public sources like SEC filings, blog posts, and tweets, and compiles them to give its clients a glimpse into future events.
This afternoon, CEO Christopher Ahlberg posted a few tantalizing details about what they look for.
Looking months or weeks out, these signals are interesting:
Covert behavior: Like layoffs followed by massive insider trading.
Deceit signals: Words like "frankly," honestly," "confidentially," and "as you know" are strong signals that somebody is about to lie.
Statements with future dates: Phrases like "Apple is to release the iPhone 5 on September 7." Not every one of these statements will be accurate, but the aggregate of lots of them might be.
Obscure signals: A forum post like "this drug is making me sick" may provide information about a clinical drug trial ahead of time.
Unusual behavior: For instance, when two companies that haven't been mentioned together are suddenly appearing next to each other in news stories, or when two companies are suddenly quiet for two weeks straight, that might mean a merger is imminent.
Days and hours out, the signals get more explicit:
Directional media bursts like statements about Google growing rapidly and suddenly.
Rumors like "we hear AAPL will show great iPad sales tonight."
Direct statements about the future like "the president will hold a press conference on a matter of national security."
And there's sometimes a (brief) opportunity to detect an event between the time it happens and when it's first reported, like an explosion of tweets on the same subject or a blogger posting about a strange explosion in his town.
The trick is not only in detecting the signals, but also deciding which ones are believable and which aren't. That's where deep statistical analysis comes in -- and where Recorded Future believes it has an edge.
The Manchurian Confession
Raymond Shaw: They can make me do anything, Ben, can't they? Anything.
Bennett Marco: We'll see, kid. We'll see what they can do and we'll see what we can do. So the red queen is our baby. Well, take a look at this, kid...
[fans deck and keeps holding up the cards]
Bennett Marco: 52 of them! Take a good look at 'em, Raymond, look at 'em, and while you're looking, listen. This is me, Marco, talking. 52 red queens and me are telling you... you know what we're telling you? It's over! The links, the beautifully conditioned links are smashed. They're smashed as of now because we say so, because we say they are to be smashed. We're busting up the joint, we're tearing out all the wires. We're busting it up so good all the queen's horses and all the queen's men will never put old Raymond back together again. You don't work any more! That's an order. Anybody invites you to a game of solitaire, you tell 'em sorry, buster, the ball game is over.
From The Economist
http://www.economist.com/node/21525840
False confessions Silence is golden
People have a strange and worrying tendency to admit to things they have not, in fact, done
Aug 13th 2011
SINCE 1992 the Innocence Project, an American legal charity, has used DNA evidence to help exonerate 271 people who were wrongly convicted of crimes, sometimes after they had served dozens of years in prison. But a mystery has emerged from the case reports. Despite being innocent, around a quarter of these people had confessed or pleaded guilty to the offences of which they were accused.
It seems hard to imagine that anyone of sound mind would take the blame for something he did not do. But several researchers have found it surprisingly easy to make people fess up to invented misdemeanours. Admittedly these confessions are taking place in a laboratory rather than an interrogation room, so the stakes might not appear that high to the confessor. On the other hand, the pressures that can be brought to bear in a police station are much stronger than those in a lab. The upshot is that it seems worryingly simple to extract a false confession from someone—which he might find hard subsequently to retract.
One of the most recent papers on the subject, published in Law and Human Behavior by Saul Kassin and Jennifer Perillo of the John Jay College of Criminal Justice in New York, used a group of 71 university students who were told they were taking part in a test of their reaction times. Participants were asked to press keys on a keyboard as they were read aloud by another person, who was secretly in cahoots with the experimenter. The volunteers were informed that the ALT key was faulty, and that if it was pressed the computer would crash and all the experimental data would be lost. The experimenter watched the proceedings from across the table.
In fact, the computer was set up to crash regardless, about a minute into the test. When this happened the experimenter asked each participant if he had pressed the illicit key, acted as if he was upset when it was "discovered" that the data had disappeared, and requested that the participant sign a confession. Only one person actually did hit the ALT key by mistake, but a quarter of the innocent participants were so disarmed by the shock of the accusation that they confessed to something they had not done.
Robert Horselenberg and his colleagues at Maastricht University, in the Netherlands, have come up with similar results. In an as-yet-unpublished study, members of Dr Horselenberg's group told 83 people that they were taking part in a taste test for a supermarket chain. The top taster would win a prize such as an iPad or a set of DVDs. The volunteers were asked to try ten cans of fizzy drink and guess which was which. The labels were obscured by socks pulled up to the rim of each can, so to cheat a volunteer had only to lower the sock.
During the test, which was filmed by a hidden camera, ten participants actually did cheat. Bafflingly, though, another eight falsely confessed when accused by the experimenter, despite participants having been told cheats would be fined €50 ($72).
The number of innocent confessors jumps when various interrogation techniques are added to the mix. Several experiments, for example, have focused on the use of false evidence, as when police pretend they have proof of a person's guilt in order to encourage him to confess. This is usually permitted in the United States, though banned in Britain.
A second computer-crash test conducted by Dr Kassin and Dr Perillo used this technique. Another person in the room beside the experimenter said he saw the participant hitting the ALT key. In this case the confession rate jumped to 80% of innocent participants. Dr Horselenberg and his colleagues found something similar.
Dr Kassin also tested the impact of bluffing. Two participants, one of whom was again in cahoots with the investigator, sat in the same room and were asked to complete what appeared to be an academic test. Halfway through, the investigator accused them of helping each other and cited the university's honour code against cheating. The investigator went on to bluff that there was a video camera in the room, though the recording, with its definitive proof one way or the other, would not be accessible until later. In the real world, this might be like a detective telling a suspect that DNA or fingerprint evidence had been found but not yet analysed (in Britain as well as America, if such a statement were actually true, police would be permitted to say it, though in the case of the experiment it was a lie). Presumably, the innocent participants knew such a tape would exonerate them. Even so, half still confessed.
All of which is both strange and rather alarming. Dr Kassin suggests that participants may have the naive—though common—belief that the world is a just place, and that their innocence will emerge in the end, particularly in the case of the alleged video evidence. One participant, for example, told him, "it made it easier [to sign the confession] because I had nothing to hide. The cameras would prove it."
In cases like that, confession is seen as a way to end an unpleasant interrogation. But it is a risky one. In the real world, such faith can be misplaced. Though a lot of jurisdictions require corroborating evidence, in practice self-condemnation is pretty damning—and, it seems, surprisingly easy to induce.
In the bad old good old days, political prisoners needing ideological re-education were required to write and re-write their autobiography (really just a long-winded confession), each version critiqued by their interrogator. Imagine Oxford-Cambridge tutorials with political commissars (not too much imagination necessary; c.f., the Cambridge Five, etc.). Eventually the author began to believe the self-confessed re-interpretations of self-identity he/she was forced to pen.
The Chinese under Mao used a group recitation version with the same basic idea—restate who you are until you get it right.
Tedious? Yes.
Effective? Very.
Fishy Chips: Spies Want to Hack-Proof Circuits
In 2010, the U.S. military had a problem. It had bought over 59,000 microchips destined for installation in everything from missile defense systems to gadgets that tell friend from foe. The chips turned out to be counterfeits from China, but it could have been even worse. Instead of crappy Chinese fakes being put into Navy weapons systems, the chips could have been hacked, able to shut off a missile in the event of war or lie around just waiting to malfunction.
The Intelligence Advanced Research Projects Agency, the spy community’s way-out research arm, is looking to avoid a repeat. The Trusted Integrated Circuit program is Iarpa’s attempt to keep foreign adversaries from messing with our chips — and check the circuits for backdoors once they’ve been made.
The U.S. has been worried about its foreign-sourced chips in its supply chain for a while now. In a 2005 report, the Defense Science Board warned that the shift towards greater foreign circuit production posed the risk that “trojan horse” circuits could be unknowingly installed in critical military systems. Foreign adversaries could modify chips to fizzle out early, the report said, or add secret back doors that would place a kill switch in military systems.
...
The Defense Science Board warned in its report that “trust cannot be added to integrated circuits after fabrication.” Iarpa disagrees. The agency is looking for ways to check out chips once they’ve been made, asking for ideas on how the U.S. can verify that its foreign chips haven’t been hacked in the production process.
Keep your suggestions original, though. Iarpa’s sister-shop, Darpa, has already done some work on chip verification. Darpa’s TRUST program uses advanced imaging and X-rays to search for deviations from chips’ designs. Its IRIS program aims to check out chips when the U.S. doesn’t have the full designs to compare them to.
…
Iarpa’s also interested in hearing ideas on chip obfuscation. The idea is to hide the “intent of digital and analog functions and their associated building blocks” of an integrated circuit in the FEOL manufacturing stage. If potential adversaries can’t reverse-engineer or understand how a circuit works, it’ll be harder for them to modify it for malicious purposes.
…
September 2004 October 2004 November 2004 December 2004 February 2005 April 2005 July 2005 August 2005 September 2005 October 2005 November 2005 December 2005 January 2006 February 2006 March 2011 June 2011 August 2011 September 2011 May 2012 February 2017 June 2019 August 2020