|28||29||30|| || || || |
Posts : 3
Join date : 2012-08-07
|Subject: Cryptography Tue Aug 07, 2012 5:57 pm|| |
Hi there, this is just a bit of feedback about the statements made on the site, and in email responses with you guys. I asked what level of encryption you guys were offering on your servers as your website claims it to be 2048-bit. This is a bit misleading, as 2048-bit can only refer to the authentication key, not the tunnel itself (which is the important aspect). I sent an email asking about the tunnel strength, which received a speedy response I must add, from Haze, claiming the tunnel encryption was 256-bit (AES). This was obviously pleasing to hear; however on inspection of TunnelBlicks logs prior to and since the upgrade on both TCP and DCP servers, the encryption turns out to be 128-bit. Obviously 128-bit is still fine, as as far as I am aware at least, it has not been broken. It just seems like maybe this could be made clearer on your website, as claiming 2048-bit encryption as a selling point seems a bit dishonest - no offence intended.
Posts : 57
Join date : 2012-08-04
|Subject: Re: Cryptography Tue Aug 07, 2012 8:14 pm|| |
These are interesting points.
The distinction between key exchange procedures and the post-key tunnel configuration is an important one. The logical relationship between the two is, I think, fairly clearcut:
If the key setup procedure is improperly handled - whether through poor algorithm choice, poor implementation, or insufficient entropic contribution - then the strength of the concomitant tunnel encryption is irrelevant. Or, to put it another way: if you fuck up the key exchange, everything from that point forward is fucked. By definition.
If, however, key setup is done right then the reliability of the tunnel itself depends on the characteristics of the tunnel encryption itself. With keys already shared (via, one assumes, RSA-ish techniques of one flavour or another), tunnel encryption is essentially an implementation of one-time pad techniques. Assuming competent implementation, the reliability of same depends purely on the entropic value of the pad keys - keylength, in other words. The only know attacks against one-time systems are essentially brute force and thus keylength matters. Alot.
I'm - obviously - not a professional cryptographer. Not even an amateur one. Perhaps an aspiring dilletante is the best that can be said. So, if there's any "real" crypto experts who would like to correct/amplify/destroy my summation above, please do so! I think it's very useful - and surprisingly difficult - to put these mathematical concepts into actual words; folks with the maths skills to really understand this stuff often can't put it into words all that well, and those with verbal skills (such as myself, for example) often fuck up the maths and thus the summary of same. Teamwork is required, in other words.
All that said, if the tunnel
keylength for VPN sessions is actually 128 bits then that's a Bad Thing. Seriously so. I can copy over (if folks are interested) the hot-off-the-presses text on how the NSA views 128 bit cryptosystems. Summary version: they consider them open for inspection given existing technololgy and resources. Not so 258 bit. So this matters. I'll be tossing queries into the mothership to find out what's what with this.
Thanks for the queries - they're touching on important, substantive issues.Pt_Jd
Posts : 57
Join date : 2012-08-04
|Subject: Re: Cryptography Tue Aug 07, 2012 8:25 pm|| |
Since I assume/hope folks will find this interesting, here's the verbatim text from Wired's grounbreaking article on illegal NSA spying
. (I've highlighted a few key sections in boldface
- Quote :
- There is still one technology preventing untrammeled government access to private digital data: strong encryption. Anyone—from terrorists and weapons dealers to corporations, financial institutions, and ordinary email senders—can use it to seal their messages, plans, photos, and documents in hardened data shells. For years, one of the hardest shells has been the Advanced Encryption Standard, one of several algorithms used by much of the world to encrypt data. Available in three different strengths—128 bits, 192 bits, and 256 bits—it’s incorporated in most commercial email programs and web browsers and is considered so strong that the NSA has even approved its use for top-secret US government communications. Most experts say that a so-called brute-force computer attack on the algorithm—trying one combination after another to unlock the encryption—would likely take longer than the age of the universe. For a 128-bit cipher, the number of trial-and-error attempts would be 340 undecillion (1036).
Breaking into those complex mathematical shells like the AES is one of the key reasons for the construction going on in Bluffdale. That kind of cryptanalysis requires two major ingredients: super-fast computers to conduct brute-force attacks on encrypted messages and a massive number of those messages for the computers to analyze. The more messages from a given target, the more likely it is for the computers to detect telltale patterns, and Bluffdale will be able to hold a great many messages. “We questioned it one time,” says another source, a senior intelligence manager who was also involved with the planning. “Why were we building this NSA facility? And, boy, they rolled out all the old guys—the crypto guys.” According to the official, these experts told then-director of national intelligence Dennis Blair, “You’ve got to build this thing because we just don’t have the capability of doing the code-breaking.” It was a candid admission. In the long war between the code breakers and the code makers—the tens of thousands of cryptographers in the worldwide computer security industry—the code breakers were admitting defeat.
So the agency had one major ingredient—a massive data storage facility—under way. Meanwhile, across the country in Tennessee, the government was working in utmost secrecy on the other vital element: the most powerful computer the world has ever known.
The plan was launched in 2004 as a modern-day Manhattan Project. Dubbed the High Productivity Computing Systems program, its goal was to advance computer speed a thousandfold, creating a machine that could execute a quadrillion (1015) operations a second, known as a petaflop—the computer equivalent of breaking the land speed record. And as with the Manhattan Project, the venue chosen for the supercomputing program was the town of Oak Ridge in eastern Tennessee, a rural area where sharp ridges give way to low, scattered hills, and the southwestward-flowing Clinch River bends sharply to the southeast. About 25 miles from Knoxville, it is the “secret city” where uranium- 235 was extracted for the first atomic bomb. A sign near the exit read: what you see here, what you do here, what you hear here, when you leave here, let it stay here. Today, not far from where that sign stood, Oak Ridge is home to the Department of Energy’s Oak Ridge National Laboratory, and it’s engaged in a new secret war. But this time, instead of a bomb of almost unimaginable power, the weapon is a computer of almost unimaginable speed.
In 2004, as part of the supercomputing program, the Department of Energy established its Oak Ridge Leadership Computing Facility for multiple agencies to join forces on the project. But in reality there would be two tracks, one unclassified, in which all of the scientific work would be public, and another top-secret, in which the NSA could pursue its own computer covertly. “For our purposes, they had to create a separate facility,” says a former senior NSA computer expert who worked on the project and is still associated with the agency. (He is one of three sources who described the program.) It was an expensive undertaking, but one the NSA was desperate to launch.
Known as the Multiprogram Research Facility, or Building 5300, the $41 million, five-story, 214,000-square-foot structure was built on a plot of land on the lab’s East Campus and completed in 2006. Behind the brick walls and green-tinted windows, 318 scientists, computer engineers, and other staff work in secret on the cryptanalytic applications of high-speed computing and other classified projects. The supercomputer center was named in honor of George R. Cotter, the NSA’s now-retired chief scientist and head of its information technology program. Not that you’d know it. “There’s no sign on the door,” says the ex-NSA computer expert.
At the DOE’s unclassified center at Oak Ridge, work progressed at a furious pace, although it was a one-way street when it came to cooperation with the closemouthed people in Building 5300. Nevertheless, the unclassified team had its Cray XT4 supercomputer upgraded to a warehouse-sized XT5. Named Jaguar for its speed, it clocked in at 1.75 petaflops, officially becoming the world’s fastest computer in 2009.
Meanwhile, over in Building 5300, the NSA succeeded in building an even faster supercomputer. “They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.
The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”
In addition to giving the NSA access to a tremendous amount of Americans’ personal data, such an advance would also open a window on a trove of foreign secrets. While today most sensitive communications use the strongest encryption, much of the older data stored by the NSA, including a great deal of what will be transferred to Bluffdale once the center is complete, is encrypted with more vulnerable ciphers. “Remember,” says the former intelligence official, “a lot of foreign government stuff we’ve never been able to break is 128 or less. Break all that and you’ll find out a lot more of what you didn’t know—stuff we’ve already stored—so there’s an enormous amount of information still in there.”
That, he notes, is where the value of Bluffdale, and its mountains of long-stored data, will come in. What can’t be broken today may be broken tomorrow. “Then you can see what they were saying in the past,” he says. “By extrapolating the way they did business, it gives us an indication of how they may do things now.” The danger, the former official says, is that it’s not only foreign government information that is locked in weaker algorithms, it’s also a great deal of personal domestic communications, such as Americans’ email intercepted by the NSA in the past decade.
But first the supercomputer must break the encryption, and to do that, speed is everything. The faster the computer, the faster it can break codes. The Data Encryption Standard, the 56-bit predecessor to the AES, debuted in 1976 and lasted about 25 years. The AES made its first appearance in 2001 and is expected to remain strong and durable for at least a decade. But if the NSA has secretly built a computer that is considerably faster than machines in the unclassified arena, then the agency has a chance of breaking the AES in a much shorter time. And with Bluffdale in operation, the NSA will have the luxury of storing an ever-expanding archive of intercepts until that breakthrough comes along.
But despite its progress, the agency has not finished building at Oak Ridge, nor is it satisfied with breaking the petaflop barrier. Its next goal is to reach exaflop speed, one quintillion (1018) operations a second, and eventually zettaflop (1021) and yottaflop.
These goals have considerable support in Congress. Last November a bipartisan group of 24 senators sent a letter to President Obama urging him to approve continued funding through 2013 for the Department of Energy’s exascale computing initiative (the NSA’s budget requests are classified). They cited the necessity to keep up with and surpass China and Japan. “The race is on to develop exascale computing capabilities,” the senators noted. The reason was clear: By late 2011 the Jaguar (now with a peak speed of 2.33 petaflops) ranked third behind Japan’s “K Computer,” with an impressive 10.51 petaflops, and the Chinese Tianhe-1A system, with 2.57 petaflops.
But the real competition will take place in the classified realm. To secretly develop the new exaflop (or higher) machine by 2018, the NSA has proposed constructing two connecting buildings, totaling 260,000 square feet, near its current facility on the East Campus of Oak Ridge. Called the Multiprogram Computational Data Center, the buildings will be low and wide like giant warehouses, a design necessary for the dozens of computer cabinets that will compose an exaflop-scale machine, possibly arranged in a cluster to minimize the distance between circuits. According to a presentation delivered to DOE employees in 2009, it will be an “unassuming facility with limited view from roads,” in keeping with the NSA’s desire for secrecy. And it will have an extraordinary appetite for electricity, eventually using about 200 megawatts, enough to power 200,000 homes. The computer will also produce a gargantuan amount of heat, requiring 60,000 tons of cooling equipment, the same amount that was needed to serve both of the World Trade Center towers.
In the meantime Cray is working on the next step for the NSA, funded in part by a $250 million contract with the Defense Advanced Research Projects Agency. It’s a massively parallel supercomputer called Cascade, a prototype of which is due at the end of 2012. Its development will run largely in parallel with the unclassified effort for the DOE and other partner agencies. That project, due in 2013, will upgrade the Jaguar XT5 into an XK6, codenamed Titan, upping its speed to 10 to 20 petaflops.
Yottabytes and exaflops, septillions and undecillions—the race for computing speed and data storage goes on. In his 1941 story The Library of Babel, Jorge Luis Borges imagined a collection of information where the entire world’s knowledge is stored but barely a single word is understood. In Bluffdale the NSA is constructing a library on a scale that even Borges might not have contemplated. And to hear the masters of the agency tell it, it’s only a matter of time until every word is illuminated.
Absolute proper respect to author James Bamford for this excellent, powerful reporting. Here, here! Pt_Jd
|Subject: Re: Cryptography || |