768-bit RSA cracked, 1024-bit safe (for now)

      Comments Off on 768-bit RSA cracked, 1024-bit safe (for now)

With the increasing computing power available to even casual users, the security-conscious have had to move on to increasingly robust encryption, lest they find their information vulnerable to brute-force attacks. The latest milestone to fall is 768-bit RSA; in a paper posted on a cryptography preprint server, academic researchers have now announced that they factored one of these keys in early December.

杭州龙凤

Most modern cryptography relies on single large numbers that are the product of two primes. If you know the numbers, it’s relatively easy to encrypt and decrypt data; if you don’t, finding the numbers by brute force is a big computational challenge. But this challenge gets easier every year as processor speed and efficiency increase, making “secure” a bit of a moving target. The paper describes how the process was done with commodity hardware, albeit lots of it.

Their first step involved sieving, or identifying appropriate integers; that took the equivalent of 1,500 years on one core of a 2.2GHz Opteron; the results occupied about 5TB. Those were then uniqued and processed into a matrix; because of all the previous work, actually using the matrix to factor the RSA value only took a cluster less than half a day. Although most people aren’t going to have access to these sorts of clusters, they represent a trivial amount of computing power for many organizations. As a result, the authors conclude, “The overall effort is sufficiently low that even for short-term protection of data of little value, 768-bit RSA moduli can no longer be recommended.” 1024-bit values should be good for a few years still.

Given that these developments are somewhat inevitable, even the authors sound a bit bored by their report. “There is nothing new to be reported for the square root step, except for the resulting factorization of RSA-768” they write. “Nevertheless, and for the record, we present some of the details.” Still, they manage to have a little fun, in one place referencing a YouTube clip of a Tarantino film following their use of the term “bingo.”

Apple: pixels as touch sensors for brighter, thinner screens

      Comments Off on Apple: pixels as touch sensors for brighter, thinner screens

Touchscreens and multitouch technology make up a significant majority of Apple’s research into future user interface improvements, and the iPhone introduced some of those UI paradigm shifts into our increasingly mobile computing. Since almost all interaction with the iPhone—and presumably the hopefully imminent Apple tablet—involves a touchscreen, Apple hopes to improve on touchscreen technology by using each individual LCD pixel as a touch sensor.

杭州龙凤

Apple has filed a patent application, published today, for a “display with dual-function capacitive elements.” By mixing display and sensing functions into each individual pixel, it would make touchscreens thinner, lighter, and brighter than they currently are today.

The way current touchscreens found on most smartphones work is by overlaying a touch-sensitive panel on top of a traditional LCD panel. The touch-sensitive panel is essentially a grid array of capacitors, most commonly made from the transparent conductor indium tin oxide (ITO). When your fingertip comes in contact with the small magnetic fields present in the capacitors, it causes the voltage along those capacitors to fluctuate. A processor translates these fluctuations into touch positions.

The need for additional layers covering the LCD screen means it is thicker, and despite the fact that ITO is transparent, the touch layer does block some light coming from the LCD display underneath. Apple’s solution involves using each individual pixel as a capacitive sensor, eliminating the need for an additional layer for a separate touch sensor.

Part of the magic of Apple’s patent relies on forming an IPS LCD display using a low temperature polycrystalline silicon instead of the more common amorphous silicon. Materials engineering nerds may want to look at the patent for a more detailed explanation, but suffice it to say that the poly-Si allows for a much faster switching frequency for driving the individual pixels. (For those unaware, the individual pixels in an LCD panel switch on and off at a rate much faster than we can perceive—it’s this same switching that can cause eye fatigue from staring at your screen all day.)

Apple’s idea takes advantage of the faster switching of poly-Si to drive the pixels one instant, and use the capacitive properties of the individual pixels as touch sensors the next. The switching happens fast enough to give a clear, bright display, as well as responsive touch sensing. The elimination of the separate touch-sensing layer also makes for a thinner, lighter, brighter, and simpler touchscreen unit.

Apple proposes its solution for mobile devices, making references to iPhones, iPods, and even MacBooks, but don’t be surprised if such an innovation also makes its way into an Apple tablet.

Digital albums, vinyl made a comeback in ’09 while CDs slide

      Comments Off on Digital albums, vinyl made a comeback in ’09 while CDs slide

2009 was a decently strong year for music—as long as you look at digital online sales and ignore the sinking ship that represents physical CDs. The US numbers are in from Nielsen SoundScan, and they are mostly a less-extreme version of the 2008 numbers, due in no small part to the struggling US economy. Still, the sales of physical media was way down while online media was way up, though vinyl enthusiasts are still bent on keeping their little niche alive—and are largely succeeding at it.

杭州龙凤

According to Nielsen’s numbers for 2009, overall music sales were up a very modest 2.1 percent over 2008—this is a far cry from the 10.5 percent growth between 2007 and 2008, but growth nonetheless. When you split the numbers out, though, they are much more telling: music lovers bought nearly 1.16 billion digital tracks in 2009 (up 8.3 percent from 2008), and 76.4 million online digital albums (up 16.1 percent).

Physical albums, on the other hand, did not fare so well. They decreased by an average of 17.4 percent year-over-year (down 20.7 percent for current albums and 14.1 percent for catalog albums):

Once again, a shocker from Nielsen’s 2009 numbers came in the form of vinyl sales, which were up 33 percent last year. (For comparison’s sake, however, vinyl sales grew by a whopping 89 percent between 2007 and 2008.) With 2.5 million vinyl units sold in 2009, Nielsen still said that more albums were sold than any other year in history(keeping in mind that SoundScan’s history only goes back to 1991, a few years after the advent of the CD). The firm attributed this growth to household names like The Beatles, Michael Jackson, and Bob Dylan, but also said that indie artists were making a vinyl comeback. “Also notable is the fact that two out of every three vinyl albums were purchased at an independent music store,” Nielsen said.

Vinyl’s mysterious growth isn’t enough to offset the tank in other physical sales, however—33 times a very small number is still a very small number. When combining all physical media with all online media, album sales were still down by a sad 12.7 percent year over year.

Numerous artists have blamed online services like iTunes (and now Amazon MP3) for people’s changing tastes when it comes to cherry-picking songs, but 2009’s numbers make it clear that customers are willing to go the album route if they have motivation to do so. Singles are still king, but with digital albums up 16.1 percent (current digital albums are up 20 percent), the trend is looking better for artists looking to make more than just a buck at a time.

Time to ban mountaintop mining due to externalized costs

      Comments Off on Time to ban mountaintop mining due to externalized costs

A new, comprehensive analysis of mountaintop removal mining, which is commonin the Appalachian region of the United States,shows that its environmental effects extend to the hydrology of its surroundings, ruining streams and the ecosystems they support. Technically known as “mountaintop mining with valley fills” (MTM/VF), it consists of stripping away forests and topsoil from the tops of mountains and then using explosives to break through rocks that cover the coal inside the mountain. The resulting rocks are then pushed away into valleys, where they interfere with and often bury existing streams.

杭州龙凤

It’s not all that surprising that clean water, and a lot of it, is important to ecosystems; research shows that if these activities disrupt as little as 5-10 percent of a watershed’s area, they can cause irreversible changes to the ecosystem. The reduced flow of streams that get buried by valley fills can kill off plants and trees in an area with high biodiversity. This loss of flora also results in a landscape that is less effective at handling runoff water, leading to an increase in the frequency and magnitude of downstream flooding.

Streams that continue to flow are polluted with various chemicals and metals from the mountaintop rocks. Increases in sulfate cause stream microbes to create more hydrogen sulfide, which is toxic to many aquatic plants and organisms. Selenium accumulation causes deformities and lethality in fish, which in turn poison the birds that eat them. Humans in the area are also affected by the dirty streams and the elevated levels of airborne, hazardous dust that results from mining. Studies have found elevated levels of hospitalization for pulmonary disorders and hypertension, as well as increased mortality in the region.

Reclamation of the areas appears to be ineffective, with soils still having low organic and nutrient content and little to no regrowth of woody vegetation afterward. Reclamation often involves rebuilding streams, but the new ones carry chemicals released by the rock debris, and don’t integrate into the radically altered environment.

The sum of these problems add significantly to the externalized costs of coal usefor power generation. Because of the huge impact, the scientists behind the report are recommending that the government stops issuing MTV/VF permits until new methods to address these problems can be developed and subjected to rigorous review.

Science, 2010. DOI: 10.1126/science.1180543

photo courtesy of Vivian Stockman

All I wheely want for Christmas: the Fanatec Porsche Turbo S

      Comments Off on All I wheely want for Christmas: the Fanatec Porsche Turbo S

“Christmas is a time when Ars people get toys. January is a time when they review them.” Thus tweeteth Deputy Editor Jon Stokes, and right he is. Under the tree this year (well, on the UPS truck) was a new steering wheel for my Xbox 360. Not just any wheel, but a (deep breath) Fanatec Porsche Turbo S steering wheel and Clubsport pedal setup, available directly from the manufacturer for the princely sum of $499.95. Yes, that’s a lot of money, but as we’ll see, you get quite a lot in return, and you could spend quite a lot less on the standard edition and still have what’s probably the best driving wheel peripheral on the market right now. Compared to the Microsoft Wireless Racing Wheel, the Porsche-licensed peripheral is a massive leap forward for Xbox gamers, and the ability to use the rig with a PS3 and the forthcoming Gran Turismo 5 should put it high on any racing nut’s wish list.

杭州龙凤

Video games used to be simple. Your NES came with a rectangular joypad that was all you needed to steer Mario from one end of the screen to the other, down pipes, up vines—and you’d get a nice dose of sore thumbs as an added bonus. Soon, the four points of the compass weren’t enough, and neither were A and B alone. We got shoulder buttons, analog sticks, and a proliferation of buttons to twitch, mash, and press in order to get to the end of the level and trigger that little flush of dopamine that’s the gamers’ equivalent of one of those chicken-flavoured cat treats that I reward my pets with when they’ve been especially adorable.

The standard game controller might be fine for some folks, but thankfully for the gaming peripheral manufacturers of this world, lots of us demand more faithful ways of interacting with our virtual pastimes. Different genres obviously have their own peripherals, from arcade sticks to musical instruments to the reason I’m writing this and (hopefully) the reason you’re still reading: steering wheels for driving games.

For a while, Microsoft has had a fairly good steering wheel available for the Xbox 360, which is a good thing since almost no one else has been able to offer one. Microsoft chose to use a different standard for the 360, so wheels that work fine on PCs and PlayStations have been useless on Redmond’s console, much to the chagrin of Logitech wheel owners. Their G25 and more recent G27 wheels have been the gold standard for driving sim players, but there’s a new player in town called Fanatec, and if you’re looking for a wheel that will work with both Xbox 360 and PS3, look no further.

I first became aware of this German company in the middle of 2008. They were already offering PC wheels, helped along with a license from Porsche. The wheels were replicas of those found in Carreras or GT3s, but what got my wallet out back then was the answer to every couchlocked racer’s dreams, the RennSport wheel stand ($129.95). Named after Porsche’s legendary series of stripped-down road monsters, this is a folding wheel stand that comes with a wife acceptance factor that’s several orders of magnitude higher than anything you could build for yourself out of shipping palettes or MDF. But more about the wheel stand later. Back to the main event.

Word on the street was that Fanatec was releasing a wheel that would work with the Xbox 360. So what, you ask. Microsoft makes a pretty good wheel that works brilliantly with the 360. But couple that news with finding out that for the first time, a console racing game would support the use of a clutch pedal as well as an accelerator and brake, and now you have something interesting on your hands. Not only that, but a proper H-pattern gearbox like you’d find in your average car. The game of course is Forza Motorsport 3, already covered on these pages at launch, and a mighty fine game it is. But here we are, 500 words in, and still I’ve told you nothing about it. What a poor reviewer I am.

Off to the races

The Fanatec Porsche Turbo S Wheel, to give it it’s full name, comes in three flavors. The Pure edition (the cheapest version at $249.95, sans pedals or shifter set), the regular edition ($349.95), which comes with a three pedal set, a sequential shifter, and an H-pattern shifter, along with an RF dongle that looks like the key to a 911 and lets you use the wheel with a PC or PS3, and finally the Clubsport package (now sold out, unfortunately), which is all of the above, but instead of the base pedal set you get Fanatec’s hefty Clubsport pedals ($199.95 on their own), which wouldn’t look out of place in an actual track-going 911 GT3 RSR racecar.

These superduper pedals (which are available separately and will work with just about any other wheel in conjunction with a PC) use contactless sensors and are light years ahead of the plastic ones that come with the Microsoft wheel we know and love. The brake pedal comes with a pair of very nifty features: a load cell sensor that allows you to vary the amount of pressure needed to reach full activation (i.e. how hard you have to press it to get 100%) and in conjunction with the wheel, force feedback that pulses the pedal at the point that the wheels are locked up, in the same way your car’s ABS behaves (yes, it still does this when you race with ABS turned off, which is a good thing as we shall see).

Cell launches “Article of the Future” format

      Comments Off on Cell launches “Article of the Future” format

During the last year, Cell has spearheaded an initiative to change the way readers interact with research articles on the Internet. They note that, for a long time, research papers on the Web laid flat and lifeless on the screen, much as print articles do on the page, but they were capable of so much more. Cell came up with new layouts for research papers that take advantage of current Web technology, and presented the prototypes to its authors and readers, who voted and gave feedback on the designs.

杭州龙凤

What Cell came up with was a format it has christened “Article of the Future.” Articles of the Future are broken down into their respective sections (Introduction, Discussion, Figures, and so on) with a navigation section at the top, allowing readers to jump to the sections they are most interested in. The landing section of each article is a bullet-point summary and abstract next to a representative image, allowing readers a quick take-away message if that’s all they’re looking for (video descriptions called a “PaperFlick” appear here, when available). You can even zoom in on images (!), which are presented under the Data tab as a film strip. The citations in the body of the paper are also linked to a separate section, where readers can view the citation in full.

Hold on, don’t reach for your smelling salts just yet. Doesn’t this sound a little familiar? It should—to an extent, all Cell has done is take features from dozens of websites and apply them to scientific publications. But, similarities aside, any effort to make research more accessible and readable is a good thing, and rare among Cell’s peers. Maybe this will encourage other publications to step into the Future—or, you know, at least 2001.

An embarrassment of Kepler riches, planetary and otherwise

      Comments Off on An embarrassment of Kepler riches, planetary and otherwise

Earlier this week, we described a brief announcement from the team behind the Kepler space observatory, designed to spot the transit of planets in front of their host stars. That announcement was followed up by a paper in Thursday’s issue of Science, which provided a few more details on some of the planets and other things that have been spotted within its field of view. But that paper was really a vehicle for a massive information dump; it contains information about 22 papers submitted to the Astrophysics Journal, many of which have been posted on an arXiv preprint server.

杭州龙凤

Many of these describe the instrument aboard the Kepler in greater detail, and others the scientific pipeline that handles the data it returns—eliminating false positives and coordinating follow-up observations are central to the process. One paper lists the follow-up resources available to the team, which include the Hubble and Spitzer in space, and Hawaii’s Keck telescope back on Earth. As of the end of 2009, there were already 177 items that were in or had been through the planetary pipeline. Five of these have already been confirmed to be signals from planets, and another 52 appear to be promising candidates. Another 65 are still under observation, status unknown.

Some of the planets are rather unusual, at least based on the bodies in our solar system. One, Kepler-4b is quite similar to Neptune (nearly the same size, and 1.4 times the mass). But, as the authors note in a fit of understatement, “A major difference between Kepler-4b and Neptune is the irradiancelevel for Kepler-4b is over 800,000 times larger.” If Kepler-4b and Neptune had the same composition, then that much heat (its equilibrium temperature is 1650K) would have caused the planet to swell dramatically. So, although the paper’s title refers to the planet as “Hot Neptune-like,” there seem to be some substantial differences in composition.

Another planet, Kepler-6b, does seem to have undergone some temperature-related bloating. Despite being only about two-thirds the mass of Jupiter, at 1,500K, it’s bloated up to a radius of 1.3 times Jupiter’s. That leaves it with a density of 0.35g/cm3 (for the metrically unaware, water’s value is 1). Although this seems odd based on our familiar planets, the authors describe these features as “fairly typical.” Kepler-8b, however, falls off the far end of typical. It’s got a density of 0.26g/cm3, making it one of the lowest density planets known.

Returning to a known planet, HAT-P-7 (which Kepler used to validate its instruments, scientists were actually able to detect the fact that the massive planet, orbiting so close to its host star, induced disturbances in the star’s surface. That’s good news for the observatory’s future. “The Kepler light of HAT-P-7 curve reveals ellipsoidal variations with an amplitude of approximately 37 ppm,” the authors of that paper note. “This is the first detection of ellipsoidal variations in an exoplanet host star, and shows the precision Kepler is capable of producing even at this early stage. For comparison, a transit of an Earth-analog planet around a Sun-like star would produce a signal depth of 84 ppm, a factor of 2 larger than this effect.”

Some of the other papers point out that, although Kepler’s primary mission is to hunt planets, it’s unusual in that it does so by staring intently at a the same stars, day in and day out (the ESA’s CoRoT does something similar). That is allowing researchers to do some asteroseismology, tracking the variations in stars over the short term, and allowing them glimpses into the processes that drive stellar evolution.

They’re also using the data to observe the orientation of the orbits of these planets in order to constrain our understanding of the development of planetary systems. These hot Jupiters can’t possibly form so close to the host star, but this the data will help identify whether they are typically dragged inwards by that star’s gravity, or shoved inwards by interactions with other planets.

All that in the first six months that Kepler has been operational.

Science, 2009. DOI: 10.1126/science.1185402

Greenpeace gives Apple gold stars for green efforts

      Comments Off on Greenpeace gives Apple gold stars for green efforts

Despite the two companies’ somewhat spotted history together, Greenpeace has awarded Apple four giant gold stars for its efforts to rid its products of brominated flame retardants (BFR) and polyvinyl chloride (PVC). (BFRs and PVC have long been on Greenpeace’s hit list of environmentally unfriendly chemicals.) In fact, Apple received a large gold star—the highest rating Greenpeace gave out—in each of the four categories rated in its latest report: desktops, portables, cell phones, and displays. Of the six companies with products in all four categories, Apple was the only one to receive a large gold star in any category, and, in general, it blew away the other five. Dell, Lenovo, Samsung, and LGE received only one small gold star each.

杭州龙凤

Apple also made progress in Greenpeace’s Guide to Greener Electronics,where it now moves up into fifth place out of the 18 companies Greenpeace chose to rate. With a cumulative score of 5.1, Apple has moved up six places since July of 2009. Apple did lose points, however, for not providing “public positions” on some issues and not communicating future plans with regard to the elimination of certain chemical compounds.

Greenpeace has been riding Apple since as far back as 2006 when it issued a similar report and tried to light up the 5th Avenue Apple store in NYC with green flashlights in an attempt to bring Apple’s environmental failings into the spotlight. That same year, the organization went as far as making a mock Apple website lambasting the company. In 2007, Greenpeace tried another tactic by pressuring Al Gore (who is a board member at Apple) into changing the company’s ways. Though this certainly won’t be the last time we hear from Greenpeace, it’s nice to see the organization has finally let up a little thanks to Apple’s environmental improvement.

Microsoft: Google’s Nexus One will hurt Android

      Comments Off on Microsoft: Google’s Nexus One will hurt Android

Earlier this week, Google unveiled the Nexus One; the search giant’s first branded mobile phone. Then the company confirmed that Nexus One, and all subsequent Google phones sold via the company’s online store, will be available unlocked for use on every participating carrier. Microsoft has weighed in on this development, specifically where Google is both offering Android to its partners and allowing one partner to benefit from having a Google-branded phone, concluding that it is a flawed strategy. The software giant says that Google will have a hard time attracting partners to its mobile operating system after introducing its own handset, even if it is developed by HTC.

杭州龙凤

Microsoft has been rumored to be working on its own mobile phone for months, if not years. Officially though, the company insists that releasing its own branded smartphone would be contrary to its strategy of offering just the operating system to a number of partners who then provide various hardware options so that consumers can have a myriad of devices to choose from. Thus it’s not too big of a surprise to hear Microsoft Entertainment and Devices President Robbie Bach bash Google for the move, saying that handset makers may fear the company will prioritize its own product over theirs and ditch Android as a result.

“Doing both in the way they are trying to do both is actually very, very difficult,” Bach said at CES 2010 yesterday, according to Bloomberg. “Google’s announcement sends a signal where they’re going to place their commitment. That will create some opportunities for us and we’ll pursue them. Over time you have to decide whether your approach is with the partners or more like an Apple approach that is more about Apple. Google’s is an interesting step. We’ll see how people react.”

Google may have a hard time convincing their licensees that they’re not in competition with them. Still, Google has at least one advantage over Microsoft: Android is free for licensees to put on their devices. If Google started off by launching the Nexus One and then began distributing Android, it would be a big problem. Since it’s the other way around, we must remember that gratis is an addiction hard to drop once you’ve had it for a few months.

The e-book wars of 2010: displays and hardware

      Comments Off on The e-book wars of 2010: displays and hardware

If I had any doubts that the e-book wars are officially on, my first day at CES dispelled them thoroughly. Note that I said “e-book wars,” and not “e-reader wars.” That’s because there’s a tidal wave of E-Ink-based e-readers that are about to hit the US, so that by the second half of this year (at the latest) E-Ink screens will be a dime a dozen. And on top of the E-Ink screens will be the tablets, and on top of those will be LCD/E-Ink tablet combos in various configurations.

杭州龙凤

But as thick as the market will be with e-book hardware, the readers aren’t the only crowded part of the market. Everyone also wants to control a distribution platform. And then there are the publishers, who are scrambling to adapt to the new medium.

In short, right now, the emerging e-book market is in a full-blown melee—a free-for-all where everyone along the chain from content producer to reader is trying to be the first to figure it all out. Over the next few days, we’ll talk about how the battle lines are shaping up in the following areas: displays, chips, storefronts, and publishers. Many of the combatants are involved in more than one of these areas—Qualcomm is in displays and chips; newcomer Copia is pushing hardware and a storefront; Sprint, Hearst, Skiff, and LG are all allied across displays, storefronts, and publishers under the Skiff banner; and so on.

The Sprint/Hearst Skiff and the Plastic Logic QUE

Most of the e-readers coming out in the next few months are based on E-Ink, but that doesn’t mean that the displays will be identical. Reading devices will compete with each other on size, thickness, resolution, contrast, and price. The screens will also compete to offer color as quickly as possible.

Of the readers that I saw, the Skiff has the edge on size so far with an 11.5″ diagonal screen. Plastic Logic comes in a close second, but, to be 100 percent honest I couldn’t actually tell that there was much of a difference in sizes (I saw them one after the other); the Skiff executive I talked to told me that the Skiff’s screen was a bit bigger. Regardless, both are easily large enough to view a full 8.5 x 11 inch page without doing any scaling, and both have solid industrial design.

The Sprint/Hearst Skiff

As far as contrast goes, Plastic Logic’s screen definitely looks better than my Kindle DX—the latter has a grayish cast, while the former presents a much cleaner black-on-white look. I can’t judge between the Skiff and the Plastic Logic screens on contrast, though, because I didn’t see them in similar lighting conditions.

The Plastic Logic QUE

Both the Skiff and the Plastic Logic QUE were incredibly thin—about quarter of an inch or less. This thinness is made possible in part by the fact that both have flexible display substrates—Skiff’s uses a foil substrate developed by LG, while Plastic Logic’s uses a plastic substrate developed in-house. Both of these make for flexible displays, but of the two only the Skiff itself is physically flexible (you can actually bend the device a bit and it doesn’t hurt it).

On the resolution front, the Skiff wins at 174dpi to Plastic Logic’s slightly lower 150dpi. I couldn’t visually tell a difference, but again, lighting conditions were drastically different.

Color E-Ink is in the offing, and I saw a prototype of it at the Skiff presentation. At this point, the technology looks promising but it needs a lot of work. Color saturation was pretty poor, and right now I’d prefer the black-and-white to it. There are supposedly better color E-Ink prototypes than the one I saw, and if I can catch a glimpse of a superior iteration of the tech then I’ll post an update.