Why a 40-year-old SCOTUS ruling against software patents still matters today

21 Jun 2018, 8:15 pm

Enlarge / Under the Federal Circuit appeals court, patent law swung from software patent skepticism in the 1970s to extreme permissiveness in the 1990s, then started to swing back toward skepticism with stricter Supreme Court oversight. (credit: Federal Circuit Historical Society / Aurich Lawson)

Forty years ago this week, in the case of Parker v. Flook, the US Supreme Court came close to banning software patents. "The court said, 'Well, software is just math; you can't patent math,'" said Stanford legal scholar Mark Lemley. As a result, "It was close to impossible in the 1970s to get software patents."

If the courts had faithfully applied the principles behind the Flook ruling over the last 40 years, there would be far fewer software patents on the books today. But that's not how things turned out. By 2000, other US courts had dismantled meaningful limits on patenting software—a situation exemplified by Amazon's infamous 1999 patent on the concept of shopping with one click. Software patents proliferated, and patent trolls became a serious problem.

But the pendulum eventually swung the other way. A landmark 2014 Supreme Court decision called CLS Bank v. Alice—which also marks its anniversary this week—set off an earthquake in the software patent world. In the first three years after Alice, the Federal Circuit Court, which hears all patent law appeals, rejected 92.3 percent of the patents challenged under the Alice precedent.

Read 57 remaining paragraphs | Comments

The top ten games from E3 2018

19 Jun 2018, 11:30 am

In spite of countless leaks and pre-show announcements, this year’s Electronic Entertainment Expo (E3) still managed to surprise us. Perhaps the biggest surprise of all was the presence of so many well-crafted, single-player delights. We were also happy to see way fewer battle royale cash-ins than we’d feared—though maybe they are just taking longer to develop.

Since attending the show last week, our E3 brain trust (Kyle Orland, Sam Machkovech, Samuel Axon) has been arguing over our favorite hands-on and hands-off demos. We managed to settle on this definitive top-ten list, along with a slew of honorable mentions.

Our selected games are listed in alphabetical order, not ranked.

Read 57 remaining paragraphs | Comments

What happened last time it was as warm as it’s going to get later this century?

18 Jun 2018, 1:00 pm

Enlarge / Map of Antarctica today showing rates of retreat (2010-2016) of the “grounding line” where glaciers lose contact with bedrock underwater, along with ocean temperatures. The lone red arrow in East Antarctica is the Totten Glacier, which alone holds ice equivalent to ~3m (10ft) of sea level rise. (credit: Hannes Konrad et al, University of Leeds UK.)

"What's past is prologue"- Shakespeare’s The Tempest.

The year 2100 stands like a line of checkered flags at the climate change finish line, as if all our goals expire then. But like the warning etched on a car mirror: it’s closer than it appears. Kids born today will be grandparents when most climate projections end.

And yet, the climate won’t stop changing in 2100. Even if we succeed in limiting warming this century to 2ºC, we’ll have CO2 at around 500 parts per million. That’s a level not seen on this planet since the Middle Miocene, 16 million years ago, when our ancestors were apes. Temperatures then were about 5 to 8ºC warmer not 2º, and sea levels were some 40 meters (130 feet) or more higher, not the 1.5 feet (half a meter) anticipated at the end of this century by the 2013 IPCC report.

Why is there a yawning gap between end-century projections and what happened in Earth’s past? Are past climates telling us we’re missing something?

Read 46 remaining paragraphs | Comments

How ARKit 2 works, and why Apple is so focused on AR

16 Jun 2018, 1:00 pm

Enlarge / A LEGO app using Apple's new ARKit features. (credit: Apple)

Augmented reality (AR) has played prominently in nearly all of Apple's events since iOS 11 was introduced, Tim Cook has said he believes it will be as revolutionary as the smartphone itself, and AR was Apple’s biggest focus in sessions with developers at WWDC this year.

But why? Most users don’t think the killer app for AR has arrived yet—unless you count Pokémon Go. The use cases so far are cool, but they’re not necessary and they’re arguably a lot less cool on an iPhone or iPad screen than they would be if you had glasses or contacts that did the same things.

From this year's WWDC keynote to Apple’s various developer sessions hosted at the San Jose Convention Center and posted online for everyone to view, though, it's clear that Apple is investing heavily in augmented reality for the future.

Read 56 remaining paragraphs | Comments

Forget about that Tesla—the Jaguar I-Pace is the most compelling EV yet

14 Jun 2018, 2:25 pm

Enlarge (credit: Jonathan Gitlin)

Because Jaguar is only offering I-Pace drives in Portugal, we elected to accept a paid flight and three nights in a hotel (two in Portugal and then one at CDG because we had to wait for our return flight) in order to attend this event, rather than having to wait several more months to drive the vehicle.

The Jaguar I-Pace might just be the most significant new car we'll drive this year. It's an all-new, all-electric vehicle from the British automaker, the first installment of its ambitious plan to electrify the entire model range over the next few years. We first saw the I-Pace as a concept at the 2016 LA Auto Show. Now, less than two years later, the production version is ready, almost unchanged.

And this week, we've driven it—on the road, off-road, and even on track. It's not perfect (no car is), but make no mistake: it is very, very good. So good that Waymo—Google's self-driving program—has ordered 20,000 I-Paces to put into service as robocabs in the next couple of years.

Read 33 remaining paragraphs | Comments

The Ars Technica Father’s Day gift guide

13 Jun 2018, 4:15 pm

Enlarge / A few gadgets we think your old man might enjoy. (credit: Jeff Dunn)

Last month, we compiled a few gift-worthy gadgets for Ars readers to grab for Mother's Day. Today, it's Dad's turn. With Father's Day on the horizon, we've once again revisited the many devices that have rolled through the Ars labs in recent months and picked out a list of favorites.

The following Father's Day gift ideas should placate the kind of tech-savvy Dad (or any parent, really) we'd expect to raise an Arsian. Feel free to nudge a loved one toward getting something if you're a father yourself. And if nothing below works, try to at least give your old man a call this weekend.

Note: Ars Technica may earn compensation for sales from links on this post through affiliate programs.

Read 56 remaining paragraphs | Comments

Exclusive: Plume’s new “Superpod” hardware is here—and it’s fast

12 Jun 2018, 8:00 am

Plume made a splash into the burgeoning Wi-Fi mesh scene a couple of years ago by promising to do things differently. In a market where vendors vie with each other to put the biggest, nastiest-looking hardware with the biggest possible numbers on the box, Plume seemed to say, "That's not how you actually fix Wi-Fi."

Instead, the small, crowdfunded startup started by taking a risk on selling tiny, low-powered devices with cloud-based smart management. And the strategy proved to be successful, despite the devices' individual low power and speed. Fast-forward to today, Plume is now releasing a second generation of hardware—called "Superpod"—that keeps the small form factor, nimble deployment, and overall network reliability of its first product. And after getting a little pre-release hands-on, Plume's newest effort also appears to add the raw speed its predecessor was missing.

Quarter-mile times aren’t everything

Before talking about this particular product's performance, we need to talk about how to measure Wi-Fi performance in the first place. When I'm not busy building my own routers, I've spent the last couple of years learning about and improving methods of testing Wi-Fi systems in ways that actually matter for real-world use. Wireless AC speed ratings are complete mumbo-jumbo, and simple iPerf3 runs don't get the job done, either.

Read 41 remaining paragraphs | Comments

Inventor says Google is patenting work he put in the public domain

10 Jun 2018, 12:10 pm

Enlarge / Meet inventor Jarek Duda. (credit: Jarek Duda)

When Jarek Duda invented an important new compression technique called asymmetric numeral systems (ANS) a few years ago, he wanted to make sure it would be available for anyone to use. So instead of seeking patents on the technique, he dedicated it to the public domain. Since 2014, Facebook, Apple, and Google have all created software based on Duda's breakthrough.

But now Google is seeking a patent that would give it broad rights over the use of ANS for video compression. And Duda, a computer scientist at Jagiellonian University in Poland, isn't happy about it.

Google denies that it's trying to patent Duda's work. A Google spokesperson told Ars that Duda came up with a theoretical concept that isn't directly patentable, while Google's lawyers are seeking to patent a specific application of that theory that reflects additional work by Google's engineers.

Read 41 remaining paragraphs | Comments

Talkin’ Treble: How Android engineers are winning the war on fragmentation

8 Jun 2018, 11:15 am

Enlarge / Google's logo for the Android P Developer Preview. (credit: Google)

With the launch of Android 8.0 last year, Google released Project Treble into the world. Treble was one of Android's biggest engineering projects ever, modularizing the Android operating system away from the hardware and greatly reducing the amount of work needed to update a device. The goal here is nothing short of fixing Android's continual fragmentation problem, and now, six months later, it seems like the plan is actually working.

At Google I/O this year, you could see signs of the Treble revolution all over the show. The Android P beta launched, but it wasn't just on Google's own Pixel devices—for the first time ever, an Android Developer Preview launched simultaneously on devices from Google, Nokia, OnePlus, Xiaomi, Essential, Vivo, Sony, and Oppo, all thanks to Project Treble compatibility. Even car makers—some of the slowest adopters of technology on Earth—were on the Project Treble train. Dodge and Volvo both had prototype cars running Android as the infotainment system, and both were running Android P.

As is becoming custom for our annual trip to Google I/O, we were able to sit down with some core members of the Android Team: Iliyan Malchev, the head of Project Treble, and Dave Burke, Android's VP of engineering. (We quoted Iliyan Malchev a million times during the Android 8.0 review, so it was nice to get information from him first hand, and Dave Burke has been through the Ars interview gauntlet several times now.) And through this lengthy chat, we got a better understanding of what life is like now that Project Treble is seeing some uptake from OEMs.

Read 89 remaining paragraphs | Comments

We know you hate the Internet of Things, but it’s saving megafauna from poachers

6 Jun 2018, 4:00 pm

Enlarge (credit: Foto24/Gallo Images/Getty Imagesi)

For much of this decade, organizations seeking to protect wildlife have attempted to use emerging technology as a conservation tool, allowing small numbers of people to monitor and manage data from animals over a wide area. Nowhere is that effort more focused—and more desperate—than in the regions of Africa where illegal animal trade is threatening to wipe out endangered animals such as rhinos, elephants, pangolins, and lions. Here, several organizations are applying Internet of Things (IoT) technology to protect animals, providing rangers with data that helps them intercept poachers before they can get to their quarry.

Many conservation efforts elsewhere use IoT to try to track the location of animals, such as Vodafone's IoT tagging of Scottish harbor seals and tracking of endangered dugongs in Philippines. But in Africa, the task of protecting rhinos is slightly different—it's about tracking people, specifically the poachers who hunt down the rhinos for their tusks.

Rhinos, of course, aren't unique in needing such intervention. Based on data from the Great Elephant Census (GEC), a continent-wide survey conducted by Microsoft cofounder Paul Allen's Vulcan Inc., Africa's savanna elephant population declined by 30 percent between 2007 and 2014 for instance. That's a loss of 144,000 elephants. Current data shows the rate of decline of the elephant population is now eight percent per year, and ivory poachers are the main reason for that decline.

Read 29 remaining paragraphs | Comments

With a simple and cheap rocket, Virgin Orbit aims for the extraordinary

4 Jun 2018, 11:30 am

Enlarge / A late May view of the Virgin Orbit factory floor showing an essentially complete LauncherOne in the middle ground, lacking only the payload fairing. This booster will be used for captive carry and drop testing. (credit: Virgin Orbit)

LONG BEACH, Calif.—The black outline of a rocket painted on a concrete factory floor measures a little more than 20 meters in length. As Will Pomerantz strides along it, he admits that the depicted LauncherOne vehicle won’t exactly amaze aerospace enthusiasts. In designing the rocket, Virgin Orbit opted for a workhorse rather than a show pony.

“As awesome a goal as it is to put humans on Mars—or SUVs on Mars, or send robots past Pluto—that’s not what we’re trying to do,” says Pomerantz, a vice president at Virgin Orbit and the company’s first employee. “We’re trying to do the simplest, cheapest vehicle that we think is commercially viable in the long run.”

LauncherOne’s most distinctive feature is that it does not blast off from the surface of the Earth. Rather, it detaches from a 747 aircraft at 11km (~6.8 miles) and ascends to space from there. The fuel it burns—a highly refined form of kerosene known as RP-1—has launched rockets for more than half a century. And the engines are as simple as they practically can be.

Read 46 remaining paragraphs | Comments

“Change like we’ve not seen in decades”—high-end auto designers go electric

2 Jun 2018, 1:30 pm

Enlarge (credit: Jaguar)

Change comes hard. And sometimes, it's slow. Until recently, no industry played closer to that model more than that of the automobile. Ever since the first series-produced cars of the 1900s placed the big internal combustion lump of iron at the front, the drive wheels at the back, and the passengers in the middle, the form factor of the automobile has stayed largely the same for 100 years. Variations have cropped up here and there—like rear-engine cars, front-drive cars with engines placed transversely, and the odd mid-engine car—but the reality for designers and engineers of the future's electric cars is more wide open now than in the prior 100 years.

Of course, there have been electric cars before. By 1912, for instance, 20 companies were in the electric car business, with more than 30,000 of them registered for street use in the US. So as we prepare ourselves for this latest incoming wave of viable, affordable, and practical modern electric cars, we wanted some big picture perspective. What does the drive for electric mean for design, engineering, and consumer perceptions?

Wayne Burgess is a long-time designer who's currently Jaguar's number-two man in charge of design. Likewise, Andreas Preuninger, head of GT car development at Porsche, has been around four-wheeled vehicles for quite a while. If anyone may have a clue what a renewed and seemingly genuine push for EVs will do to the vehicles we love, it's this type of industry lifer. After touching base with the duo recently, it's clear the coming changes in the name of better electric vehicles will impact cars for both driver and designer in ways that are and aren't immediately obvious to even the most dedicated petrolhead (err, batteryhead?).

Read 18 remaining paragraphs | Comments

NASCAR’s high-tech world: Leave any preconceptions behind for this deep-dive

31 May 2018, 11:45 am

Enlarge (credit: Aurich Lawson / Getty)

CHARLOTTE, North Carolina—For various reasons, this article is long overdue. We've looked at motorsport at Ars on many occasions, in many different forms. But a look through the archives finds barely a mention of NASCAR, admittedly an error on my part. Stock car racing is more popular in the US than any other motorsport, but it also has a reputation—or a stereotype—as a technology-free zone. But as anyone who follows the sport closely knows, there's little justification for that stereotype these days.

Although we had an invite to check out last year's season finale at Homestead in Miami, somehow that didn't feel like the right way to take a proper look at the sport today. I'm not usually one to turn down a day at the track, but it felt like the resulting article could have ended up as a piece of cultural tourism. It would be easy to trade in stereotypes about NASCAR fans—just like every other racing fan, but different and more numerous—and offhand remarks about the visceral impact of 40-odd stock cars blasting past in a pack at speeds often well north of 160mph (257km/h).

I'd rather leave that to the lifestyle publications; people come to Ars to read about technology, after all. So luckily, a better opportunity presented itself. Instead of a warm weekend away in late November, how about a trip to Charlotte in the off-season for a proper look behind the scenes? Calls were made, meetings were lined up, and so it was I found myself driving the 400 miles from Washington, DC, down to North Carolina, a surprisingly easy road trip thanks to a Cadillac CT6 equipped with Super Cruise. After a day spent talking to people throughout the sport—including NASCAR's technology development team, its R&D Center, and some chaps at Ford—I'm now reassessing my ideas about which motorsport series is the techiest of them all.

Read 40 remaining paragraphs | Comments

Street Fighter 30th Anniversary Collection is arcade nostalgia done right

29 May 2018, 11:05 pm

Enlarge / Akuma readies a super attack in Third Strike. The picture mode is set to original, with the TV scanline filter and bezel art set to on, the filters and bezel are optional

This is how arcade-nostalgia compilations should be done. One year after Ultra Street Fighter II's pricey-and-thin cash-in on the Switch, the series' best arcade entries return in a giant, priced-right anthology for pretty much every major gaming platform—and so far, it's absolutely held up to my series-obsessed button mashing.

Street Fighter: 30th Anniversary Collection does right by one of the more enduring legacies of the arcade era, responsible in good part for the popularity of the fighting game genre. Although not nearly as popular as the sequel that would follow, the original Street Fighter came out just over 30 years ago, complete with pressure-sensitive pads (which were switched to the familiar 6 button layout after people injured their hands from hitting the controls too hard).

This new collection features 12 games, from the original Street Fighter (1987) up to Street Fighter III: Third Strike (1999). That number is padded a little by the various releases of Street Fighter II, or early game renditions that were surpassed by later versions such as Street Fighter Alpha 1, or Street Fighter III: New Generation, but completionists will appreciate their inclusion. The remaining titles are excellent games that still hold up very well from a gameplay perspective.

Read 23 remaining paragraphs | Comments

Self-driving technology is going to change a lot more than cars

29 May 2018, 11:35 am

Enlarge / Nuro is designing a small electric vehicle for hauling cargo. It is designed to be street-legal but has no room for passengers. (credit: Nuro)

When people think about self-driving cars, they naturally think about, well, cars. They imagine a future where they buy a new car that has a "self drive" button that takes them wherever they want to go.

That will happen eventually. But the impact of self-driving technology is likely to be much broader than that. Our roads are full of trucks, taxis, buses, shuttles, delivery vans, and more—all of these vehicles will have self-driving equivalents within a decade or two.

The advent of self-driving technology will transform the design possibilities for all sorts of vehicles, giving rise to new vehicle categories that don't exist now and others that straddle the line between existing categories. It will also change the economics of transportation and delivery services, making on-demand delivery a much faster, cheaper, and more convenient option.

Read 55 remaining paragraphs | Comments

From Win32 to Cocoa: A Windows user’s would-be conversion to Mac OS X, part III

28 May 2018, 3:09 pm

So... Peter Bright's new home for tech support, right? (credit: antxoa)

Ten years ago around this very time—April through June 2008—our intrepid Microsoft guru Peter Bright evidently had an identity crisis. Could this lifelong PC user really have been pushed to the brink? Was he considering a switch to... Mac OS?!? While our staff hopefully enjoys a less stressful Memorial Day this year, throughout the weekend we're resurfacing this three part series that doubles as an existential operating system dilemma circa 2008. Part three ran on June 1, 2008, and it appears unedited below.

I've already described how misfortune and adversity left Apple with a new OS platform free of legacy constraints; and I've also discussed how Microsoft had failed to do the same, choosing instead to hobble its new OS with way too much legacy baggage.

Now, let's look at why I'm even considering the big switch: what has Apple done with its platform to make it so appealing? Of course, if you're already writing software for the Mac, then I'm not going to tell you anything you already don't know. But all of this was new to me, because it wasn't until I became so thoroughly disappointed with Windows that I really looked in earnest at what the Mac had to offer. My mistake.

Read 50 remaining paragraphs | Comments

From Win32 to Cocoa: A Windows user’s would-be conversion to Mac OS, part II

27 May 2018, 2:48 pm

Enlarge / How could Peter Bright ditch all this for the minimalism of MacOS? He loves the color purple far too much to do that, right? (credit: Ethan Miller / Getty Images)

Ten years ago around this very time—April through June 2008—our intrepid Microsoft guru Peter Bright evidently had an identity crisis. Could this lifelong PC user really have been pushed to the brink? Was he considering a switch to... Mac OS?!? While our staff hopefully enjoys a less stressful Memorial Day this year, throughout the weekend we're resurfacing this three part series that doubles as an existential operating system dilemma circa 2008. Part two ran on May 4, 2008, and it appears unedited below.

Last time, I described how Apple turned its failure to develop a modern OS into a great success. The purchase of NeXT gave Apple a buzzword-compliant OS with a healthy ecosystem of high-quality third-party applications. Meanwhile, Microsoft was lumbering along with Windows XP. Although technically sound, it was shot through with the decisions made more than a decade earlier for 16-bit Windows.

In 2001, when XP was released, this was not such a big deal. The first two or three versions of Mac OS X were troublesome, to say the least. Performance was weak, there were stability issues, and version 10.0 arguably wasn't even feature complete. It wasn't until early 2002 that Apple even made Mac OS X the default OS on new Macs; for the first few months of its life, XP was up against "Classic" Mac OS 9.

Read 50 remaining paragraphs | Comments

HP’s ZBook x2: It’s powerful, it’s specialized, and it’s very expensive

27 May 2018, 2:00 pm

Enlarge / HP ZBook x2. (credit: Peter Bright)

Since Microsoft's Surface Pro 3 proved that there was a market for tablet-sized PCs sporting detachable keyboards, we've seen an abundance of minor variations of the concept from the major PC OEMs. For the most part, they've stuck pretty close to Microsoft's basic formula, with tweaks in screen size and resolution, connectivity options, and hinge design distinguishing one from another.

By comparison, the new HP ZBook x2 looks like it will be one of the more unusual riffs on the concept. The fundamentals remain the same, but ZBook is HP's mobile workstation branding, and, accordingly, the ZBook x2 is aimed specifically at artists, engineers, designers, and other professional users. In particular, it's aimed at those users who like the flexibility of the Surface Pro form factor—a machine for drawing and sketching, but also for sending emails, filing accounts, or whatever else a user needs to do. Yet, the ZBook x2 also offers more power than other systems of this type.

This extra power comes from three things in particular. The first is the processor; HP is offering the new 8th generation Intel Core chips with four cores and eight threads. Second is the GPU: there's a discrete Nvidia Quadro M620 GPU with 2GB of dedicated GDDR5. And finally there's RAM: up to 32GB.

Read 26 remaining paragraphs | Comments

From Win32 to Cocoa: A Windows user’s would-be conversion to Mac OS X

26 May 2018, 3:07 pm

Enlarge / OK, technically this wouldn't have even been possible at the time of this initial article, but here's Win10, Win8, Windows XP, and Mac OS X (High Sierra) all together. Thanks Parallels. (credit: Parallels)

Ten years ago around this very time—April through June 2008—our intrepid Microsoft guru Peter Bright evidently had an identity crisis. Could this lifelong PC user really have been pushed to the brink? Was he considering a switch to... Mac OS?!? While our staff hopefully enjoys a less stressful Memorial Day this year, throughout the weekend we're resurfacing this three part series that doubles as an existential operating system dilemma circa 2008. Part one ran on April 21, 2008, and it appears unedited below.

A couple of Gartner analysts have recently claimed that Windows is "collapsing"—that it's too big, too sprawling, and too old to allow rapid development and significant new features. Although organizations like Gartner depend on trolling to drum up business, I think this time they could be onto something. "Collapsing" is over-dramatic—gradual decline is a more likely outcome—but the essence of what they're saying—and why they're saying it—rings true.

Windows is dying, Windows applications suck, and Microsoft is too blinkered to fix any of it—that's the argument. The truth is that Windows is hampered by 25-year old design decisions. These decisions mean that it's clunky to use and absolutely horrible to write applications for. The applications that people do write are almost universally terrible. They're ugly, they're inconsistent, they're disorganized; there's no finesse, no care lavished on them. Microsoft—surely the company with the greatest interest in making Windows and Windows applications exude quality—is, in fact, one of the worst perpetrators.

Read 29 remaining paragraphs | Comments

Detroit: Become Human review: Robotic in all of the wrong ways

24 May 2018, 12:00 pm

Enlarge / You can practically hear the dramatic music swelling in this screenshot, can't you?

For a game so focused on presenting a seamless interactive cinematic story, the most striking thing about Detroit: Become Human is its exposed seams.

Like the world’s most slickly produced choose-your-own-adventure book, the latest David Cage game lets you play with narrative conventions and mess with the inherent connective tissue of the story in some intriguing ways. But that underlying story ends up so fragmented, so poorly executed, and so clunkily written that it’s very difficult to appreciate the narrative playspace.

An unbelievable future

The year is 2038, and the city of Detroit is the center of a new manufacturing renaissance thanks to the creation of believable intelligent human-shaped androids. The world has been transformed by the existence of subservient machines that can do anything a human can do and more.

Read 39 remaining paragraphs | Comments