Google’s Planet Ai Can Tell Where A Picture Was Taken Just By Looking At It

Google‘s latest product can tell where in the world a picture was takenjust by loooking at it.

The program, named PlaNet, uses neural network technology to analyse images and make educated estimates about where in the world they’re from.

Google’s researchers ‘trained’ the program by feeding it over 91 million Street View pictures from around the world, along with their associated location data.

By recognising patterns from this huge database of images, it managed to get pretty good at identifying locations on its own.

As reported by MIT Technology Review, PlaNet is capable of correctly determining the locations of images to street-level accuracy 3.6 per cent of the time.

It managed to guess the right city 10.1 per cent of the time, the right country 28.4 per cent of the time, and the right continent 48 per cent of the time.

That doesn’t sound too impressive, but guessing even the correct country from a single Google Street View image is actually pretty difficult – play a few rounds of Geoguessr and see how you manage at the same task.

The team behind the software, led by engineer Tobias Weyand, tested PlaNet against a group of 10 “well-travelled humans.”

Surprising, the computer won 28 of of 50 games. In total, the computer was out in its guesses by a median distance of 1,132km. For humans, that figure was 2,321km.