Google Flips the Switch on Its Pixel Visual Core
When Google launched its Pixel 2 flagship smartphone last year, it included something of a surprise: A co-processor called Pixel Visual Core, the company’s first homegrown, consumer-facing piece of silicon. And while that feels like a momentous foray, the co-processor has lain dormant for months. Monday, Pixel Visual Core goes to work.
As it turns out—and as Google had nodded at previously—the hidden chip inside every Pixel serves a narrow but critical purpose. It will use its eight custom cores, its ability to crunch 3 trillion operations per second, all in the service of making your photos look better. Specifically, the photos you take through third-party apps like Instagram, WhatsApp, and Snapchat.
Those are the three partners at the Pixel Visual Core switch-flipping; since it’s open to all developers, more will presumably follow. They’ll all gain the powers to produce Google's HDR+ images, photos that rely on a series of post-processing tricks to make images shot with the Pixel appear more balanced and lifelike. Photos taken with the Pixel Camera app have already benefited from HDR+ powers since launch—that's one reason Pixel 2 earned the highest marks yet given to a smartphone by industry-standard photo-rater DxOMark. But Pixel Visual Core will extend the feature to the streams, feeds, and snaps of Pixel owners as well, after an update that will roll out early this week.