- In a statement to Ars Technica, Google said its camera app does not use the Pixel Visual Core.
- The company confirmed the statement with Android Authority.
- The Pixel Visual Core is Google’s first in-house chipset and is enabled for third-party applications in the February security patch.
I got a fun correction from Google today: The Google Camera app does not use the Pixel Visual Core. Google’s camera app doesn’t use Google’s camera chip. Facebook and Snapchat are the first ever uses of it.
— Ron Amadeo (@RonAmadeo) February 7, 2018
According to the team behind Android fork CopperheadOS, the Google Camera app instead uses the Snapdragon 835’s digital signal processor:
They still use Hexagon (QDSP) from the Google Camera app on the Pixel 2 (XL) just like the Pixel (XL) and they still have a special google_camera_app SELinux policy domain. Other apps can’t do what Google Camera does right now.
— CopperheadOS (@CopperheadOS) February 7, 2018
Found inside the Pixel 2 and Pixel 2 XL, the Pixel Visual Core is Google’s first in-house chipset. Its job is to handle complex imaging and machine learning tasks related to the camera. The Pixel Visual Core was enabled with the arrival of the Android 8.1 developer preview, but was not enabled for third-party apps until the February security patch.
It was assumed that the Pixel Visual Core was also enabled for the Pixel 2’s and Pixel 2 XL’s stock camera app. After all, this is Google’s chipset working with Google’s software on Google’s latest smartphones.
However, Google confirmed it does not use the Pixel Visual Core in a statement to Android Authority. Since the core Pixel camera app already has HDR+ built in, enabling the chipset would be redundant.
It would have been nice if Google clarified this from the beginning, but better late than never.