What’s new and cool in 645 PRO Mk II & PureShot!

Today we’re releasing updates to 645 PRO Mk II, our flagship iOS app inspired by classic Medium Format photography, and also PureShot, the top quality #nofilter app. And both apps get a couple of cool new features.

First, however, a quick mention of an elegant enhancement to 645 PRO Mk II’s Film Modes. These stand out from other film emulations thanks to their remarkable fidelity—we do what we can to replicate the actual appearance of classic film stocks as much as possible, rather than exaggerating for effect. And a “secret weapon” is the way that the Film Modes behave differently depending on the exposure of an individual image. So an image taken in bright light will have different characteristics to one taken in lower light—just as with film.

With the new release the Film Modes are even more lifelike, thanks to dynamic film grain—the amount of film grain that an image has is dependent on the ISO of the shot—a higher ISO means more grain. This reflects the way film actually behaves! And there’s the added benefit that shots that are most likely to suffer from sensor noise (due to high ISO), will get this disguised by more (visually pleasing) film grain.

Now, the updates that apply to both 645 PRO MK II and PureShot.

We’ve added the ability to set the minimum shutter release to a faster speed than the iOS default of 1/15 sec. Go to MENU->Focus & exposure in 645 PRO Mk II or PureShot to get access to the new setting.

You have four choices, starting with the default of 1/15 sec and going up to 1/30 sec. So if you’re troubled by camera shake, try setting a faster speed as your minimum and see if that helps!

Note that the range of speeds actually available to you depends on your specific hardware and software combination. For example, with iPhone 5s running  iOS 7.x, the full range is available, but with other devices and other versions of iOS you may be restricted to, say, 1/24 sec or even 1/20 sec. However, if you set the speed to something faster than your device can manage, 645 PRO Mk II and PureShot will automatically ramp this down to the maximum supported rate.

Next, we’ve got something for those who care more than most about what’s happening “under the hood”.

provided a really specialised setting that lets you select the type of input buffer that 645 PRO Mk II and PureShot will use. Look under your Advanced Settings for this. But why might you want to select something as arcane as this?

When developers get data from an iOS device’s camera, they can get it in one of three formats:

  • A compressed JPEG image (much the easiest to manage, and the method used by most apps)
  • An uncompressed RGB buffer, with each pixel represented by 24 bits of red, green and blue data (8 bits per channel)
  • A “planar” Y’CbCr 4:2:0 buffer, where the luminance of the image (Y) is represented by an uncompressed 8-bit channel, with the blue difference and red difference chroma elements (Cb and Cr) compressed by subsampling both the vertical and horizontal resolutions by two, so a single chroma data point is used for four pixels worth of the image

Both 645 PRO Mk II and PureShot have, since their launch, used the uncompressed RGB buffer, as we believe this to be the closest* to the “raw” data captured by the actual sensor.

However, the recent re-launch of Ben Syverson’s excellent camera app Mattebox 2 has opened some debate about this. He believes that better image results can be achieved by using the Y’CbCr buffer, and interpolating the chroma elements. And, for certain tasks—especially under extreme magnification, with extreme color processing—he’s quite right.

At first glance, this can seem to make no sense—after all, the RGB buffer is uncompressed, while the Y’CbCr has its chroma (color) information compressed fourfold, so how can the compressed starting-point be in any way “better”?

It comes down to the fact that both are interpretations of the actual data captured by the camera sensor. The digital camera sensor in a device such as an iPhone—or a high-end DSLR, come to that—doesn’t actually capture red, green and blue information for each pixel. In fact, each pixel of resolution has a single photosensor that only captures red, green or blue light. In the most common arrangement, the Bayer matrix, 50% of the photosensors capture green light, 25% capture red and 25% blue.

 

To turn this into an image, each RGB pixel has to be “assembled” by interpreting data from adjacent photosensors. And there are many ways to do this. Applications such as Adobe Lightroom and Apple’s Aperture convert RAW data into RGB images with great elegance; the processing inside an iOS image sensor that delivers an RGB buffer appears to be far less sophisticated. One side-effect of this is that there can be some chromatic “fringing” around the objects of images thanks to the red, green and blue data not being merged as elegantly as it might be. This is so minor that it’s utterly irrelevant to the vast majority of users the vast majority of the time, but it is there, and for some really extreme post-processing it can be an issue.

Using a Y’CbCr buffer—even one with compressed chroma data—gives the developer the option to build red, green and blue together in a pixel in a more effective manner than the built-in RGB processing, essentially “smoothing out” colors. This can get rid of fringing, thus providing a marginally better starting-point for certain forms of post-processing.

Just to add to the fun, there are quite a lot of different ways to turn Y’CbCr-encoded data into RGB, and we’e chosen to implement a very different technique in 645 PRO Mk II and PureShot to that used by Mattebox 2.

To repeat: most people won’t see any difference between the two different input buffers, but if you’re an obsessive pixel-peeker you may enjoy this option!

Want to know more about the thinking involved? Head on over to the MobiTog website to see a discussion involving Mattebox’s Ben Syverson and Jag.gr founder Mike Hardaker—with some interesting philosophical differences on show—by clicking here. The discussion begins at the bottom of page 2.

For background reading on the technologies being discussed, start with the following Wikipedia articles:

* It’s impossible to say with total certainly which is the more “raw” of the two input buffers—all we can do is “reverse engineer”, working back from the public data provided by Apple (and Apple isn’t talking about what happens deep in the hardware and firmware). Our opinion—and, while its an informed opinion, it’s still just an opinion—is that the RBG buffer is a quite unsophisticated “development” of the original RAW Bayer mosaic, with the Y’CbCr 4:2:0 data probably derived from that. However, not everyone agrees with us—it could be that both share a “common ancestor”, or that the RGB buffer is derived from the Y’CbCr 4:2:0 buffer! In the end, of course, it doesn’t really matter…