
One of the features in Apple’s upcoming 4.1 version of iOS is the ability to capture high dynamic range (HDR) photos, and on paper it sounds like such a feature should increase picture quality, but in the 5 minute 34 second demo by TechThrow below it looks like the implementation was rushed, wasn’t tested thoroughly enough, and in most cases actually makes your photos look worse.
HDR photos, for those unfamiliar with the term, are created when an image is taken at multiple exposures and then combined into a final product, which ideally should make your subject matter look sharper and more natural. What used to take Adobe Photoshop and a lot of time has now been added to the feature list inside your mobile phone. Nokia is also working on adding HDR to their smartphones by contributing to the open source project FCam. It’s already been demoed on the Nokia N900, and while the image samples provided on the Nokia Conversations blog look good, so did the samples shown at Apple’s event last week. Real world testing is going to settle the HDR vs. non-HDR once and for all.
Personally, I’m one of the people who are opposed to HDR photos. I like my darks to be dark, my colors to pop, and for there to be gaps in the image that I have to fill with my imagination. HDR photos look like artsy fartsy attempts at trying to be creative. Software algorithms can do a lot to clean up a photo, and it’s why many pictures from Apple’s previous iPhones looked fantastic even though they used inferior camera sensors, but I’m a traditionalist who believes better optics, better sensors, and a team of engineers dedicated to photography is what produces great cameras. Not some code that a programmer writes while on the company clock.