"Touch screens are for smartphones" and some more thoughts on future Photographic interfaces
There is a recently surfaced, and most probably true, rumor about Nikon introducing touch screens in their DSLR line, with their new D5500 camera. This info was received with some well deserved amusement in some circles (most notably mirrorless camera users, where touch screen interfaces are common for some years now) and snobbish reject in others; since touch screens are for amateur cameras and newbie users, right? Pros don't have a use for such gimmicks, correct?
Hmmmm... sorry, wrong.
Frequent readers may already know that I have very little patience with Photography Luddites. Should we historically had taken their opinion seriously, we would all be rocking 8X10 cameras still. And I have no beef with antique camera users; far from it, I'm just of a dangerously hostile disposition against people rejecting technological progress, without first sincerely and critically examining the potential.
As an example: "Traditional" photographers have a long standing addiction with eye-level viewfinders. I have even read statements that, slamming a camera against your face is the "most stable" handheld position for taking pictures. Sorry to disappoint you but, bio-mechanically speaking, the most stable position is with your arms firmly pressed against your sides and holding the camera pressed against your chest. Let's see now... wasn't this the way generations of photographers used to handle TLR cameras? Not to mention hosts of MF and 35mm cameras with top-level viewfinders?
Don't get me wrong: I'm myself highly dependent on eye-level viewfinders for composition, I just don't try to pass my preference (or needs) as dogma. And there are plenty of cases where, for example, a tilt screen (viewing up or down) is not only preferable, but the only way to get a well composed shot. Needless to say, our discussion on viewfinders concerns really useful and bright ones. Very few DSLRs (usually high end ones) have truly large and clear optical viewfinders; entry and mid-level DSLR camera viewfinders are in fact miserable dark, narrow tunnels. Most EVFs, even on prosumer mirrorless are clear and bright these days (E-M1 and X-T1 are king), having the added advantage of providing a WYSIWYG interface and extra info presentation.
But let's get back to LCD displays.
Until a few years ago, LCD displays on DSLRs were archaic devices aimed for camera setup and proofing pictures taken, in a fashion resembling our lovable primate relatives ("chimping"). At some point, particularly with the advent of DSLR video, the rear screen gained a "live view" capability, meaning the DSLR was transformed to a glorified mirrorless camera. At the same time, screens became larger and clearer, with better color and contrast display, becoming truly useful in field conditions.
Of course mirrorless always had the "live view" rear screen, and it wasn't until recently that they also gained EVFs to rival the best optical ones. Furthermore, mirrorless companies, started early on to introduce further handling options through the rear screen. One of them was the touch screen interface, carried over from smartphones and tablets.
Let's pause for a while and think about how, modern cameras, have been transformed from simple image-capturing devices, into tools that also provide serious processing and sharing capabilities. It's ok if somebody is indifferent to any such development, but it is also a shame to discard them altogether because we happen to not have the need (or possess the imagination) to use them.
I personally happen to very much like "traditional" tactile interfaces (full with aperture rings and shutter speed dials, than you). On the other hand, having a full touch-type interface at your disposal is certainly handier for a number of tasks and, in some cases, more "natural" feeling. For example, what is more intuitive: browsing through stored photos by flipping with your finger on a screen, or pushing "forward/back" buttons? What is most intuitive: trying to move the focus point by hysterically pushing buttons and turning wheels, or just pressing on a point on a live view screen?
A recent "hot" feature with almost any new camera is WiFi connectivity. Much more than giving the ability to send your latest masterpiece to your friends on Facebook, WiFi can transfer camera control to a much larger and more capable surface; not to mention the freedom of uncoupling the interface from the camera itself.
Speaking of camera interfaces, let's be honest about something: most young people getting into photography these days, are so accustomed to smartphone style interfaces, that forcing them to use an antiquated single-purpose "mechanical" style interface, would not make for the most pleasant experience. I predict that, as time passes by, we shall see more and more graphical, app-like interfaces for modern cameras; and that's a great thing if you ask me. Already, Leica dared to make a camera (Leica T) where virtually the whole operation is handled by a touch screen. Samsung is also heavily pushing similar concepts; although, as seen in their pro-level offerings, such as the NX1, this doesn't exclude the concurrent existence of traditional controls.
We can safely expect touch interfaces to become the norm in future cameras. Here is the point where I'd have to propose a few possible features, that could be implemented in such an interface:
- Touch focus, and even touch shutter actuation are getting quite common. But with cameras capable of high burst rates, it would perhaps be possible to expand this feature, by providing a "trace focus path" feature. This could be helpful for, say, action photographers, or when one needs to take several (static) photos varying the focus point.
- Expanding on the above: how about "preprogrammed" focus stacking, by touching and setting a number of focus points on screen. This would certainly be faster, and more intuitive, than manually focusing.
- One unique feature I always loved on my old Olympus OM-4 was the ability to average several spot metering points. Granted, today you can play with exposure using a online shadow/highlight histogram, but I'd like to be able to spot-meter on various points on the frame and have the camera calculate exposure from them.
- On architectural and similar types of photography, touch control could be used to fix distortions in real time. And we could expand on that: ability to control the amount of (in camera, software generated) tilt shift or other "creative manipulation" of the image, in real time.
These are just some suggestions, but I'm sure R&D departments are already implementing features we will find even more exciting and useful. One thing is certain: photographic technology will not stop providing tools for those willing (or daring) enough to try and express their art through them.