A Nameless EVIL

May 13, 2010

The introduction of Sony’s NEX-3 and NEX-5 has once again thrown a weird anomaly into sharp relief.

Even 20 Months after the introduction of the Panasonic G1, there is still no universally-agreed-upon term for this new class of cameras.

The defining aspects of the genre are a largish sensor size (as compared to typical compacts) plus interchangeable lenses. Yet by omitting any reflex viewfinder, and instead streaming a live digital image from the sensor, the body size can be reduced from the bulk of conventional DSLRs.


Name that Evil

Now, the most widely-known term (and the one I use) is “EVIL,”  meaning “electronic viewfinder, interchangeable lens.”

A few pedants object that the Olympus E-P1 does not have a “viewfinder” in the sense of something you hold up to your eyeball (nor do the Sony NEX models, so far). But you can slightly revise the phrase to be “electronic viewing” instead, if this bothers you.

Read the rest of this entry »


Throw Bits At It? Not Always.

February 16, 2010

Today, some musings that are a bit more (har har) abstract.

Sometimes we become numbed with the perpetual escalation of tech specs. The mindset of the computer industry, where each new generation promises more, faster & bigger, seems to be the new normal.

And it can become a self-fulfilling prophecy. If it is technically possible to ratchet up some specs number, inevitably we’ll choose to. That’s what keeps people buying!

But there’s one nice example of an industry that dangled ever-increasing numbers in front of consumers—who then yawned and said “no thanks.”

How many DVD-Audio or “Super Audio CD” titles do you own? (Okay, I’m sure someone out there is an enthusiastic adopter. But I mean, the average person.)

SACD Player

Did You Buy SACD? Me Either.

The original standard for music CDs uses 44,100 samples per second. Each sound sample has 16 bits; meaning it encodes the full range from soft to loud with about 65,000 discrete levels.

The CD standard was adopted around 1980; so at that time there was pressure to keep the bit rate low enough so that the player electronics would not be prohibitively expensive.

No one knew how data-handling ability would explode over the following decades. When you see a speed “60x” or “133x” on a camera memory card, those are referenced against the original CD-player (or, 1x CD-ROM) bit rate.

So in the audiophile world there were grumbles from the start that the CD bit rate was insufficient.

With only 16 bits, the quietest elements of music (like note decays and room ambience) must be recorded with somewhat coarse resolution. It’s similar to how shadow areas in digital photos can look noisier than highlights. A standard that used 20 bits or so would have preserved more fine “texture” in low-amplitude sounds.

And before digitizing sound for CD, any frequency of 22,050 cycles/second or higher must get chopped off. A high-pitched rising chirp that went past that frequency would make the A/D converter miss the true peaks and troughs of the wave; it would misrepresent it as a falling note, back down in the audible spectrum.

For CD audio, 22 kHz is the “Nyquist Limit,” and you need an “antialias filter” to block higher frequencies (yes, these exact same principles apply to digital-camera sensors). But there were complaints that the antialias filters degraded sound quality.

And while most adults can’t hear steady pitches much above 16 kHz, very brief clicks at a higher theoretical frequency might contribute some “edge” to percussion and note attacks. A lot of professionals preferred to record at 48 kHz instead. (You can go higher today, though bandwidth limitations in the analog realm become significant.)

Folks who produce music may have reasons for 24-bit sampling (tracks are often put through computation-intensive effects; you don’t want rounding errors), but 20-bit delivery covers an excellent dynamic range. Even taking a 20-bit master and dithering it down to a 16-bit release version can work well.

I apologize to all of you whose eyes glazed over during those past few paragraphs. The truth is, most of us found CD sound quality perfectly adequate.

If you do the math, the bit rate used for CD sound is about 1.4 Mbit/sec (in stereo). A reasonable standard that would have handled any outstanding quality issues might have been 20 bits at 48 kHz. That works out to 1.9 Mbit/sec.

But the arrival of DVD technology offered a huge increase in disc capacity. It was a great opportunity to sell a newer, zingier, whizzier-spec music format too.

So a “format war” broke out, but based on bit rates that were sort of crazy. Stereo in Sony’s SACD standard burns up 5.6 Mbit/sec—quadrupling the CD rate. DVD-A is a “family” of standards (a standard that doesn’t standardize); but its highest supported stereo rate is 9.2 Mbits/sec!

A leading writer in the digital-audio field once told me in an email that the reasons for these bit rates had more to do with “quieting the lunatic fringe” than with any technical justification.

But the public treated these new audio formats with indifference. SACD has found a niche in classical music, but most folks are completely unaware of it. (Even though, soon enough, millions did go out to buy a new disc format: Blu-ray.)

You all know what actually happened with music: Buying it online, and being able to take it everywhere in your pocket, totally changed the game.

iPod Mini

What I, And Probably You, Really Bought

So with downloadable music, the bit rate actually plunged instead. Today we’re buying music that uses 1/5th or even 1/8th the old CD standard! (Psychoacoustically intelligent data compression makes it possible.)

So… does this have anything to do with photography?

Camera manufacturers today (even those making enthusiast models) continue to use megapixels as the spec that defines “improvement.” Every year, more bits!

This shows a depressing lack of creativity. Past the point where this offers any real value, it’s just mindlessly chasing a number.

What we need is a serious rethink—to create something so novel and desirable that any talk about pixel count become irrelevant.

My feeling is that a game-changer for digital cameras is radical improvements in low-light capability. (This parallels the opinions of that recent Gizmodo article.)

We’ve suffered through decades of terrible point & shoots—whose slow zooms and limited sensitivity demanded nuclear-blast electronic flash for every indoor shot.

Flash is blinding, conspicuous, annoying to bystanders, and quite rightly prohibited at most museums and concerts. It drags out the lag time before shots.

It’s also a form of lighting which makes people look like shit.

Consider the fraction of our days we spend indoors, often under marginal illumination. But living rooms, restaurants, etc.—isn’t this where our real lives happen? Wouldn’t it be amazing to record those moments realistically, accurately, but without blinding and ugly flash?

What if you had a camera that could shoot at ISO 1600, cleanly? What about a camera where anti-shake let you trust shooting at 1/15th sec.? Plus a lens of f/1.7 or f/1.4—scooping up four times as much light?

Then, people could take photos freaking anywhere. Without flash. The light of a single candle is enough! (That’s LV 2, if you’re wondering.)

Now, to get a lens that fast, we’d probably need to lose the zoom.

Oh noes! Cameramakers’ second most-flogged spec number is zoom range. Our 12x is better than their 10x! I can hear the howls already: “Who would buy a camera without a zoom?”

Well, how many people use cell phones as their main camera today? Those have no optical zoom.

Some used to ask, “who would pay for a compressed MP3 when you can own the real physical disc?”

People who appreciate convenience. People who want technology to fit into their real lives. Us.