Wait... why are we having people do this anyway? Backscatter scans would be perfect if it wasn't for the privacy problem, so why aren't we using image processing software to look for objectionable objects etc instead of people?
Reading it firmly shifted my objection from one based on privacy to one based on safety. Therac-25 wasn't that long ago, after all. And, of course, let's not forget shoe fitting fluoroscopes.
Let's follow this line of reasoning. Suppose your image recognition software noticed something it thought was suspicious. What are you going to do about it? Have a human look at the backscatter picture? Do a pat-down? Either way, you're cutting down on the privacy invasion, but not eliminating it.
That's probably worthwhile, but I doubt it'll make people feel much better about the machines, and it'll piss off the people who will see it as backing down on airport security.
Do it, prove that it works, get a political or corporate sponsor for your product to roll it out, and help to solve this growing war between personal liberty and government security.
People distrust technology. Even in cases where it's more reliable than a human (and I'm not sure this would be one), there's an inherent mistrust of machines making decisions.