I've been wondering, how much the radiation and the photographic film interacted on the Apollo travels.
Why? I'm not trying to be smart or suggest you are a secret Apollo Denier, but I'm interested..
Maybe someone else here (JayW?) can elaborate further, and maybe there
is a way to do this meaningfully, but I doubt it and I'm going to take a slightly contrary view..
As I see it, there are 3 distinct environments, which need to be taken into account:
1: Film in magazines, in boxes, in the LM, in the Van Allen Belts
2: Film in magazines, in boxes, in the LM, in free space or on the moon
3: Film in magazines, on the surface of the moon
Well, yes, I guess that's a starting point, but.. even that is over-simplifying it - what about when the cameras were in shadow? Astronaut shadow versus LM shadow? Near ground or at chest height? Etc, etc..
Is there any way to compare the radioactive dose absorbed by the film, to the amount of light energy needed to darken the film by, say, 50%?
Effectively impossible, I'd guess. Light is, after all, a small subset of the total radiation.. and you have radiation both above and below the visible light range and all of it interacts differently with the camera body itself and then the multiple photosensitive film layers and the base emulsion, and of course then there's the film which is still wound prior to being exposed versus the frames at or near the light chamber and the interactions with the backplate and the reseau plates, versus that which is wound onto the takeup spool on the other side of the cartridge... There's also the fact that the chemical nature of the film has changed *after* it is exposed, so you'd havta do it twice...
In order to do this, the effects of various types of radiation, and it's interaction with the materials between the film and the various environments need to be understood, and the time in each environment need to be understood.
No, I'd argue that they actually don't need to be understood in the sense of some sort of accurate theoretical evaluation.. What I mean is, rather than agonise yourself to a possible early grave by trying to work out some meaningful way to come up with a number that would probably be wildly inaccurate due to the enormous complexity and variability of the situation, why not use a basic understanding and a ballpark idea, and then just take the cameras up into orbit and try them out and see what happens..?
(Phew - I knew I shouldn't have held my breath as I typed that sentence..) Which is pretty much what they did. After all, we have a fairly good understanding of how film is affected down here on earth in hot/cold environments, at high altitudes, over long periods. We also know how the radiation varies once you get into orbit and beyond, and most of this was pretty well tested and understood (eg Gemini) before A11 made it for the real thing.
For simplicity's sake..
I don't think any attempt to do this in that fashion would be remotely 'simple'.. You'll find light sensitivity curves for the film easily, and then maybe susceptibility to xray (eg airport xray) type radiation, but extended radiation sensitivity? That would probably involve materials scientists with very sophisticated and expensive equipment, rather than Kodak or Hasselblad.. So why not just take the cameras up on missions and try them out and look carefully at the results? Which is what they did.
Frankly, it just wasn't that much of an issue, in the same way that radiation in general wasn't a big deal for these short missions (barring intense solar activity..)...
Edit: My initial assesment, based on nothing but my years spent as a photo-amateur, makes me believe, that the film used, protected as it were, would survive several moon trips without being significantly damaged.
While I agree and the evidence suggests you are right, earthly experience doesn't count for much in space. And in this case, I don't think there is/was any substitute for simply testing it out in the actual environment..