Yeah, that would make sense but is there any reason why RTL are not susceptible to static, cosmic rays or EM radiation than CMOS chips?
Most electronics are susceptible to radiation, but some components (mainly semiconductors) are far more susceptible than others.
Radiation is energy, so what really matters is how much energy it delivers to a device (e.g., a transistor in an integrated circuit) compared to what that device normally handles or can handle. Until Apollo, all computers had been made from discrete components (transistors, etc) that tend to be physically large. (The Apollo Guidance Computer was the first computer made with integrated circuits.) Computers occupied large rooms and had to be continuously cooled, yet their processing speeds and memory sizes were quite low. Each component therefore could (and usually did) handle a lot of power, and with everything else the same, such a computer would probably be fairly radiation tolerant.
A modern commercial computer uses integrated circuits with
many orders of magnitude more transistors than those used in the Apollo AGC, so each transistor has to be quite tiny. This means it can't handle much power without damage, and a very small amount of injected energy can overwhelm the energy of the data bits it is processing. The latter phenomenon causes transient errors without necessarily also causing permanent damage.
But it's possible to mitigate the effects of radiation on even these devices. One part is to use fabrication methods known to be rad-hard. Another is to add redundancy and cross-checking so that if one device (e.g., an entire CPU) has a transient error caused by a charged particle, it will be detected. Typically you have three devices working in parallel, with logic to compare their results and 'vote out' a CPU if it doesn't agree with the other two.