To be sure, film has grain—as in individual crystals of photochemically reactive substances. As such there is a minimum resolution to film that can be expressed as a density. Back when I did film photography, we would scan film at 4000 dots per inch, and this was fine enough so that individual film grains were usually represented as multiple pixels in the final image. Somewhere I have a hard disk that has Roll 40 scans from the camera originals (not dupe masters) at 4000 DPI, but I don't remember where I put it.
The point is that if you really wanted to, you could express film's granularity and dimensions in terms similar to those you'd use to describe a CCD sensor of comparable pixel density and size. While film grain is evenly distributed, it is not uniformly distributed. Nevertheless it is theoretically possible to express film granularity in such terms as average grains per square millimeter, which would be similar enough to uniform CCD pixels per square millimeter.
That's not how film granularity is typically measured, though. You use a root mean square method on actual densitometry measurements. This accommodates the nonuniform distribution and gives the result as a statistic, not a discrete number. So while in theory you can apply "megapixels" to film, you shouldn't. Film is only technically a discrete medium, and therefore usually isn't treated or measured as one in photography or photographic analysis.