Alexa footage has ISO specific conversion and Nuke's is based on the base ISO, resulting in slighty over-saturated footage in some instances. Why on earth would you want to reference streaming media? Dunno, all the work out there being shot on Alexa in Prores*. And when writing to Prores Nuke doesn't support arbitrary frame size. Meanwhile, After Effects does.
I can't speak for Nuke under Mac...I've never had a corrupt project (and it's just a python text file so it's pretty easy to change)
Well then, you don't have a proper frame of reference to comment then, do you? And a file written with only a couple bytes of information is a bit hard to hack. This is a known issue that's been around for like two releases now.
edit: (*) I'm guessing now you might have jumped to the conclusion that "stream" mean online maybe? Nuke doesn't do so well accessing anything but frame-based files, currently, not totally reliably. Getting plates from a client in Alexa native Quicktime (or other cameras), you should be able to work with them as reliably in Nuke as you can in After Effects. You should be able to write Quicktimes back out for a client from Nuke as reliably and with all the flexibility of doing the same thing in After Effects.
It looks like you're new here. If you want to get involved, click one of these buttons!