|MadSci Network: Physics|
Let me begin by stating that I'm an astronomer, and not really qualified to answer your question. It seems to me that an expert in the physiology of the human eye, such as an ophthalmologist, would be the right person to ask. Nonetheless, I'll tell you what I can.
There are two ways I could interpret your question. One would be "is there any brightness beyond which the human visual system saturates, so that increasing the intensity does not increase the response of the eye?" Another would be "is there a brightness beyond which the human eye is damaged?"
The first question would be tricky to answer, because human eyes are complicated. The eye can adjust to levels of VERY different brightness by changing the diameter of the pupil, and by modifying the mix of certain chemicals in the retina. If you sit inside a brightly lit room at night, and then walk outside into the dark, you will at first see almost nothing. But over ten or twenty minutes, your eyes will gradually adjust to the lower light levels. Half an hour later, you'll be able to see stars which were completely invisible when you first walked outside. If you then walked back INSIDE, you would for a few seconds be dazzled by the higher light levels, before your eyes adapted again.
So, the answer to the first question would depend on the starting conditions. Someone sitting out in the dark at night would be dazzled by a certain brightness of light; but that same level of light would be no problem at all for a person who was sitting inside a brightly lit room the entire time. And that's all I can say about it. I tried to find references to any sort of "saturation intensity" for the human eye, but could not.
The answer to your second question is a bit easier to provide. There are many sources of information on eye safety -- medical texts, laser lab safety manuals, and so forth. In many of these, you can find guidelines for levels of exposure which can damage the human eye. The weakest link in the system appears to be the retina: if light of a certain intensity is focused on the retina for a certain amount of time, then it can permanently disable the light-sensitive cells and create a "blind spot." Additional exposure can detach the retina from the rear surface of the eye, or increase the size of the blind spot.
One good source which you can access easily is the Wikipedia page on "Laser safety."
You can see a set of graphs which provide the critical exposure for light of different wavelengths. One must take into account two factors: the intensity of the light, usually expressed in Watts per square meter or per square centimeter, and the duration of exposure, expressed in seconds. The combination of the two yields Joules per square meter or per square centimeter: that means "energy absorbed by the retinal cells." The graph shows that for light in the visible range, 400 nm to 700 nm, damage may occur if light of intensity 0.003 Joules/cm^2 shines for a total of 1 second. Higher intensities over shorter durations can yield the same damage.
The little laser pointer in my office produces visible light of about 650 nm (red) and has a rating of less than 1 mW; that means less than 1 milliWatt, or 0.001 Watt, or 0.001 Joules per second. One might conclude that if I accidentally shone the beam into my eye for even a few seconds, I would be risking permanent eye damage.
Again, I urge you to contact a real eye specialist to find the answer to your question.
Try the links in the MadSci Library for more information on Physics.