I am experiencing visual distortion rather than blank or dark spots in my vision. I see wavy lines, and when I read the letters sort of “decompose” - pieces of them disappear, and I have to figure out what I’m seeing from context. Several months ago I was trying to read a label on something, and was using my iPhone magnifier app. To my surprise, the app not only made the print bigger, it corrected the “decomposition” of the letters and made the print look normal to me. I started experimenting and found that if I looked at the rest of the world through the app, it corrected the wavy lines as well. A good example was watching the news on TV - without the app, the anchor’s faces were distorted, with it they looked normal.
I believe that the interpolation software Apple uses with the camera and magnifier app is providing a smoothing function by adding pixels in the display. I wonder if this technology could also be used as part of the MacOS and IOS display drivers to provide the same smoothing function on their displays as I get with the camera/magnifier. Any thoughts from you developers about whether this would be possible/trivial/difficult?And whether Apple would have to do it or if it could be an add-on app?