The entire reason AI upscaling is even becoming a consideration in any application is because of how famous other methods are at making subjectively incorrect decisions about what a pixel value should be. To the point that people are willing to put up with AI picking the wrong values often in hope that at least it won't consistently pick the wrong ones like other methods in almost all cases. There wouldn't be so many different upscaling methods if even one of them worked every time at coming up with reasonable pixel values. There wouldn't be efforts at new technology if everything worked correctly today.
It's not even mathematically possible, because multiple different input scenes of higher resolution can result in the same output image of lower resolution. It's like the counting problem in compression, where you can't compress every possible n bit length message into an n-1 bit length message. AI systems are trained with a certain set of images, which form the assumptions the system makes about images it adds resolution to.
Exactly. The only reason why AI brings hope, barily, is that it can give you a context aware upscale. Looks like you are upscaling a human. I've done that before. This pixel here should be something like this. But then the problem of injecting imagery that isn't there and being certain that you haven't becomes even more serious.
(post is archived)