Hi to all signal processing experts.
Sometimes my pic is off by 2.5 degrees; sometimes by -0.14. Intuitively one could imagine that the image degrades less if you have to perform less rotation.
Then again, rotation by a multiple of 90 is effectively lossless.
I currently assume that the image gets a constant amount of degradation from a rotation that isn't a multiple of 90. Of course I never could tell any degradation from a 24 megapixel image, but I'm intrigued if there is some theory to this. Mathematics can be insightful -- perhaps the picture comes out perfect if it's a bayer image and the rotation is something absurdly silly like e/360+pi
Sometimes my pic is off by 2.5 degrees; sometimes by -0.14. Intuitively one could imagine that the image degrades less if you have to perform less rotation.
Then again, rotation by a multiple of 90 is effectively lossless.
I currently assume that the image gets a constant amount of degradation from a rotation that isn't a multiple of 90. Of course I never could tell any degradation from a 24 megapixel image, but I'm intrigued if there is some theory to this. Mathematics can be insightful -- perhaps the picture comes out perfect if it's a bayer image and the rotation is something absurdly silly like e/360+pi