Who isn’t hanging over his desk to read a text or view a picture on his computer screen? -to be honost… I’m doing it right now- The movement that you make as a result is almost exclusive for this reason (viewing details). And in a world where gesture-based interfaces are top notch, we could have expected some interesting experiments.
A.o. Chris Anderson from Carnegie Mellon University initiated a research towards the possibilities to combine human behaviour with digital functions. In this the digital function is the ability to zoom on your screen. In the video below you see what it actually looks like (and/or read the paper).
Everybody knows that it will definitely become irritating if your computer starts zooming in and out depending on your position… but this research doesn’t have to result in a 1:1 implementation. It would be interesting to put the researchers in one room with a physiotherapist and an interaction/industrial designer. I think they could come up with interaction solutions that can cut back RSI.