First I thought my touches did not get detected at all, but then I found they was just way out of place .
It turns out that the OnTouch X and Y does relate to the physical screen resolution, rather than the relative X and Y positions of the display like MouseX/Y does.
I did a workaround by making a ratio for recalculation of the X and Y coordinates.
Code: Select all
HWidth = GetAttribute(#DISPLAY, 0, #ATTRHOSTWIDTH)
HHeight = GetAttribute(#DISPLAY, 0, #ATTRHOSTHEIGHT)
DWidth = GetAttribute(#DISPLAY, 0, #ATTRWIDTH)
DHeight = GetAttribute(#DISPLAY, 0, #ATTRHEIGHT)
WidthRatio = HWidth/DWidth
HeightRatio = HHeight/DHeightBut in non-fullscreen, the app display will open as a smaller window on the midle of the screen. Then the OnTouch X and Y coordinates will start start from the upper left corner of the app, so that the X/Y's will be exact the same values as it would be using MouseX/Y.
Now this is not a problem anymore, since I found in this case I was better off using IsLeftMouse(), but I still wonder if this OnTouch behaviour is intentional?