Group: Forum Members
Posts: 120,
Visits: 1.5K
|
After about a month's use I am still not sure what will happen if I put my finger on the screen. I know it has to do with press/long press/drag, but since the Nexus is slower than a desktop PC I cannot be sure when it is ready for input after moving the map for example.
Touching the screen could be:
A. I want airspace/airport/zone information B. I want to edit/create a route, waypoint,etc C. I want to see another part of the map, not on the screen for the moment.
Right now these three functions are done with press/long press etc. I still feel that having a 'CTRL' 'ALT' 'SHFT' key to press first would simplify rather than complicate the user interface. I am speaking in terms of keyboard equivalent, of course on a touchscreen you could provide a button or a 3-position switch. If I touch the screen, I know I want A or B or C and select this function first. You could make C default and then only need a two-position selector for A/B or similar.
Thanks,
Dirk
|