multi-touch_screens_vs._mouse-driven_screens_answer

Multi-Touch Screens vs. Mouse-Driven Screens

Name:

Number:

Course:

Lecturer:

Comparison and contrast between the metaphors used in the design of applications that run on all devices using either multi-touch computer monitor or a mouse-driven computer monitor

First, the desktop metaphors of the multi-touch computer monitor are more modern as compare to that of the mouse driven computer monitor which can be considered backward technology for this case. The metaphors in the multi-touch screen responds to screen touch directed on a particular metaphor when activating a given action. On the other hand, the metaphors on the mouse-driven computer monitor responds to a mouse click on the specified metaphor by the user. Both of these monitors have their merit and demerits.

The advantage of multi-touch computer monitor is that the task of invoking a metaphor command is done by simply pointing that particular metaphor which actually takes a shorter time. Invoking metaphor for an action in mouse driven monitor is a little bit slower since it involves moving a mouse, pointing on the metaphor in question and then clicking it. This means that touch screen is easy to use as compared to mouse driven monitor.

The disadvantage of multi-touch computer monitor is that using screen touch to open the properties of a given desktop or laptop icons/metaphors can be very challenging as the action known by many is only that which is equivalent to clicking and double clicking in mouse driven monitor. Checking or opening properties window of any icon on screen appear to be simpler using a mouse-driven monitor through right clicking a metaphor and selecting properties from the popup menu.

Nevertheless, both these two monitors were designed using the principle of user centered design which is used by many software developers. However, the idea of user centered design may not add value to either of the system as this principle does not put in to an account the activity centered design, that is, what the system is exactly supposed to do. As an example, one may ask what a user gain by using multi-touch screen in terms of activity. The truth of the matter is that they are designed to do the same activity but only using different type of interface. This means that if multi-touch screen design is meant to add a different activity that is not possible with mouse driven monitor, then it will be very much reasonable. In other words, the two kinds of applications are only different in the way users interact with the system but similar in terms of the work they are expected to do. Therefore, in the process of redesigning applications to run on all devices I will use both user centered design and activity centered design as away of reducing the limitations of the applications (Norman, 2005).

The following figure (A) shows an example of multi-touch interface with arrows showing the guide to interaction gestures. The next diagram, that is, figure (B), shows the mouse driven interface showing how a mouse can be used to complete an action of repeating the command that was previously undone. The action is completed by simply moving the mouse on the surface of the screen, pointing on the target command/icon as shown bellow and then clicking on it.

A: Multi-touch interface

B: Mouse driven interface

Difference between the interaction types and styles that apply to these monitors and applications running on them

In fact, the applications running on the two types of systems can be the same although multi-touch screen in most cases use mobile based application software. The multi-touch screen response to the user is based on sensing ability to sense the angle at which the user’s angle touches the screen in relation to the surface of the screen. This does not apply in mouse driven monitor where the response only take place on mouse clicks of item on the screen. In multi-touch screen, the user can also exploit the friction amid his/her finger and the screen in order to apply several force vectors as long as one is in contact with the screen. This is not possible with the mouse driven monitor. The multi-touch monitor also has higher potential for rich interaction as compared to mouse driven monitors since it uses high degree of contact rather than pressure which has aside effect when hard pushing using finger tips is involved like the case of mouse clicks. This simplifies to saying that interaction with multi-touch monitors is easy and simpler than that of mouse driven monitor (Buxton, 2007).

The interaction using mouse driven monitor is also slightly more accurate, especially for smaller target selection on screen. In addition, mouse driven interaction is extra efficient when measured by throughput as compared to digital tabletop. The multi-touch table interaction can also be a workable or feasible option to mouse input provided that suitable provisions for adjustment of size of target are made. Therefore, the efficiency of the multi-touch monitor is lower for screens with smaller target size. The target design elements within the user interface ought to have size that is larger than 30mm for better performance, considering the horizontal touch surfaces. IF this target size is achieved, then the time taken to complete a given task will exceed mouse depending on performance for similar tasks. As a result, the selection and completion models that are mouse based can be employed to offer the task completion of the upper bound (Micire, 2010).

Description of the conceptual model employed in the design of these types of applications

The business process modeling tools of contemporary type offers menu based user interfaces that can be used to define and visualize process models. The menu based interactions are maximized for applications that run on desktop computers. Therefore, mouse driven monitors which are also used in desktops can use business process contemporary conceptual model. This type of model is very limited on multi-touch monitor applications. On the other hand, the extensive use of mobile devices on the day to day operations of business together with their multi-touch potentials provides hopeful viewpoints for the intuitive definition and altering of the process models. In addition, multi-touch promotes the use of collaborative business process modeling that relies on natural together with intuitive gestures plus interactions (Kolb, Rudner & Reichert, 2013).

Description of analogies and concepts exposed by these monitors to users, including the task-domain objects users manipulate on the screen.

Both the multi-touch monitors and mouse driven monitors are similar in a number of ways. First, both have interfaces that are designed based on human centered design principle. In both cases, the users can be able to complete tasks such as document creation, saving, information retrieval as well as sending of the created file to a different recipient online. The user can also be able to edit a created document in both cases though using different methods of interaction. There is also the task of switching on and off the device before and after usage which are also done using different interfaces by users. In both cases, users also explore the application of graphical user interface to perform a given task like the surfing through the internet.

Most of the selection gestures made by the multi-touch monitor are roughly analogous to that of the mouse click performed within the window of user interface interaction. Matejka et al. (2009) cited in (Micire, 2010) reported the successful emulation of mouse on the platform of multi-touch monitor by means of simple and stylish heuristics associated with finger tracking. As an example, the dragging analogy of items from one area of the screen to another is ubiquitously employed in the system of Windows Input Mouse and Pull down Menus (WIMP) interface interactions (Micire, 2010).

References

Buxton, B. (2007). Multi-touch systems that I have known and loved. Microsoft Research, 56, 1-11.

Kolb, J., Rudner, B., & Reichert, M. (2013). Gesture-based Process Modeling Using Multi-Touch Devices. International Journal of Information System Modeling and Design, 4(4), 48-69.

Micire, M. J. (2010). Multi-touch interaction for robot command and control. MASSACHUSETTS UNIV LOWELL.

Norman, D. (2005). Human-centered design considered dangerous. Retrieved from http://www.jnd.org/dn.mss/humancentered_desig.html