How is programming for touch development on the Universal Windows Platform (UWP) different from mouse and keyboard development in Windows Forms? This post will cover some of the subtle differences between the two and how to use the most advanced tools for building smooth touch experiences.



Fig 1. Command prompt

The ways we communicate with our computers have gone through many changes over the years. For a long time, interactions were governed by the keyboard. Then, the Graphical User Interface (GUI) came along, which not only introduced the mouse, but drastically altered how home screens and apps looked. The success of Windows 95 and Mac OS cemented that user experience, popularized personal computing and changed our computing landscape.



Fig 2. Graphical user interface in Windows Form

While Natural User Interfaces (NUI) have been around for a long time, the transition to touch interfaces has happened both more gradually and more quietly. In part, this is because it occurred hand-in-hand with the growth of smartphone-based mobile computing – and we have a tendency not to see our phones as centers of computing power, but rather as accessories or appliances. In part, though, the smooth transition occurred because the creators of development tools such as Visual Studio made a conscious effort to protect developers from these changes by making touch and mouse (or tap and click) appear to be the same thing. As a consequence, we moved from a mouse-centric development world to touch-friendly development world without really noticing the tectonic shift that occurred beneath our feet.

What does touch-friendly mean?

When touch-enabled tablets first arrived for Windows, you pretty much just used your finger as if it was a mouse to manipulate applications designed for the mouse. It was difficult because buttons tended to be too small and you couldn’t see visual affordances, designed for mouse interactions, hidden beneath your fingers. Even if your device supported touch, the apps running on it continued to be mouse-centric. Similarly, early mobile devices came with a stylus because a stylus could manipulate those tiny buttons.



Fig 3. Natural user interface in Windows Phone

Phone apps mark the transition from mouse-centric to touch-friendly development. Phone apps weren’t just touch-first, of course. They were touch-only. They forced changes in the design, layout and controls used in apps to make interaction easier for smartphone users. These in turn were eventually incorporated into the UWP platform. While the UWP platform supports both touch and mouse interactions, as well as a variety of other inputs, when you develop apps and controls for it, you should think of it first as a touch interface. Mouse interactions should be added on secondarily.

There are many overlaps between touch gestures and mouse interactions that make this easier. For instance, when you tap on a UWP button or click on it, both the Tapped event and the Click event are triggered. Similarly, events such as ManipulationStarted and ManipulationCompleted do not differentiate between touch and mouse. Given the amount of effort that has gone into blurring the difference between touch and mouse in UWP, it is worth asking ourselves, when is a click not a click?...


Read more: UWP and the evolution of touch development - Building Apps for WindowsBuilding Apps for Windows