Although announced for the .NET Micro Framework V3, I found a way to use touchscreens with existing WPF apps without significant modifications and put together a library.
The PointingDeviceInputProvider class
The abstract base class PointingDeviceInputProvider posses the three methods DevicePointerDown, DevicePointerMove and DevicePointerUp that all accept screen coordinates. To implement touch support you need to derive a custom input provider like TouchInputProvider and pass the coordinates to the methods. It does not matter where the coordinates come from. Beside touchscreens you can think of a mouse or even a joystick would be possible. The PointingDeviceInputProvider class handles it all.
How can I use touchscreens without changing existing WPF applications?
When you pass coordinates to the DevicePointerDown method of the PointingDeviceInputProvider class, then determines the underlying WPF control on the screen and sets the focus to the element. Then it sends a Select-key down-event to the control.
If you have a button element with nested elements like text and you want to have your button focused, you need to disable all the child elements to prevent one of them to get the focus instead of the surrounding button.
To add touchscreen support to your WPF application you just need to have a device with a touchscreen like the Hi-CO.ARM9 from emtrion and add a concrete input provider implementation like the emtrion.SPOT.Input.TouchInputProvider class instead or in addidtion to a GPIOInputProvider object.
emtrion`s HiCO-ARM9 hardware
In the following you can see the emtrion.SPOT.Input.TouchInputProvider implementation based on the PointingDeviceInputProvider and using emtrion´s managed touch drivers:
//universal TouchInput Provider for Emulator and ADS7843 on HiCO.ARM
//written by Jens Kühner
public class TouchInputProvider : PointingDeviceInputProvider
public TouchInputProvider(PresentationSource source)
: base(source, false)
private void PenDown(TouchDriver.Event rEvent, object rId)
int x = rEvent.Pos.X;
int y = rEvent.Pos.Y;
//we currently only get a down event,
//so simulated down and up
Emulating a touchscreen
It is even possible to emulate a touchscreen with the .NET Micro Framework extensible emulator. The following figure shows the emtrion´s HiCO-ARM9 hardware emulator with touchscreen support. You can simulate touch events by clicking the emulator display with the mouse.
Extended touch support
If you need an even better support than just simulating key events, you need to implement a custom WPF element that implements the Kuehner.SPOT.Input.IExtendedPointDeviceControl interface with the methods OnPointerDown, OnPointerUp, and PointerMove. The attached sample include a paint box sample to show that feature.
To play with touchscreen support and the .NET Micro Framework, you can download my touchscreen sample projects including an emulator, the button sample and paint box sample.
Further you should visit the emtrion’s support area, where you can download their nice touch-enabled HiCO-ARM9 emulator and sample code.
Soon I will post how to use GHI Electronics EmbeddedMaster hardware to use a mouse or joystick to interface with WPF applications.