You Can't Detect A Touchscreen


Update: I somehow neglected to mention Modernizr.touch, which actually uses the 'Touch APIs' method below. Modernizr no longer claims to detect touch devices – see this discussion.


Whatever you may think, it currently isn't possible to reliably detect whether or not the current device has a touchscreen, from within the browser.

And it may be a long time before you can.

Let me explain why...

Boxed in

The browser environment is a sandbox. Your app's code can only get at things the browser wants you to, in order to limit the damage a malicious website can cause.

This means that the only information about the system you can get is what the browser exposes to you, in the form of HTML, CSS and JavaScript APIs. To determine if a system supports a certain feature, we can a) see if a certain API is present, or b) see if it actually does the right thing.

Historically, two browser features have been used for "touchscreen detection": media queries and touch APIs. But these are far from foolproof.

Walk with me.

Device width media queries

Mobiles have small screens and mobiles have touchscreens, so small screen equals touchscreen, right?

var hasTouch = window.matchMedia('(max-device-width: 320px)').matches;

So, so very wrong. Large tablets and touchscreen laptops/desktops have clearly proven this wrong. Plus thousands of older mobile handset models had small non-touch screens. Unfortunately, sites applying the mantra "If it’s a small screen, it’s touch; if it’s a big screen, it’s mouse-driven" are now everywhere, leaving tablet and hybrid users with a rubbish experience.

Touch APIs

This is the method that Modernizr.touch uses, by the way.

If the browser supports events like touchstart (or other events in the Touch Events API spec) it must be a touchscreen device, right?

var hasTouch = 'ontouchstart' in window;

Well, maybe. The problem is, no one ever said that a non-touch device can't implement touch APIs, or at least have the event handlers in the DOM.

Chrome 24.0 shipped with these APIs always-on, so that they could start supporting touchscreens without having separate "touch" and "non-touch" builds. But loads of developers had already used detects like the example above, so it broke a lot of sites. The Chrome team "fixed" this with an update, which only enables touch APIs if a touch-capable input device is detected on start-up.

So we're all good, right?

Not quite.

An API for an API

The browser is still quite a long way from the device itself. It only has access to the devices via the operating system, which has it's own APIs for letting the browser know what devices are connected.

While these APIs appear to be fairly reliable for the most part, we recently came across cases whereby they'd give incorrect results in Chrome on Windows 8... they were reporting presence of a touchscreen ("digitizer"), when no touchscreen was connected.

Firefox also does some kind of similar switching and it appears to fail in the same cases as Chrome, so it looks like it might use the same cues – although I can't profess to know for sure.

It appears certain settings and services can mess with the results these APIs give. I've only seen this in Windows 8 so far, but theoretically it could happen on any operating system.

Some versions of BlackBerry OS have also been known to leave the touch APIs permanently enabled on non-touch devices too.

So it looks like the browser doesn't know with 100% confidence either. If the browser doesn't know, how can our app know?

Drawing a blank

Assuming the presence of one of these touch APIs did mean the device had a touchscreen... does that mean that if such a touch API isn't present then there definitely isn't a touchscreen?

Of course not. The original iPhone (released in 2007) was the first device to support Touch Events, but touchscreens have been around in one form or another since the 1970s. Even recently, Nokia's Symbian browser didn't support touch events until version 8.2 was released last year.

IE 10 offers the (arguably superior) Pointer Events API on touch devices instead of the Touch Events spec, so would return false for the ontouchstart test. To cover all bases you could also check for e.g. 'onmspointerdown' in window – although this could easily suffer from the same system API reliability issues as Chrome and Firefox. I'm talking rubbish, onmspointerdown is relevant to mice and other pointers too. IE 10 does, however, offer navigator.maxTouchPoints which could be used instead. Thanks to @jacobrossi for correcting me.

Neither Safari nor Opera has implemented either touch API in their desktop browsers yet, so they'll draw a blank on touch devices too.

Without dedicated touch APIs, browsers just emulate mouse events... so there are loads of devices kicking about with touchscreens which you simply can't detect using this kind of detection.

It's dynamic, Jim

A touchscreen could be connected as a peripheral to an otherwise non-touch laptop, or a KVM switch could switch to/from a touchscreen from a non-touch screen. This could happen at any time during a browser session.

Browsers can't just add and remove APIs while an app is running – that would cause chaos – so following a change in the set of connected devices, this kind of feature detect could start to fail.

Poke it

I said earlier that another way to test for features is to see if the APIs actually do what they're supposed to...

var hasTouch;
window.addEventListener('touchstart', function setHasTouch () {
    hasTouch = true;
    // Remove event listener once fired, otherwise it'll kill scrolling
    // performance
    window.removeEventListener('touchstart', setHasTouch);
}, false);

This is more reliable than simply seeing if the event handler exists in the DOM: unless a browser massively violates the spec, the event will only get fired if a touch-capable device interacts with the browser.

However, it comes with 3 massive caveats:

  1. It requires interaction before you can know the result
  2. If no touch interaction occurs, you don't know that there isn't a touchscreen – just that the user isn't using it (yet)
  3. The event still won't fire for browsers which don't support the Touch Events API... which is a lot of them

This may be good enough for some use cases, but for any applications involving tailoring the layout, the UI would shift when you prod it, which is a fairly horrible user experience.

Pointer media queries

These were added in the Media Queries Level 4 spec. They've only been partially implemented in WebKit and aren't in any stable browser builds yet.

var hasTouch = window.matchMedia('(pointer: coarse)').matches;

I had a rant about the finer details of this spec before, but it has actually got the potential to be used for a reliable feature detect. Watch out though: it isn't a detect for a "touchscreen" as such... but rather any coarse pointing device.

Being a media query, it's naturally dynamic: the result will reflect the devices connected at any given instant.

However, it still depends on reliable data from the OS APIs.

It's also unclear when (or even if) other browsers will implement this feature... the spec is still very much in flux at the W3C. Until it sees wide adoption, browsers which don't support these media queries will be as "undetectable" as they are now.

You're doing it wrong

In my opinion, if you're trying to "detect a touchscreen" in the first place, you're probably making some dangerous assumptions. I'll go over a few reasons why you might want think you want to do this, and point out the flaws.

Finger-friendly layouts

Fat fingers are less accurate than mice, so it sounds like it makes sense to adapt our layout for touchscreens: larger controls, more space between them, etc.

But are touchscreens the only input devices which exhibit poor pointing accuracy?

What about gesture remotes for smart TVs, the Wiimote, or finger-tracking technology like Leap Motion?

If future-proofing against these kinds of devices matters to you, don't assume that you should only serve a spacious layout if you detect a touchscreen.

Events & interactions

So you want to set up swipe gestures for your carousels and map widgets? Cool. But don't think that means you don't also need to support mouse and keyboard interactions.

Users with sight impairments often connect keyboards and pointing sticks to their smartphones, and a lot of devices support both mouse and touch simultaneously... you can't assume they don't want to use their mouse/trackpad/keyboard.

I'd strongly recommend implementing both interaction methods together, in which case you don't need to specifically detect a touchscreen.

This article by Patrick Lauke goes into more detail on why (and how) you should implement mouse and touch events side by side. It's well worth a read (if you excuse his initial claim that you can reliably detect touchscreens...)

Handling hover states

Touchscreens (currently) can't convey hover states, so it'd be nice to adjust our UI for touchscreens so that they can still get around.

Of course a keyboard can't hover either. It's probably just best to avoid relying on hover states in the first place – use them for embellishments.

So what should I do?

Edit: This summary distracted from this article's original message, which was "be careful, you may not get the results you think you're getting". If you're aware of the risks of these detection methods and the assumptions they imply, of course it's your decision whether or not to use them anyway. However, if you're not sure, or your brief is "to support every device", the following advice may be useful.

For layouts, assume everyone has a touchscreen. Mouse users can use large UI controls much more easily than touch users can use small ones. The same goes for hover states.

For events and interactions, assume anyone may have a touchscreen. Implement keyboard, mouse and touch interactions alongside each other, ensuring none block each other.

Or, as I suggested in my media queries article, you could just ask.