(PROOF OF CONCEPT)
Role/Contributions: Designer, Concept
Tools & Technology: Adobe Illustrator, Adobe Photoshop
Date (Year): 2016
Upwards of 280 million Captchas (the most common submission/comment verification tool) are used across the web on any given day. Common methods for verification tools include:
- Identifying words, numbers and letters from a static image
- Identifying words or numbers on scanned images or photographs
- Identifying thematically similar images from a collection
- Identifying words or numbers from an audio recording
These methods are generally better-suited to laptop, desktop and other in-home and stationary usage. Smaller screen space may cause verification images to render in lower-than-optimal resolutions, while volume issues and clarity of audio files (particularly in loud, public spaces) on smartphones make audio verification less convenient to the user. These create experiences which decrease engagement and may negatively impact interaction with sites which employ some form of existing verification UI.
Up to two billion smartphone users daily engage with as many instances of online verification tools every week, making their use practically ubiquitous.
Site owners require a tool which protects their communities and services from spammers, bots, and other non-human/non-genuine users. Gesture-based Verification will improve that involvement among smartphone users by improving the form of interaction.
Using only the common tactile language of smartphones – specifically, touchscreen and gestures – the Gesture-based Submission Verification UI should increase ease-of-use for the tool and interaction across sites which employ it.
Additionally, some existing forms of verification tools perform poorly in accessibility tests. Low-res photographs, insufficient volume or clarity on audio samples and glyphs displayed at smaller-than-optimal sizes make use of the existing tool difficult for a significant proportion of users. Gesture-based Verification should improve accessibility in as large a number of these cases as possible.
Gesture-based verification acts as an add-on to sites which require confirmation to verify that its users are indeed individual human users and not automated bots of some fashion (such as sites with blogging, submission, and contribution components). Users submit their content and are directed to the gesture verification overlay, the components of which are: Instruction, tracing guide, submit button and cancel button.
As with verification tools in general, accuracy of the verification is submitted through a back-end process for validation. Multiple instances of invalid submissions result in an alternative verification process (image, audio or email confirmation). Otherwise, users complete the tracing of a simple gesture and are moved through to a confirmation and, then, a destination screen (if any).
On smartphone and other smaller handheld devices (including "phablets"), the verification UI manifests itself on a overlay. Cancelling or backing out of the overlay returns the user to the comment or submission form which originally generated the call for verification.
On tablet and larger touchscreen devices, the verification UI can manifest itself in-line, consistent with the behavior of existing verification tools
The gesture-based Verification UI uses the already-common method of using gestures on Smartphones to unlock devices, swipe-to-type keyboards, scroll-up -down and -side, swipe, deletion, and so on. The sampling below represents only the smallest fraction of a large number of possible gesture shapes:
(examples of gestures)
Conceivably, the smallest possible gesture involves connecting two points on the screen. Given that the points constitute a binary option (on/off), the 5x6 field of thirty possible points implies that there are 1,073,741,824 possible gestures which can be used for verification.