The Next Step in Responsive Design & Development

Karthikeya GS
5 min readJul 9, 2020
Responsive layouts

Over the year, we have had a lot of variation in phone screen sizes. From the dawn of iPhone 1 to palm-size phones to phablets. And the introduction of tablets and hybrid laptops is not to be forgotten either. Over the course of these changes, developers & designers have pioneered responsive layouts. Frameworks that allow flexible layout development, website creation with drag and drop features for web and phone views, etc have taken the concept of creating a responsive design to an exciting level. But, was responsive design only made to cater to screen resolution variation? Can there be any addition, small or big to it which opens new doors to developers and designers?

And so, this was the plot for my next venture in my self-taught UI/UX designing. Over the course of designing mobile applications, I have gradually taken the path of understanding in what circumstances the user might open an application. Boiling this thought down to a simpler question, how often would the user have only a single hand free to perform an operation on an application. Turns out 87% of the users use their right hand to interact with their phone, irrespective of the situation they are in. And, consciously or subconsciously I have been designing applications for right-hand users.

Icon arrangement in general UI
*The Orange Icons

Now, what does that mean?

Consider the ‘Know More’ text or arrow (‘>’) or the 3 ‘Option Dots’ or the ‘Share Icon’ in the *image above. The one thing common about them is almost every time we see them on the right half of the screen. To be precise, the extreme right of the screen. That’s because the designers want you to know more about the article/product or share something from the app, so they are making it more accessible (keeping in mind, 87% of users are right-handed).

So, what’s the point of all this again?

I believe a good designer tries to accommodate as many users in his/her use-cases (targeting the 13% left-hand users & occasional universal left-hand usage) and so, the next step in responsive design is to consider ‘Hand of Use’. That is, eliminating the stretch that left-hand users have to go through every time they use an application, that’s designed for right-hand usage.

This can be done in a few ways. I haven’t concluded the most optimal implementation and maybe I won’t be able to either. It would be for later research. I’m breaking it into two steps, the first being recognition of the hand of operation & second being the layout orientation switch.

The Recognition

Accelerometer data on a testing phone

The graph is of an accelerometer with G. This experiment was pretty simple. The phone was held in one hand and a regular ‘scrolling-up’ was done. Pay close attention to Accelerometer-x.
Until around 33 sec (x-axis), the phone was held with the right hand. Post that, it was held in the left hand. Do you see a pattern?

Observations

  1. Your primary hand applies more pressure on the screen while interaction and so, every scroll was distinctively recorded. Every spike (trough) indicates scroll. And this also implies that the secondary-hand doesn’t apply as much pressure (observe the graph from 35 secs, the graph doesn’t record distinctive spikes).
    P.S. Pressure here is in terms of force applied on the phone, which causes it to move a little in our hands.
  2. A tilt of the phone, towards the hand of use, is observed very distinctively. The tiny blue graph shifts from negative x-axis (right-tilt) to the positive x-axis (left-tilt). That is, when a user holds a phone in the right hand, they tilt the phone to the right when they want to interact with it. The tilt is a subconscious move to reduce the stretch of the thumb.

From these two observations, we can fairly determine the hand of use (left/right). But we cannot completely rely on this data. Here’s where finger gesture comes into play. Ever observed how does the thumb move over the screen while interaction. it’s not a straight line, it’s an arc! the thumb curves towards the hand of use (left/right). In a way, we ourselves are telling the phone, which hand is in use, by the thumb movement. Did we ever think of that? That’s the beauty of interaction. There are many layers to it!

Scrolling animation on a phone
Vertical Scrolling Gesture — Observe the Arc

The accelerometer and finger movement are two parameters that the phone keeps a track of at every given instance. They are daemon processes (Hey! Linux users). So, performing any operations using these two parameters on the phone shouldn’t reduce the performance.

The Layout Orientation Switch

Layout responding to altering the hand of use
Responsivity in the hand of use

Here starts the fun part! Once we know which hand the user is using, we should change the layout of the application UI to ease the accessibility with that hand. As all our interfaces cater to right-hand usage, we now need to re-position key UI elements for left-hand usage. But this step requires an article of its own.

Note: Not every element needs a re-positioning and this would be the crux of the next article.

The outcome of this article can be used in other use-cases as well. The goal of any application is to be the following: usable, accessible, and performance. And ease of interaction is the foundation of it.

There it is, hopefully, the next step to responsive design and development!

--

--