Cleaning up Quora

Screen Shot 2017-06-23 at 9.52.24 AM.png

A few years ago, Quora made an interesting design decision.

Before then, Quora was the question/answer platform that made it incredibly easy for folks to ask a question and receive intelligent, thoughtful replies.

The problem was that it was too easy.

Off to the right, there was a form where users could enter a title for their question, add question details, and immediately post it. As you scrolled through your homepage feed, the position of this form would remain fixed so you could ask a question whenever curiosity struck.

Naturally, over time, the frequency of racist, sexist, misinformed, and otherwise troll-ish questions increased. Furthermore, duplicate questions emerged where people would ask the same thing over and over again, further reducing the quality of the Q&A site that Quora had become known for.

What was Quora to do?

The first thing they did was remove the question form. They made it tougher to ask a question. The next thing they did (which I think was brilliant) was to couple the question form and the search bar together. This way, those who wanted to ask a question were forced to type it into the bar and look at the search results first. Question asking became searching and searching became question asking. This cut down on duplicate questions.

In addition, Quora instituted a character limit when entering question details. This way people could no longer post pages of stuff detailing their life story but forced them to be succinct, clear, and concise. This was Quora’s solution to cleaning up the quality of questions on their site.

As for how to prevent troll questions, Quora had a strict policy where users would be warned and subsequently have question-answering or anonymity privileges revoked if their questions were deemed inappropriate.

I don’t have the data for how Quora’s web traffic was affected by their changes.  As a guess, I would say it was reduced but I’m sure they were okay with that. They would rather have higher quality user-ship and content than being another Reddit and for that, I applaud them.

One Misplaced Restroom Sign

Screen Shot 2017-06-19 at 5.50.30 PM

I was visiting Pompeii, Italy about a year ago and found myself grabbing a quick lunch in its visitor cafeteria.

I wanted to find the restroom so I looked around until I spotted a sign with the word “toilette” hanging from the ceiling with an arrow pointing to the left. The arrow was pointing towards a closed door so I tried the door knob and opened the door into what appeared to be a small administrative area with some video feeds and files everywhere. The person at the desk looked up at me.

Embarrassed, I muttered “mi scusi” and quickly shut the door. Looking around, I realized there was a small hallway right next to the administrative door that led me to the restrooms.

Afterwards, as I was eating my lunch, I sat facing the restroom sign and watched as people tried to find where the restrooms were.  A good number of people managed to locate the side hallway without any problem.

However, one particular young boy went up to the same door I mistook for the restroom. He tried the doorknob which was now locked and waited patiently outside the door for a few minutes, hoping that whoever was inside would be finished and come out. When no one came out, he went back to his seat and came back with his father in tow. His father also tried the door and then, like me, discovered the side hallway.

This series of events led me to ponder the placement of the restroom sign and how many people were misled versus how many people found the restroom just fine. More importantly, I wanted to ask the question: when does it make sense to seek a solution to a usability issue and is there ever a situation when you shouldn’t?

Almost immediately, business decisions come into effect. How much will it cost or how long will it take to fix a usability issue?  The answer to that must often be weighed against the stakes that are in play if the issue isn’t fixed.

Consider again the restroom sign. Even if 50% of people are confused at the sign, the loss of time is probably only a few minutes for each person. It might possibly result in overall dissatisfaction with the visitation experience if accidents were to occur because of a misplaced trust in a sign. Or perhaps the workers in the administrative room would be often bothered by confused patrons.  All in all, the stakes are relatively low and the cost of changing the sign is fairly low as well. They might need to get a ladder and various tools to remove the sign and adjust it accordingly.

Next, consider the interface for a fighter jet. In this case, the stakes of bad design is much, much higher. It could cost someone’s life as mission-critical tasks are often decided within a split second. There is no room for error here and management would be wise to spend the money and time to fix the design because of the high stakes in play.

As practitioners of UX design involved in various industries, we often have to consider how much it would cost (in both time and money) to fix or implement a feature and weigh that against the consequences of not doing so.

Can intentionally slowing down page load improve user experience?

Screen Shot 2017-06-19 at 5.50.09 PM

It is true that we generally want to minimize the time that users have to wait because slower load times equates to a bad user experience.  We want to make it faster for users to get to your site or product, do what they want and leave.  We want to ensure that we optimize our processing in a way that users feel as if they are the ones in control and that they are not at the whims of a machine that is holding them back from being productive.

However, research has shown that it is not necessarily slower response times that drives users crazy. It is inconsistent times that drive users mad. If clicking a button results in a 5 second delay before something happens and users come to expect that, then they are generally okay with it. But if that same button process executes in widely varying times (i.e. 0.1 seconds, 3 seconds, 10 seconds, 1 second) each time, users find that unbearable and confusing. (The Nielsen Norman group has written a fantastic article on various facets of page response times that everyone should read.)

Therefore, I’d like to offer a slightly different take on these response times. What if intentionally slowing down that time can improve a user experience? Here is what I mean.

Imagine two scenarios. In the first scenario, you are waiting at the doctor’s office. The doctor comes in. You list out your symptoms and issues. The doctor writes them down, tells you that they know what the issue is, prescribes the appropriate medication or treatment, and sends you on your way. The whole process takes about five minutes.

In the second scenario, you are waiting at the same doctor’s office. The doctor comes in. You list out your symptoms and issues. The doctor writes them down, thinks for a moment, and asks you some follow-up questions.  The doctor checks the part of your body that hurts, performs some basic diagnostic tests, and reassures you during the process. The doctor tells you what the issue is, prescribes the appropriate medication or treatment, takes any final questions, and then sends you on your way. The whole process takes about 20-30 minutes.

Both doctors knew immediately what was wrong from the moment they heard your symptoms.  However, which doctor would you rather see?

For some, getting in and out is the most important factor. However, I would argue that in this case, the latter doctor would be more warmly received by patients because of the extra time he or she spent with you.

Perception matters. Though both doctors ended up getting the diagnosis and treatment correct, the second doctor ensured that you, as the patient, were heard, understood, and felt that you were being cared for. The second doctor was perceived to be more thorough and sensitive than the first.

Or, consider sitting down at a restaurant and ordering some food.  The food arrives within ten seconds of you ordering it. Is that a better user experience than a place where the food arrives in fifteen minutes?  Perhaps, you may be tempted to question whether or not the food in the first restaurant was fresh or flash-frozen.  Are you really getting the best quality of food?

In these scenarios, I hope to present counterexamples to the thought that faster is always better.

Similarly, I believe that there are times when intentionally slowing down response times a bit can make users feel that the decision-making that goes on behind-the-scenes is “working” or that it is being personalized for their best experience. Consider a web application that relies on a sophisticated machine learning engine to suggest a new movie or a web application that uses a very fast algorithm to perform various security checks.  Perhaps those calculations are done within milliseconds but should the results be displayed that quickly?

I argue that they shouldn’t and that UX designers or engineers may consider elongating that time slightly or presenting some sort of animation where it shows a cute robot thinking very hard about calculations so that the perception of users about the service that they are receiving can be improved.

I realize that this flies in the face of what is conventional knowledge that users hate having to wait or that faster is always better.  I am not suggesting that all response times should always be throttled down for the sake of improving a user experience.  I am merely positing that it could be a consideration that warrants some more study depending on what your product or system hopes to convey.  However, these delays in response times can walk a very thin line. Too long and users are upset and may leave your site. Too short and you may turn them off to whether or not your product actually works.

When it comes to things where waiting is normal if not expected (e.g. loan pre-approval, background checks, idea generation, mechanical diagnosis), slower can be better.

Can Craigslist UX be improved?

Screen Shot 2017-06-19 at 9.36.54 PM

At first glance, the visual design of Craigslist looks to be stuck in the 1990s.  It does not have the normal graphical flair or even follow principles stemming from visual hierarchy, information architecture, or color theory.  And yet it seems to be doing pretty well for itself, making approximately $380 million in 2015.

Craigslist first took off in the late 90s and early 2000s when it became a household name for classified ads perhaps using that domain for many of its design decisions. Yet its look has remained largely unchanged for the last 15 years or so and I was interested in exploring why.

I started by observing users actually use Craigslist and had them engage in a think-aloud protocol as they did two specific tasks. First, I wanted them to find a place they would like to live close to Carnegie Mellon University in Pittsburgh, PA and second, I wanted them to post an ad for a piece of furniture. I ended up recruiting five users (three women, two men), two of whom were first-time users of Craigslist.

My findings and observations grouped into six broad categories:

  1. Participants had different ways of switching between the Craigslist for different cities. Some searched on Google, some had to figure out how to spell the city name and type it into the URL bar, some used the navigation within the site itself.
  2. Participants found it difficult to coordinate a place that they could see in a specific location on the map and find that corresponding place in the subsequent list view.
  3. Participants wanted ways to save and compare ads.
  4. When selling, participants wished they could have some suggestions for how to post a title or how to word the content of an ad.
  5. When selling, participants noted a confusing posting path where they had to first choose the city in which to post and then the category even though they were already in the seller category already.
  6. When selling, participants wished that they could post images on the very same page as the ad itself.

Though these observations were interesting, I wanted to ask another kind of question:

What is the actual problem? What is the real issue facing Craigslist?

Screen Shot 2017-06-19 at 9.58.26 PM.png

My contention is a clear word: TRUST.

Here is a snippet of some of the actual quotes said by my five participants during my observation.

  • “Looks like a fake website.”
  • “Doesn’t seem safe.”
  • “I don’t want people to know where I live.”
  • “Feels dangerous.”
  • “Wow, all the ads look so bad and sketchy.”
  • “I don’t want to put my location.”

Next, I wanted to look at how other classified ad/e-commerce websites dealt with trust issues. I looked at Oodle.com, a similar classified ad website, as well as other major e-commerce sites like eBay, Alibaba, Amazon, and even Airbnb. I made several observations from these sites.

The presence of a user account with an accompanying picture does manage to enhance trust levels between two parties.  Airbnb does a good job by guiding you to take a picture of your face and also has the ability to add further verification documents to increase one’s trust. Furthermore, some sort of badge or rating system can let you know that you are dealing with trustworthy people versus someone who has consistently bad ratings.

I took these ideas to see if I could incorporate them in to Craigslist while keeping the visual design of the site the same.

This slideshow requires JavaScript.

This was a fun exercise for me to engage in to see how I would reimagine Craigslist by making an important distinction between the visual design of a site and the actual user experience of a site. I argued that trust is the most important issue facing Craigslist and decided to make some improvements targeting that specific issue.

So, at the end of the day, is Craigslist UX likely to change?  No.  Can Craigslist UX be improved?  Maybe, but at this point, no. And that is not a bad thing.  Craigslist actually has a phenomenal UX; it’s just not what we normally come to think of as a great UX.  Craigslist allows its users to do what they want and get out. Its strength is that it doesn’t get in your way with too many bells and whistles and instead gives the user the ability to quickly perform his or her desired tasks to satisfaction.

Less is often more when designing great experiences.

 

The History of Computing

Screen Shot 2017-06-19 at 5.51.47 PM

“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”

In the late 1980s, Mark Weiser coined the term “ubiquitous computing” and published several papers which culminated in his landmark paper, “The Computer for the 21st Century” (1991) whose opening lines are quoted above. He used this to term to characterize what he believed was the emerging third age of computing.

As we’ve progressed through various ages or generations of computing, we notice that not only has the kinds of technologies changed, but the very way humans interact with that technology has changed.

The first generation started in the mid 1930s with Alan Turing and others who pursued the idea of codifying and creating machines that could follow set instructions. The canonical devices during this period were mainframes whose people-to-device ratio was many to one. Many users would interact with one large machine in an impersonal way. The applications of this stayed mostly in the realm of scientific computation, cryptography for war, and later on, data processing.

Fast forwarding to the 1960s and 1970s, a second generation of computing emerged pioneered by visionaries such as Alan Kay, Ben Shneiderman, Ivan Sutherland, Douglas Engelbart and others who worked on what we now know as the age of personal computing. We could now shrink the capabilities of the mainframe into a machine that could sit on someone’s desk and introduce the beginnings of the GUI, the mouse, and windowing systems. The people-to-device ratio was one to one and helped users perform actions that could benefit them personally. Applications ranged from document processing, spreadsheets, and database management.

As mentioned above, Mark Weiser and his contemporaries noticed in the late 80s and 90s that as the form factor of these devices were shrinking, there would potentially be more and more things that could be considered computing devices. This marked the third generation of computing which Weiser called “ubiquitous.” An explosion of smaller form factors hit the market ranging from portable laptops, tape storage, compact discs, and later on, USB drives and cellular phones. The people-to-device ratio was one to many, indicating that one person would own and interact with many of these devices. These “computers” would be so commonplace that they would “disappear” in the sense that they would be so embedded into our environment and daily lives that people would not notice them as they would in previous generations. Common applications included human-to-human communication and data transfer.

Researchers tend to agree that we are currently in the fourth age of computing marked by certain “technologies” such as cloud services, crowd-sourcing (or social media), and a whole ecosystem of devices (“Internet of Things”) that can connect and communicate with one another. This marks a people-to-device ratio of many-to-many where multiple people can interact with each other and with or through many devices all at once. Some include wearable technology as well possibly moving from an aggregation of computing on a smart phone to computing on our very bodies with the introduction of health trackers, smart watches, and head-mounted displays.

So, what comes next? Is there something else that should mark the fourth age of computing?  And is there a fifth age that has yet to emerge over the next decade or so?

Certain research topics include using emerging technologies include computational skin, the intersection of computing with biology and neuroscience, or user-manufactured computation allowing people to create and reproduce their own computing devices many times over.

These are visions and only time will tell where and how we choose to take the next step.