Is there a term for "the user can't use anything wrong" design?

  • I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.

    Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?

    edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.

    edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.

    All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!

    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS **can not** know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?

    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.

    term = "the user is always right" ;) a play on "the customer is always right".

    @JanDorniak then the USB is designed poorly. E.g. it should be locked in until ejected, or have a catch, the release of which triggers the ejection routine.

    Does the term "foolproof" meet your needs?

    @Confused That is simply not true. By default on Windows write caching is **ON** and yanking out the drive even if you think you've finished writing to it *can* and *will* cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).

    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.

    Another example where the user clearly *is* using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...

    A user can use a hammer to insert a USB drive. Just because they might doesn't mean they are correct in doing so and there is no way to prevent it.

    @Marie have you ever seen that happen? I'm arguing that UX designers should observe how users interact with their product and design around that so that bad effects are avoided. Apple had to do this with the iPhone 4 and while they briefly had a PR gaffe with saying "you're holding it wrong", they later designed every subsequent iPhone so that this wouldn't happen. Apple's designers allowed users to hold their phones how they naturally wanted to.

    Well I mean, software can break if used incorrectly, just like the human body. You're basically suggesting that we should apply mind-over-matter to technology. Maybe it's possible to develop the software in such a way, but it would require more time and effort, and your USB stick would become twice as expensive.

    Please don't use comments for extended discussion. Take it to [CHAT] instead.

    @Fattie I'd highly recommend reading this article and watching the video in it to become more familiar. Saying that a design "babies" a user is quite condescending and a harmful attitude to have.

    @PascLeRasc: Unix is a good example of a system that does not baby the user. You tell Linux to format your boot drive and it will do so. Some tools will not even give you a token warning. Is that user friendly design?

    @RobertFrost Good point. The Macintosh floppy drive was a great illustration of this, as it didn't have a hardwire eject button. Disk ejection could only be initiated by using the GUI. So unlike other OSs of the time, a disk couldn't physically be ejected until the OS was finished with it.

    h i(or Bonjour!) @PascLeRasc - I totally don't follow you. The term "babying" is absolutely normal in software and UX development. It is not in the slightest pejorative. Some systems do NOT baby the user (say, Unix) and some systems DO baby the user (say, the buttons that control nuclear launch). Mac babies the user, Windows less so. It is a completely commonplace, non-pejorative term in English and is used ubiquitously.

    I disagree with *"designers should embrace this and improve the hammer capabilities"*, I think there's a difference between stopping people from making errors and actively encouraging people to make the same errors. Even if you could prevent data corruption of USB drives, encouraging people to rip the USB drives out will damage the USB port.

    @icc97 Please read the sentences before the one you quoted. There is no such thing as a user error. Users are always infallible and there are only designer errors. That is my viewpoint.

    I think then you're asking a different question that what everyone thinks (or has answered here). Everyone here (including the your accepted answer) is thinking that you're purely talking about avoiding errors (i.e. being defensive) - primarily that because that's what's in your title. Where as you seem to be talking about designers actively taking on board how customers use their product and *enhancing* that.

    Ok, I've done my best to answer based on what I think is your alternate meaning, but it means that you don't think in terms of 'errors', I think just being 'agile' is what you're describing.

    @PascLeRasc You have a big (let's say 10GB file) that you're copying to your USB drive. You have a nice progress bar saying it takes, say, 10 minutes to copy. When you're half way through, you pull the USB drive out of the PC. Are you still infallible in that case?

    "If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration." If that's what a good UX designer would say, then I don't want to be a good UX designer, because that's one of the silliest and most wrong-headed statements I've ever read.

    @JanDorniak **It is possible for USB drives to do this**. However, the solution is to bypass the filesystem cache, which results in a significant reduction in write performance. It will, however, ensure that nothing is being written to the drive the instant you stop performing any action that writes to it (e.g. as soon as the file copying dialog box disappears). I think Windows actually has an option to do that.

    @forest It would be entirely possible to solve this with a little bit of logic added to the software. A program that copies files could simply ask the cache to be flushed at the end of the operation and wait until the cache is done flushing before signaling to the user that the operation is done. All the while the cache works normally for general purpose file accesses. In my experience, Windows does something like this, whereas Linux doesn't. I found this out when using a device that emulates a flash drive, and Linux would wait about 1 minute before flushing the cache unless told to eject.

    @nitro2k01 Linux will do it if the program doing the copying does an `fsync()` before closing the file descriptor. I'm not sure which copying utilities do that. Flushing the cache at the end of each operation is definitely a middle ground, but would still result in some nasty perf hits for certain kinds of operations.

  • Confused

    Confused Correct answer

    3 years ago

    No. It is not a widely held view among UX designers. Unfortunately.

    Even less so amongst those using SO and considering themselves to be UX Designers.

    I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.

    UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.

    @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right

    @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches

    Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?

    @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees

    @KevinWells While I agree, its worth pointing out that for some things being able to use it for things it was not meant for can be a feature rather than a bug.

    @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)

    That last paragraph.

    @KevinWells the SE network has a site dedicated to using things for purposes they weren't meant for ;) welcome to

    `This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition.` A trained UXD is the _least_ qualified to set success criteria for a product's experience? Sounds to me like you've been working with the wrong UXDs.

    @plainclothes I'm trying to get you back to one of the main contexts: that UX is not a rigorous field. It's one of the 'macro' contexts, and the one I suspect causes the majority of the issues pertinent to the OP's question.

    It's no more or less rigorous than software engineering IME and rarely the *cause* of these issues. It's usually the business not supporting necessary investment in proper development cycles.

    @PascLeRasc at some point redesigning a tool to prevent misuse comes at the expense of the tool's original function. I've used a million objects as hammers, because all I really needed was a solid and sturdy object I could hit really hard against something else. If you wanted me to stop using a screwdriver as a hammer, you'd have to redesign it to be neither solid nor sturdy, at which point it stops being useful as screwdriver.

License under CC-BY-SA with attribution

Content dated before 7/24/2021 11:53 AM