Is there a term for “the user can't use anything wrong” design?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}






up vote
39
down vote

favorite
14












I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.



edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.



All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 77




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    Nov 26 at 22:54








  • 45




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    Nov 27 at 1:13






  • 70




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    Nov 27 at 12:36






  • 33




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    Nov 27 at 16:49






  • 28




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    Nov 27 at 20:48



















up vote
39
down vote

favorite
14












I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.



edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.



All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 77




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    Nov 26 at 22:54








  • 45




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    Nov 27 at 1:13






  • 70




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    Nov 27 at 12:36






  • 33




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    Nov 27 at 16:49






  • 28




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    Nov 27 at 20:48















up vote
39
down vote

favorite
14









up vote
39
down vote

favorite
14






14





I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.



edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.



All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.



edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.



All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!







user-behavior user-centered-design






share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 9 hours ago





















New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked Nov 26 at 21:13









PascLeRasc

304128




304128




New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 77




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    Nov 26 at 22:54








  • 45




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    Nov 27 at 1:13






  • 70




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    Nov 27 at 12:36






  • 33




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    Nov 27 at 16:49






  • 28




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    Nov 27 at 20:48
















  • 77




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    Nov 26 at 22:54








  • 45




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    Nov 27 at 1:13






  • 70




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    Nov 27 at 12:36






  • 33




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    Nov 27 at 16:49






  • 28




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    Nov 27 at 20:48










77




77




What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54






What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54






45




45




This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13




This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13




70




70




@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36




@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36




33




33




I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49




I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49




28




28




Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48






Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48












13 Answers
13






active

oldest

votes

















up vote
11
down vote



accepted










No. It is not a widely held view among UX designers. Unfortunately.



Even less so amongst those using SO and considering themselves to be UX Designers.



I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






share|improve this answer

















  • 4




    I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
    – PascLeRasc
    yesterday








  • 12




    @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
    – Kevin Wells
    yesterday






  • 7




    @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
    – Kevin Wells
    yesterday






  • 9




    @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
    – Kevin Wells
    yesterday






  • 8




    @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
    – Kevin Wells
    yesterday


















up vote
88
down vote













Accommodation for every possible user interaction is impossible.



Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





  1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


  2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


  3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






share|improve this answer










New contributor




formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 78




    "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
    – manassehkatz
    Nov 27 at 5:34






  • 33




    I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
    – formicini
    Nov 27 at 6:41






  • 1




    My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
    – formicini
    Nov 27 at 12:53






  • 12




    While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
    – Draco18s
    Nov 27 at 14:47






  • 3




    I don't see how picking apart my example is an answer. This should have been a comment instead.
    – PascLeRasc
    Nov 27 at 15:11


















up vote
66
down vote













Yes, there is a term for this ("the user can't do anything wrong"):



foolproof



But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




There is also a term for minimizing what a user can do wrong:



Defensive Design



In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




  • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

  • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






share|improve this answer








New contributor




ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 1




    There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
    – Ruadhan2300
    2 days ago






  • 3




    "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
    – Fattie
    2 days ago






  • 2




    I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
    – Monty Harder
    2 days ago






  • 1




    I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
    – Luaan
    yesterday






  • 1




    You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
    – Nelson
    yesterday


















up vote
44
down vote













User-Centered Design



What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




  1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

  2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

  3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


  4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






share|improve this answer



















  • 30




    (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
    – chrylis
    Nov 27 at 5:52






  • 4




    @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
    – Nobody
    Nov 27 at 19:26








  • 7




    @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
    – chrylis
    Nov 27 at 19:30






  • 1




    Yes, this is the correct answer
    – Fattie
    2 days ago






  • 4




    "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
    – Albin
    yesterday




















up vote
38
down vote













I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






share|improve this answer








New contributor




Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 2




    I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
    – PascLeRasc
    Nov 27 at 15:24










  • @PascLeRasc or is it pandering to the lack of common sense...
    – Solar Mike
    Nov 27 at 15:32






  • 9




    @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
    – J...
    Nov 27 at 19:10








  • 13




    @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
    – Confused
    Nov 27 at 19:12


















up vote
15
down vote













This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/



"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






share|improve this answer

















  • 1




    Joel Spolsky (praise be) wrote a pretty good article in his blog about this
    – Ruadhan2300
    2 days ago






  • 1




    Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
    – Graham
    2 days ago










  • Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
    – Ismael Miguel
    20 hours ago


















up vote
5
down vote













Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



Let's not even talk about implementing PSU to meet this Sawzall requirement...



Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






share|improve this answer

















  • 6




    iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
    – Džuris
    2 days ago






  • 1




    @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
    – Ruadhan2300
    2 days ago






  • 2




    @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
    – alephzero
    2 days ago






  • 1




    @alephzero Can you please stop posting unsubstantive comments?
    – PascLeRasc
    2 days ago






  • 2




    @PascLeRasc It's a relevant reply to another comment. And not untrue either.
    – Graham
    2 days ago


















up vote
2
down vote














OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




Yes, you're totally, completely, absolutely correct.



Engineers and companies that do what you say, make huge amounts of money.



Some of the biggest key products of our entire era are totally based on what you describe.




Is this a widely-held view among UX designers/developers?




Yes, it's one of the central ideas.



it is constantly and widely discussed as one of, or the, central issues in UX.



The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




Is there an official term for this philosophy?




Sure, it is



User-driven design



Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



Note that to some extent, the everyday phrase



K.I.S.S.



amounts to, basically, a similar approach.





Note - since the "Pascal-issue" is indeed so pervasive, there are



many, many specific terms for subsets of the concept:



For example, in the literal example you gave, that is known as



plug-and-play



or



hot swappable



Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






share|improve this answer























  • I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
    – PascLeRasc
    yesterday


















up vote
2
down vote













We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.






share|improve this answer








New contributor




Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.

























    up vote
    1
    down vote













    I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.



    To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.




    • If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.


    • If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.



    Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.



    Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.






    share|improve this answer








    New contributor




    Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.














    • 1




      In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
      – TimothyAWiseman
      yesterday






    • 2




      Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
      – Kevin Wells
      yesterday


















    up vote
    1
    down vote













    Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.






    share|improve this answer




























      up vote
      0
      down vote













      User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.



      More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.



      ACD changes the perspective from

      "how does the user want this thing to perform a function" to

      "how can we make the user fundamentally more successful at this job".



      If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.






      share|improve this answer




























        up vote
        -1
        down vote














        I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.




        This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".




        1. The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.


        2. Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.


        3. There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:




        Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.







        share|improve this answer





















          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "102"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });






          PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.










          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fux.stackexchange.com%2fquestions%2f122360%2fis-there-a-term-for-the-user-cant-use-anything-wrong-design%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          13 Answers
          13






          active

          oldest

          votes








          13 Answers
          13






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          11
          down vote



          accepted










          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






          share|improve this answer

















          • 4




            I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
            – PascLeRasc
            yesterday








          • 12




            @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
            – Kevin Wells
            yesterday






          • 7




            @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
            – Kevin Wells
            yesterday






          • 9




            @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
            – Kevin Wells
            yesterday






          • 8




            @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
            – Kevin Wells
            yesterday















          up vote
          11
          down vote



          accepted










          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






          share|improve this answer

















          • 4




            I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
            – PascLeRasc
            yesterday








          • 12




            @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
            – Kevin Wells
            yesterday






          • 7




            @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
            – Kevin Wells
            yesterday






          • 9




            @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
            – Kevin Wells
            yesterday






          • 8




            @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
            – Kevin Wells
            yesterday













          up vote
          11
          down vote



          accepted







          up vote
          11
          down vote



          accepted






          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






          share|improve this answer












          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 27 at 1:09









          Confused

          1,912617




          1,912617








          • 4




            I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
            – PascLeRasc
            yesterday








          • 12




            @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
            – Kevin Wells
            yesterday






          • 7




            @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
            – Kevin Wells
            yesterday






          • 9




            @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
            – Kevin Wells
            yesterday






          • 8




            @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
            – Kevin Wells
            yesterday














          • 4




            I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
            – PascLeRasc
            yesterday








          • 12




            @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
            – Kevin Wells
            yesterday






          • 7




            @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
            – Kevin Wells
            yesterday






          • 9




            @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
            – Kevin Wells
            yesterday






          • 8




            @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
            – Kevin Wells
            yesterday








          4




          4




          I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
          – PascLeRasc
          yesterday






          I want to provide some justification for why I've chosen this perhaps unsatisfying answer. I've had to clarify what I mean many times in this post, and still commenters have said "but what about if someone uses my thing wrong?", which is exactly what I think is a harmful view. It seems that UX, or at least this forum's idea of it, is not empathetic enough with its users.
          – PascLeRasc
          yesterday






          12




          12




          @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
          – Kevin Wells
          yesterday




          @PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
          – Kevin Wells
          yesterday




          7




          7




          @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
          – Kevin Wells
          yesterday




          @PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
          – Kevin Wells
          yesterday




          9




          9




          @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
          – Kevin Wells
          yesterday




          @PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
          – Kevin Wells
          yesterday




          8




          8




          @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
          – Kevin Wells
          yesterday




          @TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
          – Kevin Wells
          yesterday












          up vote
          88
          down vote













          Accommodation for every possible user interaction is impossible.



          Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





          1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


          2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


          3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


          All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



          If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






          share|improve this answer










          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 78




            "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
            – manassehkatz
            Nov 27 at 5:34






          • 33




            I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
            – formicini
            Nov 27 at 6:41






          • 1




            My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
            – formicini
            Nov 27 at 12:53






          • 12




            While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
            – Draco18s
            Nov 27 at 14:47






          • 3




            I don't see how picking apart my example is an answer. This should have been a comment instead.
            – PascLeRasc
            Nov 27 at 15:11















          up vote
          88
          down vote













          Accommodation for every possible user interaction is impossible.



          Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





          1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


          2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


          3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


          All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



          If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






          share|improve this answer










          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 78




            "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
            – manassehkatz
            Nov 27 at 5:34






          • 33




            I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
            – formicini
            Nov 27 at 6:41






          • 1




            My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
            – formicini
            Nov 27 at 12:53






          • 12




            While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
            – Draco18s
            Nov 27 at 14:47






          • 3




            I don't see how picking apart my example is an answer. This should have been a comment instead.
            – PascLeRasc
            Nov 27 at 15:11













          up vote
          88
          down vote










          up vote
          88
          down vote









          Accommodation for every possible user interaction is impossible.



          Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





          1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


          2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


          3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


          All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



          If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






          share|improve this answer










          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          Accommodation for every possible user interaction is impossible.



          Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





          1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


          2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


          3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


          All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



          If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.







          share|improve this answer










          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          share|improve this answer



          share|improve this answer








          edited Nov 27 at 9:01





















          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          answered Nov 27 at 4:59









          formicini

          816118




          816118




          New contributor




          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.





          New contributor





          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.








          • 78




            "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
            – manassehkatz
            Nov 27 at 5:34






          • 33




            I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
            – formicini
            Nov 27 at 6:41






          • 1




            My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
            – formicini
            Nov 27 at 12:53






          • 12




            While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
            – Draco18s
            Nov 27 at 14:47






          • 3




            I don't see how picking apart my example is an answer. This should have been a comment instead.
            – PascLeRasc
            Nov 27 at 15:11














          • 78




            "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
            – manassehkatz
            Nov 27 at 5:34






          • 33




            I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
            – formicini
            Nov 27 at 6:41






          • 1




            My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
            – formicini
            Nov 27 at 12:53






          • 12




            While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
            – Draco18s
            Nov 27 at 14:47






          • 3




            I don't see how picking apart my example is an answer. This should have been a comment instead.
            – PascLeRasc
            Nov 27 at 15:11








          78




          78




          "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
          – manassehkatz
          Nov 27 at 5:34




          "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
          – manassehkatz
          Nov 27 at 5:34




          33




          33




          I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
          – formicini
          Nov 27 at 6:41




          I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
          – formicini
          Nov 27 at 6:41




          1




          1




          My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
          – formicini
          Nov 27 at 12:53




          My example is just a little demonstration why the extreme approach "user can't use anything wrong" is a bad thing, it would not be right for a lot of people. There's bound to be a limit what users could and couldn't do to computers, and of course that limit depends on a lot of factors which fail-safe is only one of.
          – formicini
          Nov 27 at 12:53




          12




          12




          While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
          – Draco18s
          Nov 27 at 14:47




          While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
          – Draco18s
          Nov 27 at 14:47




          3




          3




          I don't see how picking apart my example is an answer. This should have been a comment instead.
          – PascLeRasc
          Nov 27 at 15:11




          I don't see how picking apart my example is an answer. This should have been a comment instead.
          – PascLeRasc
          Nov 27 at 15:11










          up vote
          66
          down vote













          Yes, there is a term for this ("the user can't do anything wrong"):



          foolproof



          But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




          a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




          There is also a term for minimizing what a user can do wrong:



          Defensive Design



          In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




          • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

          • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






          share|improve this answer








          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 1




            There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
            – Ruadhan2300
            2 days ago






          • 3




            "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
            – Fattie
            2 days ago






          • 2




            I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
            – Monty Harder
            2 days ago






          • 1




            I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
            – Luaan
            yesterday






          • 1




            You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
            – Nelson
            yesterday















          up vote
          66
          down vote













          Yes, there is a term for this ("the user can't do anything wrong"):



          foolproof



          But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




          a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




          There is also a term for minimizing what a user can do wrong:



          Defensive Design



          In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




          • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

          • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






          share|improve this answer








          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 1




            There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
            – Ruadhan2300
            2 days ago






          • 3




            "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
            – Fattie
            2 days ago






          • 2




            I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
            – Monty Harder
            2 days ago






          • 1




            I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
            – Luaan
            yesterday






          • 1




            You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
            – Nelson
            yesterday













          up vote
          66
          down vote










          up vote
          66
          down vote









          Yes, there is a term for this ("the user can't do anything wrong"):



          foolproof



          But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




          a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




          There is also a term for minimizing what a user can do wrong:



          Defensive Design



          In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




          • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

          • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






          share|improve this answer








          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          Yes, there is a term for this ("the user can't do anything wrong"):



          foolproof



          But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




          a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




          There is also a term for minimizing what a user can do wrong:



          Defensive Design



          In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




          • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

          • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.







          share|improve this answer








          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          share|improve this answer



          share|improve this answer






          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          answered Nov 27 at 10:58









          ONOZ

          75945




          75945




          New contributor




          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.





          New contributor





          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.








          • 1




            There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
            – Ruadhan2300
            2 days ago






          • 3




            "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
            – Fattie
            2 days ago






          • 2




            I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
            – Monty Harder
            2 days ago






          • 1




            I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
            – Luaan
            yesterday






          • 1




            You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
            – Nelson
            yesterday














          • 1




            There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
            – Ruadhan2300
            2 days ago






          • 3




            "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
            – Fattie
            2 days ago






          • 2




            I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
            – Monty Harder
            2 days ago






          • 1




            I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
            – Luaan
            yesterday






          • 1




            You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
            – Nelson
            yesterday








          1




          1




          There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
          – Ruadhan2300
          2 days ago




          There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
          – Ruadhan2300
          2 days ago




          3




          3




          "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
          – Fattie
          2 days ago




          "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
          – Fattie
          2 days ago




          2




          2




          I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
          – Monty Harder
          2 days ago




          I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
          – Monty Harder
          2 days ago




          1




          1




          I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
          – Luaan
          yesterday




          I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
          – Luaan
          yesterday




          1




          1




          You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
          – Nelson
          yesterday




          You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
          – Nelson
          yesterday










          up vote
          44
          down vote













          User-Centered Design



          What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



          As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



          In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




          1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

          2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

          3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


          4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






          share|improve this answer



















          • 30




            (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
            – chrylis
            Nov 27 at 5:52






          • 4




            @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
            – Nobody
            Nov 27 at 19:26








          • 7




            @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
            – chrylis
            Nov 27 at 19:30






          • 1




            Yes, this is the correct answer
            – Fattie
            2 days ago






          • 4




            "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
            – Albin
            yesterday

















          up vote
          44
          down vote













          User-Centered Design



          What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



          As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



          In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




          1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

          2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

          3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


          4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






          share|improve this answer



















          • 30




            (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
            – chrylis
            Nov 27 at 5:52






          • 4




            @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
            – Nobody
            Nov 27 at 19:26








          • 7




            @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
            – chrylis
            Nov 27 at 19:30






          • 1




            Yes, this is the correct answer
            – Fattie
            2 days ago






          • 4




            "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
            – Albin
            yesterday















          up vote
          44
          down vote










          up vote
          44
          down vote









          User-Centered Design



          What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



          As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



          In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




          1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

          2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

          3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


          4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






          share|improve this answer














          User-Centered Design



          What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



          As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



          In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




          1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

          2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

          3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


          4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited 2 days ago

























          answered Nov 27 at 4:33









          David Regev

          945513




          945513








          • 30




            (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
            – chrylis
            Nov 27 at 5:52






          • 4




            @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
            – Nobody
            Nov 27 at 19:26








          • 7




            @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
            – chrylis
            Nov 27 at 19:30






          • 1




            Yes, this is the correct answer
            – Fattie
            2 days ago






          • 4




            "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
            – Albin
            yesterday
















          • 30




            (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
            – chrylis
            Nov 27 at 5:52






          • 4




            @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
            – Nobody
            Nov 27 at 19:26








          • 7




            @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
            – chrylis
            Nov 27 at 19:30






          • 1




            Yes, this is the correct answer
            – Fattie
            2 days ago






          • 4




            "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
            – Albin
            yesterday










          30




          30




          (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
          – chrylis
          Nov 27 at 5:52




          (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
          – chrylis
          Nov 27 at 5:52




          4




          4




          @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
          – Nobody
          Nov 27 at 19:26






          @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
          – Nobody
          Nov 27 at 19:26






          7




          7




          @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
          – chrylis
          Nov 27 at 19:30




          @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
          – chrylis
          Nov 27 at 19:30




          1




          1




          Yes, this is the correct answer
          – Fattie
          2 days ago




          Yes, this is the correct answer
          – Fattie
          2 days ago




          4




          4




          "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
          – Albin
          yesterday






          "Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
          – Albin
          yesterday












          up vote
          38
          down vote













          I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






          share|improve this answer








          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 2




            I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
            – PascLeRasc
            Nov 27 at 15:24










          • @PascLeRasc or is it pandering to the lack of common sense...
            – Solar Mike
            Nov 27 at 15:32






          • 9




            @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
            – J...
            Nov 27 at 19:10








          • 13




            @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
            – Confused
            Nov 27 at 19:12















          up vote
          38
          down vote













          I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






          share|improve this answer








          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.














          • 2




            I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
            – PascLeRasc
            Nov 27 at 15:24










          • @PascLeRasc or is it pandering to the lack of common sense...
            – Solar Mike
            Nov 27 at 15:32






          • 9




            @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
            – J...
            Nov 27 at 19:10








          • 13




            @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
            – Confused
            Nov 27 at 19:12













          up vote
          38
          down vote










          up vote
          38
          down vote









          I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






          share|improve this answer








          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).







          share|improve this answer








          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          share|improve this answer



          share|improve this answer






          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          answered Nov 27 at 11:35









          Kit

          38112




          38112




          New contributor




          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.





          New contributor





          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.








          • 2




            I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
            – PascLeRasc
            Nov 27 at 15:24










          • @PascLeRasc or is it pandering to the lack of common sense...
            – Solar Mike
            Nov 27 at 15:32






          • 9




            @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
            – J...
            Nov 27 at 19:10








          • 13




            @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
            – Confused
            Nov 27 at 19:12














          • 2




            I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
            – PascLeRasc
            Nov 27 at 15:24










          • @PascLeRasc or is it pandering to the lack of common sense...
            – Solar Mike
            Nov 27 at 15:32






          • 9




            @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
            – J...
            Nov 27 at 19:10








          • 13




            @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
            – Confused
            Nov 27 at 19:12








          2




          2




          I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
          – PascLeRasc
          Nov 27 at 15:24




          I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
          – PascLeRasc
          Nov 27 at 15:24












          @PascLeRasc or is it pandering to the lack of common sense...
          – Solar Mike
          Nov 27 at 15:32




          @PascLeRasc or is it pandering to the lack of common sense...
          – Solar Mike
          Nov 27 at 15:32




          9




          9




          @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
          – J...
          Nov 27 at 19:10






          @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
          – J...
          Nov 27 at 19:10






          13




          13




          @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
          – Confused
          Nov 27 at 19:12




          @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
          – Confused
          Nov 27 at 19:12










          up vote
          15
          down vote













          This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



          Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
          https://www.nngroup.com/articles/ten-usability-heuristics/



          "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



          Apple refers to it as “User Control" in their IOS guidelines:
          https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



          "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






          share|improve this answer

















          • 1




            Joel Spolsky (praise be) wrote a pretty good article in his blog about this
            – Ruadhan2300
            2 days ago






          • 1




            Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
            – Graham
            2 days ago










          • Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
            – Ismael Miguel
            20 hours ago















          up vote
          15
          down vote













          This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



          Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
          https://www.nngroup.com/articles/ten-usability-heuristics/



          "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



          Apple refers to it as “User Control" in their IOS guidelines:
          https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



          "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






          share|improve this answer

















          • 1




            Joel Spolsky (praise be) wrote a pretty good article in his blog about this
            – Ruadhan2300
            2 days ago






          • 1




            Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
            – Graham
            2 days ago










          • Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
            – Ismael Miguel
            20 hours ago













          up vote
          15
          down vote










          up vote
          15
          down vote









          This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



          Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
          https://www.nngroup.com/articles/ten-usability-heuristics/



          "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



          Apple refers to it as “User Control" in their IOS guidelines:
          https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



          "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






          share|improve this answer












          This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



          Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
          https://www.nngroup.com/articles/ten-usability-heuristics/



          "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



          Apple refers to it as “User Control" in their IOS guidelines:
          https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



          "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 27 at 4:13









          Jeremy Franck

          23613




          23613








          • 1




            Joel Spolsky (praise be) wrote a pretty good article in his blog about this
            – Ruadhan2300
            2 days ago






          • 1




            Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
            – Graham
            2 days ago










          • Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
            – Ismael Miguel
            20 hours ago














          • 1




            Joel Spolsky (praise be) wrote a pretty good article in his blog about this
            – Ruadhan2300
            2 days ago






          • 1




            Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
            – Graham
            2 days ago










          • Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
            – Ismael Miguel
            20 hours ago








          1




          1




          Joel Spolsky (praise be) wrote a pretty good article in his blog about this
          – Ruadhan2300
          2 days ago




          Joel Spolsky (praise be) wrote a pretty good article in his blog about this
          – Ruadhan2300
          2 days ago




          1




          1




          Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
          – Graham
          2 days ago




          Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
          – Graham
          2 days ago












          Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
          – Ismael Miguel
          20 hours ago




          Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
          – Ismael Miguel
          20 hours ago










          up vote
          5
          down vote













          Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



          I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



          This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



          Let's not even talk about implementing PSU to meet this Sawzall requirement...



          Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



          Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



          In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



          So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






          share|improve this answer

















          • 6




            iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
            – Džuris
            2 days ago






          • 1




            @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
            – Ruadhan2300
            2 days ago






          • 2




            @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
            – alephzero
            2 days ago






          • 1




            @alephzero Can you please stop posting unsubstantive comments?
            – PascLeRasc
            2 days ago






          • 2




            @PascLeRasc It's a relevant reply to another comment. And not untrue either.
            – Graham
            2 days ago















          up vote
          5
          down vote













          Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



          I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



          This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



          Let's not even talk about implementing PSU to meet this Sawzall requirement...



          Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



          Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



          In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



          So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






          share|improve this answer

















          • 6




            iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
            – Džuris
            2 days ago






          • 1




            @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
            – Ruadhan2300
            2 days ago






          • 2




            @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
            – alephzero
            2 days ago






          • 1




            @alephzero Can you please stop posting unsubstantive comments?
            – PascLeRasc
            2 days ago






          • 2




            @PascLeRasc It's a relevant reply to another comment. And not untrue either.
            – Graham
            2 days ago













          up vote
          5
          down vote










          up vote
          5
          down vote









          Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



          I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



          This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



          Let's not even talk about implementing PSU to meet this Sawzall requirement...



          Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



          Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



          In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



          So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






          share|improve this answer












          Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



          I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



          This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



          Let's not even talk about implementing PSU to meet this Sawzall requirement...



          Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



          Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



          In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



          So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 27 at 23:43









          Cort Ammon

          59125




          59125








          • 6




            iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
            – Džuris
            2 days ago






          • 1




            @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
            – Ruadhan2300
            2 days ago






          • 2




            @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
            – alephzero
            2 days ago






          • 1




            @alephzero Can you please stop posting unsubstantive comments?
            – PascLeRasc
            2 days ago






          • 2




            @PascLeRasc It's a relevant reply to another comment. And not untrue either.
            – Graham
            2 days ago














          • 6




            iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
            – Džuris
            2 days ago






          • 1




            @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
            – Ruadhan2300
            2 days ago






          • 2




            @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
            – alephzero
            2 days ago






          • 1




            @alephzero Can you please stop posting unsubstantive comments?
            – PascLeRasc
            2 days ago






          • 2




            @PascLeRasc It's a relevant reply to another comment. And not untrue either.
            – Graham
            2 days ago








          6




          6




          iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
          – Džuris
          2 days ago




          iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
          – Džuris
          2 days ago




          1




          1




          @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
          – Ruadhan2300
          2 days ago




          @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
          – Ruadhan2300
          2 days ago




          2




          2




          @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
          – alephzero
          2 days ago




          @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
          – alephzero
          2 days ago




          1




          1




          @alephzero Can you please stop posting unsubstantive comments?
          – PascLeRasc
          2 days ago




          @alephzero Can you please stop posting unsubstantive comments?
          – PascLeRasc
          2 days ago




          2




          2




          @PascLeRasc It's a relevant reply to another comment. And not untrue either.
          – Graham
          2 days ago




          @PascLeRasc It's a relevant reply to another comment. And not untrue either.
          – Graham
          2 days ago










          up vote
          2
          down vote














          OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




          Yes, you're totally, completely, absolutely correct.



          Engineers and companies that do what you say, make huge amounts of money.



          Some of the biggest key products of our entire era are totally based on what you describe.




          Is this a widely-held view among UX designers/developers?




          Yes, it's one of the central ideas.



          it is constantly and widely discussed as one of, or the, central issues in UX.



          The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




          Is there an official term for this philosophy?




          Sure, it is



          User-driven design



          Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



          Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



          Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



          Note that to some extent, the everyday phrase



          K.I.S.S.



          amounts to, basically, a similar approach.





          Note - since the "Pascal-issue" is indeed so pervasive, there are



          many, many specific terms for subsets of the concept:



          For example, in the literal example you gave, that is known as



          plug-and-play



          or



          hot swappable



          Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



          So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






          share|improve this answer























          • I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
            – PascLeRasc
            yesterday















          up vote
          2
          down vote














          OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




          Yes, you're totally, completely, absolutely correct.



          Engineers and companies that do what you say, make huge amounts of money.



          Some of the biggest key products of our entire era are totally based on what you describe.




          Is this a widely-held view among UX designers/developers?




          Yes, it's one of the central ideas.



          it is constantly and widely discussed as one of, or the, central issues in UX.



          The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




          Is there an official term for this philosophy?




          Sure, it is



          User-driven design



          Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



          Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



          Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



          Note that to some extent, the everyday phrase



          K.I.S.S.



          amounts to, basically, a similar approach.





          Note - since the "Pascal-issue" is indeed so pervasive, there are



          many, many specific terms for subsets of the concept:



          For example, in the literal example you gave, that is known as



          plug-and-play



          or



          hot swappable



          Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



          So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






          share|improve this answer























          • I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
            – PascLeRasc
            yesterday













          up vote
          2
          down vote










          up vote
          2
          down vote










          OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




          Yes, you're totally, completely, absolutely correct.



          Engineers and companies that do what you say, make huge amounts of money.



          Some of the biggest key products of our entire era are totally based on what you describe.




          Is this a widely-held view among UX designers/developers?




          Yes, it's one of the central ideas.



          it is constantly and widely discussed as one of, or the, central issues in UX.



          The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




          Is there an official term for this philosophy?




          Sure, it is



          User-driven design



          Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



          Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



          Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



          Note that to some extent, the everyday phrase



          K.I.S.S.



          amounts to, basically, a similar approach.





          Note - since the "Pascal-issue" is indeed so pervasive, there are



          many, many specific terms for subsets of the concept:



          For example, in the literal example you gave, that is known as



          plug-and-play



          or



          hot swappable



          Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



          So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






          share|improve this answer















          OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




          Yes, you're totally, completely, absolutely correct.



          Engineers and companies that do what you say, make huge amounts of money.



          Some of the biggest key products of our entire era are totally based on what you describe.




          Is this a widely-held view among UX designers/developers?




          Yes, it's one of the central ideas.



          it is constantly and widely discussed as one of, or the, central issues in UX.



          The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




          Is there an official term for this philosophy?




          Sure, it is



          User-driven design



          Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



          Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



          Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



          Note that to some extent, the everyday phrase



          K.I.S.S.



          amounts to, basically, a similar approach.





          Note - since the "Pascal-issue" is indeed so pervasive, there are



          many, many specific terms for subsets of the concept:



          For example, in the literal example you gave, that is known as



          plug-and-play



          or



          hot swappable



          Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



          So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited 2 days ago

























          answered 2 days ago









          Fattie

          793517




          793517












          • I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
            – PascLeRasc
            yesterday


















          • I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
            – PascLeRasc
            yesterday
















          I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
          – PascLeRasc
          yesterday




          I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
          – PascLeRasc
          yesterday










          up vote
          2
          down vote













          We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.






          share|improve this answer








          New contributor




          Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






















            up vote
            2
            down vote













            We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.






            share|improve this answer








            New contributor




            Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
            Check out our Code of Conduct.




















              up vote
              2
              down vote










              up vote
              2
              down vote









              We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.






              share|improve this answer








              New contributor




              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.









              We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.







              share|improve this answer








              New contributor




              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.









              share|improve this answer



              share|improve this answer






              New contributor




              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.









              answered yesterday









              Tombo

              1211




              1211




              New contributor




              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.





              New contributor





              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.






              Tombo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.






















                  up vote
                  1
                  down vote













                  I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.



                  To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.




                  • If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.


                  • If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.



                  Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.



                  Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.






                  share|improve this answer








                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.














                  • 1




                    In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                    – TimothyAWiseman
                    yesterday






                  • 2




                    Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                    – Kevin Wells
                    yesterday















                  up vote
                  1
                  down vote













                  I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.



                  To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.




                  • If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.


                  • If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.



                  Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.



                  Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.






                  share|improve this answer








                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.














                  • 1




                    In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                    – TimothyAWiseman
                    yesterday






                  • 2




                    Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                    – Kevin Wells
                    yesterday













                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.



                  To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.




                  • If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.


                  • If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.



                  Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.



                  Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.






                  share|improve this answer








                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.



                  To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.




                  • If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.


                  • If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.



                  Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.



                  Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.







                  share|improve this answer








                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  share|improve this answer



                  share|improve this answer






                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  answered yesterday









                  Kevin Wells

                  1112




                  1112




                  New contributor




                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.





                  New contributor





                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  Kevin Wells is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.








                  • 1




                    In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                    – TimothyAWiseman
                    yesterday






                  • 2




                    Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                    – Kevin Wells
                    yesterday














                  • 1




                    In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                    – TimothyAWiseman
                    yesterday






                  • 2




                    Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                    – Kevin Wells
                    yesterday








                  1




                  1




                  In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                  – TimothyAWiseman
                  yesterday




                  In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
                  – TimothyAWiseman
                  yesterday




                  2




                  2




                  Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                  – Kevin Wells
                  yesterday




                  Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
                  – Kevin Wells
                  yesterday










                  up vote
                  1
                  down vote













                  Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.






                  share|improve this answer

























                    up vote
                    1
                    down vote













                    Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.






                    share|improve this answer























                      up vote
                      1
                      down vote










                      up vote
                      1
                      down vote









                      Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.






                      share|improve this answer












                      Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered 21 hours ago









                      Taran

                      1572




                      1572






















                          up vote
                          0
                          down vote













                          User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.



                          More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.



                          ACD changes the perspective from

                          "how does the user want this thing to perform a function" to

                          "how can we make the user fundamentally more successful at this job".



                          If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.






                          share|improve this answer

























                            up vote
                            0
                            down vote













                            User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.



                            More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.



                            ACD changes the perspective from

                            "how does the user want this thing to perform a function" to

                            "how can we make the user fundamentally more successful at this job".



                            If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.






                            share|improve this answer























                              up vote
                              0
                              down vote










                              up vote
                              0
                              down vote









                              User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.



                              More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.



                              ACD changes the perspective from

                              "how does the user want this thing to perform a function" to

                              "how can we make the user fundamentally more successful at this job".



                              If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.






                              share|improve this answer












                              User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.



                              More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.



                              ACD changes the perspective from

                              "how does the user want this thing to perform a function" to

                              "how can we make the user fundamentally more successful at this job".



                              If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered yesterday









                              plainclothes

                              19.5k43777




                              19.5k43777






















                                  up vote
                                  -1
                                  down vote














                                  I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.




                                  This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".




                                  1. The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.


                                  2. Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.


                                  3. There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:




                                  Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.







                                  share|improve this answer

























                                    up vote
                                    -1
                                    down vote














                                    I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.




                                    This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".




                                    1. The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.


                                    2. Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.


                                    3. There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:




                                    Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.







                                    share|improve this answer























                                      up vote
                                      -1
                                      down vote










                                      up vote
                                      -1
                                      down vote










                                      I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.




                                      This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".




                                      1. The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.


                                      2. Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.


                                      3. There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:




                                      Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.







                                      share|improve this answer













                                      I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.




                                      This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".




                                      1. The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.


                                      2. Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.


                                      3. There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:




                                      Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.








                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered 20 hours ago









                                      icc97

                                      6,7561730




                                      6,7561730






















                                          PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.










                                          draft saved

                                          draft discarded


















                                          PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.













                                          PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.












                                          PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.
















                                          Thanks for contributing an answer to User Experience Stack Exchange!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid



                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.


                                          To learn more, see our tips on writing great answers.





                                          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                          Please pay close attention to the following guidance:


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid



                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.


                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function () {
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fux.stackexchange.com%2fquestions%2f122360%2fis-there-a-term-for-the-user-cant-use-anything-wrong-design%23new-answer', 'question_page');
                                          }
                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          QoS: MAC-Priority for clients behind a repeater

                                          Ивакино (Тотемский район)

                                          Can't locate Autom4te/ChannelDefs.pm in @INC (when it definitely is there)