Is there a term for “the user can't use anything wrong” design?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}
up vote
49
down vote
favorite
I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?
edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.
All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!
user-behavior user-centered-design
|
show 25 more comments
up vote
49
down vote
favorite
I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?
edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.
All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!
user-behavior user-centered-design
83
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
52
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
77
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
38
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
37
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48
|
show 25 more comments
up vote
49
down vote
favorite
up vote
49
down vote
favorite
I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?
edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.
All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!
user-behavior user-centered-design
I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?
edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
edit 2: I'd like to thank you all for proving my point. I posted here a point (the user can't use anything wrong) that I interpreted one way and you all interpreted another way. My intention was that there are no wrong actions to take, and your overall interpretation was that there are indeed wrong actions, and we should work to prevent these.
All of you are correct. As the designer of the post, I'm at fault here, and I think you'd agree. I should have made it more clear what I intended the point of this post to be. I have no right to try to argue with any of you about what my intentions are because only the user's interpretation matters. Thank you for such a invigorating discussion!
user-behavior user-centered-design
user-behavior user-centered-design
edited Nov 30 at 21:32
asked Nov 26 at 21:13
PascLeRasc
360128
360128
83
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
52
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
77
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
38
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
37
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48
|
show 25 more comments
83
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
52
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
77
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
38
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
37
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48
83
83
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
52
52
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
77
77
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
38
38
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
37
37
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48
|
show 25 more comments
16 Answers
16
active
oldest
votes
up vote
18
down vote
accepted
No. It is not a widely held view among UX designers. Unfortunately.
Even less so amongst those using SO and considering themselves to be UX Designers.
I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.
UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
|
show 5 more comments
up vote
104
down vote
Accommodation for every possible user interaction is impossible.
Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?
Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.
Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.
Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.
All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.
If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.
New contributor
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
|
show 4 more comments
up vote
101
down vote
Yes, there is a term for this ("the user can't do anything wrong"):
foolproof
But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:
a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools
There is also a term for minimizing what a user can do wrong:
Defensive Design
In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:
- Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash
- Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.
New contributor
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
|
show 3 more comments
up vote
47
down vote
User-Centered Design
What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.
As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.
In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:
- Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.
- Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.
- Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.
Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
|
show 1 more comment
up vote
42
down vote
I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).
New contributor
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
add a comment |
up vote
15
down vote
This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.
Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/
"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."
Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/
"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
add a comment |
up vote
5
down vote
Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.
I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."
This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.
Let's not even talk about implementing PSU to meet this Sawzall requirement...
Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."
Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.
In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!
So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
– Džuris
Nov 28 at 10:39
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
|
show 5 more comments
up vote
2
down vote
OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Yes, you're totally, completely, absolutely correct.
Engineers and companies that do what you say, make huge amounts of money.
Some of the biggest key products of our entire era are totally based on what you describe.
Is this a widely-held view among UX designers/developers?
Yes, it's one of the central ideas.
it is constantly and widely discussed as one of, or the, central issues in UX.
The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.
Is there an official term for this philosophy?
Sure, it is
User-driven design
Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."
Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.
Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)
Note that to some extent, the everyday phrase
K.I.S.S.
amounts to, basically, a similar approach.
Note - since the "Pascal-issue" is indeed so pervasive, there are
many, many specific terms for subsets of the concept:
For example, in the literal example you gave, that is known as
plug-and-play
or
hot swappable
Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.
So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
add a comment |
up vote
2
down vote
We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.
New contributor
add a comment |
up vote
2
down vote
I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.
To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.
If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.
If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.
Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.
Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.
New contributor
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
add a comment |
up vote
1
down vote
Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.
add a comment |
up vote
1
down vote
Your philosophy may not be applicable to everything - some processes will always require a learning curve - but you are up to something.
When I get lost in an online banking website I use to tell the support people:
If I can not find it (as a skilled computer proffessional), you are doing it wrong.
add a comment |
up vote
0
down vote
User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.
More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.
ACD changes the perspective from
"how does the user want this thing to perform a function" to
"how can we make the user fundamentally more successful at this job".
If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.
add a comment |
up vote
0
down vote
The “user can't use anything wrong” design doesn't exist because it shouldn't. It is one thing a "preventing" design, avoiding conscious mistakes and user errors, but assuming “the user can't use anything wrong” can have negative consequences.
Nielsen Norman Group, in Preventing User Errors: Avoiding Unconscious Slips, explains:
The designer is at fault for making it too easy for the user to commit the error. Therefore, the solution to user errors is not to scold users, to ask them to try harder, or to give them more extensive training. The answer is to redesign the system to be less error-prone.
General Guidelines for Preventing Slips:
- Include Helpful Constraints
- Offer Suggestions
- Choose Good Defaults
- Use Forgiving Formatting
On the other hand, if you are too forgiving when you collect a user address, how can you use it to deliver a product? Stalking him because you don't want to stress him to provide a correct address can't be the solution.
Another example can be when you let a user know that if he deletes his account, this won't be available anymore. It is ok to provide a 30 days interval when he can change his mind but after this, it is not ok to store the data anymore, just because the user could delete the account by accident.
add a comment |
up vote
0
down vote
I don't think UX as a community has a term for "the user can't do anything wrong." per se. But there are various design philosophies from other disciplines that may apply such as 'foolproof', 'fail-open' or 'childproof'.
For example my 5 & 6 yo daughters have been using the YouTube Kids app for over a year and have only had one minor technical difficulty in that entire time (just for reference they couldn't get the video previews to go away when a video was playing. I showed them how to swipe down and they went away. However, if you wait a minute they fade on their own). That is an amazing accomplishment.
One good resource on this subject is Don't Make Me Think by Steven Krug.
Coming from the Mac world I have personally been astounded that something as simple as dragging an application's document from the desktop onto the application's icon in the taskbar does not open the document. We are on Windows version 10 and this still doesn't work. Up until Windows 7 it would bring up an error.
New contributor
add a comment |
up vote
-1
down vote
I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".
The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.
Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.
There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:
Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.
add a comment |
16 Answers
16
active
oldest
votes
16 Answers
16
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
18
down vote
accepted
No. It is not a widely held view among UX designers. Unfortunately.
Even less so amongst those using SO and considering themselves to be UX Designers.
I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.
UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
|
show 5 more comments
up vote
18
down vote
accepted
No. It is not a widely held view among UX designers. Unfortunately.
Even less so amongst those using SO and considering themselves to be UX Designers.
I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.
UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
|
show 5 more comments
up vote
18
down vote
accepted
up vote
18
down vote
accepted
No. It is not a widely held view among UX designers. Unfortunately.
Even less so amongst those using SO and considering themselves to be UX Designers.
I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.
UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.
No. It is not a widely held view among UX designers. Unfortunately.
Even less so amongst those using SO and considering themselves to be UX Designers.
I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.
UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.
answered Nov 27 at 1:09
Confused
2,022617
2,022617
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
|
show 5 more comments
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
22
22
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
@PascLeRasc The reason you're getting so much push-back is that what you're suggesting is too extreme. You can't plan for every possible use of your product and make it good for all of them. If I try to hammer in nails with a wine glass it is my fault when it breaks, not the fault of the glass blower for not making it useful as a hammer. In that case I, the user, was wrong. When I then complain to the glass manufacturer and they tell me that I was supposed to use the glass for drinking wine and not hammering nails, they aren't being un-empathetic, they're just right
– Kevin Wells
Nov 29 at 18:16
9
9
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
@PascLeRasc And people here aren't disputing that we should watch and listen to users to refine our products and make them more usable and intuitive, but there is always a trade off involved and we have to be realistic in our approaches
– Kevin Wells
Nov 29 at 18:17
1
1
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
Why do you think someone thought your wine glass was a hammer? Could you tweak your design so that it doesn't suggest that it's a hammer?
– PascLeRasc
Nov 29 at 18:17
18
18
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
@PascLeRasc I've seen people use all sorts of crazy things for purposes they aren't meant for. If you can imagine a stupid way to use an object I bet someone at some point has tried it. Now if a large number of your users report the same confusions (like hundreds of people using a wine glass as a hammer), then yes, you should look into why that would be. But you will always have one off situations where people do something stupid, and those people should be ignored rather than designed around, don't miss the forest for the trees
– Kevin Wells
Nov 29 at 18:20
12
12
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
@TimothyAWiseman It absolutely can be, for example I'm glad that my flat head screw driver makes for a decent pry bar in a pinch. However not everything can be good for every purpose. To refer back to my first example, a wine glass makes a pretty good cookie cutter if you just want a circle, but makes for a lousy hammer, and even in that case I don't think wine glass makers should try to design them to be better cookie cutters (unless they want that to be a unique selling point to stand out from the market)
– Kevin Wells
Nov 29 at 18:32
|
show 5 more comments
up vote
104
down vote
Accommodation for every possible user interaction is impossible.
Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?
Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.
Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.
Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.
All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.
If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.
New contributor
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
|
show 4 more comments
up vote
104
down vote
Accommodation for every possible user interaction is impossible.
Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?
Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.
Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.
Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.
All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.
If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.
New contributor
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
|
show 4 more comments
up vote
104
down vote
up vote
104
down vote
Accommodation for every possible user interaction is impossible.
Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?
Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.
Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.
Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.
All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.
If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.
New contributor
Accommodation for every possible user interaction is impossible.
Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?
Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.
Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.
Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.
All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.
If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.
New contributor
edited Nov 27 at 9:01
New contributor
answered Nov 27 at 4:59
formicini
984218
984218
New contributor
New contributor
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
|
show 4 more comments
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
92
92
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
"If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
– manassehkatz
Nov 27 at 5:34
38
38
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
– formicini
Nov 27 at 6:41
13
13
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
– Draco18s
Nov 27 at 14:47
3
3
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
I don't see how picking apart my example is an answer. This should have been a comment instead.
– PascLeRasc
Nov 27 at 15:11
3
3
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
@PascLeRasc your question is actually 2 questions: "Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?". I answer only the first by way of example. The example chosen is similar to yours to help you see the similarity, it is neither to pick apart it nor to demean you. If you did think so then I apologize.
– formicini
Nov 28 at 1:28
|
show 4 more comments
up vote
101
down vote
Yes, there is a term for this ("the user can't do anything wrong"):
foolproof
But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:
a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools
There is also a term for minimizing what a user can do wrong:
Defensive Design
In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:
- Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash
- Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.
New contributor
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
|
show 3 more comments
up vote
101
down vote
Yes, there is a term for this ("the user can't do anything wrong"):
foolproof
But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:
a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools
There is also a term for minimizing what a user can do wrong:
Defensive Design
In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:
- Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash
- Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.
New contributor
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
|
show 3 more comments
up vote
101
down vote
up vote
101
down vote
Yes, there is a term for this ("the user can't do anything wrong"):
foolproof
But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:
a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools
There is also a term for minimizing what a user can do wrong:
Defensive Design
In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:
- Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash
- Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.
New contributor
Yes, there is a term for this ("the user can't do anything wrong"):
foolproof
But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:
a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools
There is also a term for minimizing what a user can do wrong:
Defensive Design
In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:
- Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash
- Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.
New contributor
New contributor
answered Nov 27 at 10:58
ONOZ
1,109165
1,109165
New contributor
New contributor
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
|
show 3 more comments
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
2
2
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
– Ruadhan2300
Nov 28 at 11:39
5
5
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
"Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
– Fattie
Nov 28 at 16:24
3
3
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
– Monty Harder
Nov 28 at 22:15
1
1
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
I'd also recommend another alternative - instead of preparing for everything that could possibly go wrong, allow the user to go back. If it's feasible to implement undo for a functionality, it's probably going to work 9001% better than anything that tries to prevent the problem in the first place. Indeed, this is also used in the USB drive example - NTFS uses transactions exactly to limit the damage caused by unexpected loss of function (e.g. power loss). It cannot prevent data loss, but it can prevent file system corruption, unlike FAT32 (and for good applications, even data corruption).
– Luaan
Nov 29 at 14:34
1
1
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
You have to admit Defensive Programming requires the programmer to 100%, absolutely, without a doubt, understand the entire system. I've had hilarious shopping cart experiences where I open the developer console and made stores ship to locations that they didn't allow. One time the company shipped it out for no shipping cost because their system didn't know how to handle a country not on their list and I kept insisting it was their fault (it technically is...) Most developers simply do not have wide enough scope of knowledge to do proper defensive programming.
– Nelson
Nov 30 at 0:57
|
show 3 more comments
up vote
47
down vote
User-Centered Design
What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.
As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.
In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:
- Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.
- Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.
- Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.
Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
|
show 1 more comment
up vote
47
down vote
User-Centered Design
What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.
As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.
In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:
- Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.
- Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.
- Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.
Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
|
show 1 more comment
up vote
47
down vote
up vote
47
down vote
User-Centered Design
What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.
As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.
In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:
- Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.
- Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.
- Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.
Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.
User-Centered Design
What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.
As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.
In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:
- Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.
- Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.
- Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.
Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.
edited Nov 28 at 14:11
answered Nov 27 at 4:33
David Regev
975513
975513
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
|
show 1 more comment
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
32
32
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
(1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
– chrylis
Nov 27 at 5:52
4
4
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
@chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
– Nobody
Nov 27 at 19:26
8
8
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
@Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
– chrylis
Nov 27 at 19:30
1
1
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
Yes, this is the correct answer
– Fattie
Nov 29 at 4:17
4
4
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
"Foolproof" is not a consequence of "User Centered Design". On the contrary, achieving a foolproof state often means that you have to decrease the usability in other scenarios. I don't recall Norman having said that, and it's not in the Youtube video either. ONOZ answer, in my view, is to the point. formicini give a good example in his answer. I think it's what chrylis means in his comment, but I'm not sure so I leave my 2 cents as well.
– Albin
Nov 29 at 15:38
|
show 1 more comment
up vote
42
down vote
I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).
New contributor
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
add a comment |
up vote
42
down vote
I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).
New contributor
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
add a comment |
up vote
42
down vote
up vote
42
down vote
I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).
New contributor
I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).
New contributor
New contributor
answered Nov 27 at 11:35
Kit
42112
42112
New contributor
New contributor
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
add a comment |
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
2
2
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
– PascLeRasc
Nov 27 at 15:24
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
@PascLeRasc or is it pandering to the lack of common sense...
– Solar Mike
Nov 27 at 15:32
10
10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
@SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
– J...
Nov 27 at 19:10
15
15
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
@SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
– Confused
Nov 27 at 19:12
add a comment |
up vote
15
down vote
This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.
Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/
"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."
Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/
"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
add a comment |
up vote
15
down vote
This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.
Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/
"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."
Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/
"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
add a comment |
up vote
15
down vote
up vote
15
down vote
This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.
Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/
"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."
Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/
"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."
This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.
Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/
"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."
Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/
"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."
answered Nov 27 at 4:13
Jeremy Franck
23613
23613
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
add a comment |
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
1
1
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
Joel Spolsky (praise be) wrote a pretty good article in his blog about this
– Ruadhan2300
Nov 28 at 11:33
1
1
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
– Graham
Nov 28 at 23:38
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
Well, you can't have error messages if the mouse is charging, and the usb post is under the mouse... (geek.com/wp-content/uploads/2015/10/magic_mouse_2_charging.jpg)
– Ismael Miguel
Nov 30 at 10:39
1
1
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
@Ruadhan2300 Almost all of his articles from the late 90's / early 2000's are still surprising relevant 20 years later.
– corsiKa
yesterday
add a comment |
up vote
5
down vote
Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.
I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."
This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.
Let's not even talk about implementing PSU to meet this Sawzall requirement...
Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."
Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.
In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!
So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
– Džuris
Nov 28 at 10:39
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
|
show 5 more comments
up vote
5
down vote
Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.
I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."
This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.
Let's not even talk about implementing PSU to meet this Sawzall requirement...
Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."
Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.
In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!
So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
– Džuris
Nov 28 at 10:39
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
|
show 5 more comments
up vote
5
down vote
up vote
5
down vote
Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.
I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."
This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.
Let's not even talk about implementing PSU to meet this Sawzall requirement...
Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."
Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.
In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!
So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"
Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.
I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."
This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.
Let's not even talk about implementing PSU to meet this Sawzall requirement...
Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."
Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.
In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!
So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"
answered Nov 27 at 23:43
Cort Ammon
59125
59125
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
– Džuris
Nov 28 at 10:39
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
|
show 5 more comments
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
– Džuris
Nov 28 at 10:39
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
7
7
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.– Džuris
Nov 28 at 10:39
iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS.
- how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.– Džuris
Nov 28 at 10:39
2
2
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
@Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
– Ruadhan2300
Nov 28 at 11:31
3
3
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
@Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
– alephzero
Nov 28 at 13:36
4
4
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
@alephzero Can you please stop posting unsubstantive comments?
– PascLeRasc
Nov 28 at 17:18
2
2
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
@PascLeRasc It's a relevant reply to another comment. And not untrue either.
– Graham
Nov 28 at 23:40
|
show 5 more comments
up vote
2
down vote
OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Yes, you're totally, completely, absolutely correct.
Engineers and companies that do what you say, make huge amounts of money.
Some of the biggest key products of our entire era are totally based on what you describe.
Is this a widely-held view among UX designers/developers?
Yes, it's one of the central ideas.
it is constantly and widely discussed as one of, or the, central issues in UX.
The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.
Is there an official term for this philosophy?
Sure, it is
User-driven design
Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."
Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.
Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)
Note that to some extent, the everyday phrase
K.I.S.S.
amounts to, basically, a similar approach.
Note - since the "Pascal-issue" is indeed so pervasive, there are
many, many specific terms for subsets of the concept:
For example, in the literal example you gave, that is known as
plug-and-play
or
hot swappable
Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.
So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
add a comment |
up vote
2
down vote
OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Yes, you're totally, completely, absolutely correct.
Engineers and companies that do what you say, make huge amounts of money.
Some of the biggest key products of our entire era are totally based on what you describe.
Is this a widely-held view among UX designers/developers?
Yes, it's one of the central ideas.
it is constantly and widely discussed as one of, or the, central issues in UX.
The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.
Is there an official term for this philosophy?
Sure, it is
User-driven design
Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."
Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.
Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)
Note that to some extent, the everyday phrase
K.I.S.S.
amounts to, basically, a similar approach.
Note - since the "Pascal-issue" is indeed so pervasive, there are
many, many specific terms for subsets of the concept:
For example, in the literal example you gave, that is known as
plug-and-play
or
hot swappable
Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.
So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
add a comment |
up vote
2
down vote
up vote
2
down vote
OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Yes, you're totally, completely, absolutely correct.
Engineers and companies that do what you say, make huge amounts of money.
Some of the biggest key products of our entire era are totally based on what you describe.
Is this a widely-held view among UX designers/developers?
Yes, it's one of the central ideas.
it is constantly and widely discussed as one of, or the, central issues in UX.
The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.
Is there an official term for this philosophy?
Sure, it is
User-driven design
Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."
Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.
Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)
Note that to some extent, the everyday phrase
K.I.S.S.
amounts to, basically, a similar approach.
Note - since the "Pascal-issue" is indeed so pervasive, there are
many, many specific terms for subsets of the concept:
For example, in the literal example you gave, that is known as
plug-and-play
or
hot swappable
Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.
So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".
OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.
Yes, you're totally, completely, absolutely correct.
Engineers and companies that do what you say, make huge amounts of money.
Some of the biggest key products of our entire era are totally based on what you describe.
Is this a widely-held view among UX designers/developers?
Yes, it's one of the central ideas.
it is constantly and widely discussed as one of, or the, central issues in UX.
The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.
Is there an official term for this philosophy?
Sure, it is
User-driven design
Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."
Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.
Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)
Note that to some extent, the everyday phrase
K.I.S.S.
amounts to, basically, a similar approach.
Note - since the "Pascal-issue" is indeed so pervasive, there are
many, many specific terms for subsets of the concept:
For example, in the literal example you gave, that is known as
plug-and-play
or
hot swappable
Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.
So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".
edited Nov 29 at 4:31
answered Nov 29 at 4:25
Fattie
793517
793517
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
add a comment |
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
I agree with the ideas in this and thanks for the writeup, but it's not really what I'm thinking of. I think what I'm really thinking of is more of an industrial design issue than UX.
– PascLeRasc
Nov 29 at 16:36
add a comment |
up vote
2
down vote
We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.
New contributor
add a comment |
up vote
2
down vote
We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.
New contributor
add a comment |
up vote
2
down vote
up vote
2
down vote
We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.
New contributor
We always called it user-proofing, and it's usually the most time consuming aspect of software development. It's not so much that the user can't do anything wrong, but more that whatever the user does won't crash or break the software. This term dates back to at least 1997 when I started developing professionally, and probably much earlier.
New contributor
New contributor
answered Nov 29 at 16:00
Tombo
1211
1211
New contributor
New contributor
add a comment |
add a comment |
up vote
2
down vote
I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.
To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.
If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.
If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.
Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.
Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.
New contributor
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
add a comment |
up vote
2
down vote
I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.
To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.
If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.
If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.
Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.
Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.
New contributor
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
add a comment |
up vote
2
down vote
up vote
2
down vote
I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.
To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.
If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.
If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.
Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.
Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.
New contributor
I'm shocked to see that no one has brought up the fact that everything in design and engineering has a cost. You can always engineer a better version of whatever you're making that covers more use cases and has more features that users want, but every time you do you sacrifice something else. The thing you sacrifice may be literal cost and raise the price or lower profits, or it can be a trade off in some other way.
To use your example of usb's being pulled out without ejection there are a few associated costs to different approaches.
If you make usb's lock in place you add manufacturing cost and complexity to both the drives and the ports, and you decrease usability because it makes them more cumbersome to put in or take out. Even if someone could make such a drive I would never buy it and continue to buy normal usb's without locks.
If instead you make sure the usb is kept in an ejectable state as much as possible then you will lose performance (since the computer will have to do constant cleanup and restrict write times to short bursts). Since one of the biggest selling points of flash drives is read/write speed, that also means no one would want to buy it.
Either way by trying to cover for this niche UX issue they have lost a lot of potential customers.
Basically what I'm saying is that you have to do a cost/benefit analysis and decide which features are worth doing and which are beyond the scope of what you're trying to accomplish. Yes, we should watch and listen to users and find out how to refine our products to be more useful in real world scenarios, but there is always a limit.
New contributor
New contributor
answered Nov 29 at 18:11
Kevin Wells
1212
1212
New contributor
New contributor
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
add a comment |
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
2
2
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
In a technical since, I'm not sure this is an answer since it doesn't propose a term for the concept, which was technically the question. However, I agree with all of this strongly and it explains why this isn't done more often. A classic example might be a programming language. Scratch is almost fool-proof, but it is slow, limited, and literally made for kids. A general purpose programming language like C++ lets the user do things wrong in innumerable ways, but also gives the user tremendous power. Limiting the things a user can do wrong comes at a trade-off in power or efficiency or both.
– TimothyAWiseman
Nov 29 at 19:19
2
2
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
Fair point, I suppose my answer is only an extension of the wider discussion about this practice and not a strict answer to the question. Also the programming language example is a great example of what I'm talking about, as is just about any specialist tool
– Kevin Wells
Nov 29 at 19:36
add a comment |
up vote
1
down vote
Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.
add a comment |
up vote
1
down vote
Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.
add a comment |
up vote
1
down vote
up vote
1
down vote
Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.
Falling Into The Pit of Success is a term used in the development community. It's more focused around language or library design, but can be applied to front end interaction also. It's definitely vocabulary I would use when discussing UX with other developers to get them on the same page.
answered Nov 30 at 9:49
Taran
1572
1572
add a comment |
add a comment |
up vote
1
down vote
Your philosophy may not be applicable to everything - some processes will always require a learning curve - but you are up to something.
When I get lost in an online banking website I use to tell the support people:
If I can not find it (as a skilled computer proffessional), you are doing it wrong.
add a comment |
up vote
1
down vote
Your philosophy may not be applicable to everything - some processes will always require a learning curve - but you are up to something.
When I get lost in an online banking website I use to tell the support people:
If I can not find it (as a skilled computer proffessional), you are doing it wrong.
add a comment |
up vote
1
down vote
up vote
1
down vote
Your philosophy may not be applicable to everything - some processes will always require a learning curve - but you are up to something.
When I get lost in an online banking website I use to tell the support people:
If I can not find it (as a skilled computer proffessional), you are doing it wrong.
Your philosophy may not be applicable to everything - some processes will always require a learning curve - but you are up to something.
When I get lost in an online banking website I use to tell the support people:
If I can not find it (as a skilled computer proffessional), you are doing it wrong.
answered 11 hours ago
daniel.sedlacek
2,18441428
2,18441428
add a comment |
add a comment |
up vote
0
down vote
User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.
More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.
ACD changes the perspective from
"how does the user want this thing to perform a function" to
"how can we make the user fundamentally more successful at this job".
If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.
add a comment |
up vote
0
down vote
User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.
More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.
ACD changes the perspective from
"how does the user want this thing to perform a function" to
"how can we make the user fundamentally more successful at this job".
If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.
add a comment |
up vote
0
down vote
up vote
0
down vote
User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.
More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.
ACD changes the perspective from
"how does the user want this thing to perform a function" to
"how can we make the user fundamentally more successful at this job".
If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.
User-centric is the broad principle and, IME, it's widely accepted among modern software product teams and many hardware product teams.
More specifically, I think Activity Centered Design deals directly with this issue. ACD addresses the user's entire workflow and how the product can fit into, augment, or alter that flow.
ACD changes the perspective from
"how does the user want this thing to perform a function" to
"how can we make the user fundamentally more successful at this job".
If you do ACD (or UCD) without accommodating user "error" then you did it wrong and you need to keep iterating.
answered Nov 30 at 0:48
plainclothes
19.5k43777
19.5k43777
add a comment |
add a comment |
up vote
0
down vote
The “user can't use anything wrong” design doesn't exist because it shouldn't. It is one thing a "preventing" design, avoiding conscious mistakes and user errors, but assuming “the user can't use anything wrong” can have negative consequences.
Nielsen Norman Group, in Preventing User Errors: Avoiding Unconscious Slips, explains:
The designer is at fault for making it too easy for the user to commit the error. Therefore, the solution to user errors is not to scold users, to ask them to try harder, or to give them more extensive training. The answer is to redesign the system to be less error-prone.
General Guidelines for Preventing Slips:
- Include Helpful Constraints
- Offer Suggestions
- Choose Good Defaults
- Use Forgiving Formatting
On the other hand, if you are too forgiving when you collect a user address, how can you use it to deliver a product? Stalking him because you don't want to stress him to provide a correct address can't be the solution.
Another example can be when you let a user know that if he deletes his account, this won't be available anymore. It is ok to provide a 30 days interval when he can change his mind but after this, it is not ok to store the data anymore, just because the user could delete the account by accident.
add a comment |
up vote
0
down vote
The “user can't use anything wrong” design doesn't exist because it shouldn't. It is one thing a "preventing" design, avoiding conscious mistakes and user errors, but assuming “the user can't use anything wrong” can have negative consequences.
Nielsen Norman Group, in Preventing User Errors: Avoiding Unconscious Slips, explains:
The designer is at fault for making it too easy for the user to commit the error. Therefore, the solution to user errors is not to scold users, to ask them to try harder, or to give them more extensive training. The answer is to redesign the system to be less error-prone.
General Guidelines for Preventing Slips:
- Include Helpful Constraints
- Offer Suggestions
- Choose Good Defaults
- Use Forgiving Formatting
On the other hand, if you are too forgiving when you collect a user address, how can you use it to deliver a product? Stalking him because you don't want to stress him to provide a correct address can't be the solution.
Another example can be when you let a user know that if he deletes his account, this won't be available anymore. It is ok to provide a 30 days interval when he can change his mind but after this, it is not ok to store the data anymore, just because the user could delete the account by accident.
add a comment |
up vote
0
down vote
up vote
0
down vote
The “user can't use anything wrong” design doesn't exist because it shouldn't. It is one thing a "preventing" design, avoiding conscious mistakes and user errors, but assuming “the user can't use anything wrong” can have negative consequences.
Nielsen Norman Group, in Preventing User Errors: Avoiding Unconscious Slips, explains:
The designer is at fault for making it too easy for the user to commit the error. Therefore, the solution to user errors is not to scold users, to ask them to try harder, or to give them more extensive training. The answer is to redesign the system to be less error-prone.
General Guidelines for Preventing Slips:
- Include Helpful Constraints
- Offer Suggestions
- Choose Good Defaults
- Use Forgiving Formatting
On the other hand, if you are too forgiving when you collect a user address, how can you use it to deliver a product? Stalking him because you don't want to stress him to provide a correct address can't be the solution.
Another example can be when you let a user know that if he deletes his account, this won't be available anymore. It is ok to provide a 30 days interval when he can change his mind but after this, it is not ok to store the data anymore, just because the user could delete the account by accident.
The “user can't use anything wrong” design doesn't exist because it shouldn't. It is one thing a "preventing" design, avoiding conscious mistakes and user errors, but assuming “the user can't use anything wrong” can have negative consequences.
Nielsen Norman Group, in Preventing User Errors: Avoiding Unconscious Slips, explains:
The designer is at fault for making it too easy for the user to commit the error. Therefore, the solution to user errors is not to scold users, to ask them to try harder, or to give them more extensive training. The answer is to redesign the system to be less error-prone.
General Guidelines for Preventing Slips:
- Include Helpful Constraints
- Offer Suggestions
- Choose Good Defaults
- Use Forgiving Formatting
On the other hand, if you are too forgiving when you collect a user address, how can you use it to deliver a product? Stalking him because you don't want to stress him to provide a correct address can't be the solution.
Another example can be when you let a user know that if he deletes his account, this won't be available anymore. It is ok to provide a 30 days interval when he can change his mind but after this, it is not ok to store the data anymore, just because the user could delete the account by accident.
answered 2 days ago
Madalina Taina
2,97411032
2,97411032
add a comment |
add a comment |
up vote
0
down vote
I don't think UX as a community has a term for "the user can't do anything wrong." per se. But there are various design philosophies from other disciplines that may apply such as 'foolproof', 'fail-open' or 'childproof'.
For example my 5 & 6 yo daughters have been using the YouTube Kids app for over a year and have only had one minor technical difficulty in that entire time (just for reference they couldn't get the video previews to go away when a video was playing. I showed them how to swipe down and they went away. However, if you wait a minute they fade on their own). That is an amazing accomplishment.
One good resource on this subject is Don't Make Me Think by Steven Krug.
Coming from the Mac world I have personally been astounded that something as simple as dragging an application's document from the desktop onto the application's icon in the taskbar does not open the document. We are on Windows version 10 and this still doesn't work. Up until Windows 7 it would bring up an error.
New contributor
add a comment |
up vote
0
down vote
I don't think UX as a community has a term for "the user can't do anything wrong." per se. But there are various design philosophies from other disciplines that may apply such as 'foolproof', 'fail-open' or 'childproof'.
For example my 5 & 6 yo daughters have been using the YouTube Kids app for over a year and have only had one minor technical difficulty in that entire time (just for reference they couldn't get the video previews to go away when a video was playing. I showed them how to swipe down and they went away. However, if you wait a minute they fade on their own). That is an amazing accomplishment.
One good resource on this subject is Don't Make Me Think by Steven Krug.
Coming from the Mac world I have personally been astounded that something as simple as dragging an application's document from the desktop onto the application's icon in the taskbar does not open the document. We are on Windows version 10 and this still doesn't work. Up until Windows 7 it would bring up an error.
New contributor
add a comment |
up vote
0
down vote
up vote
0
down vote
I don't think UX as a community has a term for "the user can't do anything wrong." per se. But there are various design philosophies from other disciplines that may apply such as 'foolproof', 'fail-open' or 'childproof'.
For example my 5 & 6 yo daughters have been using the YouTube Kids app for over a year and have only had one minor technical difficulty in that entire time (just for reference they couldn't get the video previews to go away when a video was playing. I showed them how to swipe down and they went away. However, if you wait a minute they fade on their own). That is an amazing accomplishment.
One good resource on this subject is Don't Make Me Think by Steven Krug.
Coming from the Mac world I have personally been astounded that something as simple as dragging an application's document from the desktop onto the application's icon in the taskbar does not open the document. We are on Windows version 10 and this still doesn't work. Up until Windows 7 it would bring up an error.
New contributor
I don't think UX as a community has a term for "the user can't do anything wrong." per se. But there are various design philosophies from other disciplines that may apply such as 'foolproof', 'fail-open' or 'childproof'.
For example my 5 & 6 yo daughters have been using the YouTube Kids app for over a year and have only had one minor technical difficulty in that entire time (just for reference they couldn't get the video previews to go away when a video was playing. I showed them how to swipe down and they went away. However, if you wait a minute they fade on their own). That is an amazing accomplishment.
One good resource on this subject is Don't Make Me Think by Steven Krug.
Coming from the Mac world I have personally been astounded that something as simple as dragging an application's document from the desktop onto the application's icon in the taskbar does not open the document. We are on Windows version 10 and this still doesn't work. Up until Windows 7 it would bring up an error.
New contributor
edited 15 hours ago
New contributor
answered 15 hours ago
Marlon D.
1014
1014
New contributor
New contributor
add a comment |
add a comment |
up vote
-1
down vote
I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".
The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.
Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.
There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:
Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.
add a comment |
up vote
-1
down vote
I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".
The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.
Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.
There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:
Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.
add a comment |
up vote
-1
down vote
up vote
-1
down vote
I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".
The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.
Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.
There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:
Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.
I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.
This almost entirely changes the meaning of your title. It goes from being error avoidance, to "the bug is a feature".
The closest thing that I can think of is Agile/Lean UX. This is where you have a short feedback loop. You build your product, be it a microphone or a mobile app and get it into the hands of users. Then depending on how they use it you enhance those features.
Also as far as things being used not for their original purpose - I think the buzz-word "pivot" comes in. This is where the microphone folks realise they've built a better hammer by accident and start selling hammers that you sing in to.
There's also another similar but related area where you have mistakes that turn out to be extremely useful - serendipitous accidents appears to be a relevant term here. I believe the most famous of these is penicillin, but there's also the discovery of Blu tack in the UK:
Fleming recounted that the date of his discovery of penicillin was on the morning of Friday 28 September 1928. The traditional version of this story describes the discovery as a serendipitous accident: in his laboratory in the basement of St Mary's Hospital in London (now part of Imperial College), Fleming noticed a Petri dish containing Staphylococci that had been mistakenly left open was contaminated by blue-green mould from an open window, which formed a visible growth. There was a halo of inhibited bacterial growth around the mould. Fleming concluded that the mould released a substance that repressed the growth and caused lysing of the bacteria.
answered Nov 30 at 11:08
icc97
6,7561730
6,7561730
add a comment |
add a comment |
Thanks for contributing an answer to User Experience Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fux.stackexchange.com%2fquestions%2f122360%2fis-there-a-term-for-the-user-cant-use-anything-wrong-design%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
83
What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
Nov 26 at 22:54
52
This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
Nov 27 at 1:13
77
@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
Nov 27 at 12:36
38
I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
Nov 27 at 16:49
37
Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
Nov 27 at 20:48