1 - The Psychopathology of Everyday Things

An untrained person seated in the cockpit of a modern get liner would be completely bewildered by the array of gauges and controls before him - but he would not be surprised or bothered by this because he recognizes that flying an airplane is a complicated task that he is unable to perform. However, when he struggles to operate a door, switches, water faucets, and other common everyday devices, he becomes quite upset.

There are few things quite so simple as a door. To open it, you can pull it or you can push it. And yet, this is often perplexing - and when you figure out the basic signs that indicate whether to pull or push, you will find they are used inconsistently. And sometimes, you will even encounter a door that slides. So fumbling with a door happens quite often: we push a door that is meant to be pulled or vice-versa. We solve the problem rather quickly and are not permanently obstructed, but it's annoying all the same. And it happens far more often than it should. The author pauses to mention he has used this analogy quite often, such that a poorly designed door is called a "Norman Door" - Google it.

The author refers to Jacques Carelman, who has developed an entire series of books called "D'objets Introuvables" (roughly, "Confusing Objects") that provides many examples of everyday things that are unworkable, ill-formed, ill-conceived, and thoroughly bad in every way. It's an entertaining read, but a sad reminder of how many badly designed objects frustrate and annoy us in our everyday lives.

He tells an anecdote about a friend getting trapped in a post office between two rows of glass doors. The visual effect of the architecture was quite elegant, and "probably won design awards," but made it unclear which side of the door would swing open. So after passing through the first set of doors, he pushed on the wrong side of a door (the side with the hinges) and assumed it was locked. He turned around and did the same thing, and then assumed that the door had locked behind him. He remained stuck there until someone else wandered through the doors and he was able to figure out what had gone wrong.

The two most important characteristics of good design are:

The glass doors in the anecdote above show a failure of discoverability: a glass pane with four metal squares at the corners does not enable the user to recognize how it can be used, or even to perceive by inspection how it might be used. Omitting a push-plate and concealing the hinges were deliberate design choices - and very bad ones, as they made it impossible to see how to operate the door, or even to recognize it was a door at all.

The more complex a device becomes, the more difficult it is for an individual to determine what to do merely by looking at it. We provide manuals or training courses to teak people to use complex devices like computers or airplanes. But the author suggests that many devices are made to be complex by including too many functions, controls, and unnecessary features. A washing machine shouldn't look like the control room of a spaceship, with a bewildering array of controls and displays, considering that most people will never use all of the settings and features the modern washing machine provides.

Another brief anecdote: a couple who purchased a state-of-the-art washer/dryer combination device were complexes by all the controls. The husband refused to go near it, and the wife had figured out one method of getting it to work and ignored everything else. These weren't stupid people - one was an engineer and the other a physician - they simply didn't want to devote the time to learning a complex device to perform the simple task of washing clothing.

The Complexity of Modern Devices

Begin with the premise: all artificial things are designed.

That is not to say that they are designed well, or that much thought is put into them - but they are intentionally crafted and arranged by someone. And given that, then there are an enormous number of things that impact the daily lives of modern man that have been designed - and unless you're on a camping trip in the wilderness, chances are the number of "designed" objects in your environment far outnumber the ones that are truly natural.

In the best of cases, products should be designed to be easy to use, such that they do not become an obstruction or distraction, but can be used to accomplish the things we wish to do with a minimum of distraction. Sadly, this is not the case, and the stress and frustration of modern life largely arises from the fact that we live in a poorly designed world, full of awkward and difficult things - and this is the failure of their designers.

The design of everyday objects seems very mundane and unimportant - but given that they rob life of its pleasure, it is a very worthwhile undertaking. And that is the subject of this book, though it is likely too large a topic for a single book to address, hence the author plans to focus on three areas:

In essence, these disciplines considers the way in which people must behave to accomplish a goal, which is a departure from earlier forms of design which focused entirely on the device, object, or process and treated the people as being of little importance.

Unfortunately, the history of design has been largely indifferent to the needs and interests of human beings. Tracking the flow through a service experience, "the customer" is treated much the same as any other piece of mechanical equipment, to be moved to locations and engaged in processes that make the process efficient, with the presumption that they will be glad to yield to the rules dictated by the company or organization that is "serving" them.

And worse, there is a standing deficiency in human-machine interaction to use the human to do what the machine is capable of doing - and in some instances this is done with so little regard that it often seems that the human is serving the needs of the machine rather than the other way around.

Another unfortunate attitude on the part of engineers is that the human being using the machine is to blame for any mistake that occurs. The device is not overly complex or awkward to use, the user is just stupid and needs to read the manual again. "Human error" is a convenient excuse for a badly designed machine.

The irony of the situation is that people don't know how to use devices designed by engineers who don't understand people. Had the engineers begun with an understanding of the way that people work, and designed their devices to accommodate them, many of these "problems" would have been avoided. The problem, in effect, is that engineers design machines for "people the way you want them to be, not for the way they really are."

Human-Centered Design

The author suggests that, in the decades since the first edition of this book, design has gotten better - there are many books and courses on the topic. And yet, design problems remain - in part because technology is outpacing design. There is no time to allow the market to sort out the problems with a device before it is obsolete, and is replaced by a new device.

But no matter the technology, the principles of design are the same: a device is used by a human being to address a human need - and if it is to be successful the device must be based on the capabilities and behaviors of that human being. It is not about what can be done with plastic, metal, and microchips, but what a human being can do with those materials if they are configured a certain way.

Human-centered design (HCD) is an approach that begins with the needs, capabilities, and behaviors of human beings and asks, what must a device do to enable the person to accomplish a task (rather than the other way around). But more than that, it's recognizing that people don't wish to accomplish tasks - they wish to gain benefits, and performing a task is a means to achieve a desired benefit.

Random thought: another factor in good design is that it considers what will happen when things go wrong. One of the major shortcomings of design thus far has been that everything will go perfectly, and refusing to consider how the user might recover in instances in which they didn't.

Fundamental Principles of Interaction

The job of the designer is to produce a pleasurable and beneficial experience - not a perfect object. Engineers tend to obsess over the perfection of the objects they are creating, and aim for a device that is a wonder of technology, and which may be difficult, confusing, and deliver no value to the person who uses it.

In a commercial sense, experience is critical - it determines how people feel about their interactions with objects. And emotions are important to determining whether they feel satisfied, and what kind of stories they tell to others. Most people are indifferent to the technical wonder of a product that is difficult to use and delivers no real benefit.

(EN: Fanboys and nerds, however, are obsessive about inherent technical qualities, and it's not just for digital devices. There are automotive fanboys who take great pride in being able to drive a car that is difficult to operate - were it not so, the manual transmission would have disappeared decades ago.)

Consider that cognition and emotion are highly intertwined. The way a person feels about something drives the way they think about it. A user who feels frustrated by a difficult experience will generally not say positive things about the technical sophistication of the product that has annoyed him and made it difficult or impossible to achieve his goals.

The author then makes a clumsy transition to the "fundamental principles" of interaction: affordances, signifiers, constraints, mappings, and conceptual models.


(EN: I have the sense that this very author may be the reason so many designers totally misuse the term "affordance" to mean any property of an object that is useful, or provides a visual indication of how to use it, or anything for which they seem not to know the proper term. I'll refrain from further comment, except to say that the poor use of the term here likely indicates the need for less vague thinking on the properties of objects.)

The author suggests that an "affordance" means a signal to the user that indicates to him how an object is to be used. His definition is broad enough to reflect actual affordances, such as a push-plate on a door indicating the user should push to open, along with natural properties, such as a shelf's flatness being an affordance that indicates things should be placed on it.

He then brings into the notion the design of objects to human capabilities. If a chair can be lifted, then it "affords" being moved - but if it cannot be lifted, it does not. This means that the author's notion of affordance varies, such that a strong person will find a chair to be movable but a weak person would not.

His notion of affordance is that it is jointly determined by the qualities of the object and the abilities of the individual who uses it - which he concedes is problematic because the abilities of the user is usually unknown and defies prediction.

He switches to the physical properties of things for a while: glass is typically transparent, such that it allows light to pass through it, but does not allow the passage of physical objects (aside of certain atomic particles), at least not without being broken. As a window, these properties work out well - you can see through the window but things cannot pass from outside to inside. As a door, they don't work out so well (a considerable number of people walk into glass doors) unless the property of transparency is somehow addressed in a manner that people realize that there is a pane of glass blocking the way and that they should not step into it.

People learn to recognize the properties of things, and they learn the affordances that are customarily provided to operate them. He fusses a bit about the difference between perception and cognition - it is perception that sees a small metal rectangle affixed to a large wooden one, but it must be cognitively processed for us to recognize the metal object is a push-plate and the wooden one is a door.

Affordances exist even if they are not visible - just because an individual does not know how to operate the switch that turns a device on doesn't mean that the device lacks an affordance for being turned on - merely that the person cannot figure out how to do it. But this transitions to a different topic - the qualities that signal that something is an affordance to the user are "signifiers" (to be discussed next).


Again, a signifier is something that indicates to the user that an affordance exists for him to do something. A door without a push-plate has the affordance to be opened, but adding a push-plate suggests to the user how the door should be opened (omitting it might cause users to be unaware it is a door at all).

And if that is not enough, the plate can be inscribed with the word "push" to further signify its operation - but the author later remarks that if a blatant signifier is used it is an indicator of bad design (EN: I don't entirely disagree, but when a device or affordance is new by necessity, such that people have not encountered such a thing before, then being blatant is not only necessary but good compared to making the user bear the risk of making a wrong guess.)

From a design perspective, the visual appearance of typical affordances constitutes a certain language that we count upon people having learned: we count on people knowing a knob can be turned, that a slot is for inserting something, that a switch can be flipped from one position to another. One of the problems that many designers have is in attempting to reinvent exiting affordances which violate the known language of signifying - such that users must learn a different way of operating the device.

In terms of communication, deviating from the known signifiers is similar to using a word that someone else doesn't understand - or worse, attempting to redefine an existing word to mean something other than what everyone already understands. What results from this is not delight, but puzzlement and often embarrassment for a user who can't figure out how to switch on a lamp until someone else shows them the special way that a badly designed lamp must be operated.

He gripes a bit about the misuse of the term "affordance" to mean a signifier. To place a circle on a panel where the user can touch it to perform an a function is not creating an affordance, but placing a signifier. The affordance, in the author's definition, is created by the engineer who arranges the device to do something when the spot is touched and the user who possesses a finger and adequate strength or body temperature to activate it.

The problem for designers is a practical one: to make the capabilities of a product available to the user - and not merely available, but obvious and understandable at a glance. This may mean changing the form of the object itself so that the means to operate it are self-evident, or more aptly that they are understood by the user, or adding signifiers to make them evident when they are not.

The author admits that he is borrowing the term "signifier" from semiotics and changing its meaning a bit - to include any mark, sound, or other perceptible indicator that communicates "appropriate behavior" to a person. (EN: it would likely be more accurate to consider it to communicate a relationship between a cause the user will contribute and the effect the device will deliver. Whether it is 'appropriate' becomes hazy, and it can then be argued the user failed to do what the designer considered to be appropriate.)

The author mentions that signifiers may take on their own meaning, or that users may misinterpret them. Particularly when the significance of something is unclear, people will discover or make up their own meanings. (EN: A good example of this is the "low oil" indicator in a car - I've met more than one person who believes that is an "oil change" indicator.)

There's also the notion that the digital channel causes signifiers to be lost. Consider that a bookmark in a physical book temporarily market the current place in the text, but also enabled the reader to appreciate his progress merely by seeing the location of the bookmark within the book. Electronic bookmarks are not temporary (the bookmark is not removed when a user returns to a web site) nor does it give any visual indication of how far the user has left to go (though a scrollbar does that for a single page).

The very worst signifier is one that is misleading. To place a "push" sign on a door that must be pulled is to deliberately misguide and frustrate users.

He mentions a clever instance of a signifier that was intentionally misleading: vertical pipes that blocked the entry to a street were flexible so that emergency vehicles could simply drive over them and they would fold without harming the vehicle - so these "pipes" were a signal of no entry to those who were unaware, but permitted access for those who knew.

In design, signifiers are more important than affordances - they communicate how to use the device to accomplish a given outcome, rather than leaving the user to go through a trial-and-error process trying to discover how it should be used.

There's a brief anecdote about a mobile application for which there was no affordance to pull down the menu - the designer expected users to simply know that they could swipe down to reveal it. People couldn't figure it out - when it was shown to them it seemed quite obvious, but because there was no signifier, they could not have guessed that capability existed.


Mapping is a technical term, borrowed from mathematics, meaning the relationship between the elements of two sets of things. Suppose there are many lights in the ceiling of a classroom or auditorium and a row of light switches on the wall at the front of the room. The mapping of switches to lights specifies which switch controls which light.

"Natural" mapping refers to a situation in which the arrangement of controls matches the arrangement of items they control. If there are three lights and three switches, the left switch operates the left light, the middle switch the middle light, and the right switch the right light. Deviating from this order can be confusing to the user because they must either be told, or learn by trial and error, something that does not make sense at a glance.

Mapping is different from the mechanical operations of things. For example, the mechanism for steering a boat is quite complex. The user understands that if they push the tiller to the left, the boat turns to the right. An engineer would say this is not correct - pushing the tiller causes the rudder to move to the right, the rudder blocks the force of flowing water on the right side of the boat, and the unobstructed water on the left side of the boat creates pressure against the keel, and this difference in pressure causes the boat to rotate to the right. The user doesn't care, and doesn't need to know, the precise sequence of events - just the action he takes and the effects it will have.

(EN: Arguably so, but this ignorance leads to mistakes. Pushing the rudder to the left doesn't turn the boat to the right if the boat is not moving because there is no flowing water to create the pressure. And the boat doesn't turn, but rotates, so the path the boat takes will be slightly different than assumed and could lead to a collision with a close object. So the user does need a bit more information an for the information to be more accurate than "pushing the tiller makes the boat turn right" though I would agree that it needs to be in terms he can understand, rather than describing the precise mechanism.)

Loose bits:

To understand how to place and arrange controls effectively requires considering the behavior of the user, not the properties of the device.


Feedback is an indication to the user that he has done something. The most natural feedback is the responsiveness of the device: when the user flips a switch, he sees the light turn on, and knows he has operated it correctly.

In an absurd scenario, there may be switches in one room that operate lights in another - such that the user who flips the switch has no immediate evidence that a light was turned on because he cannot see it happen.

In a less absurd scenario, consider a dishwasher that is manufactured to be completely quiet, such that when the user presses the "start" button he has no evidence that the device has turned on, but must take it on faith that it has.

Another scenario, consider the call button for an elevator. There is a natural delay between pushing the button and the elevator's arrival - but the user does not know this, and may press the button multiple times if it does not arrive promptly and then assume the lift is out of service and take the stairs, even though the car is on its way to his floor. Or you press the power button on a computer and it takes several seconds before the monitor to show anything.

In these scenarios, some form of feedback must be added because there is no natural feedback that the operation was completed correctly. For example, a light near the button illuminates to signal to the user that the command was received and executed (or is in the course of being executed).

Good feedback is perceptible and immediate - even the delay of a tenth of a second has been shown to be disconcerting. Ideally, it is also informative. "A light came on" doesn't tell the user anything, as he must learn what the light means. If a message displays that tells him literally what is happening, all the better. If a light comes on, the machine beeps, or some other ambiguous thing happens, it doesn't really tell the user anything - which can in some instances be irritating.

(EN: This can be overdone, and for devices that are used often it can become quite annoying to get messages. For example, if the elevator call button lights up, most people will understand that means the car is coming to their floor to pick them up. They don't need a verbal announcement telling them what is happening - and if they use the same elevator every day, the verbal "the elevator has been activated and will arrive shortly" announcement would become quite a nuisance.)

Too much feedback is likewise bad. The behavior of "backseat drivers" is a well-known irritant. What they suggest is often correct and helpful, but the driver who is constantly pestered by a backseat driver, who often tells him things he already knows, is not grateful for the constant stream of directions.

(EN: This isn't feedback, but instruction. A better example, though thankfully less common, is people who narrate your actions after the fact. "Turn right ahead" is helpful, but unnecessary. "I see you just made a right turn" is not. And "you should have made a right turn" is even worse.)

Designing feedback well is therefore a difficult balance - giving the user enough, without giving them too much, both in terms of the frequency and information content of the feedback.

He also mentions that engineers like to save money by using a single feedback mechanism to communicate multiple things. Given a single LED light, they try to cram multiple functions into it: a short flash means one thing, a long flash means another, a series of two short and one long flash mean something else. The user must somehow learn (because it is not at all intuitive) what the language of flashes means for this device - and worse still, every machine uses its own patterns of flashes, so the user has to become multilingual in learning to speak the language of each device he uses - because each of them speak a different language, and none of them speak his.

A final note is the intensity of feedback: feedback that confirms that an action has been understood should be subtle and unobtrusive. Starting the dishwasher successfully should not be acknowledge by a loud alarm that makes the user thing something has gone terribly wrong. Though a loud alarm would likely be appropriate where there is considerable danger - like shutting off the fuel when an airplane is in flight.


A "conceptual model" (or "mental model") is an explanation of the way in which a user expects a thing to work. This is different to an engineering model, which is accurate and detailed. Per the earlier example, "pushing the tiller to the left makes the boat turn right" is a conceptual model, and quite sufficient to the needs of a person who means to steer the boat.

The graphical user interface to a computer system supports a conceptual model of files and folders. Early computer systems depicted the hard drive as a filing cabinet, directories as folders, and computer files as slips of paper. The user quickly grasped the concept of putting a given document into a folder in a filing cabinet, and the metaphor spread to virtually all personal computers.

However in the present day, the computer's conceptual model has become more abstract. A computer application is not the same thing as a file, but is treated the same way. Putting a folder inside a folder inside a folder has no correspondence to anything that is done in the physical world. And there are many misconceptions about "cloud" data storage - as people can't relate to sticking a piece of paper into a cloud to store it. It takes some explanation to learn the language of the modern computer system.

He mentions that users bring their own mental models to a device, and those mental models may be wrong. Moreover, people may have different mental models, which can be observed when there are two or three people standing in front of a photocopier arguing about which button to press to make the device collate and staple the copies.

The conceptual model evolves during interaction with a device. In the worst of cases, a person fumbles about with a machine and observes what works or what does not work, making many mistakes along the way. In better instances, there is adequate instruction and feedback for the user to make informed guesses about the machine's operation - he does not guess which button to press to start the device because there is a button that looks like a power button he has seen on other devices, or better still has the word "start" upon it. When there is a clear indication of how to use the device, most users abandon their assumptions and accept what the device is telling them.

Some objects provide instructions by their physical form: a user can easily figure out how to use a hammer just by virtue of the shape of the device and needs no instruction. Or then handle can be designed in such a way that there are grooves for the user's fingers, helping them to know not just where to grasp it, but how to grasp it. The finger-sized holes in a bowling ball are likewise simple to figure out at a glance, though a person may have to work out which fingers to use.

However, things are not always designed to be intuitive, and some cannot be. The author describes a digital watch with three buttons - it's fairly easy to guess that pressing the top button moves something up and the bottom moves something down, but there is a bizarre and unnatural code for selecting what is being adjusted. Changing the time requires the user to hold the top and bottom buttons down until the watch beeps, then using the middle to toggle between hour and minute, then holding the middle down to go back to operation mode. Changing the date requires holding the middle and top button, and setting the alarm requires holding the middle and bottom button. There is no way to intuit the way these controls work - the user must read an instruction manual.

(EN: I note that the author, like many who criticize design, provides no solution that would be intuitive - it's very easy to point out what's wrong, but hard to suggest what's right. Particularly given the wristwatch model, creating intuitive controls with clear indicators is impossible given the small size of the device. Or if it is not strictly impossible, no-one has even drawn a plausible design that would do so.)

The conceptual model is a valuable way to approach a design - consider what a user is likely to believe or likely to try before approaching a device, and then design the device to accommodate that behavior inasmuch as it is possible to do so. If you can pull that off, the device is easy to use and requires no learning or thinking to operate.

A complex model is only necessary for a complex device. A hammer is a simple device, one that a toddler can figure out unless a designer goes out of his way to make the device incomprehensible.


The author goes into a bit of a rant about a refrigerator he purchased recently. Old-fashioned refrigerators had no controls: you plugged them in and they worked. The device was designed to cool the large refrigeration compartment to one temperature and the small freezer compartment to another. A user who knows how to open a door could use the device, needing only to learn the difference between the two compartments.

His "modern" refrigerator has multiple controls for temperature ...

So for the simple matter of temperature control, the user had to go from learning nothing at all (accepting the capabilities of the device as manufactured) to having to learn to use four controls to set specific temperatures. Moreover, the device doesn't tell him what temperature is appropriate - so the user has to go to a separate reference to figure out what is the proper temperature for each compartment and the benefits of having it a few degrees warmer or cooler.

To make matters more complex, the refrigerator gave him a great deal more control than just temperature: the humidity of each compartment could also be controlled. There were dampers for various bins that could be adjusted to increase or reduce the flow of cold air. The ice-maker could be configured in multiple ways and you could specify at the time of use whether to have whole cubes or have the ice crushed to varying consistencies. Oh, and there was also an air filter and water filter on the machine that the user needs to learn to inspect and replace.

(EN: It seemed a long tirade and I thought he was exaggerating a bit - but then I did an internet search for "refrigerator manual" turned up one with more than ten letter-sized pages of instructions in small type just for operating the various features - not including the installation and maintenance sections.)

This is a clear case in which the designers of the device have gone completely insane, because they are focused on the capabilities of the device and not the needs or desires of the user - who merely wants to keep some things cold and other things frozen.

So in this instance, the question should not be how to make all these controls more intuitive - but to ask if they should be provided at all.

(EN: And I strongly suspect manufacturers research this, but likely did so badly. The engineers came up with a long list of features and they asked people if they would value them - and people in a lab will often say "yes" to that question without giving it much thought. An observational study would be better.)

The System Image

The author presents a notion of a "system image" to include the various experiences that give a person an impression about what a device ought to do and how it can be made to do it. This is drawn from experience with similar things in the past, what was communicated by marketing and sales, what was learned from articles and other sources, and the like. It is, in effect, the expectations we have, from whatever source.

Expectations are, but their nature, a fabrication that may be incomplete or inaccurate, and the degree to which these expectations lack accuracy influences our experience of interacting with the device for the first time. The more accurate the expectations, the more they will be successful and satisfied without needing training or instructions.

Where things go awry, the conflict between designer and user is that each think the other is responsible. But ultimately, the user must be accommodate: devices are purchased to deliver a benefit, and if the user is unable to get the benefit because he cannot figure out how to use it, then he will choose a different product (or choose to accept his need will remain unfulfilled). The product is not purchased, the designer is not paid to design.

(EN: This probably needs to be taken a step further, because in a commercial situation the purchaser must believe, even before he has the product in hands, that he is capable of getting the benefit of the product. Particularly for technology products, there are people who refuse to purchase, or even consider purchasing, a given item because they fear it will be too complicated. The perception of usability is what causes them to purchase the first time - the experience of usability is what gets them to repurchase and recommend.)

The Paradox of Technology

The paradox of technology is that it promises to make life easier and more enjoyable - but at the same time each technology we obtain is difficult and frustrating. Consider the earlier example of the refrigerator: at one time all you had to do was plug it in. Now it must be set up, configured, and maintained on a regular basis. So the net effect is worse: the added features of the modern-day refrigerator make it consume more time and effort to do the same task the old one did.

(EN: On a grander level, recall that in the 1890s a lower-middle-class family had only one working parent and two domestic servants. Nowadays even upper-middle-class families have two working parents and no domestics. So in reality, the vacuum cleaner didn't save the housewife, but enabled her to take on a task that a servant used to perform ... and to do it after she gets home from a day at work.)

He goes on a tear about wristwatches: how the first ones were simple mechanisms that used a single knob - pull and twist to set the time. But adding day, date, year, month, stopwatch, alarm, countdown timer, multiple time zones, calculator, and other functions made the simple wristwatch very difficult to fathom. You could say that it replaced a dozen different other devices - but how often did a person carry all those devices around. And now that they have all those functions on their wrist, how many people use all of them in a given day?

The problem is that the more functions, the greater complexity, the harder it is to use, and the less likely anyone is to use the added functions. Which is to say, the more it does the less useful it is, which is another paradox of technology. (EN: I sense that's a bit of a stretch, as the device is more useful, just that fewer people in aggregate need all of the capabilities of the device.)

The Design Challenge

"Design" implies that an item is crafted in order to achieve a given objective. Thus far it has only been considered in terms of usability for a given purchase - but there is also the notion that a product can be designed to be attractive, or affordable, or efficient, or durable, or to deliver some other value. In some instances these qualities come into conflict with one another and may require a trade-off of one thing for another.

The hardest part of design is often to simply get people to agree on these goals - or to get a single person to be reasonable in setting them. A customer wants the product to be both high-quality and cheap, to have many functions but be easy to use, and so on. It is the designer who balances one desire against the other.

And even if you can find a way to satisfy the competing demands of the customer, there is then the demands of the organization that is producing the device. Aside of wanting it to be producible at a cost that enables them to make a profit, they may want it to be easy to service, easy to ship, etc.

The happy ending to all this conflict is that it can be done - and every product that is commercially successful is proof that it has been done. The task for designers in the competitive market is simply to do it better. (EN: And I would add, better to a sufficient degree to win customer's preference over competing alternatives.)