In information technology, the prevailing mindset is one of logic, science, and mathematics, which is natural given that computing devices are the product of science engineering. However, when systems are delivered into the hands of human users, they are merely tools in the hands of a person who is not always logical, scientific, or mathematically precise.

The developer approaches a system in a rigid manner, with a rigid understanding of data and the way it will be stored, processed, and acted upon. To the developer, the user is a component in a system, who will act in a predictable manner. The designer does not expect the user to deviate his perceived role - to improvise, cut corners, hack, and tinker - or if so, he will perceive a limited number of ways in which such actions might happen, and seek to discourage such behavior and return the user to a state of predictable order.

This is not to say that tinkering is necessarily bad. Human users tinker in order to overcome the shortcomings of the system. In some instances (the example is given of the MIR space station), such tinkering is necessary to overcome design flaws or shortcomings, and result in improved performance - or better, to the achievement of the goal that could not be achieved if the technology were used as intended by its developers.

The experience of the author has convinced him of the need to accept the unfinished, untidy, irregular, and ill-designed as inherent qualities of the system, and the tinker and hack as fundamental practices.

The discovery of unintentional capabilities, while an anathema to engineering models and methods, has been a significant finding in the study of user behavior. The reaction of the engineer has been to discourage tinkering and to attempt to prevent it in order to compel the user to a prescribed course of action, even if it is frustrating and inefficient. The author's consulting approach, suggesting unprescribed user behavior be permitted and even encouraged, has been met with mixed (and polarized) reactions, depending on the role of the audience and culture of the organization.

The author posits that the only "authentic" way to deal with such situations is to focus on the desired goal rather than the precise course of action that led to it - and when viewed from that perspective, "aberration" becomes "innovation," enabling us to find useful and effective ways to evolve the technology toward better suiting and accomplishing its intended purpose.

Getting closer to the "real life" of systems, the way that users actually interact with them, requires a more intuitive perspective, just as the users who improvise and hack take as they discover unintended system capabilities. They do not entirely dismiss the system's model, but find ways to work within it to achieve their intended outcome.

The "themes" discussed in the book need not be confined to information technology: they apply to any situation where there is a structured system (a business organization, for example) where behaviors are prescribed and expected, but are not necessarily effective nor efficient, leading users to find ways of working the system, or working around the system, to accomplish goals.

The author draws a parallel to the Renaissance period, in which men in many fields of endeavor (arts, sciences, etc.) broke from prescribed patterns of behavior (the "traditional" ways) to achieve great goals and transform their field of study, and even their culture. Information technology is, in many ways, similar to the "dark ages" of conformity and dehumanization, and is in need of a similar rebirth.

The author's method in this book is to present ideas and dilemmas. He concedes it will not be systematic, which seems fitting as the failure of a systematic approach, itself, is the problem this book seeks to address.