Understanding The Nature Of IT And Its Limitations
The following passage is draft material from my new book. Yes, it is MIS 101 type material, and therefore critical to get precisely correct. Comments, especially from computer scientists, appreciated. It is important for any professional engaged in the use of computers to understand their nature and limitations:
- All commercial computers, regardless of operating system, architecture, vendor, and programming language, operate in fundamentally the same manner, with the same potentials and limitations. They can store and retrieve binary data (ones and zeros), and make decisions to manipulate data based on simple logical tests: this bit is "0," this bit AND this bit are both "1," this bit OR this bit are equal to "1," and so forth. All digital computing has been built on this foundation since the 1930s. We will not see any change to this until laboratory technologies like quantum computing or analog computing mature.
- Computers require specified procedural solutions to problems. They follow instructions but cannot generally create new instructions for themselves. (Such capabilities are still the subject of research, not commercial computing.) Regardless of the input or output form (text, graphics, audio, video), all computing boils down to this requirement.
- Many problems are straightforward to solve with computers. Other problems cannot be solved with them at all, because clear procedural solutions to those problems have not been formulated (e.g., predicting the weather or the stock market).
- Even when clear procedural solutions exist, they may require more computing power and time than we can imagine ever existing -i.e., the largest and fastest hypothetical computer running programs for the life of the universe. There are known problems with significant business value that fall into this category, e.g. in logistics, supply chain, and many other areas. Much innovation in computing comes in this area, as "good enough" solutions employing creative tradeoffs are developed for these kinds of problems, and technology advances increase the boundary of what is "computable." Many computer scientists spend their careers on such problems.
- Developing software is endlessly surprising and uncertain. Computer programs, in general, can never be completely tested or proven correct. This has been mathematically proven. (Some classes of programs can be demonstrated as correct, simple things like the software controlling your microwave oven.) Using software testing resources and time optimally therefore requires professional skill and experience to understand tradeoffs and risks.
- The human resource issue in IT has certain peculiarities, most notably the radical differentials in software programmer productivity. Empirical research has consistently shown that the most productive developers are up to 30 times more productive than the least. No other field has this problem.
View All Articles by Charles Betz
Our Daily Email of Breaking eBusiness News
About the Author:
Charles Betz is a Senior Enterprise Architect, and chief architect for IT
Service Management strategy for a US-based Fortune 50 enterprise. He is author of the forthcoming Architecture and Patterns for IT Service
Management, Resource Planning, and Governance: Making Shoes for the Cobbler's Children (Morgan Kaufman/Elsevier, 2006, ISBN 0123705932). He is the sole author of the popular www.erp4it.com weblog.
WebProNews RSS Feed
More Expert Articles Articles