Reviews
“An intriguing study of data and how it has evolved, ‘The Structure of Digital Computing’ is well worth considering, highly recommended.”
The Midwest Book Review
“Fascinating insights into the past, present and future of computing.”
Kirkus Reviews
“If you have ever wondered whether there is any structure beneath all the noise in the marketplace about computing, this is the book to read.”
Stuart Bailey, Founder and CTO of Infoblox
“Reading this book will help you understand the basic concepts and trends that have shaped computing for the past half century and that will continue to do so for the forseeable future.”
Joel J. Mambretti, Director of the International Center for Advanced Internet Research at Northwestern University
Separating Genuine Technical Advances from Market Clutter and Why it Matters
After reading the Structure of Digital Computing, you will learn the answers to the following questions:
- How would you explain the last 50 years of computing if you had only 90 seconds? (Chapter 1)
- What is commoditization and why is it so important to understanding computing trends? (Chapter 2)
- How can you distinguish important technical advances in computing from market clutter? (Chapter 3)
- Do new computing technologies generally take 1 year, 2 years, 5 years, 10 years, or longer to develop? (Chapter 4)
- What is big data? (Chapter 5)
The Structure of Digital Computing takes a 50 year perspective on the history of computing and divides this period into five overlapping eras: mainframe computers, personal computers, the web, and clouds of devices. The book argues that most of what vendors try to pass off as revolutionary is simply market clutter. It also argues that genuine technical innovations are rare and hard to predict, but are usually recognized and appreciated quite quickly.
From the book’s perspective, we are transitioning from the third era of computing (the web) to the fourth era of computing, the era of computing devices. In the device era, most of us have replaced our desktop and laptop computers with digital devices, such as smart phones and, in the future, wearable computers. The Internet of Things (IoT) is another term that is used to describe this era.
These devices (large and small) are all producing data that has led us into the fifth of computing: the era of big data.
About the book
The book is about 220 pages long and is divided into five chapters:
- The Five Eras of Computing
- Commoditization: Beyond Moore’s Law
- Technical Innovation vs Market Clutter
- Technology Adoption Cycles
- The Era of Data
The first chapter is an overview of the first four eras of computing. Chapter 2 is about technology commoditization (the process in which computing infrastructure, such as computer chips and storage disks, become commoditized over time) and its impact. Chapter 3 looks at some of the roots of technology innovation and market clutter. Chapter 4 explains why it usually takes at least a decade for us to adopt a new technology. The last chapter is a gentle introduction to the era of big data.
Here is the table of contents.
Why It Sometimes Makes Sense to Look at an Old Book for New Ideas
Much of the book was written during the period 2001 – 2003, and then polished a bit and published in 2012. A good question to ask is why look at a book that was published over 10 years ago and largely written over 20 years ago. The main reason is that the point of view of the book is to take a fifty year perspective on computing in order to see more clearly into the future of computing.
Here are extracts from the 2012 preface:
This book is about the structure of digital computing: what is significant, what is novel, what endures, and why it is all so confusing. The book tries to balance two point of views: digital computing as viewed from a business perspective, where the focus is on marketing and selling, and digital computing from a more technical perspective, where the focus is on developing new technology.
My goal was to write a short book about digital computing that takes a long term point of view and integrates to some extent these two perspectives.
The book is shaped by my personal experience in these two worlds: From 1996–2001, I was the Founder and the CEO of a company called Magnify, Inc. that developed and marketed software for managing and analyzing big data. Prior to this, from 1988–1996, I was faculty member at the University of Illinois at Chicago (UIC), where I did research on data intensive and distributed computing. From 1996– 2010, I remained at UIC as a part time faculty member.
Although there have been some changes since 2003 (for example, computers are faster, there are more web sites, and phones are smarter), hopefully as the book will make clear, at a more fundamental level, we are still on the same fifty or so year trajectory today that we were on in 2003.
About the Author
Robert L. Grossman is a Partner at Analytic Strategy Partners LLC, which he founded in 2016. From 2002-2015, he was the Founder and Managing Partner at Open Data Group, which built and deployed predictive models over big data in financial services, insurance, healthcare and IoT. He is also the Frederick H. Rawson Distinguished Service Professor of Medicine and Computer Science, and the Jim and Karen Frank Director of the Center for Translational Data Science at the University of Chicago.
Getting the Book
You can download the entire book or any of the individual chapters.
- The Five Eras of Computing
- Commoditization: Beyond Moore’s Law
- Technical Innovation vs Market Clutter
- Technology Adoption Cycles
- The Era of Data
- Notes and references
You can also order the book from Amazon. The book is available as a paperback (for $10.95) or for the Kindle (for $8.00).
Note that it’s less expensive to buy the book from Amazon than to print the downloaded version.