Balancing materials, power, and cost in modern computer design
In Balance
"maddog" looks at the idea of balance in modern computer design – the trade-offs that must be made by designers to meet changing requirements.
A few years ago, I wrote an article about how much paper tape (used on an old ASR-33 Teletype) it would take to hold 1 terabyte (TB) of data. Without going into the detail of that particular article, I had figured out that it would take more than 6,330 years to read in or write out 2TB of paper tape, assuming the Teletype did not break in that time.
That was one illustration of the issue of "balance" – the various trade-offs that have been made by computer designers through the years to accommodate the technology on hand. Previously, devices were of such small capacity that large main memories did not make much sense.
To start with the easiest example of balance, try to imagine a modern-day cell phone built out of the transistors available in 1968. Some of those transistors cost $1.50, while a gallon of gasoline (about four liters) only cost 35 cents. The size of the transistor meant that your cell phone would probably cover the state of Texas, and the power requirements would both need the output of the world's largest hydroelectric plant and create a real danger of climate change on the spot.
I have not done any actual calculations to back up the statements in that last paragraph, but I don't really need to. The point comes through that a cellphone of today would have been impractical to build back then, even if we had known how to build it. So, the first thing that helps create the balance of the modern-day computer is the density of materials – the number of components we can put in a small space through micro-miniaturization. This density also helps with issues of lowering power consumption, which in turn helps lower the heat generated. All of these things help create an affordable product once a solution is found.
Density helps with another factor: speed. As Rear Admiral Grace Murray Hopper used to demonstrate with her 12-inch-long telephone wire representing a nanosecond (and later her pepper grain-sized picoseconds), light and electricity do take time to travel through circuits, and time for a charge to build up and change a 0 to a 1. If you do not give the electricity enough time, then the signal that was supposed to be a 1 ends up looking like a 0 to the circuit that is waiting for it. Of course, as you shrink the circuits, electricity does not have to travel as far, and it does not take as many electrons to charge the smaller circuit, so cycle times can decrease.
However, as the circuits keep getting smaller, the ability for manufacturers to make them in large quantities becomes more difficult. There is good news, however, in the use of new manufacturing techniques, and new materials also allow the components to become smaller and closer together.
So far, I have been discussing issues that most people would assign to CPUs, but which also affect other devices. However, balance also affects the ratio of CPU speeds to memory sizes, cache sizes, and pipelining in processors. Few of the early computers had floating-point instructions, caches, or pipelining. The cost of the components was so high that these mechanisms for speeding up calculations were just not affordable for most computers. Many of the early Intel 386 processors had no floating-point hardware; you had to buy another Intel chip to put on the motherboard to get that feature. Thus, a lot of people avoided doing floating-point calculations unless they had to, substituting integer instructions whenever possible.
Likewise, when your memory size is 4,000 bytes, having a processor run at several gigahertz is not practical; you just will not be able to deliver the data fast enough to keep the processor fed. And, if your RAM memory is small and your processor slow, the average person will not really enjoy the wait between actions – for example, between taking a picture and being ready to take the next picture.
This brings me to the concept of "real time." Slow processors, small memories, and other factors mean that tasks that seem to be real time today were not so real time a few years ago; they were completely impractical (either in terms of cost or technology) from the standpoint of being able to be done in the time needed for them to be finished – a loose interpretation of real time.
Balance in a system is both technological and financial: Can the customer afford the solution? Often, our solutions are limited by the base technologies, the balance of putting them all together, and the total cost. All three have to be considered.
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
NVIDIA Released Driver for Upcoming NVIDIA 560 GPU for Linux
Not only has NVIDIA released the driver for its upcoming CPU series, it's the first release that defaults to using open-source GPU kernel modules.
-
OpenMandriva Lx 24.07 Released
If you’re into rolling release Linux distributions, OpenMandriva ROME has a new snapshot with a new kernel.
-
Kernel 6.10 Available for General Usage
Linus Torvalds has released the 6.10 kernel and it includes significant performance increases for Intel Core hybrid systems and more.
-
TUXEDO Computers Releases InfinityBook Pro 14 Gen9 Laptop
Sporting either AMD or Intel CPUs, the TUXEDO InfinityBook Pro 14 is an extremely compact, lightweight, sturdy powerhouse.
-
Google Extends Support for Linux Kernels Used for Android
Because the LTS Linux kernel releases are so important to Android, Google has decided to extend the support period beyond that offered by the kernel development team.
-
Linux Mint 22 Stable Delayed
If you're anxious about getting your hands on the stable release of Linux Mint 22, it looks as if you're going to have to wait a bit longer.
-
Nitrux 3.5.1 Available for Install
The latest version of the immutable, systemd-free distribution includes an updated kernel and NVIDIA driver.
-
Debian 12.6 Released with Plenty of Bug Fixes and Updates
The sixth update to Debian "Bookworm" is all about security mitigations and making adjustments for some "serious problems."
-
Canonical Offers 12-Year LTS for Open Source Docker Images
Canonical is expanding its LTS offering to reach beyond the DEB packages with a new distro-less Docker image.
-
Plasma Desktop 6.1 Released with Several Enhancements
If you're a fan of Plasma Desktop, you should be excited about this new point release.