Monday, July 07, 2008

32. How It Works: Your Guide to Notebook Technology

By: Pulp (Dustin Sklavos) from NotebookReview.com

I suspect the way most people look at their computers is the way I look at my car: confused and silently hoping that each day it will just work without any problems. The technology that goes into computers has progressed at a breakneck pace that until recently has only showed signs of slowing down; that said, the "slowdown" is roughly equivalent from reducing your speed from about 125 miles per hour to about ... we'll say 105.

Mercifully, if anything stays the same, it's the way computers are built. If you pop open the case on a desktop computer, there's an amalgam of wires, circuit boards, fans, and all kinds of crazy stuff that will likely just confuse people. But the logic and design that goes into this labyrinth of technology has actually gone relatively unchanged over the past decade. Of course, your laptop, on the other hand, won't do you the luxury of having all these interchangeable parts. In many ways a laptop can actually be MORE complex than a desktop.

Case in point: if my video card in my desktop goes on the fritz and decides it doesn't want to draw pretty pictures anymore, I can just pull it out and replace it with a new one. Not only that, but I have a wide variety (probably too wide) of cards I can put in a desktop, from $20 to even as far as $2,000 for workstation-class hardware. How did I know the video card itself went bad? I plugged my monitor into another computer and it worked just fine.

But what if your laptop stops giving a picture? It could be the cable that connects to the screen inside the shell. It could be the logic in the screen itself. It could be the graphics processor, the processor normally mounted to a removable video card in a desktop but soldered into place inside the laptop.

This long introduction boils down to a single point: laptops are complex. Even for the seasoned technology enthusiast they can be tricky; for the neophyte and the average user, they can be downright mind boggling. And that's where this guide comes in.

How It Works: Prologue

Over the next several weeks, I'm going to break down a single part of your laptop and explain what it does. This isn't going to be about recommendations, this is going to be about knowledge and understanding, dispensing information in such a way that eventually, you won't even need recommendations. You won't need someone to tell you "get this, this is good." You'll be able to know what you need and why.

This week I'll be covering the basic information you'll need to understand the stuff I'm talking about. Don't worry, it's not a huge deal, and there isn't going to be a test later.

There's one very important term and two units of measure that are vital to explaining any of these things to you.

Units of Measure: Hertz (Hz)

The dictionary definition of hertz is a measure of frequency.

You will probably never hear anyone refer to a computer as operating at hertz; the lowest you'll ever hear is megahertz (MHz, 1 MHz is equal to 1,000,000 Hz), and over the past few years you've also heard gigahertz (GHz, 1 GHz is equal to 1,000 MHz). I'll basically be starting at megahertz.

While megahertz may superficially be used to calculate "speed," it's best analogy would be revolutions per minute in a car. Basically, depending on the gear your car is in, 3,000 rpms may result in more or less work getting done. There are a lot of factors that go into just how much work actually gets done: who made the engine, what type of engine it is, etc. Processors are basically the same way. An AMD processor and an Intel processor both running at 900 MHz will do different amounts of work even though their frequencies are the same.

You will also hear megahertz used interchangeably with gigahertz (GHz); again, 1 GHz is equivalent to 1,000 MHz, so if your processor runs at 2.2 GHz, it's running at 2,200 MHz.

The terms "clock speed," "clocks," and "frequency" will all be used interchangeably in regular jargon to measure the same thing: the megahertz or gigahertz a processor runs at.

Units of Measure: Bytes (B)

Without getting into the esoterica of bits, etc., a byte is basically a means of measuring data. The same prefixes used to measure hertz apply to bytes, but it gets a little tricky. I'll try to make it easy for you.

We count in what's called "base 10," and just about everything we interact with on a daily basis is counted in this fashion. We count in units, tens, hundreds, thousands, and so on. We're used to it. It's just how we measure things and it works out fine.

Your computer, on the other hand, counts in "base 2," or binary. Computers operate data basically in a sequence of "on" and "off;" and just about everything "digital" these days can be broken down into 1s and 0s. What this results in is some math that's going to feel slightly goofy. Basically, here's how your units of measure work out:

1 kilobyte (kB) = 1,024 bytes (B)
1 megabyte (MB) = 1,024 kilobytes (kB)
1 gigabyte (GB) = 1,024 megabytes (MB)
1 terabyte (TB) = 1,024 gigabytes (GB)

You see how the scaling can get kind of confusing. If you visit wikipedia, you'll see "kibibyte," "mebibyte," "gibibyte," and "tebibyte." Look, that's probably closer to "correct," but that's not everyday jargon and it's not what everyone's comfortable with. I had an English teacher tell me "alright" isn't an actual word. You know what? Everyone uses it. "Alright" will make more sense than "all right" in modern language at this point, so if anyone tries to correct you on measuring bytes, punch them in the face because they have nothing better to do than quibble with you on minutiae.

Where were we? Ah yes. Since we're used to counting in base 10, we expect a megabyte to be a thousand kilobytes, not 1,024. Without getting into specifics of what we're counting, I'll just say that generally it's okay to fudge this particular detail. So if you buy something that advertises, say, 320 GB and you only see 299 ... it's in the ballpark. You got what you paid for.

That's a lot of confusing terms just to get to a simple point, ain't it? Like I said, it's generally okay to fudge it. When I get to a part of your notebook that it isn't okay, I'll let you know. Trust me, it seems like a lot, but it can become second nature in a hurry.

Very Important Term: Bandwidth

Bandwidth is basically used to describe the speed at which data can travel, and is generally notated as kilobytes per second (kB/s), megabytes per second (MB/s), and gigabytes per second (GB/s). You see how it works.

This is very important to know, because the connections between components inside your computer are largely designed around the concept of managing bandwidth.

Bandwidth has a nice bonus over the terms of measure I brought up above because it works pretty linearly. 23 GB/s is always going to be faster than 21 GB/s, unlike MHz where 900 MHz can be slower than 600 MHz if the 600 MHz processor is a better or more efficient design. Likewise, there's no confusing jargon like there is with 1,000 MB being used to fudge 1,024 MB. Bandwidth is what it is: how fast data can move from point A to point B, and we measure it in MB/s and GB/s because it sounds cooler that way (actually, we just measure it in whichever way is most practical and appropriate to scale, but I like my explanation better).

Conclusion

This felt like a textbook, didn't it? Makes your brain hurt, doesn't it?

It's cool, I'll do you a solid like a good textbook does and condense the nonsense:

  • Megahertz (MHz) are more or less analogous to revolutions per minute (rpms) in an automobile engine and are not a direct notation of "speed."
  • MHz are used to measure the frequencies processors run at.
  • Bytes (B) are used to measure amounts of data.
  • We count in base 10; computers count in base 2. It's okay for you to fudge the difference in bytes as long as you're conscious of it.
  • Bandwidth is measured linearly in a measure of bytes per second, and will be the basis for a lot of parts of this series.

No comments: