1. #1

    What exactly were these mysterious "bits" during the 90s console wars?

    The console wars during the 90s could also be called the bit wars. Kids on the playground fought over which console was the best, and one argument that was in use were these bits that console manufacturers kept marketing about.

    Everyone said "my console is better because 64 bits are more than 32!", but nobody really knew what they were. Thry were just bits, and the higher the bits the better the graphics.

    Well, it's time once and for all to solve this mystery.
    What were these "bits" spoken of in legends?

    And for fun; https://youtu.be/qDRaz9nmJgQ

  2. #2
    I am Murloc! Atrea's Avatar
    10+ Year Old Account
    Join Date
    Apr 2010
    Location
    Montreal, QC
    Posts
    5,740
    They're talking about the CPU architecture.

    The original NES and Sega consoles both possessed an 8-bit CPU. In laymans terms, think of it like a highway, 8 lanes wide. That's the data in and out of the CPU. Each "clock cycle" (think, a second, in computer time) would have 8 bits of data.

    A 16-bit CPU would have double this, 16 lanes, in the same amount of time. This effectively doubles (though the real effect is actually a little more complicated than doubling) that amount of processing power. The Sega Genesis was the first 16-bit home console, and they were actually the ones who started the whole "16 > 8 lel" thing. They also used terms like "blast processing" (which is meaningless - the Genesis simply has a faster CPU than the NES, is all) to position their Sega Genesis console ahead of the NES. It wasn't until a few years later when Nintendo launched the SNES, that they too had a 16-bit console.

    You can see how this progresses; 16 < 32 < 64. It's interesting to note however that some of the original 64-bit systems - Atari Jaguar for instance - were not using a 64-bit architecture, but instead, 2x32-bit CPUs.

  3. #3
    Quote Originally Posted by Cidzor View Post
    iirc, the whole "THIS MANY BITS" marketing point was more of a Nintendo thing. Aside from some of the more obscure/short-lived consoles (Atari Jaguar comes to mind), I don't recall many non-Nintendo companies being really big on that.

    The console wars were definitely good times though. I still remember having a Nintendo Power subscription in the 90's where they talked all kinds of shit about Sega. Like...they had a section where they'd show fan art sent in by readers, and it often had stuff along the lines of Mario and Luigi tag teaming to beat the shit out of Sonic the Hedgehog.
    The bit war was well and truly alive with other companies, especially so with Sega who launched their 16 bit system to great clout long before SNES
    Modern gaming apologist: I once tasted diarrhea so shit is fine.

    "People who alter or destroy works of art and our cultural heritage for profit or as an excercise of power, are barbarians" - George Lucas 1988

  4. #4
    Bits have nothing on blast processing.

  5. #5

  6. #6
    The "bits" refer to the word size of the CPU, was abused for marketing, and is currently irrelevant. It boils down to how large of an integer can the CPU handle every cycle. A bit is a boolean value; it is true or false, 1 or 0. The 8-bit consoles are working with a integer range of 0-255 (2^8), the 16-bit consoles are working with a integer range of 0-65,536 (2^16), and so on. With 64-bit CPU architecture being mainstream, and a integer range of 0-1.8446744e+19 (2^64), word size has become irrelevant.

    The word size puts a cap on the number of things you can index/address without significant overhead. So, the 8-bit NES word size essentially caps all integer variables like resolution-width, resolution-height, colors, etc. to a range of 256 values. About 15 years ago, people would tell you that anymore than 4GB of memory was a waste on a 32-bit cpu/mobo/os. That's because the max value of a 32-bit integer was reached and additional memory couldn't be addressed.

    The bit marketing started with the 16-bit generation and increasingly got more deceptive. As Atrea pointed out, the Atari Jaguar was marketed as a 64-bit system, but was really two 32-bit CPUs. The N64 has a 64-bit CPU, but a 32-bit system-bus. The Gamecube was considered a 128-bit system, but really is a 32-bit CPU that can perform 128-bit SIMD instructions.

  7. #7
    Quote Originally Posted by Cidzor View Post
    iirc, the whole "THIS MANY BITS" marketing point was more of a Nintendo thing. Aside from some of the more obscure/short-lived consoles (Atari Jaguar comes to mind), I don't recall many non-Nintendo companies being really big on that.

    The console wars were definitely good times though. I still remember having a Nintendo Power subscription in the 90's where they talked all kinds of shit about Sega. Like...they had a section where they'd show fan art sent in by readers, and it often had stuff along the lines of Mario and Luigi tag teaming to beat the shit out of Sonic the Hedgehog.
    Sega emblazoned "16-bit" on the front of the Mega Drive and later brought out the 32-X, Amiga and ST were very competitive over releasing the first 16-bit home computer, then Amiga were very proud of releasing 32-bit machines, later bringing out the CD32, and I'm pretty sure the Playstation and Saturn both had their bits touted which is what led to the N64's name.

    - - - Updated - - -

    Quote Originally Posted by Woogs View Post
    The "bits" refer to the word size of the CPU, was abused for marketing, and is currently irrelevant. It boils down to how large of an integer can the CPU handle every cycle. A bit is a boolean value; it is true or false, 1 or 0. The 8-bit consoles are working with a integer range of 0-255 (2^8), the 16-bit consoles are working with a integer range of 0-65,536 (2^16), and so on. With 64-bit CPU architecture being mainstream, and a integer range of 0-1.8446744e+19 (2^64), word size has become irrelevant.

    The word size puts a cap on the number of things you can index/address without significant overhead. So, the 8-bit NES word size essentially caps all integer variables like resolution-width, resolution-height, colors, etc. to a range of 256 values. About 15 years ago, people would tell you that anymore than 4GB of memory was a waste on a 32-bit cpu/mobo/os. That's because the max value of a 32-bit integer was reached and additional memory couldn't be addressed.

    The bit marketing started with the 16-bit generation and increasingly got more deceptive. As Atrea pointed out, the Atari Jaguar was marketed as a 64-bit system, but was really two 32-bit CPUs. The N64 has a 64-bit CPU, but a 32-bit system-bus. The Gamecube was considered a 128-bit system, but really is a 32-bit CPU that can perform 128-bit SIMD instructions.
    I'll add to this that once we reached the 32-bit era it became a bit pointless as a measure of "power." Early 32-bit machines like Amigas were much weaker than later consoles like the Playstation with it's dedicated 3D graphics.

    There was also a brief bit-wars to do with colour, though that may have just been for us on the nerdier side of things. For example the early Amigas were 12-bit colour (palette of 4096) with 32 on-screen. AGA Amigas had 24-bit colour (palette of 16.8m, considered all the colours humans can distinguish) with 256 on screen. This way of measuring capped out as soon as PCs came along with 24-bit "true colour" cards that could display the entire range of colours on-screen. For a brief time some cards were advertised as 64-bit though that was muddling in the processor size, they weren't displaying quintillions of colours. After it became all about polygon count as dedicated gfx memory.

    - - - Updated - - -

    Understanding bits is still important these days - you'll notice that your internet speed is sometimes written as Mbps and others MBps. The first refers to Mega-bits per second, the latter to Mega-Bytes per second. The internet uses 8 bits per byte so the rate of MBps will be one eighth of the rate of Mbps. Programs on your computer usually inform you of the speed in MBps as that is most relevant to how long you will have to wait. Internet companies usually advertise their rate in Mbps presumably because it lets them advertise with a bigger number. That's why the 32Mbps connection you pay for will not download faster than 4MBps on Steam.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •