Hacker News new | past | comments | ask | show | jobs | submit login
Looking at the Atari 400 Part 1 – By Paul Lefebvre (goto10retro.com)
26 points by rbanffy 76 days ago | hide | past | favorite | 13 comments



The Atari 400 and 800 are interesting to me because they had a different video architecture from most of the other computers of its generation. Most systems of the day had a simple frame buffer and/or “tile” system where the tiles are defined by a character ROM so you could draw an A to the screen by writing a 65 to the right memory system. Some systems (like the Apple ][) could work in a split screen mode where part of the screen was bitmapped and another part was text but this was an exception instead of the rule.

The Atari machines on the other hand had two chips that worked together to prepare the display data which meant that you could choose a different video mode for each line which gave developers a lot of flexibility and reduces the display RAM requirements for a machine that launched with just 8k of RAM. (For instance if a horizontal line was all one color you did not need that frame buffer line; if you wanted to do horizontal or vertical scrolling that could be configured by setting the offset of where scanning starts)

See https://en.wikipedia.org/wiki/Atari_8-bit_family#Design


I remember the term "player missile graphics" from the day. Google has several articles on the subject.

https://www.atariarchives.org/creativeatari/PlayerMissile_Gr...


Also Display List Interrupts: they made more colors per screen and other things possible.

https://www.atariarchives.org/creativeatari/Display_List_Int...

https://playermissile.com/dli_tutorial/

It's great that today you can get so much information online. Back then it was much harder so I could never quite wrap my head around that stuff as a teenager.


That was pretty interesting from an engineering perspective, and highlighted the different compromises computer makers made. The Atari could display certain types of graphics, like gradients, trivially - while it could struggle with fine bitmaps unlike systems that mapped a bitmap buffer straight to screen.

These compromises gave each 80s system an unique artistic style... Unlike current consoles which are just triangle and shader pumping machines with differences largely on throughput.


Most machines of that era had a pretty standard architecture except for a few strange ones like Atari and TI-99/4A (minuscule amount of ordinary RAM but a fair amount of “video RAM” the CPU can access through ports; no bitmaps, only tiles) and the early Sinclair (which used Lancaster’s “cheap video” https://www.tinaja.com/ebooks/cvcb1.pdf to delete the display controller)

One thing that did matter was color choice. My TRS-80 Color Computer had a terrible choice of colors that you’d expect from a physicist or electrical engineer, the C-64 had a small number of hand-picked colors that looked nice. (Somebody ought to tell the people who maintain contemporary terminals because I usually have to do “:syntax off” to see what I am editing with vi and it was annoying as hell to set up a Linux computer attached to a TV this weekend because I was sitting at an odd angle which made most colors illegible except for white.)

Another interesting fact to me is that 1980s era display controllers were conceptually simple but had a huge part count in terms of gates because of multiple wide data paths. Today it is fun to build them with FPGAs but difficult to build them out of discrete parts. I saw an ad in Byte circa 1979 for a gfx card for S-100 computers that had much better specs than the average home computer but was a fairly large board packed with chips on both sides.

Thus most of the home computers had an ASIC display controller and projects like

https://www.commanderx16.com/

struggle and wind up using an FPGA or microcontroller for the display controller as opposed to something authentic. Sometimes though I think it might be fun to try building a display controller on a few big breadboards.


> Sometimes though I think it might be fun to try building a display controller on a few big breadboards.

I haven't found a full schematic or analysis anywhere, but I've been tempted a few times to try replicating some incarnation of Alpha Denshi graphics hardware. It's the lineage that led to the Neo Geo hardware, and for any given generation has some impressive capabilities despite being built (AFAIK) entirely out of memories and standard logic chips.

I guess it would be more practical to start with some really minimal hardware like Minivader or Dottori-kun (which, legend has it, were created entirely due to an odd regulation requiring arcade cabinets to be sold as "complete" units, and expected to be thrown out by the buyer), but that's not as fun (and IIRC they have some janky sync behavior that might not work on any display I own).


I did a lot of 8-bit Atari programming at the time, and the display list was the greatest thing ever. I was a bit surprised that other computers had global modes rather than being able to mix-n-match and create a custom mode. Double-buffering was also trivial, by writing to the other memory area then setting that address as the display list starting address in the vertical blank interrupt handler.

The C64 had better sprites and sound, but the Atari did more with less, with the 2600/VCS being an extreme example. In particular the SIO[0] (serial I/O) architecture let peripherals work without a lot of logic, whereas the C64 drive had a more powerful CPU than the C64 itself. Atari SIO with its daisy-chaining, device numbers, and command/response structure is sometimes referenced as a precursor to the design of USB. Atari SIO was even better in that devices could load their drivers over SIO when booted so that devices made in the future could be supported without OS/ROM changes or manual driver loading.

[0] https://en.wikipedia.org/wiki/Atari_SIO


minor nitpick: The 1541 disk drive's CPU was just a 6502 running at 1 Mhz. Basically the same chip as in the C64 (6510), but without the built-in I/O ports.

Atari's SIO system was way better.


One way to think of it is as an enhanced version of the Atari 2600 graphics chip with a sort of primitive "GPU" put in front of it to offload the "racing the beam" stuff. Amiga "copper lists" later generalized the concept to drive a larger array of relatively simple specialized I/O chips (presumably due to Jay Miner).


Yep, it was really clever how it was built in separate stages.


The Atari 8-bit machines were ahead of their time. Their graphics are so good that an architecture first sold in 1979 was still market-competitive among other inexpensive home computers in 1987, when the XEGS appeared as the last new model in the series. About the only thing that one might miss is built-in 80-column graphics. The entire line was (more or less) compatible with each other from start to finish, too; far better than Commodore, which over the same time period sold at least five different 8-bit computer lines, all mutually incompatible with rare exceptions. SIO is far superior to Commodore's jumble of serial, user, and cassette ports, too. (Did I mention that this all appeared in 1979?)

Arguably Atari's sophistication was a disadvantage. The need for custom ICs, without an in-house fab like Commodore, and a focus on quality manufacturing as opposed to Commodore's slipshod methods caused the 8-bit line to be far too expensive compared to competitors; in 1979 one might as well have bought an Apple II and gain a larger library, far more expandability, and the amazing Disk II. By 1982 the Commodore 64 brought (slightly) superior graphics and sound at a far lower price. The key year is 1983, when Jack Tramiel waged war on the entire rest of the industry with massive discounts on the VIC-20 and 64 leading up to Christmas; the primary opponent was TI (which indeed surrendered in November), but Atari was collateral damage.

That need not have been fatal for Atari, but a) product shortages of the new 600XL and 800XL models in 1983 because of mishandling of Asian production, b) the massive losses that parent Warner Bros. suffered from the collapse of the video game market (which Commodore's aforementioned price war contributed to), and c) consequent sale of Atari to the same Tramiel after he quit Commodore, caused developers to shy away from new software releases from 1984 onward. Once the software disappeared, so did customers.


Jay miner who lead the design team for the Atari 8-bit series went on to design the Amiga chipset.


Cool, Paul is the Xojo guy (Think VB6 for Win/Mac/Linix)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: