Jay Miner's Vision

This article explains Jay Miner's vision. Inspired by a question from Tommy Hallgren in scene.se, Amiga vs. ST: Why is one 7.14 MHz and the other 8 MHz?
In short, the clock is normally 7.09 MHz for a PAL Amiga, and 7.16 MHz for NTSC. It's close to a color clock which was at the time roughly matching RAM speed; it's not exactly a magic number - details below.
On later Amiga computers the color clock remains the same, while the CPU runs on its own clock and an optional external FPU runs on a third. Here, the motherboard is essentially treated as a graphics card.

Background

The background for this topic goes back to the 1970s, and it is that all home and personal computers somewhere had chips for 15 kHz, and then a faster frequency depending on the current horizontal resolution.
It cannot be overstated how big the 15 kHz standard was. If you had anything that displayed a picture, it was 15 kHz, or the user (and the hardware developer) pay sky high to develop their own electronics backing a higher-performance cathode ray tube, which they also had to develop and put into production.
This standard was abandoned in a few cases, prior to the next standard - 31 kHz as in VGA (which is essentially just a doubling). Examples include Atari System 2, Atari ST monochrome output, and 1970s inspired graphics cards that "showed picture" on PC.
Houston: we have picture!!1one.

But how do you get a computer to not only show a picture, but also to update the picture content quickly, as in games, or simply create a responsive computer? It took a very long time before e.g. the PC could do this with the same fidelity that was already there a decade ago, and long into the 1990s the PC user experience was like using an old computer from the 1970s. Even then, there were graphics cards for home computers that Unix and CP/M routines could control.

A responsive computer

The invention was to split the bus between the chip set and the CPU, and when the CPU wasn't addressing the memory, to let the graphics circuit (VDU) read. Initially, this let Apple II show a bitmap of sorts and modify it quickly, as opposed to practically all other personal computers of the time.
The invention was copied to several 8-bit computers. C64 and Atari ST has a 50/50 split of the bus similar to Apple II, but yet other required manually writing pixels to a separate VRAM, which released the CPU (if correctly implemented), but enforced severe limitations on updating the picture rapidly.

The Amiga computer

On the Amiga, the 15 kHz standard was kept, likely from its massive popularity that would continue far into the 1990s, not least with consoles and DVD, and with a thought towards video effects, which became huge around 1990. It would take many years until any other computer for less cost than a new car could do anything like that.
In the case of Amiga, though, this invention is considerable evolved. If one developed a complete chip set which arbitered (fought over) the memory bus in an inventive and strict way, you would not only have a CPU that did what it should, but to a low performance cost be able to use the memory bandwidth fully. This to build a complete computer-system-on-a-motherboard with full OS support.
This was a major shift away from the previous solutions, all the way back to the 1970s: you were always forced to acquire the right software to talk to the right hardware - unless you settled for the expansions the computer brand could offer. This works in a niched computer solution, but not in one that can be popular (unless extreme prejudice happens, such as takes place in the form of illegal monopolies).
By the much more advanced bus arbitration of Agnus, the chips could cooperate, so that not only the VDU could read from RAM, but also other chips, and the Blitter, floppy, and SCSI DMA could write to it, affecting CPU performance minorly.
Now, such a thing would be very complex using a fully asynchronous design. The Atari ST doesn't that, but has the same approach as C64 and others: only the CPU can write to RAM for display, so those chips are slave rather than co-op, and what happens to the picture depends on where the raster is when the write happens, in a simple 1:1 relationship as for previous computers.

Why the difference in frequencies?

You could say that the frequency of the Atari ST is a completely natural choice and as expected, and that if you can get the chips of the Amiga cooperate, you can do everything Jay Miner had as vision and cause the big differences one experiences.
The Amiga's frequency is chosen with regard to that vision, but in a way so that it is 100% compatible with video effects - and roughly with DRAM timing. (It's worth noting that CRT circuitry is forgiving, as for some flat screen display chips, and that armed with this knowledge you can "tweak your modeline".)

Modern Amiga-inspired computers

If we were to remake Jay Miner's vision in modern components, centered on memory bandwidth with chips cooperating in a similar way, such a computer would be experienced as very responsive even today. <3
In my meaning, this would be a truer successor than even an Amiga with a faster CPU and a graphics card, and the latter is of course a 100% legitimate Amiga, like my accelerated ones.
But as always: when it comes to popularity, it's about getting others to dig the hype. ;)