I will preface this by saying I'm not trying to rant at anyoneor be rude (and I apologize if it comes across that way), I'm just trying to be as precise as I can be with my question; I'm after a very specific thing.
I know how DHGR works. The VRAM is split between the main and aux memory banks and interlaced, and also the lines aren't laid out linearly, and that the bits displayed are drawn from each byte least-significant bit first and bit 7 is ignored, etc. On a monochrome screen, this will just display a 560 x 192 1-bit bitmapped display. On a color monitor, a repeating pattern of 4 bits will trick the NTSC video circuitry into displaying a single color, as it interperets the repeating bits as a chromanace signal.
What I want to know is: if I have some arbitrary string of bits to be displayed, starting at the beginning of a line (for instance 110101000111101010101100101001, which in order to be displayed gets packed into the hex bytes 2B 3C 55 29 02 and the odd bytes are written sequentially into aux memory and the even bytes are written sequentially into main memory), how do I take this information and figure out (possibly by hand) what pixels will be which colors on the color display? In this case it comes out to blue, 2x light blue, 3x dark grey, 3x dark green, yellow, 2x white, 2x pink, 5x grey, 2x aqua, 3x blue, 3x grey, 2x green, and dark green, but how would I figure that out (by hand or otherwise)?
This is an already-solved problem (else we wouldn't have the wonderful linapple emulator which I used to figure out what colors would come out of that bit pattern), but I just can't find any explanation of exactly how it all works. Yes, the bit patterns that produce the colors are described in the Apple IIe Tech Note #3 (http://www.1000bit.it/support/manuali/apple/technotes/aiie/tn.aiie.03.html), but there is no explanation of exactly what colors will result from some arbitrary pattern of bits.
If anyone could point me to some documentation or explain it that would be very, VERY much appreciated.
The Apple's ~14 MHz clock is exactly four times faster than NTSC's 3.58MHz color clock, as you're already aware. Although Apple is only transmitting a simple on-off signal, its timing is interpreted as a Y'IQ composite signal by an NTSC receiver.
So, in double hi-res mode you could synchronize the bit stream with the 4 phases of the I and Q components so they contribute values as follows:
The bits streams 1010 and 0101 are both interpreted as gray because the 1 bits are 180-degrees out of phase for the components that receive them, resulting in both color components I = Q = 0:
Since I and Q are both 0, the patterns 1010 and 0101 land at the grayish (0,0) coordinate of the Y'IQ color plane (picture). Hence, a shade of gray.
On the other hand, the pattern 1100 generates color because I and Q both decode as nonzero color components:
Since I and Q are both 1, the pattern 1100 lands at the top-right corner (1,1) coordinate of the Y'IQ color plane (same picture). A shade of magenta, depending on the tint control.
EDIT: You can find a mush-more-relatable layman's overview in James Sather's book Understanding The Apple II, page 8-33 but it doesn't go into detail of how the bits transform into color components.
Here's another explanation:
http://lukazi.blogspot.com/2017/03/double-high-resolution-graphics-dhgr.html