I recently bought a Videx Ultraterm video card for my Apple II+ computer. I do not have an Apple Monitor /// (the recommended monitor to use with the Ultraterm to view the 132 column or higher modes according to the Ultraterm manual). I do have an Apple Monitor II and it does not support the 132 column or higher modes. I reviewed the specs for the Apple Monitor /// and Apple Monitor II and it looked to me that the Monitor II had a wider video bandwidth (18 MHz vs 15 MHz) and Resolution (900 TV lines at center vs 700 TV Lines at center). Based on this, I thought I would be able to use/view the Ultraterm 132 column or higher modes with the Apple Monitor II. Maybe my Ultraterm card is bad. Has anyone used a Videx Ultraterm with the Apple Monitor II and been able to use/view the 132 column or higher modes?
Anonymous
User login
Please support the defense of Ukraine.
Direct or via Unclutter App
Active forum topics
Recent content
Navigation
No Ads.
No Trackers.
No Social Media.
All Content Locally Hosted.
Built on Free Software.
We have complied with zero government requests for information.
The card uses the same CCIR System-M composite interface as the video onboard the Apple II, so "compatibility" is strictly a matter of picture quality. The "Resolution" of a System-M display is very different from the kind of resolution numbers on computer displays; it is merely a figure of merit for the quality of the picture. So no matter how high or low the resolution spec on the monitor, it should display the same composite signals.
The video bandwidth (the frequency of the signal passed through the video amplifier) will determine how legible the characters are. The phosphor persistence will determine the level of flicker.
Here is a great article from back in the day about this card:
Creative_Computing_v10_n09_1984_September_0038.jpg
Thanks for the Ultraterm article/review. I believe it has the answer I was looking for. It does state that the Apple II monitor does not work well with the Ultraterm. When I first read this statement I thought it was talking about an early Apple II monitor, and thought it was talking about the Sanyo VM-4209 or 4509 which Apple did sell with early Apple II computers. But when I looked at the date of the article, September 1984. It must be talking about the Apple Monitor II, which was available at this time. So it looks like I will be on the lookout for a reasonably priced Apple Monitor/// to try the higher number of columns modes. In the mean time, the Ultraterm is an upgrade over my old Videx Videoterm card that I had prior in my Apple II+.
I presume the card is generating an interlaced video signal. That would be the only way to achieve the higher vertical resolution and would explain why they recommend a high-persistence/slow-phosphor monitor like the Monitor /// (the slower phosphor would reduce the flicker from an interlaced signal).
I'd expect the newer Apple II monitor to be perfectly fine as far as clarity is concerned, but the flicker from the interlaced signal would be a distraction.
(Later Apple /// models had the option to generate interlaced video, which probably looked pretty good with the Monitor ///)
Interlaced is not the only way. They could also be doing progressive with a reduced frame rate. In my ESP32 SoftCard in order to handle Hercules' 720x348 and Mac's 512x342 resolutions, I support both approaches and let the user pick one, since they both have their merits and drawbacks (more here). In either case a monitor with a high-persistence phosphorus will be beneficial in reducing flicker.
However no matter which approach the card uses, the author should still see something on the screen of any composite monitor, including the standard Apple II monochrome monitor. If he is just getting a black screen, the card is most likely not working or it has not been turned on.
Now a monitor that has the bandwidth to handle a pixel sampling rate of 20 MHz goes a long way towards comfortably fitting 720-pixel raster on the screen horizontally. My card can only do a maximum of 16 MHz and on some monitors the image goes right to the very edge of the screen on the 720x348 resolution. This card seems to be firing at 20 MHz.
Without interlacing, a compliant CCIR System-M signal has 240 rows, and System-B has 288, where the top and bottom rows are usually cut off by the bezel (overscanned). The horizontal line period (H) in both cases is around 64 microseconds. This H period is generally critical to the flyback circuit, so should not be changed much or at all. In CRT displays without a locked local oscillator, like the IBM 5151, feeding a signal with the wrong H can cause components to burn up. The vertical period (V) is less critical, and a CRT set may be adjustable to a range of frequencies around the CCIR standard (16.7 ms in System-M, 20 ms in System-B), but it could upset other adjustments like linearity and color decoding. It's not ideal for the user to have to adjust multiple pots when switching video modes, but perhaps it can be lived with.
Interlacing inserts a H/2 delay after every other vertical retrace, making each field begin or end with a half-line. There shouldn't be any adjustment required in switching between interlaced and non-interlaced modes if the V and H timing are kept.
The horizontal line period of 64 microseconds is not changed in any of the methods I described above. Actually it cannot be changed by the composite video signal and still maintain synchronization (outside of a very small adjustment margin), so any concern for component damage is a non-issue.