tomaitheous wrote:Did you miss the formula that I posted?
40 CELL mode uses a 6.7125Mhz dot clock and 32 CELL mode uses a 5.37mhz dot clock. That's what defines the "width" of a pixel for determining correct pixel aspect ratio and overscan area.
6.7125mhz/ 15735hz = 426 dot clock(pixel) length scanline / 1.186 (not displayable part of line, i.e. hblank/sync) = 360 displayable pixels. 360/1.125(for clipping) = 320 non-border pixels.
5.37mhz/ 15735hz = 341 / 1.186 = 288 pixels / 1.125(VDP clipping) = 256pixels.
It doesn't scale or fit the pixels into a 720 (13.425mhz) scanline. 720 is a standard, not a limit. A scanline is infinite in resolution since it's an analog signal. Actually most consumer grade TVs can not even show a true 720pixel line from an input source. The 13.425mhz signal is out of the range of composite,s-video, and RF. Even most SDTV(480i) and EDTV(480i) component output isn't 13.5mhz, and most low-to-mid grade sets won't *translate* all the detail from that frequency anyway.
I didn't miss it but have some questions about your pixel calculations:
1/ how do you get that the pixel clock is the main clock (5.3693175 Mhz) divided by 8 in H40 ? The x10 factor in H32 mode seems fine as it is compatible with Charles Mc Donald's tests on the SMS.
2/ how do you know for sure that the VDP line frequency is 15735hz (which is, from what I read, the specific NTSC line frequency). For example, in the original TMS9918 doc (working in H32 mode), the line period is specified to be 63.69 us (ie 15700Hz): this is compatible with the fact that both this doc and McDonald's tests say that the maximum number of pixel per line is 342 in this mode (not 341), i.e the line frequency is 53693175/10/342 = 15700Hz. I doubt that the VDP line frequency would change between h40 and h32, only the pixel clock change.
3/ next, about htotal/hsync ratio: again, in the TMS9918 docs, the timings are described to be as following:
63.695 us for the line (342 pixels)
43.67 us for active display (256 pixels)
2.42 us for right border (13 pixels)
2.8 us for left border (15 pixels)
the rest is for hsync/hblank/color burst
this give us 284pixels out of 342 and a 1.21 ratio, which is more like the specified NTSC timings (nominal line period = 63.5555 us , Line-blanking interval = 10.9 us)
with this in mine and supposing the H40/H32 ratio is 5/4, I calculated:
NTSC resolutions:
284x240 pixels in H32 mode
356x240 pixels in H40 mode
4/ finally, what about PAL mode ? Should the max number of pixels per line remain constant ? this would mean, if the clock dividers do not change, that the VDP line frequency in PAL mode is slower, in order to compensate a slower pixel clock (pal main clock is 5.3203424 Mhz)
Secondly, aren't the hdisplay/hsync ratio different between NTSC and PAL formats ?
sorry to bother you with all those question, I'm in a complete learning stage about this subject so maybe I misunderstood some video concepts
PS: I made some test with a megadrive model 2 (PAL) that I modded to 60Hz: it appears my TV set (a 28" sony CRT) does not show any top/bottom borders but show the right side border quite completely (it's like the image is shifted to the left, not centered)