-
-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AppleII Conan game intro screen incorrect (rde/blue mixed with green/mauve) #872
Comments
Looking at the NTSC colour wheel, that would suggest that my OpenGL decoder is most likely 90 degrees out of phase. I know there was an error in phase in the OpenGL code but believed it to be appropriately handled in lieu of a fix. I'll check it out. |
To folllow-up on this: I wasn't immediately able to reproduce. I did finally see some of the other defects that have been reported to me with the OpenGL backend, but colours looked correct. Tested environment was Ubuntu 20.04, in VMWare. That said, I'm still suspicious about the way I tried to resolve the phase error as mentioned above so hopefully soon the penny will drop. |
I played around with this some more, and there is definitely something mysterious going on. See the attachments; 'OpenGL 66' and 'OpenGL 67' are that title screen with colour phase shifted only 1/256th of a cycle — as if I had finely adjusted a tint control — yet the displayed colour leaps a huge distance. Even more weirdly, I thought I'd check the Metal renderer, which manages to dodge a 1/8th of a cycle phase error that I believe to be present in OpenGL and therefore has phase numbers not exactly comparable, and I managed to adjust things such that I got a gradual phase error across the display. That's 'Metal 56'. So I don't offhand know whether this ties exactly into #857 but it's likely that something is amiss. Though I'm surprised that it seems to affect both back-ends. More investigation required. |
Hacks in AppleII/Video.cpp, AppleII/Video.hpp, and Macintosh/Video.cpp assume that building on macOS means building for Metal unless IGNORE_APPLE is defined. By defining this in the macOS SDL build, Macintosh video is now sized and positioned correctly and Apple II colors are now just as wrong as they are on other OpenGL builds instead of being wrong in a unique way. See TomHarte#872
I wrote a program to display all the colors along with their intended chrominance phase and amplitude and luma values, hoping to spot some pattern. Initially, I didn't. Here's the correct output from the macOS Metal UI: And here's the incorrect output from the macOS SDL OpenGL UI: The three very similar greens, and the three very similar magentas, make it look like more than just a phase shift problem. After discussing the issue with @RRyanClark, under the hypothesis that the OpenGL version might have had correct colors at some point in the past, I tried building older versions of the code from 2019. I didn't find correct colors, but I did find different colors. I decided to …although that might still not be the whole story because just twiddling the hue knob on that screenshot by 90° in an image editor fixes many of the colors but not all of them; for example 7 Light blue then appears darker than 2 Dark blue. |
Hacks in AppleII/Video.cpp, AppleII/Video.hpp, and Macintosh/Video.cpp assume that building on macOS means building for Metal unless IGNORE_APPLE is defined. By defining this in the macOS SDL build, Macintosh video is now sized and positioned correctly and Apple II colors are now just as wrong as they are on other OpenGL builds instead of being wrong in a unique way. See TomHarte#872
Adjust phase by 90 degress. Closes TomHarte#872
If I understand correctly that SDL/OpenGL colors are a little brighter than the Metal ones but I can live with it. |
Adjust phase by 90 degress. Closes TomHarte#872
Well shoot. While 81ad864 fixes the 90° color shift in the Scons/SDL/OpenGL version on macOS 12.7.2 on my 2012 15" MacBook Pro with Retina Display, it causes a 90° shift in the other direction in the Scons/SDL/OpenGL version on Linux Mint 21.2 on my 2011 13" MacBook Pro. Of course I can make it work on both machines like this… diff --git a/Machines/Apple/AppleII/Video.hpp b/Machines/Apple/AppleII/Video.hpp
index ae3478928..1d5038db4 100644
--- a/Machines/Apple/AppleII/Video.hpp
b/Machines/Apple/AppleII/Video.hpp
@@ -471,11 471,15 @@ template <class BusHandler, bool is_iie> class Video: public VideoBase {
// The OpenGL scan target introduces a phase error of 1/8th of a wave. The Metal one does not.
// Supply the real phase value if this is an Apple build.
// TODO: eliminate UGLY HACK.
-#if defined(__APPLE__) && !defined(IGNORE_APPLE)
#ifdef __APPLE__
#ifndef IGNORE_APPLE
constexpr uint8_t phase = 224;
#else
constexpr uint8_t phase = 192;
#endif
#else
constexpr uint8_t phase = 0;
#endif
crt_.output_colour_burst((colour_burst_end - colour_burst_start) * 14, phase);
} …but I have no idea if this is truly a Mac-vs-Linux difference, or an OpenGL- or SDL-version difference, or a graphics card difference, or something else… This also suggests that whatever issue was originally being reported had probably already been fixed by the time I started looking at it. Hey @wiz21b, would you happen to remember what OS and version you were using and on what machine with what graphics card when you reported this? Tom, do you remember what changes you had made to the code to achieve the screenshots you posted? |
So... what's the deal with the Ugly Hack? Why do you need a different color burst phase for Metal and OpenGL for Apple II but not for other machines? |
In short: the OpenGL scan target assumes that in composite decoding mode it can just internally sample at four times the colour subcarrier. Mathematically, that'd be valid except... ... because it does only cheap sampling, that introduces phase errors for machines with output like the Apple II as a result of incorrect bucketing. Most other machines describe their output in such a way that the cheap sampling is exactly the same as expensive sampling would. Related/subsidiary: I have a desire to write the OpenGL scan target for reasons beyond just those below, but working with OpenGL is always so much work that I keep deferring. In particular with regards to debugging and general runtime inspection of a running system. I also have a mental backlog of work items on the Metal side and keep telling myself that I'll just get the pipeline locked down, then do a full translation. Besides the things in #1242 I want to experiment with a sharpening filter on chroma as my code does a pretty bad job of in-phase colour in general — which is especially evident for machines like the Master System and MSX 2 that are both in-phase and using complicated palettes. With regards to the issue above: being conscious of the flaw, Metal switch to using whatever is the least integer multiple of the input sample rate that is also at least four times the colour subcarrier. So there is no risk of aliasing and no need to come up with elaborate sampling solutions. It makes chroma/luma separation just marginally more expensive, but that's more than paid for in other substantial pipeline improvements applied by Metal. Of those that would also transfer to OpenGL, the Metal scan target is smart enough to do submit all work to the GPU after initial line composition in terms of rectangular regions on the intermediate buffers. OpenGL sticks to submitting work line-by-line throughout. I daren't imagine the full list of reasons why that's a terrible choice but it'd include the hugely-increased volume of commands submitted and the cache detriment of not allowing the GPU to schedule pixels for translation in an optimal order. The other really big OpenGL issue for a user is that it assumes a 4:3 work area for intermediate buffers, so e.g. a full-screen original Macintosh on your 16:9 monitor on a modern Mac is fitted with only letterboxing. On your non-Mac it's both letterboxed and pillarboxed. |
Thank you for the explanation! I think I lack the necessary background to understand it properly. "Cheap sampling," for example, is unfamiliar to me, but so is most of what you do in the emulation and graphics parts of Clock Signal. And the extent of my familiarity with OpenGL is a hello world program decades ago and I haven't written or read Metal code before. By which I don't mean to ask for a more detailed explanation, just to explain that I may not be much help in figuring out these issues.
More data: it appears to be a graphics card difference because the same 2011 13" MBP shows the same problem under macOS 12.7.2 with the same (MacPorts) libsdl2 2.28.5 install as the 2012 15" Retina MBP that doesn't show the color problem. So the 13" 2011 MBP (which has Intel HD Graphics 3000 384 MB) needs I figured that porting the Metal code back to OpenGL as you planned in #856 will resolve the various issues, but since that's been open for three years I guess it's more of a long-term plan, and I had hoped to be able to knock out a more targeted quick fix for this color issue but now I'm not sure. |
So far only one of the machines I've tested needs
¹ Can't test, crashes on launch due to |
Hello, I've run the game Conan on AppleII machine.
The intro screen depict a forest and a castle. The forest is red :-) That's not correct, it should be green.
I've tested the 4AM crack of Conan (the one which has the music on the intro)
Same problem with Boulder Dash, Chivalry, etc.
Version tested: CLK-2021-02-02
The text was updated successfully, but these errors were encountered: