Username:
B
I
U
S
"
url
img
#
code
sup
sub
font
size
color
smiley
embarassed
thumbsup
happy
Huh?
Angry
Roll Eyes
Undecided
Lips Sealed
Kiss
Cry
Grin
Wink
Tongue
Shocked
Cheesy
Smiley
Sad
1 page
--
--
List results:
Search options:
Use \ before commas in usernames
Edit history:
ING-X: 2025-03-21 11:50:22 pm
ING-X: 2025-03-21 07:06:53 pm
ING-X: 2025-03-21 02:20:04 pm
This question is probably best answered by nate, since he's the technical expert and has been around long enough to probably have the historical knowledge of this that I'm most curious about. But any insight is welcome.

SDA has long prided itself on its video quality standards. Runs are encoded in multiple quality options, with high quality (or "insane quality" in some cases) being the best available. Use of higher quality inputs like S-Video or Component has been encouraged. Something I noticed - and this is mentioned on the AviSynth page, for self-encoders - is that SDA videos will never have higher resolution than the actual game; if a game runs at 240p, even high quality video will be encoded at 320x240. This is because, supposedly, there's no advantage to encoding a 240p game in any higher resolution than 240p, since you get no extra information out of it and it would just waste server space.

But something I've learned about over the past few years is the idea of interpolation during upscaling: when a low-resolution input is displayed on a high-resolution screen (such as inputting a SD console into an HDTV, or expanding a low-resolution video or image to full-screen on a computer), it needs to be upscaled. In other words, the (e.g.) 320x240 pixel image needs to be scaled up to (e.g.) 1440x1080 to be displayed (obviously this is done automatically by the HDTV or computer, and the user may not be aware of it). But there are many different upscaling methods, and some result in a very different look than others. In particular, HDTVs and media players on computers tend to use a bilinear or bicubic method of upscaling, which typically results in a blurry image. Even a 640x480 video (which would be what high-quality SDA videos use) would end up blurred if upscaled to full-screen on a media player on a computer (which even in the mid-2000s, would have used a sharp 1024x768 resolution at the lowest).

Now, I know that depending on taste, some amount of blur can be advantageous (I learned about all this in the context of learning about consumer-grade CRT TVs, which have a natural Gaussian blur that pixel-artists often took into account when drawing their sprites, leading to more realistic and rounded off images). But based on my understanding of SDA's mindset in the mid-2000s - and the lack of any mention of bilinear/bicubic upscaling on the SDA pages I'm aware of - it seems like such a blurring of SDA's videos (during upscaling when watching on a computer in full-screen) would have been considered undesirable, especially for 2D sprite games (which even emulators would render at full pixel-sharpness). Which makes me wonder: did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer? Was this considered desirable for whatever reason? Was bilinear/bicubic upscaling even common in media players of the time period? Was the whole concept of upscaling/interpolation just not on the SDA team's radar at the time (I certainly had no idea about it myself; I just assumed "low resolution" just inherently looked blurry)? And if such blurring is not considered desirable, is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?
Thread title:  
Quote from ING-X:
did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer?

yes, we were aware. to help you understand the decision about not allowing higher output resolution than the game's native, it was to keep opinions about upscalers out of sda. like, imagine you submit your run and the higher qualities have been upscaled using eagle or 2xSaI or something. and it's accepted, and then people come into the forum like man, that algorithm is, like, so last year/fugly/whatever. so now you have a problem and sda has a problem because dudes are coming in with pitchforks demanding changes, whereas if we leave the upscaling to them in their player we can say "that's your problem."

note that the output of hardware upscalers like the ossc is (and has always been iirc) an exception. people expect the output of these devices to be sharp and in fact it's so sharp and the signal so stable (low noise) that the higher resolutions have little impact on bitrate/quality (which would be another reason not to allow upscaling in the higher quality files). the fpga consoles like the mega sg are viewed the same way. so you can see that the rule is really about keeping personal choices out of sda - not because i'm authoritarian (although i am) but because no admin wants to deal with a bunch of whining and community strife when there's a simple solution right there to prevent it before it starts.

Quote:
Was bilinear/bicubic upscaling even common in media players of the time period?

yes, the default method was one of those two, can't remember which one now, both in players like vlc and in editors like virtualdub, with its scale filter. if it wasn't one of those then it was nearest neighbor, but that looked so bad that no one used it.

for *downscaling*, if you're curious, the method also matters, and i always preferred (and hardcoded into anri) lanczos.

Quote:
is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?

there are a *lot* of software scalers out there these days. google retro game scaling algorithm and you will see. plenty of opinions, too, as always. i bet some of the more recent scalers already come preinstalled in vlc and you just have to go into settings and select whichever one you want. you can probably also download packs of them, both for vlc and for editors like premiere. i personally do not have a preference for one algorithm over another as i play exclusively using my ossc and fpga consoles these days. and with those i leave the defaults, which results in a quite sharp image, quite unlike the appearance of the same game on a crt display of the same era.

so, as you can see, for my personal usage, i am of the opinion that trying to make modern displays look like crts is a waste of time. i think part of the problem is that "a crt" doesn't refer to anything specific. some people grew up with sony trinitrons, for example, which look different from other crt displays. i did not grow up with a sony trinitron, but a lot of people did, so when i see their work on this stuff, i'm like, "what is this?"

and it also doesn't help that i just don't care about making games look like they did on a crt. doing that feels like someone else's hobby to me. like sure, knock yourself out, but don't expect me to get excited about it. and i guess that same attitude back in the day made it easy for me to make the decision about not allowing upscaling video coming from the old consoles. imagine if i had been a huge fan of a particular scaling algorithm and had injected my opinion about that into the site on top of every other way that i did. sda would look even more dated today, if you can even imagine it.

just some random musings from me. hope i was able to answer your questions.
Edit history:
ING-X: 2025-03-23 07:07:53 pm
ING-X: 2025-03-23 02:10:28 pm
Thanks, that was very enlightening. Smiley I had no idea about any of this back in the day, I (and I assume most others not knowledgable in the tech aspects of this) always assumed there was just a linear progression from "low quality" to "high quality". I think I knew my CRTs looked "blurrier" (which at the time I would have seen as undesirable) while my HDTV looked more detailed, and I just assumed the HDTV was showing the full detail of the input signal (in all its analog ugliness). I didn't realize at the time that my HDTV was using some kind of bicubic upscaling algorithm on the inputs which is why it looked so ugly. I certainly didn't know about all the ways in which sprite artists took the specific CRT display quirks into account, in a way that some people would prefer (I have to wonder how many people knew about that at the time). If I knew what I know now, I think I probably would have been using a CRT for my speedruns back in the day.

This makes me wonder, did you guys at the time know about how (at least consumer-grade) CRTs had a unique ""scaling algorithm"" different from e.g. bicubic or bilinear, or about how sprite artists often took the specifics of the CRT (along with composite video etc) into account when designing sprites, in such a way that some people (myself included) would prefer? I'm very curious to know how much of that sort of thing was understood at the time, it already seems like you guys knew way more about scaling algorithms etc than I was giving the mid-2000s credit for.
this is a harder question, because while i know stuff today, it's been so long that i can't remember when i learned it.

my understanding is you're asking about sda's golden age if you will, from about 2004 through 2014. this was the time of traditional (ntsc/pal) analog video capture. in 2004, everyone who wasn't using a capture card was using vhs, and almost no one had an hdtv, while in 2014, no one was using vhs, and almost everyone had an hdtv. the middle of the last decade also saw the rise of both hardware upscalers like the ossc and the fpga consoles. so ~2004-2014 was the transition period from analog to digital where people were capturing old school ntsc/pal signals from consoles and trying to figure out what to do with them. i became knowledgeable in this area, and it was a mess. i am glad that it's in the past. it was so easy to make a mistake and ruin your captured video.

i can tell you one thing i may have known about back then: the "transparency" and rainbow effects in the waterwalls in sonic 1:
https://youtu.be/x0weL5XDpPs?t=208
https://www.reddit.com/r/emulation/comments/l3txpn/comment/gkn0590/

like i said, i can't remember when i learned about this specific example, but i am 100% sure that i knew about both horizontal "smearing" (which artists exploited using dithering) and chroma bleed, which i sometimes called "rainbowing," inherent in ntsc composite signals. here's one example i found searching the forum just now:
https://forum.speeddemosarchive.com/post/hard_corps_uprising_quality_test_2.html
you'll notice that this is a negative example. at the time i thought of s-video as unconditionally superior to composite. but you have to remember that i was running a website where people would mostly download compressed video files and watch them on their computer screens, which were usually lcd (especially after about 2005), *not* crt tvs. so if you went back in time and told me "you're obliterating some of the intent of the artists by suggesting people switch to s-video," i would have replied "it's already obliterated the moment you digitize the signal." in other words i understood by spring 2004 (more on this date below) that a digitization of an analog video signal (and its subsequent display) is always an interpretation of it.

on top of that, it's not like s-video was a later invention for every console. tons of people used s-video with their analog tvs by the late 90s. maybe some developers were even viewing their work over s-video. and s-video only eliminated chroma bleed effects like rainbowing and dot crawl, not horizontal smearing. and dot crawl was usually introduced by people's cheap capture cards. in other words it was not the intent of the game's developers for the game to look that way. at least i am unfamiliar with any examples of dot crawl being intended.

i know that it was spring 2004 at the latest that i understood the differences between the types of displays because that's when zero mission came out and i saw it on my gbasp screen and on my tv (via my game boy player) and again in virtualdub after going through my capture card. so i got to see exactly how the game was supposed to look on my gbasp and how it looked on a crt tv and how it looked when that analog signal from the game boy player was digitized by my capture card. i wanted the captured image to look sharp like it did on my gbasp screen and there was just no way to do it. the brightness was also totally wrong which probably bothered me even more than the blurriness and the color. but that's yet another problem with ntsc that i won't get into here.

another thing i'd like to mention that you didn't bring up but was nonetheless part of our decision making (both for sda and early gdq): latency/input lag on hdtvs has only relatively recently come down enough for people who care about latency to start using them. i think my lg oled tv that i got six years ago was one of the first hdtvs to be competitive with a crt in terms of input lag (< 20 ms). and if you try to plug a composite cable from an old console into an hdtv, the lag is horrid. so that's why until recently in "behind the scenes" pictures of gdq you would see a row of crts no matter what people were playing.

there were gaming monitors before 2019 that featured low latency, but the picture was not pretty like with an oled display (or, obviously, a crt, for games designed to be played on a crt). lots and lots of people used those low latency gaming monitors though, including later gdqs at some points i believe. i'm just not as familiar with them.

so the answer to this question that you didn't ask, about input lag, is that we must have known about it almost right from the start, because in december 2009 we set up the first gdq knowing about it, and we had known about it for some time at that point.
Thanks again for the response, and for all the detail. Smiley I'm still a little unclear about what you knew about the distinction between horizontal smearing from analog video signals (which it seems clear you knew about) vs. the CRT itself (which has its own horizontal smearing and similar effects, at least in consumer-grade TVs - I've heard it described as a "Gaussian" scaling algorithm). I'd guess since you looked at Zero Mission on a CRT vs a capture card, you must have seen the difference there at least?
ah okay, yeah, that was/is beyond the limit of my knowledge. from the word gaussian i can guess what you're talking about, but it never occurred to me to interpret the placement of the apertures in the mask as a scaling method. but that's exactly what it is now that you mention it. i know different tvs had different mask patterns too like i was talking about with the trinitrons. i've read that they were the most different but i don't have much experience with them other than seeing them at other kids' houses and not paying much attention at the time. i mean i could tell that games looked different on them but i never made the connection.

oh, that reminds me of something i did know about though. i learned this from radix actually. i still don't remember when but it must have been early, like 2005 or so. he remarked one time that it's weird how we don't usually perceive ntsc tvs to be flickering even though "the phosphor is only active for a few nanoseconds" or something like that. later on i found out that this effect is called persistence of vision. i believe this is the same effect with old school edison type incandescent light bulbs, where they turn on and off 60 times a second but humans can't ordinarily perceive that. and modern lcd/oled displays have this big problem with "judder" because their pixels stay on for so long rather than strobing like crts. i actually have my lg tv configured to insert black frames at 60 hz so the games i play at 60 fps have less judder (because the tv is strobing, sort of, at 60 hz). it's a hack though and i really want to get a new tv that can do 120 fps and 4k and hdr at the same time. i've read that 120 fps cuts down on judder at least as much as black frame insertion.
Edit history:
ING-X: 2025-03-25 12:44:05 am
ING-X: 2025-03-24 09:18:09 pm
ING-X: 2025-03-24 09:17:47 pm
ING-X: 2025-03-24 09:17:03 pm
ING-X: 2025-03-24 09:00:54 pm
ING-X: 2025-03-24 09:00:01 pm
Yeah, Trinitrons look a little different, they have a different type of mask that looks more like a pixel grid. Higher end consumer trinitrons - along with higher end "professional" CRT monitors of all brands, as well as CRT PC monitors - tend to look more like nearest-neighbor scaling (like emulators, or default settings on OSSC etc). Lots of people like that, but to me part of the appeal of CRTs is the "Gaussian" scaling that makes things look softer and more "convincing".

I think I'm still a little bit unclear about a couple things, which may be a result of you just not remembering things that well, but also just inherent difficulties in communicating in non-real time over a forum. I'm getting that you didn't know specifics about how CRTs' "scaling" worked, but I'm wondering:

1. if you knew at least about CRTs having their own horizontal smearing, on top of the composite/s-video/etc signal's smearing, even if you didn't know more specifics
2. if you at least could tell (since you compared them) that CRTs looked *different* than direct video capture in some way, or that the CRT was less sharp than the video capture upscaled with nearest-neighbor (maybe the CRT looked closer to bilinear to your eye?)
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)
Edit history:
DJGrenola: 2025-03-25 08:06:09 am
DJGrenola: 2025-03-25 08:05:59 am
DJGrenola: 2025-03-25 08:05:57 am
DJGrenola: 2025-03-25 07:55:32 am
guffaw
Quote from ING-X:
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)


me and nate did discuss LCD vs CRT back in the day, but more in terms of monitors than TV. I remember because there was a difference of opinion involved: nate was a fan of LCDs, which were the new technology at the time. I much preferred CRTs, partly because I felt the smoothing effect made them more pleasant to use, but also because you could drive them at different resolutions, which you can't really do with an LCD. nate almost certainly had access to higher quality LCD panels than I'd ever seen though. anyway I was certainly aware of the smoothing effect and we probably discussed it.

the notion of crafting a custom scaler specifically to simulate a CRT certainly wasn't something that had occurred to anyone at SDA at the time. I don't think it had ever even occurred to anyone in e.g. a university looking for a thesis, and academics regularly work on all sorts of dumb useless ideas. think about it -- why would you want a scaler to simulate a CRT, when you could just go down the road and buy an actual CRT? no incentive to create such a thing even existed until they stopped making CRT screens.

Quote from nate:
he remarked one time that it's weird how we don't usually perceive ntsc tvs to be flickering even though "the phosphor is only active for a few nanoseconds" or something like that. later on i found out that this effect is called persistence of vision. i believe this is the same effect with old school edison type incandescent light bulbs, where they turn on and off 60 times a second but humans can't ordinarily perceive that. and modern lcd/oled displays have this big problem with "judder" because their pixels stay on for so long rather than strobing like crts.


couple of points about this. I'm not familiar with the materials science involved but I know you can get different speed phosphors. old analogue oscilloscopes and radar units used very slow phosphors, so you could trace the beam across the screen e.g. once a second and see for a good half a second the trail it would leave behind. you're right in that TV phosphors decayed much quicker, but I don't know the numbers. so there's partly a persistence of vision effect within the eye and brain, but I think also the phosphor would naturally remain illuminated for a short while.

the other thing is the incandescent light bulb. they definitely flickered a bit at 50/60 Hz, but ultimately the light is produced by a white-hot piece of tungsten. it takes time for that to cool down, so it will continue to glow with residual heat, even when the voltage across the filament drops to zero. so the flickering effect isn't as pronounced as you might think. see e.g.



also, any issues you may be having with the RSS feed are not my fault.
Edit history:
DJGrenola: 2025-03-25 10:45:12 am
DJGrenola: 2025-03-25 09:37:30 am
guffaw
Quote from ING-X:
Which makes me wonder: did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer? Was this considered desirable for whatever reason? Was bilinear/bicubic upscaling even common in media players of the time period? Was the whole concept of upscaling/interpolation just not on the SDA team's radar at the time (I certainly had no idea about it myself; I just assumed "low resolution" just inherently looked blurry)? And if such blurring is not considered desirable, is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?


funnily enough I was uploading an old audio cassette tape from my student days to archive.org recently. people asked me what I was going to do about reducing the hiss on the recording, and the answer was, well, nothing. it's arrogant to assume you can do a better job of this than whatever whizzy AI algorithm will exist in 50 years. so it's better to keep the processing to a minimum, and upload the driest source you can. admittedly in the case of internet video there's a compromise involved, in that you actually do need the videos to be watchable on the technology of the day. stuff that was recorded on DVD recorders could in theory just have been uploaded in its original MPEG-2, which would have been completely unprocessed. but then you have issues with download filesizes for people on 56K modems, and the need to deinterlace it on playback (which is easy to do badly) and that kind of thing. plus the SDA methodology kind of carried over from VHS tape, which is probably why uploading MPEG-2 didn't happen, even though theoretically that would have been a purer source.

I do actually still have the MPEG-2s of both my published metroid runs. sometimes I think about uploading them somewhere, although I doubt I'll ever bother.
Edit history:
nate: 2025-03-25 02:32:37 pm
nate: 2025-03-25 02:23:14 pm
Quote from ING-X:
1. if you knew at least about CRTs having their own horizontal smearing, on top of the composite/s-video/etc signal's smearing, even if you didn't know more specifics

it's hard to say to be honest. i want to say no but there are so many years in there where i could have read it or thought about it and went, "huh, that's interesting," and then filed it away never to be seen again. grenola touched on this a bit but it wasn't something people thought about in general as far as i remember. the conceptual leap from "this is the root cause of how games look on this particular crt tv" to "i am going to write software to simulate that appearance on my computer monitor" did not happen. why didn't it happen? why didn't the romans have steam power? it obviously would have been useful to some people, right? i don't know the answer. maybe it's just random, or maybe, like grenola said, now that crts are so much rarer and more respected, the people looking into this are motivated to learn about and glorify them.

to elaborate a bit, until recently, there was a "next big thing" feeling in consumer technology, like omg dude you totally have to see what a drop of water looks like on a leaf in hd. and people like me looked down on crt tvs as antiquated technology destined for the landfill. i resented the entire "legacy" analog video apparatus i carried on my shoulders while i was trying to bring the speedrunning gospel to people. and the crt tv was "ground zero" of that apparatus. it was the harebrained early 20th century technology that spawned all the other harebrained hacks i had to cut through to do my job. so maybe i tended to be a little bit biased toward newer technologies like large lcd screens that in reality had their own long list of problems including their input lag and mutilation of old school game video.

Quote:
2. if you at least could tell (since you compared them) that CRTs looked *different* than direct video capture in some way, or that the CRT was less sharp than the video capture upscaled with nearest-neighbor (maybe the CRT looked closer to bilinear to your eye?)

no, i definitely didn't think of it in terms of scaling. scaling was the least of my problems. i'd say interlacing was first. any 480i capture from a 240p game immediately got field split and then scaled down to get the correct aspect ratio. it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp. but it was hopeless, like i said. there was no horizontal definition left and the brightness and color were also totally wrong. i did the best i could.

edit: upscaling 240p game video was brought up more recently, starting in about 2013 iirc, because by then there was a general awareness here at sda that the common chroma downsampling applied to e.g. h.264 mp4 video, which we called "yv12" after what it is called in avisynth (4:2:0 sampling), was impacting the image quality when people were viewing these videos upscaled. so the other admins and i started allowing people to submit IQ and XQ with more chroma or just flat out rgb rather than yuv. basically we realized that because of the chroma downsampling, storing old school game video in the native resolution was actually losing information, and we had to do something about that. another solution would have been to allow upscaling, but i was always reluctant, because i cringed at all the disk space and bandwidth that would be wasted for what i viewed as a hack. remember that i was the one who paid for both the disk space and the bandwidth, at least unless you exclusively downloaded runs from archive.org.

Quote:
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)

well, there was the "scanline" filter you could use with some emulators. as far as i know, it just drew black horizontal lines over the image. i remember very clearly trying it out and being like "wtf?" because it didn't look anything like a crt tv to me. i didn't think the person who wrote the filter was stupid; i just didn't get it. that's about the only example i remember. i think the rest of the filters in emulators were the stuff like eagle and 2xsal like i mentioned earlier. i'm not sure whether those were meant to mimic particular crt tvs or not. i found them ugly and stuck with nearest neighbor when i (rarely) played games using emulators.

you might search doom9's forum for stuff like crt upscaler. that is where people's projects would have been posted and where a lot of my knowledge and code that went into sda's software came from.

Quote from DJGrenola:
old analogue oscilloscopes and radar units used very slow phosphors, so you could trace the beam across the screen e.g. once a second and see for a good half a second the trail it would leave behind.

yes! i have seen this before now that you mention it. good insight. thank you for bringing this up. i was definitely painting with a very broad brush when i compared fast consumer tv phosphor illumination with *all incandescent light bulbs*, lol. having said that, i know that there are some big offenders that ruin people's high framerate videography in e.g. train stations.

Quote from DJGrenola:
admittedly in the case of internet video there's a compromise involved, in that you actually do need the videos to be watchable on the technology of the day. stuff that was recorded on DVD recorders could in theory just have been uploaded in its original MPEG-2, which would have been completely unprocessed. but then you have issues with download filesizes for people on 56K modems, and the need to deinterlace it on playback (which is easy to do badly) and that kind of thing. plus the SDA methodology kind of carried over from VHS tape, which is probably why uploading MPEG-2 didn't happen, even though theoretically that would have been a purer source.

yes! i did consider this. the archivist in me wanted to upload the originals of everything. i also kept all the vhs tapes i had ever been sent for years until my dad convinced me to pare them down. (i think he was tired of helping me move them between apartments.) and then i had to decide which ones might be historically significant even though history hadn't really begun yet. i still have a few today. i know scarlet's 0:55 is one of them ... the rest i do not remember.

to be honest, some days it's hard to motivate myself to care about people i'll never meet in the far future looking into this stuff. they will definitely have some things to look at, some things that will make the leap through time, and they will consider them precious to them. they might curse me for not saving more. if they recreate me using ai, and they get my personality right, i will tell them: "if you had all these things, then you would no longer value them." and then they would delete me. just as planned.
Edit history:
ING-X: 2025-03-25 03:16:51 pm
ING-X: 2025-03-25 03:16:30 pm
Quote:
to elaborate a bit, until recently, there was a "next big thing" feeling in consumer technology, like omg dude you totally have to see what a drop of water looks like on a leaf in hd. and people like me looked down on crt tvs as antiquated technology destined for the landfill. i resented the entire "legacy" analog video apparatus i carried on my shoulders while i was trying to bring the speedrunning gospel to people. and the crt tv was "ground zero" of that apparatus. it was the harebrained early 20th century technology that spawned all the other harebrained hacks i had to cut through to do my job. so maybe i tended to be a little bit biased toward newer technologies like large lcd screens that in reality had their own long list of problems including their input lag and mutilation of old school game video.


This is the sense I get too. I myself looked down on CRTs for the longest time, until just a couple years ago when I saw pictures comparing "raw pixels" (i.e. nearest neighbor scaling) to what the games look like on a CRT, and was blown away. And then shortly after I found out that my HDTV that I had used for years was doing some kind of weird upscaling that I didn't know about that made everything look really ugly. It was a big moment of change in my entire mindset regarding these things. In fact, I had previously had a bit of resentment toward CRTs because everyone kept telling me I should use them because of input lag, and I felt that wasn't relevant to me because I had a (presumably) low-lag HDTV, and I felt my HDTV was better and fancier than people's CRTs (obviously I feel differently now).

Quote:
no, i definitely didn't think of it in terms of scaling. scaling was the least of my problems. i'd say interlacing was first. any 480i capture from a 240p game immediately got field split and then scaled down to get the correct aspect ratio. it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp. but it was hopeless, like i said. there was no horizontal definition left and the brightness and color were also totally wrong. i did the best i could.

edit: upscaling 240p game video was brought up more recently, starting in about 2013 iirc, because by then there was a general awareness here at sda that the common chroma downsampling applied to e.g. h.264 mp4 video, which we called "yv12" after what it is called in avisynth (4:2:0 sampling), was impacting the image quality when people were viewing these videos upscaled. so the other admins and i started allowing people to submit IQ and XQ with more chroma or just flat out rgb rather than yuv. basically we realized that because of the chroma downsampling, storing old school game video in the native resolution was actually losing information, and we had to do something about that. another solution would have been to allow upscaling, but i was always reluctant, because i cringed at all the disk space and bandwidth that would be wasted for what i viewed as a hack. remember that i was the one who paid for both the disk space and the bandwidth, at least unless you exclusively downloaded runs from archive.org.


Wait, I thought you said you *had* thought about upscaling, but chose not to so that people could upscale how they wanted on the media player?

Either way, maybe I should reword this: Did you notice that your CRT looked different from direct video capture, and/or that the CRT looked "softer" than nearest-neighbor (e.g. on emulator)? I'm assuming yes because of your Zero Mission comparisons (where you looked at GBA SP, vs. CRT, vs. video capture), but I just wanted to clarify.
Quote:
Wait, I thought you said you *had* thought about upscaling, but chose not to so that people could upscale how they wanted on the media player?

people used to complain about the low resolution of even our highest quality files, implying we should be upscaling using some unspecified method, and i told them no, do it yourself.

that was before i realized in around 2013 that the chroma resolution of those files was too low (lossy). that was the first time that i considered upscaling as a solution to a real problem. for analog video input, instead of upscaling, we decided to allow higher chroma resolution in the higher quality files as i said. and for hardware upscaler output (including the fpga consoles) we decided not to enforce downscaling to the original native resolution of the game. it's optional.

Quote:
Either way, maybe I should reword this: Did you notice that your CRT looked different from direct video capture, and/or that the CRT looked "softer" than nearest-neighbor (e.g. on emulator)? I'm assuming yes because of your Zero Mission comparisons (where you looked at GBA SP, vs. CRT, vs. video capture), but I just wanted to clarify.

first question (analog video capture versus crt) no, second question (crt versus emulator) yes. i didn't think of crt tvs as anything other than analog video signal displays. and how games looked on them didn't matter to me because i was targeting sda users' computer monitors, which were never (i hope) crt tvs and even around the beginning (2004) increasingly lcds which as you know are vastly different from crt tvs. as grenola touched on, the idea of making the image look like that of a crt tv was never contemplated and would have seemed absurd if suggested, like worshiping a dying technology that was not dying fast enough for us.

having said this i want to emphasize that i'm talking only about scaling. ntsc gave me the much larger problems of color reproduction, which i never made significant progress on, and brightness/contrast, which i tried with some success to correct during the capture by carefully and laboriously tweaking settings on the capture hardware. ("correcting" the brightness after the capture invariably results in a "washed out" image.) this brightness problem particularly affected metroid prime which drove me to despair. compare the brightness/contrast of the first two videos on this page. the first is one of my early ntsc captures and the second is with my ossc in 2019. the first video represents my early attempts to correct the problem after the capture. later on i acquired capture hardware with settings i could adjust. but the problem never fully went away and the quality of the second video was a dream during sda's golden age.
Edit history:
ING-X: 2025-03-26 09:38:07 pm
ING-X: 2025-03-26 06:04:08 pm
ING-X: 2025-03-26 04:00:16 pm
ING-X: 2025-03-26 02:23:03 pm
ING-X: 2025-03-26 02:21:53 pm
I'd like to say "I see", but unfortunately I continue to be confused and have more questions. Sad I'll try to articulate my remaining confusions as best I can. I feel kind of bad about the wording of this because I feel like it sounds like some kind of courtroom interrogation (lol), so I hope you'll forgive me for that.

I think there's 3 distinct but related confusions here, so I'm gonna use a single paragraph and then put two footnotes labeled [1] and [2]. I'm not a very organized thinker unfortunately.

If I'm understanding correctly, you understood that media players were upscaling the video capture using bilinear or similar[1], and that multiple upscaling algorithms existed. Your video capture, to you, looked similar enough to a CRT TV that you didn't visually notice a difference (which I assume means the CRT TV looked similar to the bilinear scaling on your media player or in VirtualDub). But at the same time, you seemed to realize there was a difference between a CRT TV and digitized video capture (i.e. that digitized video capture was already "obliterating the intent of the artists" because it's always an interpretation of it), and the idea of making an SDA video look like a CRT TV is something that could have occurred to you but didn't (and if it had occurred to you, it might have sounded silly[2]). I'm having trouble reconciling these ideas because they seem to contradict each other - it sounds like you both did, and did not, understand CRT TVs as looking different from direct video capture? So I'm confused here.

[1] I'm a little unclear now about how much you knew about the media player upscaling, because it sounds like you might have only known about it after people brought it up?

[2] I'm a little confused here too because on one hand you said you might hypothetically have tried to upscale SDA videos to look like the CRT screen ("it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp."), but then later you said that the idea of upscaling to look like a CRT would have seemed absurd and silly. So I'm confused here too.
Quote from ING-X:
If I'm understanding correctly, you understood that media players were upscaling the video capture using bilinear or similar[1], and that multiple upscaling algorithms existed. Your video capture, to you, looked similar enough to a CRT TV that you didn't visually notice a difference (which I assume means the CRT TV looked similar to the bilinear scaling on your media player or in VirtualDub).

that's right.

Quote from ING-X:
But at the same time, you seemed to realize there was a difference between a CRT TV and digitized video capture (i.e. that digitized video capture was already "obliterating the intent of the artists" because it's always an interpretation of it), and the idea of making an SDA video look like a CRT TV is something that could have occurred to you but didn't (and if it had occurred to you, it might have sounded silly[2]). I'm having trouble reconciling these ideas because they seem to contradict each other - it sounds like you both did, and did not, understand CRT TVs as looking different from direct video capture? So I'm confused here.

it might help you to understand if i go back to the sonic 1 waterfall example. scaling is not involved there. it's based on horizontal smearing to get the "transparency" effect and chroma bleed to get the rainbowing. and all capture hardware preserved those two effects from composite ntsc, though my understanding is that the exact appearance of chroma "artifacts" depended on the specific hardware. for example, some nasty hardware gave you a ton of dot crawl along with your rainbowing.

the "artistic intent already obliterated" was a hypothetical. if someone came to me and said hey, the videos on your site look different from how the game looks on a crt *without giving a specific example*, i would have said yeah, duh? because i understood how digital sampling works. it's like if you take a picture of a work of art. the picture is not the artwork. the artwork is going to look different in person, at different times of the day, from different angles, and so on.

as far as i remember, no one ever gave me a specific example involving scaling, unless you count the ~2013 chroma subsampling thing, when people *did* give me evidence and i made a change in policy based on it.

Quote:
later you said that the idea of upscaling to look like a CRT would have seemed absurd and silly. So I'm confused here too.

it would have seemed absurd without a specific example of something that looked better on a crt. but that was an uphill battle 20 years ago because digital was the next big thing. the hype was like a religion. everyone expected everything to look better on newer displays, even older games. the only counterexample i can think of is not game related. it was people, especially women who worked in entertainment, worrying about higher resolution showing problems with their skin and so on that they previously hadn't had to worry about.

i at least was not capable of thinking of a crt image as superior. metroid prime's problems i assumed came from the capture device rather than the difference in display technology. today i think they're all related. but then i'm not currently suffering from the problem so i can afford to be philosophical.

Quote:
I'm a little unclear now about how much you knew about the media player upscaling, because it sounds like you might have only known about it after people brought it up?

i don't know when i learned about it, but i knew about it early on (let's say 2006 at the latest). i was probably digging through vlc's settings interface (the old one with the expanding node list on the side) looking for display options, maybe to check which deinterlacing filters it had built in, and i would have seen the dropdown with the scaling filters then. and i knew the names of the filters from seeing them in virtualdub's resize filter dialog. and i thought, "in vlc you can configure all these filters to be applied in realtime like i configure them in virtualdub to be applied at encoding time." so that if nothing else convinced me that sda should standardize on the native resolution of the game (for HQ+ files) and leave any upscaling to the viewer. but when i thought of upscaling i thought of bilinear versus bicubic versus lanczos, the ones i knew from the world of windows pc noncommercial video editing and encoding software, not anything having to do with the physical realities of display technology. i never thought of it because what could i do about the displays people were using to watch our videos anyway? it wouldn't have made sense to me. you would have had to have shown me a specific example of me throwing away information by standardizing on the native resolution of the game.
Edit history:
ING-X: 2025-03-27 12:49:28 pm
Alright, I think I'm starting to understand. So If I'm getting this correctly, you understood the idea of upscaling algorithms more generally but didn't see (or perhaps weren't conceptually able to see) a CRT TV as specifically being distinct from e.g. bilinear scaling on your media player. It's only in hindsight that you (and the rest of us too) can see the CRT TV as being a distinct look that some people may prefer. Back then, it was just assumed that the better your display, the better the image (even for old games). (That's certainly what I remember assuming.)

I'm guessing when you said

Quote:
it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp.


this was also somewhat hypothetical - IF you had thought to try to match it to the CRT TV (in the same way you tried to match with your GBA SP), AND ended up noticing by chance that bilinear didn't quite look like the CRT TV, THEN you might have tried different algorithms to make it look right. But at the time you probably wouldn't have noticed (because noticing positive things about a CRT TV was conceptually "out of bounds"), and would have just gone with bilinear since it looked close enough. Does that sound about right?
Quote from ING-X:
Alright, I think I'm starting to understand. So If I'm getting this correctly, you understood the idea of upscaling algorithms more generally but didn't see (or perhaps weren't conceptually able to see) a CRT TV as specifically being distinct from e.g. bilinear scaling on your media player. It's only in hindsight that you (and the rest of us too) can see the CRT TV as being a distinct look that some people may prefer. Back then, it was just assumed that the better your display, the better the image (even for old games). (That's certainly what I remember assuming.)

that's right.

Quote:
I'm guessing when you said

Quote:
it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp.


this was also somewhat hypothetical - IF you had thought to try to match it to the CRT TV (in the same way you tried to match with your GBA SP), AND ended up noticing by chance that bilinear didn't quite look like the CRT TV, THEN you might have tried different algorithms to make it look right. But at the time you probably wouldn't have noticed (because noticing positive things about a CRT TV was conceptually "out of bounds"), and would have just gone with bilinear since it looked close enough. Does that sound about right?

that's right. i realized after posting yesterday that there was another opportunity for this to come up: when i started selling dvds with sda runs on them (~2005-2008). this is a largely forgotten chapter in sda's history but basically i was trying to recoup the cost of hosting and make sda less dependent on my financial support. i made a little form where you could pick which dvds you wanted and i would mail them to you. crucially, these were not just the same files you could download from the site on dvd but rather "authored" dvds (i used idvd) with the videos converted to the dvd video format, 480i30 mpeg-2 ts. they also had custom navigation menus that i designed in photoshop.

so i was actually, for a few seconds anyway (to make sure everything was working correctly), watching sda runs displayed on my crt tv *after converting them back from the native resolution of the game*. but, again, i wasn't *expecting* to see anything weird in terms of scaling. i was looking first to make sure that playback was smooth, most importantly with the correct field dominance, and second that the brightness and color were acceptable, with some allowance for the inevitable degradation of passing through multiple generations of ntsc recording. i ended up making dozens of these dvd versions of the runs and selling hundreds of them, and presumably some of the people i sold them to were watching them, but as far as i know, it never caused a single person to think about the difference between viewing the dvds on a crt tv (the devices for which the dvd standard was designed in ~1995) versus on a pc *in terms of scaling*. *i* thought about it in terms of interlacing, like "i hope no one watches these on their pc unless they know how to configure their deinterlacing filter." and if you squint, deinterlacing is a type of scaling. but that was all.

it's interesting and i bring it up because what we call "d4" (240p) stuff had to be upscaled horizontally (to 720 pixels width) for the dvds. so what method was i using to do that? i don't know. lanczos? i know i used virtualdub as part of my process. but the actual scaling i may have done in avisynth. so it still could have been lanczos. but it was definitely one of the usual suspects that you list. and this case is a bit different from what you're talking about in that the videos were intended to be displayed on a crt tv rather than intended to be displayed on a computer monitor (where today people might apply an upscaling filter meant to mimic a particular make and model of crt tv). so did this process make the runs look different on my crt tv than they had when they had come in on vhs (or, later, dvd)? i don't know. if it did, i never noticed, and no one else did either, or at least they never said anything to me.
Edit history:
ING-X: 2025-03-28 12:52:27 pm
Nice, sounds like I got it then Smiley Thanks for all the insight, I really learned a lot.

Just to end things off with a bang: I've always wondered, how do DVD recorders end up turning 240p video into 480i interlaced? Do they just line-double everything? Or do they upscale somehow?