Username:
B
I
U
S
"
url
img
#
code
sup
sub
font
size
color
smiley
embarassed
thumbsup
happy
Huh?
Angry
Roll Eyes
Undecided
Lips Sealed
Kiss
Cry
Grin
Wink
Tongue
Shocked
Cheesy
Smiley
Sad
1 page
--
--
List results:
Search options:
Use \ before commas in usernames
Edit history:
ING-X: 2025-03-21 11:50:22 pm
ING-X: 2025-03-21 07:06:53 pm
ING-X: 2025-03-21 02:20:04 pm
This question is probably best answered by nate, since he's the technical expert and has been around long enough to probably have the historical knowledge of this that I'm most curious about. But any insight is welcome.

SDA has long prided itself on its video quality standards. Runs are encoded in multiple quality options, with high quality (or "insane quality" in some cases) being the best available. Use of higher quality inputs like S-Video or Component has been encouraged. Something I noticed - and this is mentioned on the AviSynth page, for self-encoders - is that SDA videos will never have higher resolution than the actual game; if a game runs at 240p, even high quality video will be encoded at 320x240. This is because, supposedly, there's no advantage to encoding a 240p game in any higher resolution than 240p, since you get no extra information out of it and it would just waste server space.

But something I've learned about over the past few years is the idea of interpolation during upscaling: when a low-resolution input is displayed on a high-resolution screen (such as inputting a SD console into an HDTV, or expanding a low-resolution video or image to full-screen on a computer), it needs to be upscaled. In other words, the (e.g.) 320x240 pixel image needs to be scaled up to (e.g.) 1440x1080 to be displayed (obviously this is done automatically by the HDTV or computer, and the user may not be aware of it). But there are many different upscaling methods, and some result in a very different look than others. In particular, HDTVs and media players on computers tend to use a bilinear or bicubic method of upscaling, which typically results in a blurry image. Even a 640x480 video (which would be what high-quality SDA videos use) would end up blurred if upscaled to full-screen on a media player on a computer (which even in the mid-2000s, would have used a sharp 1024x768 resolution at the lowest).

Now, I know that depending on taste, some amount of blur can be advantageous (I learned about all this in the context of learning about consumer-grade CRT TVs, which have a natural Gaussian blur that pixel-artists often took into account when drawing their sprites, leading to more realistic and rounded off images). But based on my understanding of SDA's mindset in the mid-2000s - and the lack of any mention of bilinear/bicubic upscaling on the SDA pages I'm aware of - it seems like such a blurring of SDA's videos (during upscaling when watching on a computer in full-screen) would have been considered undesirable, especially for 2D sprite games (which even emulators would render at full pixel-sharpness). Which makes me wonder: did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer? Was this considered desirable for whatever reason? Was bilinear/bicubic upscaling even common in media players of the time period? Was the whole concept of upscaling/interpolation just not on the SDA team's radar at the time (I certainly had no idea about it myself; I just assumed "low resolution" just inherently looked blurry)? And if such blurring is not considered desirable, is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?
Thread title:  
Quote from ING-X:
did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer?

yes, we were aware. to help you understand the decision about not allowing higher output resolution than the game's native, it was to keep opinions about upscalers out of sda. like, imagine you submit your run and the higher qualities have been upscaled using eagle or 2xSaI or something. and it's accepted, and then people come into the forum like man, that algorithm is, like, so last year/fugly/whatever. so now you have a problem and sda has a problem because dudes are coming in with pitchforks demanding changes, whereas if we leave the upscaling to them in their player we can say "that's your problem."

note that the output of hardware upscalers like the ossc is (and has always been iirc) an exception. people expect the output of these devices to be sharp and in fact it's so sharp and the signal so stable (low noise) that the higher resolutions have little impact on bitrate/quality (which would be another reason not to allow upscaling in the higher quality files). the fpga consoles like the mega sg are viewed the same way. so you can see that the rule is really about keeping personal choices out of sda - not because i'm authoritarian (although i am) but because no admin wants to deal with a bunch of whining and community strife when there's a simple solution right there to prevent it before it starts.

Quote:
Was bilinear/bicubic upscaling even common in media players of the time period?

yes, the default method was one of those two, can't remember which one now, both in players like vlc and in editors like virtualdub, with its scale filter. if it wasn't one of those then it was nearest neighbor, but that looked so bad that no one used it.

for *downscaling*, if you're curious, the method also matters, and i always preferred (and hardcoded into anri) lanczos.

Quote:
is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?

there are a *lot* of software scalers out there these days. google retro game scaling algorithm and you will see. plenty of opinions, too, as always. i bet some of the more recent scalers already come preinstalled in vlc and you just have to go into settings and select whichever one you want. you can probably also download packs of them, both for vlc and for editors like premiere. i personally do not have a preference for one algorithm over another as i play exclusively using my ossc and fpga consoles these days. and with those i leave the defaults, which results in a quite sharp image, quite unlike the appearance of the same game on a crt display of the same era.

so, as you can see, for my personal usage, i am of the opinion that trying to make modern displays look like crts is a waste of time. i think part of the problem is that "a crt" doesn't refer to anything specific. some people grew up with sony trinitrons, for example, which look different from other crt displays. i did not grow up with a sony trinitron, but a lot of people did, so when i see their work on this stuff, i'm like, "what is this?"

and it also doesn't help that i just don't care about making games look like they did on a crt. doing that feels like someone else's hobby to me. like sure, knock yourself out, but don't expect me to get excited about it. and i guess that same attitude back in the day made it easy for me to make the decision about not allowing upscaling video coming from the old consoles. imagine if i had been a huge fan of a particular scaling algorithm and had injected my opinion about that into the site on top of every other way that i did. sda would look even more dated today, if you can even imagine it.

just some random musings from me. hope i was able to answer your questions.
Edit history:
ING-X: 2025-03-23 07:07:53 pm
ING-X: 2025-03-23 02:10:28 pm
Thanks, that was very enlightening. Smiley I had no idea about any of this back in the day, I (and I assume most others not knowledgable in the tech aspects of this) always assumed there was just a linear progression from "low quality" to "high quality". I think I knew my CRTs looked "blurrier" (which at the time I would have seen as undesirable) while my HDTV looked more detailed, and I just assumed the HDTV was showing the full detail of the input signal (in all its analog ugliness). I didn't realize at the time that my HDTV was using some kind of bicubic upscaling algorithm on the inputs which is why it looked so ugly. I certainly didn't know about all the ways in which sprite artists took the specific CRT display quirks into account, in a way that some people would prefer (I have to wonder how many people knew about that at the time). If I knew what I know now, I think I probably would have been using a CRT for my speedruns back in the day.

This makes me wonder, did you guys at the time know about how (at least consumer-grade) CRTs had a unique ""scaling algorithm"" different from e.g. bicubic or bilinear, or about how sprite artists often took the specifics of the CRT (along with composite video etc) into account when designing sprites, in such a way that some people (myself included) would prefer? I'm very curious to know how much of that sort of thing was understood at the time, it already seems like you guys knew way more about scaling algorithms etc than I was giving the mid-2000s credit for.
this is a harder question, because while i know stuff today, it's been so long that i can't remember when i learned it.

my understanding is you're asking about sda's golden age if you will, from about 2004 through 2014. this was the time of traditional (ntsc/pal) analog video capture. in 2004, everyone who wasn't using a capture card was using vhs, and almost no one had an hdtv, while in 2014, no one was using vhs, and almost everyone had an hdtv. the middle of the last decade also saw the rise of both hardware upscalers like the ossc and the fpga consoles. so ~2004-2014 was the transition period from analog to digital where people were capturing old school ntsc/pal signals from consoles and trying to figure out what to do with them. i became knowledgeable in this area, and it was a mess. i am glad that it's in the past. it was so easy to make a mistake and ruin your captured video.

i can tell you one thing i may have known about back then: the "transparency" and rainbow effects in the waterwalls in sonic 1:
https://youtu.be/x0weL5XDpPs?t=208
https://www.reddit.com/r/emulation/comments/l3txpn/comment/gkn0590/

like i said, i can't remember when i learned about this specific example, but i am 100% sure that i knew about both horizontal "smearing" (which artists exploited using dithering) and chroma bleed, which i sometimes called "rainbowing," inherent in ntsc composite signals. here's one example i found searching the forum just now:
https://forum.speeddemosarchive.com/post/hard_corps_uprising_quality_test_2.html
you'll notice that this is a negative example. at the time i thought of s-video as unconditionally superior to composite. but you have to remember that i was running a website where people would mostly download compressed video files and watch them on their computer screens, which were usually lcd (especially after about 2005), *not* crt tvs. so if you went back in time and told me "you're obliterating some of the intent of the artists by suggesting people switch to s-video," i would have replied "it's already obliterated the moment you digitize the signal." in other words i understood by spring 2004 (more on this date below) that a digitization of an analog video signal (and its subsequent display) is always an interpretation of it.

on top of that, it's not like s-video was a later invention for every console. tons of people used s-video with their analog tvs by the late 90s. maybe some developers were even viewing their work over s-video. and s-video only eliminated chroma bleed effects like rainbowing and dot crawl, not horizontal smearing. and dot crawl was usually introduced by people's cheap capture cards. in other words it was not the intent of the game's developers for the game to look that way. at least i am unfamiliar with any examples of dot crawl being intended.

i know that it was spring 2004 at the latest that i understood the differences between the types of displays because that's when zero mission came out and i saw it on my gbasp screen and on my tv (via my game boy player) and again in virtualdub after going through my capture card. so i got to see exactly how the game was supposed to look on my gbasp and how it looked on a crt tv and how it looked when that analog signal from the game boy player was digitized by my capture card. i wanted the captured image to look sharp like it did on my gbasp screen and there was just no way to do it. the brightness was also totally wrong which probably bothered me even more than the blurriness and the color. but that's yet another problem with ntsc that i won't get into here.

another thing i'd like to mention that you didn't bring up but was nonetheless part of our decision making (both for sda and early gdq): latency/input lag on hdtvs has only relatively recently come down enough for people who care about latency to start using them. i think my lg oled tv that i got six years ago was one of the first hdtvs to be competitive with a crt in terms of input lag (< 20 ms). and if you try to plug a composite cable from an old console into an hdtv, the lag is horrid. so that's why until recently in "behind the scenes" pictures of gdq you would see a row of crts no matter what people were playing.

there were gaming monitors before 2019 that featured low latency, but the picture was not pretty like with an oled display (or, obviously, a crt, for games designed to be played on a crt). lots and lots of people used those low latency gaming monitors though, including later gdqs at some points i believe. i'm just not as familiar with them.

so the answer to this question that you didn't ask, about input lag, is that we must have known about it almost right from the start, because in december 2009 we set up the first gdq knowing about it, and we had known about it for some time at that point.
Thanks again for the response, and for all the detail. Smiley I'm still a little unclear about what you knew about the distinction between horizontal smearing from analog video signals (which it seems clear you knew about) vs. the CRT itself (which has its own horizontal smearing and similar effects, at least in consumer-grade TVs - I've heard it described as a "Gaussian" scaling algorithm). I'd guess since you looked at Zero Mission on a CRT vs a capture card, you must have seen the difference there at least?
ah okay, yeah, that was/is beyond the limit of my knowledge. from the word gaussian i can guess what you're talking about, but it never occurred to me to interpret the placement of the apertures in the mask as a scaling method. but that's exactly what it is now that you mention it. i know different tvs had different mask patterns too like i was talking about with the trinitrons. i've read that they were the most different but i don't have much experience with them other than seeing them at other kids' houses and not paying much attention at the time. i mean i could tell that games looked different on them but i never made the connection.

oh, that reminds me of something i did know about though. i learned this from radix actually. i still don't remember when but it must have been early, like 2005 or so. he remarked one time that it's weird how we don't usually perceive ntsc tvs to be flickering even though "the phosphor is only active for a few nanoseconds" or something like that. later on i found out that this effect is called persistence of vision. i believe this is the same effect with old school edison type incandescent light bulbs, where they turn on and off 60 times a second but humans can't ordinarily perceive that. and modern lcd/oled displays have this big problem with "judder" because their pixels stay on for so long rather than strobing like crts. i actually have my lg tv configured to insert black frames at 60 hz so the games i play at 60 fps have less judder (because the tv is strobing, sort of, at 60 hz). it's a hack though and i really want to get a new tv that can do 120 fps and 4k and hdr at the same time. i've read that 120 fps cuts down on judder at least as much as black frame insertion.
Edit history:
ING-X: 2025-03-25 12:44:05 am
ING-X: 2025-03-24 09:18:09 pm
ING-X: 2025-03-24 09:17:47 pm
ING-X: 2025-03-24 09:17:03 pm
ING-X: 2025-03-24 09:00:54 pm
ING-X: 2025-03-24 09:00:01 pm
Yeah, Trinitrons look a little different, they have a different type of mask that looks more like a pixel grid. Higher end consumer trinitrons - along with higher end "professional" CRT monitors of all brands, as well as CRT PC monitors - tend to look more like nearest-neighbor scaling (like emulators, or default settings on OSSC etc). Lots of people like that, but to me part of the appeal of CRTs is the "Gaussian" scaling that makes things look softer and more "convincing".

I think I'm still a little bit unclear about a couple things, which may be a result of you just not remembering things that well, but also just inherent difficulties in communicating in non-real time over a forum. I'm getting that you didn't know specifics about how CRTs' "scaling" worked, but I'm wondering:

1. if you knew at least about CRTs having their own horizontal smearing, on top of the composite/s-video/etc signal's smearing, even if you didn't know more specifics
2. if you at least could tell (since you compared them) that CRTs looked *different* than direct video capture in some way, or that the CRT was less sharp than the video capture upscaled with nearest-neighbor (maybe the CRT looked closer to bilinear to your eye?)
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)
Edit history:
DJGrenola: 2025-03-25 08:06:09 am
DJGrenola: 2025-03-25 08:05:59 am
DJGrenola: 2025-03-25 08:05:57 am
DJGrenola: 2025-03-25 07:55:32 am
guffaw
Quote from ING-X:
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)


me and nate did discuss LCD vs CRT back in the day, but more in terms of monitors than TV. I remember because there was a difference of opinion involved: nate was a fan of LCDs, which were the new technology at the time. I much preferred CRTs, partly because I felt the smoothing effect made them more pleasant to use, but also because you could drive them at different resolutions, which you can't really do with an LCD. nate almost certainly had access to higher quality LCD panels than I'd ever seen though. anyway I was certainly aware of the smoothing effect and we probably discussed it.

the notion of crafting a custom scaler specifically to simulate a CRT certainly wasn't something that had occurred to anyone at SDA at the time. I don't think it had ever even occurred to anyone in e.g. a university looking for a thesis, and academics regularly work on all sorts of dumb useless ideas. think about it -- why would you want a scaler to simulate a CRT, when you could just go down the road and buy an actual CRT? no incentive to create such a thing even existed until they stopped making CRT screens.

Quote from nate:
he remarked one time that it's weird how we don't usually perceive ntsc tvs to be flickering even though "the phosphor is only active for a few nanoseconds" or something like that. later on i found out that this effect is called persistence of vision. i believe this is the same effect with old school edison type incandescent light bulbs, where they turn on and off 60 times a second but humans can't ordinarily perceive that. and modern lcd/oled displays have this big problem with "judder" because their pixels stay on for so long rather than strobing like crts.


couple of points about this. I'm not familiar with the materials science involved but I know you can get different speed phosphors. old analogue oscilloscopes and radar units used very slow phosphors, so you could trace the beam across the screen e.g. once a second and see for a good half a second the trail it would leave behind. you're right in that TV phosphors decayed much quicker, but I don't know the numbers. so there's partly a persistence of vision effect within the eye and brain, but I think also the phosphor would naturally remain illuminated for a short while.

the other thing is the incandescent light bulb. they definitely flickered a bit at 50/60 Hz, but ultimately the light is produced by a white-hot piece of tungsten. it takes time for that to cool down, so it will continue to glow with residual heat, even when the voltage across the filament drops to zero. so the flickering effect isn't as pronounced as you might think. see e.g.



also, any issues you may be having with the RSS feed are not my fault.
Edit history:
DJGrenola: 2025-03-25 10:45:12 am
DJGrenola: 2025-03-25 09:37:30 am
guffaw
Quote from ING-X:
Which makes me wonder: did nate (and the rest of the SDA staff) understand, during that time period, that videos were being blurred during upscaling, when viewed in full-screen on a computer? Was this considered desirable for whatever reason? Was bilinear/bicubic upscaling even common in media players of the time period? Was the whole concept of upscaling/interpolation just not on the SDA team's radar at the time (I certainly had no idea about it myself; I just assumed "low resolution" just inherently looked blurry)? And if such blurring is not considered desirable, is there a way to circumvent it (perhaps enabling nearest-neighbor upscaling in VLC?) that would be recommended?


funnily enough I was uploading an old audio cassette tape from my student days to archive.org recently. people asked me what I was going to do about reducing the hiss on the recording, and the answer was, well, nothing. it's arrogant to assume you can do a better job of this than whatever whizzy AI algorithm will exist in 50 years. so it's better to keep the processing to a minimum, and upload the driest source you can. admittedly in the case of internet video there's a compromise involved, in that you actually do need the videos to be watchable on the technology of the day. stuff that was recorded on DVD recorders could in theory just have been uploaded in its original MPEG-2, which would have been completely unprocessed. but then you have issues with download filesizes for people on 56K modems, and the need to deinterlace it on playback (which is easy to do badly) and that kind of thing. plus the SDA methodology kind of carried over from VHS tape, which is probably why uploading MPEG-2 didn't happen, even though theoretically that would have been a purer source.

I do actually still have the MPEG-2s of both my published metroid runs. sometimes I think about uploading them somewhere, although I doubt I'll ever bother.
Edit history:
nate: 2025-03-25 02:32:37 pm
nate: 2025-03-25 02:23:14 pm
Quote from ING-X:
1. if you knew at least about CRTs having their own horizontal smearing, on top of the composite/s-video/etc signal's smearing, even if you didn't know more specifics

it's hard to say to be honest. i want to say no but there are so many years in there where i could have read it or thought about it and went, "huh, that's interesting," and then filed it away never to be seen again. grenola touched on this a bit but it wasn't something people thought about in general as far as i remember. the conceptual leap from "this is the root cause of how games look on this particular crt tv" to "i am going to write software to simulate that appearance on my computer monitor" did not happen. why didn't it happen? why didn't the romans have steam power? it obviously would have been useful to some people, right? i don't know the answer. maybe it's just random, or maybe, like grenola said, now that crts are so much rarer and more respected, the people looking into this are motivated to learn about and glorify them.

to elaborate a bit, until recently, there was a "next big thing" feeling in consumer technology, like omg dude you totally have to see what a drop of water looks like on a leaf in hd. and people like me looked down on crt tvs as antiquated technology destined for the landfill. i resented the entire "legacy" analog video apparatus i carried on my shoulders while i was trying to bring the speedrunning gospel to people. and the crt tv was "ground zero" of that apparatus. it was the harebrained early 20th century technology that spawned all the other harebrained hacks i had to cut through to do my job. so maybe i tended to be a little bit biased toward newer technologies like large lcd screens that in reality had their own long list of problems including their input lag and mutilation of old school game video.

Quote:
2. if you at least could tell (since you compared them) that CRTs looked *different* than direct video capture in some way, or that the CRT was less sharp than the video capture upscaled with nearest-neighbor (maybe the CRT looked closer to bilinear to your eye?)

no, i definitely didn't think of it in terms of scaling. scaling was the least of my problems. i'd say interlacing was first. any 480i capture from a 240p game immediately got field split and then scaled down to get the correct aspect ratio. it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp. but it was hopeless, like i said. there was no horizontal definition left and the brightness and color were also totally wrong. i did the best i could.

edit: upscaling 240p game video was brought up more recently, starting in about 2013 iirc, because by then there was a general awareness here at sda that the common chroma downsampling applied to e.g. h.264 mp4 video, which we called "yv12" after what it is called in avisynth (4:2:0 sampling), was impacting the image quality when people were viewing these videos upscaled. so the other admins and i started allowing people to submit IQ and XQ with more chroma or just flat out rgb rather than yuv. basically we realized that because of the chroma downsampling, storing old school game video in the native resolution was actually losing information, and we had to do something about that. another solution would have been to allow upscaling, but i was always reluctant, because i cringed at all the disk space and bandwidth that would be wasted for what i viewed as a hack. remember that i was the one who paid for both the disk space and the bandwidth, at least unless you exclusively downloaded runs from archive.org.

Quote:
3. if the look of CRTs was even something people talked about at the time, or if that's more of a recent thing (like, were people trying to make "CRT filters" back then, or is that only something that started recently)

well, there was the "scanline" filter you could use with some emulators. as far as i know, it just drew black horizontal lines over the image. i remember very clearly trying it out and being like "wtf?" because it didn't look anything like a crt tv to me. i didn't think the person who wrote the filter was stupid; i just didn't get it. that's about the only example i remember. i think the rest of the filters in emulators were the stuff like eagle and 2xsal like i mentioned earlier. i'm not sure whether those were meant to mimic particular crt tvs or not. i found them ugly and stuck with nearest neighbor when i (rarely) played games using emulators.

you might search doom9's forum for stuff like crt upscaler. that is where people's projects would have been posted and where a lot of my knowledge and code that went into sda's software came from.

Quote from DJGrenola:
old analogue oscilloscopes and radar units used very slow phosphors, so you could trace the beam across the screen e.g. once a second and see for a good half a second the trail it would leave behind.

yes! i have seen this before now that you mention it. good insight. thank you for bringing this up. i was definitely painting with a very broad brush when i compared fast consumer tv phosphor illumination with *all incandescent light bulbs*, lol. having said that, i know that there are some big offenders that ruin people's high framerate videography in e.g. train stations.

Quote from DJGrenola:
admittedly in the case of internet video there's a compromise involved, in that you actually do need the videos to be watchable on the technology of the day. stuff that was recorded on DVD recorders could in theory just have been uploaded in its original MPEG-2, which would have been completely unprocessed. but then you have issues with download filesizes for people on 56K modems, and the need to deinterlace it on playback (which is easy to do badly) and that kind of thing. plus the SDA methodology kind of carried over from VHS tape, which is probably why uploading MPEG-2 didn't happen, even though theoretically that would have been a purer source.

yes! i did consider this. the archivist in me wanted to upload the originals of everything. i also kept all the vhs tapes i had ever been sent for years until my dad convinced me to pare them down. (i think he was tired of helping me move them between apartments.) and then i had to decide which ones might be historically significant even though history hadn't really begun yet. i still have a few today. i know scarlet's 0:55 is one of them ... the rest i do not remember.

to be honest, some days it's hard to motivate myself to care about people i'll never meet in the far future looking into this stuff. they will definitely have some things to look at, some things that will make the leap through time, and they will consider them precious to them. they might curse me for not saving more. if they recreate me using ai, and they get my personality right, i will tell them: "if you had all these things, then you would no longer value them." and then they would delete me. just as planned.
Edit history:
ING-X: 2025-03-25 03:16:51 pm
ING-X: 2025-03-25 03:16:30 pm
Quote:
to elaborate a bit, until recently, there was a "next big thing" feeling in consumer technology, like omg dude you totally have to see what a drop of water looks like on a leaf in hd. and people like me looked down on crt tvs as antiquated technology destined for the landfill. i resented the entire "legacy" analog video apparatus i carried on my shoulders while i was trying to bring the speedrunning gospel to people. and the crt tv was "ground zero" of that apparatus. it was the harebrained early 20th century technology that spawned all the other harebrained hacks i had to cut through to do my job. so maybe i tended to be a little bit biased toward newer technologies like large lcd screens that in reality had their own long list of problems including their input lag and mutilation of old school game video.


This is the sense I get too. I myself looked down on CRTs for the longest time, until just a couple years ago when I saw pictures comparing "raw pixels" (i.e. nearest neighbor scaling) to what the games look like on a CRT, and was blown away. And then shortly after I found out that my HDTV that I had used for years was doing some kind of weird upscaling that I didn't know about that made everything look really ugly. It was a big moment of change in my entire mindset regarding these things. In fact, I had previously had a bit of resentment toward CRTs because everyone kept telling me I should use them because of input lag, and I felt that wasn't relevant to me because I had a (presumably) low-lag HDTV, and I felt my HDTV was better and fancier than people's CRTs (obviously I feel differently now).

Quote:
no, i definitely didn't think of it in terms of scaling. scaling was the least of my problems. i'd say interlacing was first. any 480i capture from a 240p game immediately got field split and then scaled down to get the correct aspect ratio. it never once occurred to me (or anyone else as far as i know) to upscale. so the question of method never came up. it's an interesting hypothetical. which method would i have chosen? lanczos3? would i have thought to try to match the output to how the game looked on my crt tv? i certainly tried to match zero mission to how it looked on my gbasp. but it was hopeless, like i said. there was no horizontal definition left and the brightness and color were also totally wrong. i did the best i could.

edit: upscaling 240p game video was brought up more recently, starting in about 2013 iirc, because by then there was a general awareness here at sda that the common chroma downsampling applied to e.g. h.264 mp4 video, which we called "yv12" after what it is called in avisynth (4:2:0 sampling), was impacting the image quality when people were viewing these videos upscaled. so the other admins and i started allowing people to submit IQ and XQ with more chroma or just flat out rgb rather than yuv. basically we realized that because of the chroma downsampling, storing old school game video in the native resolution was actually losing information, and we had to do something about that. another solution would have been to allow upscaling, but i was always reluctant, because i cringed at all the disk space and bandwidth that would be wasted for what i viewed as a hack. remember that i was the one who paid for both the disk space and the bandwidth, at least unless you exclusively downloaded runs from archive.org.


Wait, I thought you said you *had* thought about upscaling, but chose not to so that people could upscale how they wanted on the media player?

Either way, maybe I should reword this: Did you notice that your CRT looked different from direct video capture, and/or that the CRT looked "softer" than nearest-neighbor (e.g. on emulator)? I'm assuming yes because of your Zero Mission comparisons (where you looked at GBA SP, vs. CRT, vs. video capture), but I just wanted to clarify.