tritosine2k
Erudite
- Joined
- Dec 29, 2010
- Messages
- 1,700
It's still different in a very objective and measurable way, but yeah VGA's line doubling brings a high horizontal resolution display slightly closer to an LCD than a low-res display without the line doubling.that the art you saw in monitor CRTs isn't any different from what you see today in flat panel displays
But stop saying "monitor CRTs" as if they are really any different at all to TV CRTs. Monitors weren't exclusively used with IBM PC clones running VGA cards, and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!
Did CGA and EGA do line doubling? IIRC they did not (also IIRC EGA had some ahead of its time "intensity" signal). You should be able to see scan line width variation on your CGA screen in a macro photo of the phosphors, just like with any other CRT display. There's nothing magic about it.
Correct. Interesting that you don't think these are relevant to the final image lol.As for scanlines, they're clearly visible in person especially in the larger monitors showing lower resolutions.
It's still different in a very objective and measurable way
But stop saying "monitor CRTs" as if they are really any different at all to TV CRTs.
Monitors weren't exclusively used with IBM PC clones running VGA cards
and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!
This is the point of contention. Except for pixel art created on and for VGA (how much such art is there? Even on PC half the games with decent pixel art were Amiga ports where the art was done on an Amiga), most "retro" pixel art was built for analog image reconstruction, not coloring a grid of squares. People who grew up playing 2D games on emulators on LCDs or on VGA low res modes are the ones who get it wrong. For fucks sake, how hard is to understand that?Second this is a fucking CRT, displaying fucking pixel art, the entire fucking point of the image is to show that pixel art on a fucking old computer CRT looks as square as any fucking modern pixelar
Only if you have the view that an image is a 2D grid of squares, and not a sampled signal that needs to be reconstructed. The former is a very unsophisticated, poor, and I dare say amateurish misconception of what an image is.First of all this is not a problem, it is a feature to get proper crisp image on any display above 13"
You don't know what you're talking about. There is no color bleeding unless your "TV CRT" is damaged in some way. Color bleeding might come from a composite signal, but RGB signals on "TV CRTs" with the same resolution, mask and phosphor spec will look *exactly* the same as a monitor on the same resolution, mask and phosphor spec.smudgy color bleeding that you'd get on a fucking TV CRT
My entire childhood was IBM PC clones tyvm.everyone whose entire fucking childhood was only about NES, PS1
You can buy very high resolution "TV CRTs" e.g. BVMs. There's very little difference except connectivity between something like a BVM "TV" and a GDM "Monitor". They're just CRTs.There other differences in that computer CRTs had more sharpness, faster response times, short phosphor persistence, and higher resolution
3D graphics are vector graphics. They're not drawn by hand, but by an algorithm. They are objectively better the higher the resolution and AA, no matter the display technology being used to display them. At the same time, the image is still better in an objective way when it is reconstructed on a CRT by scanning out lines over phosphor masks than on an LCD color grid.u could play Quake on a TV from your PC if you had the right card, but it looked awful compared to it being on a smaller sized PC CRT.
and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!
First of all this is not a problem, it is a feature to get proper crisp image on any display above 13" or so - otherwise you'd have some huge scanlines which would both darken the final image and look awful.
Second this is a fucking CRT, displaying fucking pixel art, the entire fucking point of the image is to show that pixel art on a fucking old computer CRT looks as square as any fucking modern pixelart on a fucking flat panel instead of the smudgy color bleeding that you'd get on a fucking TV CRT that everyone whose entire fucking childhood was only about NES, PS1 and consoles ignorantly parades as the "proper way CRT is supposed to look". For fucks sake, how hard is to understand that?
You could get 900 TVL "TV CRTs" i.e. a 1200 pixel horizontal resolution. They weren't "stuck". The only thing that made you "stuck" was your wallet.TV CRTs had .60-.9mm dot pitch and stuck at 640x480 resolution
You can buy very high resolution "TV CRTs" e.g. BVMs. There's very little difference except connectivity between something like a BVM "TV" and a GDM "Monitor". They're just CRTs.There other differences in that computer CRTs had more sharpness, faster response times, short phosphor persistence, and higher resolution
3D graphics are vector graphics. They're not drawn by hand, but by an algorithm. They are objectively better the higher the resolution and AA, no matter the display technology being used to display them. At the same time, the image is still better in an objective way when it is reconstructed on a CRT by scanning out lines over phosphor masks than on an LCD color grid.u could play Quake on a TV from your PC if you had the right card, but it looked awful compared to it being on a smaller sized PC CRT.
You could get 900 TVL "TV CRTs" i.e. a 1200 pixel horizontal resolution. They weren't "stuck". The only thing that made you "stuck" was your wallet.TV CRTs had .60-.9mm dot pitch and stuck at 640x480 resolution
Here's my source besides living through it.This thread feels like it's probably full of misinformation.
Dot pitch is just screen size divided by resolution. If you're able to resolve 1200 black and white lines over 10 inches or whatever, your dot pitch will be 10 inches divided by 1200. You could get high resolution small screens. In fact BVMs are quite compact.You're missing dot pitch from your equation on top of display resolution
1. Software isn't written in "bits"here is the matter of whether the software is written in 4,8, 16, or 32 bit
You don't know what you're talking about. Consoles couldn't (or well, didn't, except for some static menus and some very rare exceptions) output 640 by 480 till the Dreamcast lol. They usually didn't even output 240 lines but like 224 or some such number.All of your console games were made with 640x480 that's why it didn't scale
I remember around the time Plasma TVs were coming around in the mid 2000s, "HD" "TV" CRTs were a cheaper consumer alternative.Also, I was speaking of the general public not millionaires living on the bleeding edge. They're outliers and ignored.
Dot pitch is just screen size divided by resolution. If you're able to resolve 1200 black and white lines over 10 inches or whatever, your dot pitch will be 10 inches divided by 1200. You could get high resolution small screens. In fact BVMs are quite compact.You're missing dot pitch from your equation on top of display resolution
1. Software isn't written in "bits"here is the matter of whether the software is written in 4,8, 16, or 32 bit
2. Except for maybe some weird niche machine I don't know about there's no such thing as a 4-bit computer.
3. So I guess you meant the color channels being used for the art, which is fine that makes sense.
But there's almost no *pixel art* from the CRT era that was more than 15 bit in color. I'd actually be interested in leaning of some examples.
I remember around the time Plasma TVs were coming around in the mid 2000s, "HD" "TV" CRTs were a cheaper consumer alternative.Also, I was speaking of the general public not millionaires living on the bleeding edge. They're outliers and ignored.
Anyway, to move the conversation in a different direction, I do want to say that "SD" screens (larger dot pitch, as JamesDixon would say) have properties that reconstruct low-res images in a way with some better objective qualities (the coarser mask is positioned closer to where the sinc lobes should be). I find 360 TVL masks to be ideal for 240 to 320 samples per line.
You don't know what you're talking about. Consoles couldn't output 640 by 480 till the Dreamcast lol.All of your console games were made with 640x480 that's why it didn't scale
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.
I'm stupid and can't understand simple English.
Whether a CPU is 8 bit or 16 bit has very little to do with the art a machine can output so I don't know why you brought this up and continue to bring it up. And any CPU can use any amount of bit integers, it's just a matter of how many instructions it would take if registers aren't the right size.1. This shows how ignorant you are of the entire subject. When using the 4, 8, 16, and 32 bit it refers to the CPUs bit integers or memory addresses. The first games were made for 4 bit CPUs and by the end of computer CRTs we were up to 32 bit CPUs. So yes, software was written for a specific bit set. Windows 3.x was 16 bit, Windows 95-ME were 32 bit, Windows XP was 32/64 bit, and every version up until Windows 10 was 32/64 bit. Windows 10 dropped support for 32 bit.
lol. Good to know, I guess?Intel 4004 and 4040 CPUs were 4 bit. They were manufactured in 1971 and 1974 respectively. You can still buy 4 bit CPUs today.
What do you mean "No"? I threw you a bone and said you probably meant the color channels as opposed to how many bits "software is written in" (lol wtf), and now you're trying to school me by explaining color bit depth without realizing that color bit depth is composed of the bit depths of the individual color channels (RGB in the standard additive model).No, I meant color depth. At first the color depth was 2 which was black and white. You have 4, 8, 16, 24, etc... color depth. When I said 24 bit color is the current standard that means the display can render an individual dot pitch as one of 16,777,216 colors.
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.
I'm stupid and can't understand simple English.
I thought you had me on ignore.
Whether a CPU is 8 bit or 16 bit has very little to do with the art a machine can output so I don't know why you brought this up and continue to bring it up. And any CPU can use any amount of bit integers, it's just a matter of how many instructions it would take if registers aren't the right size.1. This shows how ignorant you are of the entire subject. When using the 4, 8, 16, and 32 bit it refers to the CPUs bit integers or memory addresses. The first games were made for 4 bit CPUs and by the end of computer CRTs we were up to 32 bit CPUs. So yes, software was written for a specific bit set. Windows 3.x was 16 bit, Windows 95-ME were 32 bit, Windows XP was 32/64 bit, and every version up until Windows 10 was 32/64 bit. Windows 10 dropped support for 32 bit.
I'm a developer btw. I'd link you to my github, but I'm not interested in doxing myself.
lolIntel 4004 and 4040 CPUs were 4 bit. They were manufactured in 1971 and 1974 respectively. You can still buy 4 bit CPUs today.
What do you mean "No"? I threw you a bone and said you probably meant the color channels as opposed to how many bits "software is written in" (lol wtf), and now you're trying to school me by explaining color bit depth without realizing that color bit depth is composed of the bit depths of the individual colors.No, I meant color depth. At first the color depth was 2 which was black and white. You have 4, 8, 16, 24, etc... color depth. When I said 24 bit color is the current standard that means the display can render an individual dot pitch as one of 16,777,216 colors.
As for the rest of what you wrote, you're out of your depth. Stop embarrassing yourself.
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.
I'm stupid and can't understand simple English.
I thought you had me on ignore.
Changed my mind, I want the full codex experience
Those sequence of sentences just don't make much sense. Especially the last three. Maybe you know what you're talking about, but you're using very confusing and inexact terminology.
I just don't think that's a good use of my time in your case. And that's saying something.Attack the points not the poster
I can't help it if you can't read you stupid commie fuck.
You should stop embarrassing yourself since you obviously don't know what the fuck you're talking about.
Attack the points not the poster fucktard.