Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

.

pixel art is:


  • Total voters
    96
  • Poll closed .

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,480
Even better:
https://www.edmundoptics.com/knowle...introduction-to-modulation-transfer-function/
MTF-in-the-vertical-and-horizontal-directions-for-the-cathode-ray-tube-CRT-and-liquid.png
 

Nutmeg

Arcane
Vatnik Wumao
Joined
Jun 12, 2013
Messages
20,082
Location
Mahou Kingdom
An aside, does anyone else find EGA art much more beautiful than VGA art? It might be because artists working with EGA were working with signal samples being reconstructed into images with analog circuitry as opposed to something more akin to modern "color the grid" pixel art as is the case with VGA.
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
that the art you saw in monitor CRTs isn't any different from what you see today in flat panel displays
It's still different in a very objective and measurable way, but yeah VGA's line doubling brings a high horizontal resolution display slightly closer to an LCD than a low-res display without the line doubling.

But stop saying "monitor CRTs" as if they are really any different at all to TV CRTs. Monitors weren't exclusively used with IBM PC clones running VGA cards, and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!

Did CGA and EGA do line doubling? IIRC they did not (also IIRC EGA had some ahead of its time "intensity" signal). You should be able to see scan line width variation on your CGA screen in a macro photo of the phosphors, just like with any other CRT display. There's nothing magic about it.

As for scanlines, they're clearly visible in person especially in the larger monitors showing lower resolutions.
Correct. Interesting that you don't think these are relevant to the final image lol.

The key difference in computer CRTs and TV CRTs is that the TV had speakers and a band tuner built in. Computer CRTs lacked them by default. There other differences in that computer CRTs had more sharpness, faster response times, short phosphor persistence, and higher resolution. A standard definition CRT TV was stuck at 480p. You could play Quake on a TV from your PC if you had the right card, but it looked awful compared to it being on a smaller sized PC CRT.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,224
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
It's still different in a very objective and measurable way

As far as the way art is displayed goes, they are not different. I am not referring to the technical details, i am referring to how in the FF7 example posted previously you'd see the right image and not the left image on a monitor CRT.

But stop saying "monitor CRTs" as if they are really any different at all to TV CRTs.

They are.

Monitors weren't exclusively used with IBM PC clones running VGA cards

No, but you'd get similar results (in that you do not have color bleeding or other artifacts that make art look different from a flat panel display) with an EGA, CGA or Hercules card - i gave photos as examples. They may not be the best photos but you can see that individual pixels are visible.

and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!

First of all this is not a problem, it is a feature to get proper crisp image on any display above 13" or so - otherwise you'd have some huge scanlines which would both darken the final image and look awful.

Second this is a fucking CRT, displaying fucking pixel art, the entire fucking point of the image is to show that pixel art on a fucking old computer CRT looks as square as any fucking modern pixelart on a fucking flat panel instead of the smudgy color bleeding that you'd get on a fucking TV CRT that everyone whose entire fucking childhood was only about NES, PS1 and consoles ignorantly parades as the "proper way CRT is supposed to look". For fucks sake, how hard is to understand that?
 

Nutmeg

Arcane
Vatnik Wumao
Joined
Jun 12, 2013
Messages
20,082
Location
Mahou Kingdom
Second this is a fucking CRT, displaying fucking pixel art, the entire fucking point of the image is to show that pixel art on a fucking old computer CRT looks as square as any fucking modern pixelar
This is the point of contention. Except for pixel art created on and for VGA (how much such art is there? Even on PC half the games with decent pixel art were Amiga ports where the art was done on an Amiga), most "retro" pixel art was built for analog image reconstruction, not coloring a grid of squares. People who grew up playing 2D games on emulators on LCDs or on VGA low res modes are the ones who get it wrong. For fucks sake, how hard is to understand that?

First of all this is not a problem, it is a feature to get proper crisp image on any display above 13"
Only if you have the view that an image is a 2D grid of squares, and not a sampled signal that needs to be reconstructed. The former is a very unsophisticated, poor, and I dare say amateurish misconception of what an image is.

smudgy color bleeding that you'd get on a fucking TV CRT
You don't know what you're talking about. There is no color bleeding unless your "TV CRT" is damaged in some way. Color bleeding might come from a composite signal, but RGB signals on "TV CRTs" with the same resolution, mask and phosphor spec will look *exactly* the same as a monitor on the same resolution, mask and phosphor spec.

everyone whose entire fucking childhood was only about NES, PS1
My entire childhood was IBM PC clones tyvm.

There other differences in that computer CRTs had more sharpness, faster response times, short phosphor persistence, and higher resolution
You can buy very high resolution "TV CRTs" e.g. BVMs. There's very little difference except connectivity between something like a BVM "TV" and a GDM "Monitor". They're just CRTs.

u could play Quake on a TV from your PC if you had the right card, but it looked awful compared to it being on a smaller sized PC CRT.
3D graphics are vector graphics. They're not drawn by hand, but by an algorithm. They are objectively better the higher the resolution and AA, no matter the display technology being used to display them. At the same time, the image is still better in an objective way when it is reconstructed on a CRT by scanning out lines over phosphor masks than on an LCD color grid.
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
and the pixel chunkiness in your Jazz Jackrabbit example has little to do with the display, and everything to do with the signal having nearest neighbor vertical doubling, a problem with VGA cards and nothing whatsoever to do with the display!

First of all this is not a problem, it is a feature to get proper crisp image on any display above 13" or so - otherwise you'd have some huge scanlines which would both darken the final image and look awful.

Second this is a fucking CRT, displaying fucking pixel art, the entire fucking point of the image is to show that pixel art on a fucking old computer CRT looks as square as any fucking modern pixelart on a fucking flat panel instead of the smudgy color bleeding that you'd get on a fucking TV CRT that everyone whose entire fucking childhood was only about NES, PS1 and consoles ignorantly parades as the "proper way CRT is supposed to look". For fucks sake, how hard is to understand that?

The key difference is that computer CRTs had an average of .21-.28mm dot pitch with the highest resolution being 1280x1024. TV CRTs had .60-.9mm dot pitch and stuck at 640x480 resolution. As you can see the computer CRT had between 3 to 4 times the amount of dots making up the individual pixels on top of being 2.14 times more pixels. That's why computer CRTs had sharper and more detailed pixel art compared to say consoles running on TVs.

As such, the squared images you're thinking of only came from the game/program itself not due to the monitor, especially monitors from the 1990s and later. Those earlier games were mostly 4-8 bit and the art was done via hexadecimal by the programmer not an artist. 16 bit and 32 bit art was done by professional artists using dedicated work stations using art programs that began life as dedicated computer graph paper with art tools embedded. As time went on the software became even more powerful to where many sprites used half tones in individual pixels. SVGA was what enabled that with 24bit true color.
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
There other differences in that computer CRTs had more sharpness, faster response times, short phosphor persistence, and higher resolution
You can buy very high resolution "TV CRTs" e.g. BVMs. There's very little difference except connectivity between something like a BVM "TV" and a GDM "Monitor". They're just CRTs.

u could play Quake on a TV from your PC if you had the right card, but it looked awful compared to it being on a smaller sized PC CRT.
3D graphics are vector graphics. They're not drawn by hand, but by an algorithm. They are objectively better the higher the resolution and AA, no matter the display technology being used to display them. At the same time, the image is still better in an objective way when it is reconstructed on a CRT by scanning out lines over phosphor masks than on an LCD color grid.

You're missing dot pitch from your equation on top of display resolution. Computer CRTs had a dot pitch of .21-.28mm with a max screen resolution of 1280x1024. TV CRTs had .6-.9mm dot pitch and stuck at 640x480 resolution. There is a hardware difference that you're ignoring. There is the matter of whether the software is written for 4, 8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.
 
Last edited:

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
TV CRTs had .60-.9mm dot pitch and stuck at 640x480 resolution
You could get 900 TVL "TV CRTs" i.e. a 1200 pixel horizontal resolution. They weren't "stuck". The only thing that made you "stuck" was your wallet.

You can have a high resolution, but what the difference is and you're ignoring is dot pitch. That TV CRT may have a 1200 pixel resolution, but with a dot pitch of .6-.9mm it's going to be worse then what was in a 1280x1024 computer CRT that was set to .21-.28mm dot pitch. That means that there was finer detail at a rate of 3-4 times greater then the TV CRT. It's a combination of resolution and dot pitch that determines the final result without considering the software's rendering capabilities.

Also, I was speaking of the general public not millionaires living on the bleeding edge. They're outliers and ignored.
 
Last edited:

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
I forgot to add that 640x480 was broadcast from the towers and cable providers. All of your console games were made with 640x480 that's why it didn't scale. Not so with computer CRTs. Computer software was designed for higher resolutions once the hardware moved from 640x480. That's why the images scaled and were dependent upon the monitor's resolution.
 

Nutmeg

Arcane
Vatnik Wumao
Joined
Jun 12, 2013
Messages
20,082
Location
Mahou Kingdom
You're missing dot pitch from your equation on top of display resolution
Dot pitch is just screen size divided by resolution. If you're able to resolve 1200 black and white lines over 10 inches or whatever, your dot pitch will be 10 inches divided by 1200. You could get high resolution small screens. In fact BVMs are quite compact.

here is the matter of whether the software is written in 4,8, 16, or 32 bit
1. Software isn't written in "bits"
2. Except for maybe some weird niche machine I don't know about there's no such thing as a 4-bit computer.
3. So I guess you meant the color channels being used for the art, which is fine that makes sense.

But there's almost no *pixel art* from the CRT era that was more than 15 bit in color. I'd actually be interested in learning of some examples.

All of your console games were made with 640x480 that's why it didn't scale
You don't know what you're talking about. Consoles couldn't (or well, didn't, except for some static menus and some very rare exceptions) output 640 by 480 till the Dreamcast lol. They usually didn't even output 240 lines but like 224 or some such number.

Also, I was speaking of the general public not millionaires living on the bleeding edge. They're outliers and ignored.
I remember around the time Plasma TVs were coming around in the mid 2000s, "HD" "TV" CRTs were a cheaper consumer alternative.

Anyway, to move the conversation in a different direction, I do want to say that "SD" screens (larger dot pitch, as JamesDixon would say) have properties that reconstruct low-res images in a way with some better objective qualities (the coarser mask is positioned closer to where the sinc lobes should be). I find 360 TVL masks to be ideal for 240 to 320 sample per line signals at 4:3.
 
Last edited:

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
You're missing dot pitch from your equation on top of display resolution
Dot pitch is just screen size divided by resolution. If you're able to resolve 1200 black and white lines over 10 inches or whatever, your dot pitch will be 10 inches divided by 1200. You could get high resolution small screens. In fact BVMs are quite compact.

here is the matter of whether the software is written in 4,8, 16, or 32 bit
1. Software isn't written in "bits"
2. Except for maybe some weird niche machine I don't know about there's no such thing as a 4-bit computer.
3. So I guess you meant the color channels being used for the art, which is fine that makes sense.

But there's almost no *pixel art* from the CRT era that was more than 15 bit in color. I'd actually be interested in leaning of some examples.

Also, I was speaking of the general public not millionaires living on the bleeding edge. They're outliers and ignored.
I remember around the time Plasma TVs were coming around in the mid 2000s, "HD" "TV" CRTs were a cheaper consumer alternative.

Anyway, to move the conversation in a different direction, I do want to say that "SD" screens (larger dot pitch, as JamesDixon would say) have properties that reconstruct low-res images in a way with some better objective qualities (the coarser mask is positioned closer to where the sinc lobes should be). I find 360 TVL masks to be ideal for 240 to 320 samples per line.

All of your console games were made with 640x480 that's why it didn't scale
You don't know what you're talking about. Consoles couldn't output 640 by 480 till the Dreamcast lol.

The ability to display fine detail involves many factors including the resolution of the video source, video bandwidth, sharpness of the electron beam(s), and the dot/slot/line pitch (color only) of the CRT.

The CRT is primarily responsible for the latter two.

The focus or sharpness of the spot or spots that scan across the screen is a function of the design of the electron gun(s) in the CRT and the values of the various voltages which drive them. Focus may be adjustmented but excellent focus everywhere on the screen is generally not possible.

Sharp focus is a difficult objective - the negatively charged electrons repel each other and provide an inherent defocusing action. However, increasingly sharp focus would not be of value beyond a certain point as the ultimate resolution of a color CRT is limited by the spacing - the pitch - of the color phosphor elements. (For monochrome displays and black-and-white TVs, CRT resolution is limited primarily by the electron beam focus.)

One of three approaches are used to ensure that only the proper electron beam strikes each color phosphor. All perform the same function:

  1. Dot mask - the phosphor screen consists of triads of R, G, and B, circular dots in a triangular arrangement. The shadow mask is a steel or InVar sheet filled with holes - one for triad. The dot mask has been used since the early days of color TV and is still popular today. The electron guns are also arrange in a triangular configuration.
  2. Slot mask - the phosphor screen consists of triples of vertically elongated R, G, and B, stripes (actually, these are usually full vertical stripes interrupted by narrow gaps). The shadow mask is a steel or InVar sheet filled with slots - one for each triple. Ideally, the metal between the slots vertically is as thin as possible to maintain the structural stability of the slot mask sheet. This type of tube seems to be very popular in TVs but also shows up in some computer monitors. The electron guns are in line which makes some of the setup adjustments less critical compared to the dot mask CRT.
  3. Aperture grille - the phosphor screen consists of triples of vertical R, G, and B, lines running the full height of the screen. The aperture grille is a series of tensioned steel wires running vertically behind the phosphor stripes - one for each triple. The aperture grille - until recently under patent protection and therefore only available in the Trinitron from Sony - is found in both TVs and monitors. The electron guns are also in line.
The pitch of a color CRT refers to the spacing of phosphor triads or triples. For dot mask CRTs, this parameter is relevant in both the horizontal and vertical direction. For slot mask and aperture grille CRTs, the pitch is only relevant in the horizontal direction.
Dot pitches as small as .22 mm are found in high resolution CRTs. Very inexpensive 14" monitors - often bundled with a 'low ball' PC system - may have a dot pitch as poor as .39 mm. This is useless for any resolution greater than VGA. Common SVGA monitors use a typical dot pitch of .28 mm. TVs due to their lower resolution have pitches (depending on screen size) as high as .75 mm - or more.

Obviously, with smaller screens and higher desired video source resolutions, CRT pitch becomes increasingly important. However, it isn't a simple relationship like the size of a pixel should be larger than the size of a dot triad or triple, for example. Focus is important. All other factors being equal, a smaller pitch is generally preferred and you will likely be disappointed if the pitch is larger than a pixel. As the pixel size approaches the phosphor triad or triple size, Moire becomes more likely. However, the only truly reliable way to determine whether Moire will be a problem with your monitor is to test it at the resolutions you intend to use.---http://arcadecontrols.com/files/Miscellaneous/crtfaq.htm#crtcrs

1. This shows how ignorant you are of the entire subject. When using the 4, 8, 16, and 32 bit it refers to the CPUs bit integers or memory addresses. The first games were made for 4 bit CPUs and by the end of computer CRTs we were up to 32 bit CPUs. So yes, software was written for a specific bit set. Windows 3.x was 16 bit, Windows 95-ME were 32 bit, Windows XP was 32/64 bit, and every version up until Windows 10 was 32/64 bit. Windows 10 dropped support for 32 bit.

2. Intel 4004 and 4040 CPUs were 4 bit. They were manufactured in 1971 and 1974 respectively. You can still buy 4 bit CPUs today.

3. No, I meant color depth. At first the color depth was 2 which was black and white. You have 4, 8, 16, 24, etc... color depth. When I said 24 bit color is the current standard that means the display can render an individual dot pitch as one of 16,777,216 colors.

When referring to screen resolution I am not talking about the game hardware. I am talking about the TV hardware. You're concerned with consoles when in 1998 computers were outputting 1280x1024 screen resolution which blows your consoles out of the water.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,462
Location
down under
Codex+ Now Streaming!
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.

47361982-05028300-d6dc-11e8-8b37-b4321bed083c.jpg
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.

I'm stupid and can't understand simple English.

I thought you had me on ignore. :lol:
 

Nutmeg

Arcane
Vatnik Wumao
Joined
Jun 12, 2013
Messages
20,082
Location
Mahou Kingdom
1. This shows how ignorant you are of the entire subject. When using the 4, 8, 16, and 32 bit it refers to the CPUs bit integers or memory addresses. The first games were made for 4 bit CPUs and by the end of computer CRTs we were up to 32 bit CPUs. So yes, software was written for a specific bit set. Windows 3.x was 16 bit, Windows 95-ME were 32 bit, Windows XP was 32/64 bit, and every version up until Windows 10 was 32/64 bit. Windows 10 dropped support for 32 bit.
Whether a CPU is 8 bit or 16 bit has very little to do with the art a machine can output so I don't know why you brought this up and continue to bring it up. And any CPU can use any amount of bit integers, it's just a matter of how many instructions it would take if registers aren't the right size.

I'm a developer btw. I'd link you to my github, but I'm not interested in doxing myself.

Intel 4004 and 4040 CPUs were 4 bit. They were manufactured in 1971 and 1974 respectively. You can still buy 4 bit CPUs today.
lol. Good to know, I guess?

No, I meant color depth. At first the color depth was 2 which was black and white. You have 4, 8, 16, 24, etc... color depth. When I said 24 bit color is the current standard that means the display can render an individual dot pitch as one of 16,777,216 colors.
What do you mean "No"? I threw you a bone and said you probably meant the color channels as opposed to how many bits "software is written in" (lol wtf), and now you're trying to school me by explaining color bit depth without realizing that color bit depth is composed of the bit depths of the individual color channels (RGB in the standard additive model).

As for the rest of what you wrote, you're out of your depth. Stop embarrassing yourself.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,462
Location
down under
Codex+ Now Streaming!
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.

I'm stupid and can't understand simple English.

I thought you had me on ignore. :lol:

Changed my mind, I want the full codex experience :M

Those sequence of sentences just don't make much sense. Especially the last three. Maybe you know what you're talking about, but you're using very confusing and inexact terminology.
 

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
1. This shows how ignorant you are of the entire subject. When using the 4, 8, 16, and 32 bit it refers to the CPUs bit integers or memory addresses. The first games were made for 4 bit CPUs and by the end of computer CRTs we were up to 32 bit CPUs. So yes, software was written for a specific bit set. Windows 3.x was 16 bit, Windows 95-ME were 32 bit, Windows XP was 32/64 bit, and every version up until Windows 10 was 32/64 bit. Windows 10 dropped support for 32 bit.
Whether a CPU is 8 bit or 16 bit has very little to do with the art a machine can output so I don't know why you brought this up and continue to bring it up. And any CPU can use any amount of bit integers, it's just a matter of how many instructions it would take if registers aren't the right size.

I'm a developer btw. I'd link you to my github, but I'm not interested in doxing myself.

Intel 4004 and 4040 CPUs were 4 bit. They were manufactured in 1971 and 1974 respectively. You can still buy 4 bit CPUs today.
lol

No, I meant color depth. At first the color depth was 2 which was black and white. You have 4, 8, 16, 24, etc... color depth. When I said 24 bit color is the current standard that means the display can render an individual dot pitch as one of 16,777,216 colors.
What do you mean "No"? I threw you a bone and said you probably meant the color channels as opposed to how many bits "software is written in" (lol wtf), and now you're trying to school me by explaining color bit depth without realizing that color bit depth is composed of the bit depths of the individual colors.

As for the rest of what you wrote, you're out of your depth. Stop embarrassing yourself.

1. It actually can since it limits the software on what memory addresses it can handle. Also I stated that the games were written for a specific cpu set not that it impacts the art or that it was written in "bits" you retard fuck that is strawmanning me. If your position were true then you should be able to run Duke Nukem 3d on Windows 10. You can't since Windows 10 no longer supports 8, 16, and 32 bit software. One thing you've left out of your equation is the OS that is required by all machines.

I don't care if you paint naked titties for a living. It's irrelevant to the conversation at hand.

2. I used specific terminology and the correct one. I can't help it if you can't read you stupid commie fuck.

3. Not an argument princess. You should stop embarrassing yourself since you obviously don't know what the fuck you're talking about. Attack the points not the poster fucktard.
 
Last edited:

JamesDixon

GM Extraordinaire
Patron
Dumbfuck
Joined
Jul 29, 2015
Messages
11,231
Location
In the ether
Strap Yourselves In Codex Year of the Donut
There is a hardware difference that you're ignoring. There is the matter of whether the software is written in 4,8, 16, or 32 bit. That ultimately determined how good the art looked. With SVGA and 24bit color they were using half tones combined with the dot pitch to render pretty sharp images.

I'm stupid and can't understand simple English.

I thought you had me on ignore. :lol:

Changed my mind, I want the full codex experience :M

Those sequence of sentences just don't make much sense. Especially the last three. Maybe you know what you're talking about, but you're using very confusing and inexact terminology.

Well that's obvious since you're a retard that failed English.
 

Nutmeg

Arcane
Vatnik Wumao
Joined
Jun 12, 2013
Messages
20,082
Location
Mahou Kingdom
Attack the points not the poster
I just don't think that's a good use of my time in your case. And that's saying something.

And it's not "points" I'd be attacking, just bizarre misinformation and misunderstandings due to an uneducated (not pejorative, we can't all be experts at everything) individual reaching beyond his grasp and feeling like he has something to prove.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom