PorkyThePaladin
Arcane
- Joined
- Dec 17, 2013
- Messages
- 5,415
It does not fit our hardware architecture.What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument
In OO program to call a method you need to go to VMT associated with specific instance of an object - then read address of virtual method - jump to it - and only then execute it. You cannot inline the method.
You cannot predict size of polymorphic object - so you cannot place them in memory in order. Your collection of objects is really a collection of pointers to various different addresses - and next few objects in collection cannot be loaded into cache ahead of time while you are processing current one.
It is not something that will go away with larger caches. Ryzen 5800x3d or Zen 4/5 offer larger and larger higher level caches - but you can always get faster code by fitting everything in lower level cache - which is 10 times smaller and 10 times faster.
Most commercially used so called OO languages do not go all the way - and do not make everything inherit from an 'object'. That's why sometimes you get programs that perform more or less acceptable.
This is the kind of argument that makes no sense in the context of most programming jobs, e.g. business programming, web programming, etc. This guy is arguing for low level programming that fits the machine, when literally the entire history of programming has been in the opposite direction. All programming languages are divided into several generations: 1st gen were 0s and 1s, Assembly was 2nd gen, languages like C and Fortran and Cobol were 3rd gen, and SQL is 4th gen. Java/Python/Ruby/C# are something like 3.5 gen, though Python and Ruby are much higher on that scale and Java/C# lower. Each generation was more of an abstraction to get away from the machine details. The more you get away from thinking in terms of machine architecture, the more freedom, flexibility, and power you get to think in terms of your business logic, and that increases programmer productivity by a ridiculous amount. And programmer productivity is the single most expensive thing today when it comes to this industry.
Now if you are coding an OS or a cutting edge 3D shooter, you might need to do what he is talking about, but outside of that, focus on being more productive. Notch built Minecraft in Java, a language generally considered terrible for video games, but because he was familiar with it and it allowed him to focus on the design stuff, the game sold for 2.5 billion bucks. Meanwhile you have pointer cowboys like some here trying to optimize memory allocation...
As far as OOP, anyone arguing against it is a retard. OOP is an extremely valuable programming tool. Like any tool, it can be over-used. The idea behind OOP is to divide-and-conquer the complexity of code into manageable bite-sized conceptual chunks which can be more easily created, maintained, re-used, etc. You should always do that. However, if you are writing a one-time unique procedure, of course don't put it into a class just to follow OOP, but where appropriate, try to use OOP as much as possible.