It does not fit our hardware architecture.What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument
In OO program to call a method you need to go to VMT associated with specific instance of an object - then read address of virtual method - jump to it - and only then execute it. You cannot inline the method.
You cannot predict size of polymorphic object - so you cannot place them in memory in order. Your collection of objects is really a collection of pointers to various different addresses - and next few objects in collection cannot be loaded into cache ahead of time while you are processing current one.
It is not something that will go away with larger caches. Ryzen 5800x3d or Zen 4/5 offer larger and larger higher level caches - but you can always get faster code by fitting everything in lower level cache - which is 10 times smaller and 10 times faster.
Most commercially used so called OO languages do not go all the way - and do not make everything inherit from an 'object'. That's why sometimes you get programs that perform more or less acceptable.
This is the kind of argument that makes no sense in the context of most programming jobs, e.g. business programming, web programming, etc. This guy is arguing for low level programming that fits the machine, when literally the entire history of programming has been in the opposite direction. All programming languages are divided into several generations: 1st gen were 0s and 1s, Assembly was 2nd gen, languages like C and Fortran and Cobol were 3rd gen, and SQL is 4th gen. Java/Python/Ruby/C# are something like 3.5 gen, though Python and Ruby are much higher on that scale and Java/C# lower. Each generation was more of an abstraction to get away from the machine details.
This is because thanks to Moore's Law you could just eat the performance. Days of doubling processing power every 18 month are over but we still want more from our software so further abstracting from the machine (or not considering the performance) is probably not possible.
The more you get away from thinking in terms of machine architecture, the more freedom, flexibility, and power you get to think in terms of your business logic, and that increases programmer productivity by a ridiculous amount. And programmer productivity is the single most expensive thing today when it comes to this industry.
1. Highly productive programmers that churn out functional code at high volume, interact with business people, try to understand the business requirements and help the company grow.
You are forgetting about end user and what his demands are.