Putting the 'role' back in role-playing games since 2002.
Good Old Games
Donate to Codex
News Content Gallery About Donate Discord Contact
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Which programming language did you choose and why?

kepler

Literate
Joined
Jun 1, 2022
Messages
32
Location
Lechistan
What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument
It does not fit our hardware architecture.

In OO program to call a method you need to go to VMT associated with specific instance of an object - then read address of virtual method - jump to it - and only then execute it. You cannot inline the method.

You cannot predict size of polymorphic object - so you cannot place them in memory in order. Your collection of objects is really a collection of pointers to various different addresses - and next few objects in collection cannot be loaded into cache ahead of time while you are processing current one.

It is not something that will go away with larger caches. Ryzen 5800x3d or Zen 4/5 offer larger and larger higher level caches - but you can always get faster code by fitting everything in lower level cache - which is 10 times smaller and 10 times faster.

Most commercially used so called OO languages do not go all the way - and do not make everything inherit from an 'object'. That's why sometimes you get programs that perform more or less acceptable.

This is the kind of argument that makes no sense in the context of most programming jobs, e.g. business programming, web programming, etc. This guy is arguing for low level programming that fits the machine, when literally the entire history of programming has been in the opposite direction. All programming languages are divided into several generations: 1st gen were 0s and 1s, Assembly was 2nd gen, languages like C and Fortran and Cobol were 3rd gen, and SQL is 4th gen. Java/Python/Ruby/C# are something like 3.5 gen, though Python and Ruby are much higher on that scale and Java/C# lower. Each generation was more of an abstraction to get away from the machine details.

This is because thanks to Moore's Law you could just eat the performance. Days of doubling processing power every 18 month are over but we still want more from our software so further abstracting from the machine (or not considering the performance) is probably not possible.

The more you get away from thinking in terms of machine architecture, the more freedom, flexibility, and power you get to think in terms of your business logic, and that increases programmer productivity by a ridiculous amount. And programmer productivity is the single most expensive thing today when it comes to this industry.

1. Highly productive programmers that churn out functional code at high volume, interact with business people, try to understand the business requirements and help the company grow.

You are forgetting about end user and what his demands are.
 

Axioms

Arcane
Developer
Joined
Jul 11, 2019
Messages
645
There are three types of programmers along one axis:

1. Highly productive programmers that churn out functional code at high volume, interact with business people, try to understand the business requirements and help the company grow.

2. Autistic uber-nerds that are mostly interested in the technical aspects of programming, are unable to interact with business people on any productive level, will re-factor and redesign perfectly functioning systems 12 times to get that perfect architecture, etc.

3. Outsourced/shit programmers that care neither about point 1 or 2.

I feel like based on the comments in this thread that most programmers here are from group 2... They despise highly productive languages, frown on paradigms that most closely resemble the real world, focus on insane performance in the age of cheap cloud-based hardware, and so on. Interesting...

And there's Porky who's so entrenched in writing webapps in Python where performance doesn't matter that in his worldview that's the only type of project that exists.

That's a retarded categorisation, by the way; if you're doing heavy algorithmic stuff that can go on for weeks or months, you don't have time to sit in meetings half of the day to "interact with business people, try to understand the business requirements and help the company grow". You do all that by *coding*, because that's your job. Better companies recognise that forcing highly-skilled coders sit in meetings all day is gross waste of talent instead of letting them do what they does best, which is coding.

E.g. in my job I'm *shielded* from wasting my time by too many meetings by management, because talking about business requirements can be done by many people (and there are dedicated people to do that), but doing the actual work requires uninterrupted focus. Forcing coders to do all that in one person is actually a sign of Mickey-mouse type of operations to me where they simply just want to hire one person "who can do it all". Feel sorry for you if that's your only type of experience working in the industry.

Also, the business people won't understand shit of what you will try to say, because it's so math/algorithm heavy. So you can only give a two sentence progress report anyway, and that's it.

I don't know where you live, Porky, but there are *many* "outsourced programmers" from the Eastern Block in particular who are at least as good if not better than the average "western" guy. Sure, I have worked on projects where half of the team was outsourced to Indonesia and it was a trainwreck... but it's in general a gross overgeneralisation.
American outsourcing goes straight to India or south east Asia. They don't trust eastern europeans here.
 

Axioms

Arcane
Developer
Joined
Jul 11, 2019
Messages
645
What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument
It does not fit our hardware architecture.

In OO program to call a method you need to go to VMT associated with specific instance of an object - then read address of virtual method - jump to it - and only then execute it. You cannot inline the method.

You cannot predict size of polymorphic object - so you cannot place them in memory in order. Your collection of objects is really a collection of pointers to various different addresses - and next few objects in collection cannot be loaded into cache ahead of time while you are processing current one.

It is not something that will go away with larger caches. Ryzen 5800x3d or Zen 4/5 offer larger and larger higher level caches - but you can always get faster code by fitting everything in lower level cache - which is 10 times smaller and 10 times faster.

Most commercially used so called OO languages do not go all the way - and do not make everything inherit from an 'object'. That's why sometimes you get programs that perform more or less acceptable.

This is the kind of argument that makes no sense in the context of most programming jobs, e.g. business programming, web programming, etc. This guy is arguing for low level programming that fits the machine, when literally the entire history of programming has been in the opposite direction. All programming languages are divided into several generations: 1st gen were 0s and 1s, Assembly was 2nd gen, languages like C and Fortran and Cobol were 3rd gen, and SQL is 4th gen. Java/Python/Ruby/C# are something like 3.5 gen, though Python and Ruby are much higher on that scale and Java/C# lower. Each generation was more of an abstraction to get away from the machine details.

This is because thanks to Moore's Law you could just eat the performance. Days of doubling processing power every 18 month are over but we still want more from our software so further abstracting from the machine (or not considering the performance) is probably not possible.

The more you get away from thinking in terms of machine architecture, the more freedom, flexibility, and power you get to think in terms of your business logic, and that increases programmer productivity by a ridiculous amount. And programmer productivity is the single most expensive thing today when it comes to this industry.

1. Highly productive programmers that churn out functional code at high volume, interact with business people, try to understand the business requirements and help the company grow.

You are forgetting about end user and what his demands are.
Porky is absolutely right in many cases. But incredibly wrong for others. Try coding a detail strategy game in some high level language. Hell we see the performance hits in C# in Unity much less Python.
 

Orud

Learned
Patron
Joined
May 2, 2021
Messages
324
Strap Yourselves In
Video games fall outside of the examples Porky made for when it (performance) doesn't matter. Video games belong, to an extent, to performance critical software and require a different approach when compared to common business-case programming; e.g. make less method calls, prefer memory control over garbage collection (or end up with unity's performance/resource need), etc... . And even that area has become more and more forgiving as time and processing power has grown (again, the fact that Unity is what it is, is a perfect example of this).

He is simply arguing that in most existing professional programming projects, readability and maintainability outweigh performance (to an extent, of course). And he is right, else the landscape of dominant programming languages would look completely different.
 
Last edited:

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
American outsourcing goes straight to India or south east Asia. They don't trust eastern europeans here.
What can I say? American propaganda worked, and they're really good at manipulating the public.

Anyway, their loss, I guess.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
Video games fall outside of the examples Porky made for when it (performance) doesn't matter. Video games belong, to an extent, to performance critical software and require a different approach when compared to common business-case programming; e.g. make less method calls, prefer memory control over garbage collection (or end up with unity's performance/resource need), etc... . And even that area has become more and more forgiving as time and processing power has grown (again, the fact that Unity is what it is, is a perfect example of this).

He is simply arguing that in most existing professional programming projects, readability and maintainability outweigh performance (to an extent, of course). And he is right, else the landscape of dominant programming languages would look completely different.
Yeah, he's right in 90% of his points, *but* then he makes some really retarded extrapolations/overgeneralisations, that's the most annoying part. Which really rubs me the wrong way. Right now I'm literally doing both: high-level stuff at work (long-running server-side processes, so can't just handwave away performance concerns like in most webapps; but still, it's very high-level code), then low-level performance sensitive C++ stuff in my spare time contributing to DOSBox Staging. Both approches are needed and are best-fits for the problem domains at hand; you can't "replace" either language with the other.

There's another subtle difference there: while there's no arguing that readability and maintainability are paramount in professional work, there are many *better* tools that actually offer *higher* levels of clean abstraction while also providing a lot more safety than Python that Porky is so fond of, yet he defends Python like his life depended on it. I'm all for optimising for programmer productivity where it makes sense, but I find Python to be a bad to average *high-level tool* for that job (like I still find C++ to be a really bad tool for *high-level* programming; it's fine for very low-level work, of course).

The meme he keeps repeating ("only shit programmers need type-checking so get gud lol") is born out of pure ignorance, so maybe just stop perpetuating it.
 

Orud

Learned
Patron
Joined
May 2, 2021
Messages
324
Strap Yourselves In
I agree completely with the Python remarks. I find its main strengths to be an easy syntax (if you use a proper editor to avoid annoying shit like whitespace errors) allowing non dedicated programmer (e.g. infra) to often maintain the scripts, but very unwieldy once it gets past a certain project size (and yes, it also has quite a noticeable performance hit).

As for the last statement, there's a good reason typed languages have been rapidly stomping weakly-typed languages these past 10 years in popularity. Its a strong 'tool' that reduces bugs, reduces bug detection time and vastly improves project readability by offering some sort of a 'contract' to the callers of your piece of logic. For small projects weakly typed languages are okay, I guess, but even with something as simple as large functions, the difference in readability with typed languages is immediately noticeable. And that's been my main concern for my dayjob this past decade.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
I agree completely with the Python remarks. I find its main strengths to be an easy syntax

I like Python syntax a lot, actually. One of my favourite languages, Nim, uses the exact same significant-whitespace-as-indentation rule. In fact, one of the inspirations for Nim was Python -- but then you get C/C++ level performance and native binaries. Hence it has replaced Python for many people, I even use it for scripts, it compiles super fast.

As for the last statement, there's a good reason typed languages have been rapidly stomping weakly-typed languages these past 10 years in popularity. Its a strong 'tool' that reduces bugs, reduces bug detection time and vastly improves project readability by offering some sort of a 'contract' to the callers of your piece of logic.
Well, once I loved Python (more than 10 years ago) and wrote a medium-sized hobby project in it. The problem was that even I could not keep track of what function expected what arguments exactly after a while, especially when putting it aside for a few weeks. Then I ended up doing what most people do: documenting what the functions expect in comments. Guess what, by using strict typing you're doing the same thing, but the compiler does the work of checking the types for you!

I like LISP/Scheme, but I would be utterly terrified of working on a LISP project at work with a group of people; there's just way too much freedom to do anything that no other compiled language can do.

Anyway, these days I value a well-thought out but restricted set of features a lot more than "ultimate freedom". It goes a long way towards making a *team* of programmers work well together, and things that might give you some speed boost during the prototyping phase can become a huge burden for production code and for long-term maintenance.

It's like clay (LISP) versus precision milled stainless-steel components (C++/Kotlin/typed languages). Sure, the latter takes longer to produce, but that's the real deal.
 
Joined
Dec 17, 2013
Messages
3,658
To me, a lot of your guys' feelings toward Python are similar to the arrival of firearms in late medieval Europe. Overnight, any peasant could pick up a musket/arquebus and easily kill a knight that trained for warfare for his entire life and owned expensive equipment. And professional programmers like you seem to be intimidated by Python the same way, ie any non-programmer can pick up Python and start coding world-changing programs, while you spent decades learning the ins-and-outs of C++ or whatever.

But it's a little bit silly, imho. Despite how you (or medieval knights) feel about it, the most powerful paradigm will prevail, and Python has been #1 or #2 most popular language every year for at least a decade. And its popularity continues to grow. And firearms didnt just prevail because peasants could use them, they were just more powerful in general than earlier weapons, and Python is a lot more powerful in terms programmer productivity and flexibility than other alternatives right now.

Also keep in mind that while any peasant can use a firearm, there is still a huge difference between a farmer holding a rifle and a NAVY SEAL. Same thing with Python, almost anyone can use it for simple stuff, but someone good can do amazing things with it.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
On a more friendly note, PorkyThePaladin, if you like the dynamic typing of Python, you might want to check out structural typing in TypeScript, which is kind of a mixture between strict typing and fully dynamic. It's a much more formalised and sane version of Python's duck-typing that can be checked by the compiler.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
To me, a lot of your guys' feelings toward Python are similar to the arrival of firearms in late medieval Europe.

You're not comprehending what we're saying to you.
 

Orud

Learned
Patron
Joined
May 2, 2021
Messages
324
Strap Yourselves In
We're not saying Python doesn't have it's uses, but it's not a silver bullet for every problem. Nothing is. PHP is still the most used language on the web, doesn't mean you should use it for every web-app out there.

When all you have is a hammer...
 

Urthor

Liturgist
Patron
Joined
Mar 22, 2015
Messages
1,814
Pillars of Eternity 2: Deadfire
At some point the industry will stop pretending object oriented was a good idea. But even before that happens it will be universally acknowledged that all languages should have type safety (probably via a linter for code that isn't compiled).
What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument

OOP makes excellent *interfaces* for human beings.

It's really convenient *to use* objects.

Creating Java style *implementations* of software, private and public data, class inheritance, object factories, is a complete fucking nightmare for 90% of programs. It's *useless complexity*.

Most programs only need main method, calling a function, calling a function. Then take this and package it into an object's class method.

Only the MOST COMPLICATED applications, GUIs with buttons everywhere, need or desire full blown OO programming.

And only 1% of these *incredibly complex programs* (ironically, programming languages themselves, like Python, fall under this category) need class inheritance and meta classes.


We're not saying Python doesn't have it's uses, but it's not a silver bullet for every problem. Nothing is. PHP is still the most used language on the web, doesn't mean you should use it for every web-app out there.

It also has to be said.

Dynamic programming languages were a LOT more useful 20 years ago. Compile times took *ages* on Pentium 4, magnetic disk and 128MB of RAM.

Interpreted programming was genuinely amazing back then.

Nowadays:

Python is 3 things:

Syntax. Indents are brilliant.

Dynamic typing, no explicit typing.

Pip. Package management is INSANELY useful for saving you time. Python has a GREAT package manager (compared to everything else that isn't Rust or Ruby IMO). Anyone who has ever used Rust will understand. By comparison, everything involving packaging prebuilt binaries in C/C++ is fucking atrocious. Ditto, namespaces in C/C++ are awful.

IMO, Interpreted programming outlived its usefulness. Static compilation is too fast, hardware advances have genuinely made it not worthwhile.

Everything will migrate to inferred types like Scala/Kotlin in the next 15-20 years. Then there will be an IDE setting to display said inferred types.
 
Last edited:
Joined
Dec 17, 2013
Messages
3,658
We're not saying Python doesn't have it's uses, but it's not a silver bullet for every problem. Nothing is. PHP is still the most used language on the web, doesn't mean you should use it for every web-app out there.

When all you have is a hammer...

That's been Rincewind's and others strawman all along. I never said Python was for everything, in fact, I've said the opposite. I just find it funny how they keep shitting on it when it's the most popular (or 2nd ) language out there every year.

Reminds me of that joke: There are two kinds of languages, the ones everyone complains about, and the ones nobody uses. :)
 

raw

Conscript
Patron
Vatnik
Joined
Nov 1, 2008
Messages
19,002
Location
Barracks
PC RPG Website of the Year, 2015
At some point the industry will stop pretending object oriented was a good idea. But even before that happens it will be universally acknowledged that all languages should have type safety (probably via a linter for code that isn't compiled).
What's wrong with OOP? I keep hearing complaints but so far I have no seen a concise, reasonable argument
OOP is just one style of writing a program or a piece of it. Like the scheme of rhymes in lyrics. Likewise, it doesn't fucking matter unless there's good reason why it's there at that specific location in the program. But it's a thing I won't even discuss with someone who hasn't shipped at least a couple of million worth of systems relying on embedded hardware. Because just when writing lyrics, especially when you're starting out, you should be more concerned about getting whatever message you have to say across instead of wasting your time on whether it's OK to use ABAB from line 5 to 15. And when you have to, you know what you're doing.

Most of the people invested into these sort of topics are full of shit aswell. Sometimes I like to ask stupid shit like 'What do you think of $BUZZWORD and when would you use it' during interviews so I can flush these people out.
 

Hirato

Purse-Owner
Patron
Joined
Oct 16, 2010
Messages
3,760
Location
Australia
Codex 2012 Codex USB, 2014 Shadorwun: Hong Kong
Video games belong, to an extent, to performance critical software and require a different approach when compared to common business-case programming; e.g. make less method calls, prefer memory control over garbage collection (or end up with unity's performance/resource need), etc... . And even that area has become more and more forgiving as time and processing power has grown (again, the fact that Unity is what it is, is a perfect example of this).
Now that's just unfair to those high level languages.
Unity3D is a special kind of cursed and unoptimised that makes my computer cry in distress just trying to render a basic 2D menu.
I don't know how the managed it, but its CPU requirements scale exponentially with the number of CPU cores you have; It almost feels like they're spawning a thread per core, and having them all use spinlocks to fight over the same work queue.
I genuinely have to run commands like echo 0 > /sys/devices/system/cpu/cpu{{8..15},{24..31}}/online to make them playable.
Seriously, turning off half my CPU more than halves the CPU usage, and boosts framerates from single-digit to playable.
 

Urthor

Liturgist
Patron
Joined
Mar 22, 2015
Messages
1,814
Pillars of Eternity 2: Deadfire
The big thing re arguing about programming languages is you're not actually arguing about programming languages.

You're arguing about features of programming language ecosystems.

IMO, the features of that matter are

- Runtime. How easy it is to install for the first time, compile, and get running. How easy it is to upgrade and downgrade. Scala is atrocious, Ruby is extremely annoying to install, but IMO extremely easy to upgrade and downgrade. Python is the complete opposite of Ruby. Rust is better than C/C++ on this because it's less buggy.
- Package Management. How easy it is to make other people do your work for you via "Cargo do the thing."
- Syntax. How good the code looks.

Syntax is 99.9% a "how pretty is this" question. Programmers stare at code all day, they decide whether it's attractive or not.

They Get Opinions, Then they lie, invent "scientific sounding reasons," and justify their opinion with facts.

IMO, Scala is beautiful, Python is beautiful, Ruby is passable, Rust is passable, C/Java/C++ are horrible. Indents are more beautiful than curly braces.

Some people like curly braces and will defend C++ to the death. I respect their opinion. My experience is the majority of the human race does not agree.

Things that don't matter:

- How fast it runs. This doesn't matter. Any important code that must run quickly is coded by a savant who's talented enough they can code in brainfuck. Most code does not need to be quick, hence speed doesn't matter. Rust's advantage is #1 Cargo is great, #2 I don't have to deal with C++, #3 memory safety is less buggy. Rust is NOT faster than C++, at all.
- Doing really well at one of first three things and bad at the other two. Almost all the successful programming languages are 7/10 on all three. A fancy math person probably could rank programming language's standard deviation from the mean of the first three metrics.

The exceptions are C/C++, but since Rust arrived all C++ is legacy software. And I expect Zig to make C legacy software in 10-20 years, Zig is a LOT better than C across the issues.
 
Last edited:

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
Sometimes I like to ask stupid shit like 'What do you think of $BUZZWORD and when would you use it' during interviews so I can flush these people out.

Similarly, I like FP, in general, but I like to ask this question in interviews: "What are the drawbacks of the FP-style?". It weeds the FP zealots out and only the pragmatic people remain who are not afraid of local mutation when it's the best choice.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
The exceptions are C/C++, but since Rust arrived all C++ is legacy software. And I expect Zig to make C legacy software in 10-20 years, Zig is a LOT better than C across the issues.
I would add Nim to the mix, my current "better C/C++" pick for all my hobby projects.
 

Rincewind

Magister
Joined
Feb 8, 2020
Messages
1,082
Location
down under
If only they used Python...

https://chapel-lang.org/

Chapel is a modern programming language, developed by Cray Inc., that supports HPC via high-level abstractions for data parallelism and task parallelism. These abstractions allow the users to express parallel codes in a natural, almost intuitive, manner. In contrast with other high-level parallel languages, however, Chapel was designed around a multi-resolution philosophy. This means that users can incrementally add more detail to their original code prototype, to optimise it to a particular computer as closely as required.

In a nutshell, with Chapel we can write parallel code with the simplicity and readability of scripting languages such as Python or MATLAB, but achieving performance comparable to compiled languages like C or Fortran (+ traditional parallel libraries such as MPI or OpenMP).

In this lesson we will learn the basic elements and syntax of the language; then we will study task parallelism, the first level of parallelism in Chapel, and finally we will use parallel data structures and data parallelism, which is the higher level of abstraction, in parallel programming, offered by Chapel.
(source)
 

Urthor

Liturgist
Patron
Joined
Mar 22, 2015
Messages
1,814
Pillars of Eternity 2: Deadfire
The exceptions are C/C++, but since Rust arrived all C++ is legacy software. And I expect Zig to make C legacy software in 10-20 years, Zig is a LOT better than C across the issues.
I would add Nim to the mix, my current "better C/C++" pick for all my hobby projects.

Nim was a candidate, it's not going anywhere I'd say. Everyone has coalesced around the other two IMO.

- How fast it runs. This doesn't matter.


It's ironic you talk about shipping millions of embedded devices, then misunderstand the exact same point slightly later.

Most things are not done properly, to spec, with TLA+ etc etc, by people who actually ship the real thing commercially. For the people doing the real thing, languages don't matter as much because you've got commercial quality verification.

But for a programming language to be adopted, it has to cater to the people who aren't doing that. For them, close enough is good enough in terms of speed/performance for the vast majority of cases.

The dumb programmers are the ones out there fixing the package manager's CLI and writing documentation.
 
Last edited:
Joined
Dec 17, 2013
Messages
3,658
You can't evaluate a language based only on how good it is. You also have to take into consideration its popularity. Popularity is what creates the third party libraries, the online questions and answers and resources. Without that, even if you have the greatest language ever, it will be next to useless.

University professors tend to do this a lot, since they are not burdened by such things as actually making stuff.

Nim might be great in the future, or it might die out. But for now, stick to the popular stuff unless you are just working on some hobby projects.
 
Top Bottom