Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Vapourware Codexian Game Development Thread

Joined
Dec 24, 2018
Messages
1,964
I fucked up. The calculations I used were getting acceptable population densities, but the model itself fundamentally won't work. I realized this when it came time to begin assembling recipes for farms - if I apply the current penalties / bonuses from tile climate/terrain/soil factors, then farms aren't going to produce enough to feed themselves. I need to re-approach the situation and probably find a way of calculating "number of farms in a tile" first and then other productivity modifiers later. I think the current system I have can mostly be retained to get some kind of "life rating" score which I might retain / reuse for nomadic & hunter-gatherer pops, but farm generation needs to be redone.
 

Eisen

Learned
Joined
Apr 21, 2020
Messages
766
I am creating a 3D low poly game in Godot, however, i remember creating games in Unity being much better, last time i used was 2018.
How is Unity nowadays? Is performance still terrible?
 

GoblinGrotto

Novice
Joined
Jan 11, 2021
Messages
21
Generation subsystem for the grand strategy game I'm working on. I've described parts of it earlier in the thread so won't go into a huge amount of detail, but the general idea is that as things evolve, such as preferred ratios of this operation or that operation, or the placement of cultures, or whatever, I don't want to have to go and manually edit populations, as it would be an enormous workload. So instead there are user-defined parameters such as which cultures should appear where, and then a generation process that creates operations, and the populations to staff them. This allows the contents of the tiles to be replaced quickly when parameters change - although it does reduce precise control somewhat. I may add a manual override system of some kind later - but that's the sort of thing that would be more applicable to a historical game, and while the engine is being designed for Earth-like worlds populated by humans, and should be well suited to a historical campaign eventually, the stock campaign is in a fictional world and I will not be making a historical campaign until after the game is in a release state, if at all.

Sounds cool, also like a lot of work. Is it fantasy based or more rooted in reality?

I am creating a 3D low poly game in Godot, however, i remember creating games in Unity being much better, last time i used was 2018.
How is Unity nowadays? Is performance still terrible?

I'd say it's fine if you know how to work around the quirks of it. Like avoid using Monobehaviour Update calls and avoid as much of unitys stuff as possible



In other news. I've made blood. Then I optimized the blood. Now I have a shit ton of blood, it's probably too much but I kind of like it.

 
Joined
Dec 24, 2018
Messages
1,964
Switched to a kind of scaled area system where each tile has different scaling factors for operations whose representative area per scale unit is expected to be variable (so, agriculture, subsistence, and forestry, mainly) and then used that for generation control (it can also be reused during gameplay to calculate used vs available space), effectively allowing me to have variable-sized operations without them really being variable. Seemed to work. I want to detail it a bit more though so that agricultural cultures will still generate some hunter-gatherers, representing general woodsmen and the like, partly to cover areas that are outright inhospitable to agriculture but also just for expected flavour (and to leave some space available for agricultural expansion, gradually taking space away from hunter-gatherer stencils as the region develops).
I also started painting soils instead of everything being Regosols. Probably not going to go super detailed especially since climates still need a lot of work / fine-tuning, and they affect terrain, which affects soils, but decent for a test bed to make sure soils' impact on agriculture is working properly.

Edit: expanded generation, now certain ecological adaptive models can get a bleed-through from others as described above. Enables agricultural societies to get a bit of settled pastoralism, settled pastoralism to get a sample of the agricultural set (so agriculture and settled pastoralism both share the same main set of operations, they're just weighted heavily towards certain sets), cyclic pastoralism to get a small amount of subsistence agriculture only (not the other ag forms), and all forms get a bit of foraging. This seems to work fairly well, and also helps for areas where the selected culture wouldn't otherwise work - you can see below how the northern coast is mostly pastoralists - but the societies there are actually all set to generate as agricultural. However, because the land there is very cold and the soil not very good, very little agriculture gets generated, whereas pastoralists & foragers are a bit more able to live in such conditions.

Screenshot 2025-01-19 at 06.08.06.jpg

Profession mapmode added to illustrate how this yields some overlap in generated profession types in tiles. (Rivers here are a placeholder image that's been added as a cartographic layer to make placement of gleysols and fluvisols easier in the soil editor. I do plan to add proper rivers at some point with real geometry - I'm just not entirely certain how I want to implement them as in-game objects and how they ought to interact with tiles, so it's on hold and will maybe get added after generation if I finish generation significantly before Q2 2025).

Sounds cool, also like a lot of work. Is it fantasy based or more rooted in reality?
Earthlike world. Currently with no fantasy elements, though I may or may not add some. They would be religious / magic elements, if at all - not things like orcs and elves. I have actually speculated including one or two near-human hominid species in some of the far peripheral parts of the world (possibly the island near the northern pole since it's extremely inhospitable for humans and might make sense as a "last refuge" of a hominid species that couldn't compete with Homo Sapiens), but I'd consider that to be speculative fiction rather than fantasy. In any case that'll be a decision to make later. I've had some difficulty deciding in whether to go hardcore real-physics-only as far as lore is concerned, but I do lean in that direction because I want it to eventually extend to a modern or near-future time period.
 
Last edited:

shihonage

Second Variety Games
Patron
Developer
Joined
Jan 10, 2008
Messages
7,201
Location
United States Of Azebarjan
Bubbles In Memoria
Is it possible to make a Fallout-style game as early access? When I think about it, it seems not.

Growing the world and features while maintaining backward compatibility for a variety of older savegame types...
 
Joined
Dec 24, 2018
Messages
1,964
Is it possible to make a Fallout-style game as early access? When I think about it, it seems not.
Early Access for primarily content-driven games (such as RPGs) is generally a bad idea, because the player will get their first experience when the game is in a sub-par form (vs the release form) and/or when content is being delivered piecemeal. It will decrease the quality of their first experience, and later experiences might be when the game is in a better state, but the content has already been spoiled for them by them playing it earlier. Early Access works for more systems-driven games that are played over and over anyways (or when expansion on the systems leads to a drastically new experience for the players) and where the story is not a focal point. While you can replay an RPG, and most people, I think, generally do, the most intense and enjoyable experience of it is usually the first playthrough. Later playthroughs, you already know the main story beats and characters, even if you make different choices. It just isn't the same as the first time.

That being said I'm assuming this is for SHELTER, which you've already worked on for a very long time - if what's getting at you is that you're in a situation where you need to either start getting sales or abandon the game, well, go EA rather than abandoning it. Even if EA is typically a bad move for content-driven games, if your alternative is cancellation, then that automatically makes EA the better choice. (But don't fall down the trap of trying to support a gazillion older versions of save games during Early Access - for minor changes, sure, but be ready to tell players "you should probably start a new save" frequently. It'll be too much work just maintaining support for old versions otherwise. Just support one or two patches back and if players aren't regularly loading to the new patch and saving in the new format, then they can start a fresh game. It's Early Access, nobody can expect not to have to restart many times.)
 

RobotSquirrel

Arcane
Developer
Joined
Aug 9, 2020
Messages
2,309
Location
Adelaide
Is it possible to make a Fallout-style game as early access? When I think about it, it seems not.

Growing the world and features while maintaining backward compatibility for a variety of older savegame types...
Do a fallout tactics, separate your narrative and your combat. Have a battle mode, use that version of the game to subsidize the narrative version. (Tactics' Multiplayer was basically this)
This also provides the ability to experiment with one without effecting the other. I'm of the belief that ideally narrative games should be hidden until they're finished.
But there's no harm in letting the player indulge in some of the mechanics, go with your traditional modes, CTF, Deathmatch, Teams, etc. What ever you can to provide that re-occuring value to the player up until your narrative title is ready to go. Plus you're building a community plus you're play testing the mechanics.

One of the biggest problems I see with new developers is that hesitancy to getting the game into the hands of players, but in order to have a success you need to first have someone actually play your games and build a reputation. You don't get that by just stealth dropping your game and expecting people to buy it.
 

Tavernking

Don't believe his lies
Developer
Joined
Sep 1, 2017
Messages
1,271
Location
Australia
Pro-tip: Play game jam entries. The ones that score highly with the judges. So many times I've been reluctant to add a feature but when I see that some amateurs did it in a 48 hour game jam there's really no excuse. And every time it ends up being an easy feature to add.
 

Hag

Arbiter
Patron
Joined
Nov 25, 2020
Messages
2,592
Location
Breizh
Codex Year of the Donut Codex+ Now Streaming! Enjoy the Revolution! Another revolution around the sun that is.
I've been reading through this nice website on programming patterns and I can highly recommend it if you're not a professional developer. It's very well written and concise, concepts are laid-out nicely, code is C++ but fairly trivial to understand if like me you're not an adept of this language.
The part on data locality is the eye opener, you then re-read your code and all you see are cache misses.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,806
Location
Langley, Virginia
^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
 
Developer
Joined
Oct 26, 2016
Messages
2,573
^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,806
Location
Langley, Virginia
^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
When you are writing real time system controlling fighter jet, you may not want to throw exceptions or write virtual methods. When you are writing a game that needs to refresh the screen 'only' 300 times a second, you can afford all C++ features.

Using C++ templates you can squeeze every last bit of performance at the cost of code segment growth.

So on machines with stupid amounts of cache, like 9800X3D - you may reach levels of performance well above good C code.

On embedded systems C code will be faster - by the virtue of fitting whole code into cache.
 
Developer
Joined
Oct 26, 2016
Messages
2,573
^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
When you are writing real time system controlling fighter jet, you may not want to throw exceptions or write virtual methods. When you are writing a game that needs to refresh the screen 'only' 300 times a second, you can afford all C++ features.

Using C++ templates you can squeeze every last bit of performance at the cost of code segment growth.

So on machines with stupid amounts of cache, like 9800X3D - you may reach levels of performance well above good C code.

On embedded systems C code will be faster - by the virtue of fitting whole code into cache.
Drivers are not written in C++, they are written in C. Anything C++ actually "does" is written in C. All its system, graphics interop is C.

And templates are shit.
 

RobotSquirrel

Arcane
Developer
Joined
Aug 9, 2020
Messages
2,309
Location
Adelaide
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
I'm using C primarily because its more readable than C++. Writing engine code in C++ just becomes an incoherent mess once you get beyond the frame buffer. Doom, Quake 1 2 and 3 were all coded in C. It's ideal for doing software 3D which is entirely what I'm writing. I'm having a good time with it.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,806
Location
Langley, Virginia
^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
When you are writing real time system controlling fighter jet, you may not want to throw exceptions or write virtual methods. When you are writing a game that needs to refresh the screen 'only' 300 times a second, you can afford all C++ features.

Using C++ templates you can squeeze every last bit of performance at the cost of code segment growth.

So on machines with stupid amounts of cache, like 9800X3D - you may reach levels of performance well above good C code.

On embedded systems C code will be faster - by the virtue of fitting whole code into cache.
Drivers are not written in C++, they are written in C. Anything C++ actually "does" is written in C. All its system, graphics interop is C.

And templates are shit.
I've written production Linux drivers in C. It does not differ that much from C++, except you need to implement by hand some concepts that in C++ are part of the language.

C does not "do" anything either. You express your intent and compiler tries to find the fastest assembly that will 'technically' do what you wanted, cheating as much as it can without getting caught.

In C++ templates you can express your intent in more precise way so compiler can make stronger assumptions.

C++ templates get back the performance that pure C loses to Fortran.
 
Developer
Joined
Oct 26, 2016
Messages
2,573
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
I'm using C primarily because its more readable than C++. Writing engine code in C++ just becomes an incoherent mess once you get beyond the frame buffer. Doom, Quake 1 2 and 3 were all coded in C. It's ideal for doing software 3D which is entirely what I'm writing. I'm having a good time with it.

^^^
Sounds like something a modern compiler should be taking care of.
How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
When you are writing real time system controlling fighter jet, you may not want to throw exceptions or write virtual methods. When you are writing a game that needs to refresh the screen 'only' 300 times a second, you can afford all C++ features.

Using C++ templates you can squeeze every last bit of performance at the cost of code segment growth.

So on machines with stupid amounts of cache, like 9800X3D - you may reach levels of performance well above good C code.

On embedded systems C code will be faster - by the virtue of fitting whole code into cache.
Drivers are not written in C++, they are written in C. Anything C++ actually "does" is written in C. All its system, graphics interop is C.

And templates are shit.
I've written production Linux drivers in C. It does not differ that much from C++, except you need to implement by hand some concepts that in C++ are part of the language.

C does not "do" anything either. You express your intent and compiler tries to find the fastest assembly that will 'technically' do what you wanted, cheating as much as it can without getting caught.

In C++ templates you can express your intent in more precise way so compiler can make stronger assumptions.

C++ templates get back the performance that pure C loses to Fortran.
Embedded C interacts directly with hardware. I am changing physical properites of hardware. So, C actually does something of itself.

On the other hand C++ sits on many layers of abstractions and in turn has many of its own whacko abstractions. Its a disasterous bloatware, and even the author(s) admit that.

C++ just about safe from extinction due to the marginal niches it serves. Outside those niches its just a terrible language to work with.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,806
Location
Langley, Virginia
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
I'm using C primarily because its more readable than C++. Writing engine code in C++ just becomes an incoherent mess once you get beyond the frame buffer. Doom, Quake 1 2 and 3 were all coded in C. It's ideal for doing software 3D which is entirely what I'm writing. I'm having a good time with it.

How ?

There are certain guarantees that language specification gives you, and C++ is very particular about it. It is a language made for professionals, so it won't have vastly different performance characteristics on different platform with different compiler.

For example:
std::uint8_t some_array[SOME_NUMBER][CACHE_LINE_SIZE] will be stored as continuous bytes in memory. No ifs or buts.

CPU designers optimize for this scenario and tell in their specs what the cache line size is (128 bytes), and what performance you should expect, usually that CPU will be smart enough to read next chunk into memory when you access the data chunks sequentially, giving you for all intends and purposes 'infinite cache'.

C++ compiler won't break the language contract to rearrange your data structures, because they may have been carefully laid out by programmer to get 10000x performance ('infinite cache') on specific hardware.

Of course compilers of languages with less strict specification may attempt to do that, but they usually don't bother, because it is a hard problem, and code that needs to be fast is usually written in C/C++ anyway.
C is fast. Never known C++ to be a "fast" language. Its full of bloat crap.
When you are writing real time system controlling fighter jet, you may not want to throw exceptions or write virtual methods. When you are writing a game that needs to refresh the screen 'only' 300 times a second, you can afford all C++ features.

Using C++ templates you can squeeze every last bit of performance at the cost of code segment growth.

So on machines with stupid amounts of cache, like 9800X3D - you may reach levels of performance well above good C code.

On embedded systems C code will be faster - by the virtue of fitting whole code into cache.
Drivers are not written in C++, they are written in C. Anything C++ actually "does" is written in C. All its system, graphics interop is C.

And templates are shit.
I've written production Linux drivers in C. It does not differ that much from C++, except you need to implement by hand some concepts that in C++ are part of the language.

C does not "do" anything either. You express your intent and compiler tries to find the fastest assembly that will 'technically' do what you wanted, cheating as much as it can without getting caught.

In C++ templates you can express your intent in more precise way so compiler can make stronger assumptions.

C++ templates get back the performance that pure C loses to Fortran.
Embedded C interacts directly with hardware. I am changing physical properites of hardware. So, C actually does something of itself.

On the other hand C++ sits on many layers of abstractions and in turn has many of its own whacko abstractions. Its a disasterous bloatware, and even the author(s) admit that.

C++ just about safe from extinction due to the marginal niches it serves. Outside those niches its just a terrible language to work with.
Once you get experience, programming language is just a way to get assembly code / microcode you want for given architecture.

If you are fine with Reverse Polish Notation - it's Forth. If you are Linus Torvalds - it's GCC variant of C and Rust. If you started with IBM 360 and are still in the business of writing performant code - it's Fortran 77 or Fortran 2008. If you've learned C decades ago - it's C99 standard, which is not terrible.

C++ is a language in which Fortune 50 corporations invested a lot of money in, funding ISO committee with hundreds of PhDs which will not allow any newbie mistakes that Rust, Python, Perl, Ruby or Java are known for. C++ language moves ahead slowly following new architecture paradigms, but not at such glacial pace as pure C.

Choose your poison as you like, and if you choose wrong but still write anything worth preserving for the future - some people will make a nice living rewriting your spaghetti code into language / notation that will be relevant in next few decades.
 

Viata

Arcane
Joined
Nov 11, 2014
Messages
9,915
Location
Water Play Catarinense
You express your intent and compiler tries to find the fastest assembly that will 'technically' do what you wanted, cheating as much as it can without getting caught
Sadly, sometimes the C compiler will optimize out stuff that I do not want him to do so, forcing me to use the volatile keyword. Obviously, this has only been a problem for me in real time systems.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom