Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Decline Imported Universal Render Pipeline into Unity 2019.4.x and it ruined 4 projects!!!

Cleveland Mark Blakemore

Golden Era Games
Übermensch Developer
Joined
Apr 22, 2008
Messages
11,557
Location
LAND OF THE FREE & HOME OF THE BRAVE
I feel sick. It crushed four of my Unity games that were looking just fine.

I assumed Unity didn't pull stuff like this ... but they have changed. I know a lot of other people they did this too as well. It makes me feel sick in my stomach. My last backups were two weeks ago.

Going from 19.3 to 19.4 LTS just gutted four 3d games I was working on. Mucked up the shaders, destroyed the lightmapping, crashed 3/4s of the assets.

I have tried everything and Unity support has just lost the plot on this switch away from Enlighten. They've lost their minds. I've been an advocate since 2010 and they never pulled anything like this before. The Enlighten engine worked great and has gotten better with time, have no idea why they would rip their pipeline to shreds and then just crap on the rug by issuing it without much of a warning to devs.
 

Cleveland Mark Blakemore

Golden Era Games
Übermensch Developer
Joined
Apr 22, 2008
Messages
11,557
Location
LAND OF THE FREE & HOME OF THE BRAVE
Oh shit, I am sorry

If you've ever done anything on a computer, you know that testicle curling feeling you get when you have suffered a major loss and have not done a proper backup in a while? Imagine that times a thousand. I know I should have done more frequent backups but I never saw that coming. I always thought I would be able to repair it but the import just totally trashed everything. It was like a firetruck driven by drunken Unity staff crashed through the front of my house while they were whooping and hollering and they proceeded to throw firebombs into the interior even after I miraculously survived.
 

Cleveland Mark Blakemore

Golden Era Games
Übermensch Developer
Joined
Apr 22, 2008
Messages
11,557
Location
LAND OF THE FREE & HOME OF THE BRAVE
I'm not giving up. I have many prototypes still in 2019.3 and I am just going to pick one up and continue. You know what? If anything, I am going to look for a blobber to start on. I think my original Lovecraft blobber proof-of-concept in Unity 3D is still intact. With a good start on a Innsmouth type village you blobber about in and can open crates.
 

RPK

Scholar
Joined
Apr 25, 2017
Messages
337
how in the world did you not take a backup right before doing something major like that? or use source control?
Unity does not play well with Git or other version control systems without a ton of effort. It's a total shitshow.

git bash works just fine with Unity. not sure what you're doing that's making it so difficult.
 

DoomIhlVaria

Cipher
Patron
Joined
Nov 1, 2009
Messages
373
Location
Hell's Waiting Room
Make the Codex Great Again! Insert Title Here I'm very into cock and ball torture

RPK

Scholar
Joined
Apr 25, 2017
Messages
337
how in the world did you not take a backup right before doing something major like that? or use source control?
Unity does not play well with Git or other version control systems without a ton of effort. It's a total shitshow.

git bash works just fine with Unity. not sure what you're doing that's making it so difficult.
https://thoughtbot.com/blog/how-to-git-with-unity

correct usage of the .gitignore file is fundamental to using git. anyone doing c# development knows this.

any large stuff like textures and 3d models shouldn't be stored in version control anyway.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,223
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
any large stuff like textures and 3d models shouldn't be stored in git anyway.

FTFY, git is garbage for large files but that is an issue with git, not an issue with version control in general. If you are making a game then you must have all files on version control - code and data - because these (especially on games) are tied to each other and you certainly do not want to try and go back several versions when trying to figure out when a bug was introduced in the codebase while having data that may actually be what triggers the bug, or even worse, have data that does not work with the older code at all. Especially for games data are as important as code and you do want to be able to both go back in time and check different versions, revert older changes and create branches for features that may or may not work.

Sadly there isn't a free VCS that is good with handling big files like those you'd find in games, but Subversion should be good enough for smaller projects - i've used it at a gamedev job ~10 years ago where everything was on Subversion. It does tend to puke on the carpet when faced with multigigabyte checkouts though.
 

RPK

Scholar
Joined
Apr 25, 2017
Messages
337
any large stuff like textures and 3d models shouldn't be stored in git anyway.

FTFY, git is garbage for large files but that is an issue with git, not an issue with version control in general. If you are making a game then you must have all files on version control - code and data - because these (especially on games) are tied to each other and you certainly do not want to try and go back several versions when trying to figure out when a bug was introduced in the codebase while having data that may actually be what triggers the bug, or even worse, have data that does not work with the older code at all. Especially for games data are as important as code and you do want to be able to both go back in time and check different versions, revert older changes and create branches for features that may or may not work.

Sadly there isn't a free VCS that is good with handling big files like those you'd find in games, but Subversion should be good enough for smaller projects - i've used it at a gamedev job ~10 years ago where everything was on Subversion. It does tend to puke on the carpet when faced with multigigabyte checkouts though.

yes, that's fair. I was speaking from a frame of reference of things that don't change - in my world, 3d models, textures etc don't. I've commissioned a few things for a personal project but most of them are from asset stores, be the cgtrader, fiverr or the much maligned unity asset store. these things don't change for me, so I just stick them in a folder that isn't tracked by version control and if i need to recover them some day, I'll just redownload.

Since i have an MSDN account through work, I used to use TFS for version control. It truly was shit for Unity. Although you could put any massive file you wanted in there, it sucked ass for tracking whether a file was changed/moved or actually new.

come to think of it, I do have a few textures in source control that I've hacked/kitbashed to apply to existing models to make them at least a little unique for my use.
 

Gregz

Arcane
Joined
Jul 31, 2011
Messages
8,511
Location
The Desert Wasteland
Oh shit, I am sorry

If you've ever done anything on a computer, you know that testicle curling feeling you get when you have suffered a major loss and have not done a proper backup in a while? Imagine that times a thousand. I know I should have done more frequent backups but I never saw that coming. I always thought I would be able to repair it but the import just totally trashed everything. It was like a firetruck driven by drunken Unity staff crashed through the front of my house while they were whooping and hollering and they proceeded to throw firebombs into the interior even after I miraculously survived.

Automate your backups nightly to a NAS. There's no excuse not to these days, storage is dirt cheap.
 

DraQ

Arcane
Joined
Oct 24, 2007
Messages
32,828
Location
Chrząszczyżewoszyce, powiat Łękołody
FTFY, git is garbage for large files but that is an issue with git, not an issue with version control in general. If you are making a game then you must have all files on version control - code and data - because these (especially on games) are tied to each other and you certainly do not want to try and go back several versions when trying to figure out when a bug was introduced in the codebase while having data that may actually be what triggers the bug, or even worse, have data that does not work with the older code at all. Especially for games data are as important as code and you do want to be able to both go back in time and check different versions, revert older changes and create branches for features that may or may not work.

Sadly there isn't a free VCS that is good with handling big files like those you'd find in games, but Subversion should be good enough for smaller projects - i've used it at a gamedev job ~10 years ago where everything was on Subversion. It does tend to puke on the carpet when faced with multigigabyte checkouts though.
SVN is a humongous steaming turd and I don't understand how anyone could ever recommend it given almost any existing alternatives.
And just because git handles binaries poorly and you generally shouldn't put them in git, doesn't mean that:
  1. you can't put them in git if you really want to
  2. you can't put them somewhere else while using git for all that remains
 

Atrachasis

Augur
Joined
Apr 11, 2007
Messages
203
Location
The Local Group
OK, folks, I am genuinely curious and would hope to be educated by you (and condolences to Cleve...). My idea of software development is typing code into vi (that was a lie, actually I'm part of the unwashed EMACS masses) and then entering 'g++' into bash, linking against any external libraries as needed.

What is the mechanism by which an update of external software can actually mess up your own code and data? I'd understand API calls not functioning properly any more, but it sounds like Cleve's own source code and assets were somehow altered - how does that work?
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,223
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
SVN is a humongous steaming turd and I don't understand how anyone could ever recommend it given almost any existing alternatives.

The main issue Subversion has is that it keeps a second unmodified copy of files in your work directory under those ".svn" subdirectories which isn't really necessary. It does have a few other minor issues, but it is far from a "steaming turd". I'm not sure why you'd even say that.

But the thing is, there are no better alternatives. All VCS nowadays are like Git just with a slightly different skin.

And just because git handles binaries poorly and you generally shouldn't put them in git, doesn't mean that:
  1. you can't put them in git if you really want to
  2. you can't put them somewhere else while using git for all that remains

  1. You can put them but git will quickly become unusably slow and will use a lot of disk space in your local copy. Subversion isn't ideal either, but when it comes to large files it is considerably faster (i just did a quick test by creating and modifying files of 512MB with git and svn and svn was almost twice as fast as git) and you can manage the disk space by storing the repository in a dedicated HDD (though both are worse compared to Perforce which keeps track of modification status so it doesn't need to waste time checking that manually and keeps all data on the repository so it doesn't need to duplicate data on the work space).
  2. If the solution to a version control system deficiency is to use another version control system for the files that cause that deficiency, then you should reconsider using that version control system in the first place. Keeping everything in one place and in sync is important (and besides, all VCSs have issues, using two of them just means you get two sets of issues :-P)
Also keep in mind that Subversion hasn't stood still since Git became popular, it got several improvements since then in performance, branching, merging (for code), etc.

Perforce is free for teams of up to 5 people. It seems to be the standard in the game industry, and is far more popular than any alternative like Git LFS or Subversion.

You are right but TBH considering all the name and ownership changes that have gone to Perforce (now Helix or whatever it is called) and "added value" crap they seem to be adding, i have a feeling that it isn't going to be a safe choice in the long run. At least Subversion has been used in game studios (which is another reason i recommended it - i have used it myself at the past without major issues).

There is also Plastic SCM which was made by ex-gamedevs who wanted to make a better Perforce or something but i never tried that. I just noticed that they added a free option for individuals (i do not think this was there last time i checked it) but i do not really like how they want me to fill some form to "apply" for using it. And it seems to only be for hosting on their servers while I prefer to have full control of my data :-P
 

aleph

Arcane
Joined
Jul 24, 2008
Messages
1,778
If the solution to a version control system deficiency is to use another version control system for the files that cause that deficiency, then you should reconsider using that version control system in the first place. Keeping everything in one place and in sync is important (and besides, all VCSs have issues, using two of them just means you get two sets of issues :-P)

Or you could just properly separate code and data, like any competent software developer. Then you can also test different code versions with different versions of your data and find out much easier if a bug is due to faulty code or faulty data.

Also, I can't take anyone seriously who recommends SVN in 2020.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,223
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
Or you could just properly separate code and data, like any competent software developer. Then you can also test different code versions with different versions of your data and find out much easier if a bug is due to faulty code or faulty data.

We're talking about games, code and data are intertwined, you cannot separate them - e.g. a level that uses some special object for an event is useless by itself without the code that implement that special object and the code that implements that object is useless without the level where it is used. Moreover in many engines code is part of the data (scripts, shaders, etc) which can also contain assets with logic (node-based scripting, quest definitions, material graphs, etc) so the distinction isn't as clear as you may think.

And of course you want all of those versioned and synced so that if you need to check an older version you also have the data that is compatible and meant to work with that version.

Also, I can't take anyone seriously who recommends SVN in 2020.

Why is that?

EDIT: also what does 2020 have to do with anything? Subversion is still a project under development, the latest version was released just a couple of months ago, is used by many projects and is common especially with games. And regardless of that, what matters is if it does the job, not when it was made or what year is today.
 
Last edited:

aleph

Arcane
Joined
Jul 24, 2008
Messages
1,778
You can still split stuff up and use different version control systems even if it logically only works together. You just need a good build system which can assemble everything together.
I know, unity is a mess which makes separation of software components harder than it should be. But this is more an argument against using bad tools than against proper software development practices.

Come on, even if you don't like git, why use SVN, or any centralized version control system?
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,223
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
You can still split stuff up and use different version control systems even if it logically only works together. You just need a good build system which can assemble everything together.

You can split stuff, but there is no reason to do that since you'll need to keep everything synchronized anyway, you'll have to keep around and manage two VCSs with the issues that each one may have and you win absolutely nothing by doing so when you can simply have everything in one place and avoid all that.

I know, unity is a mess which makes separation of software components harder than it should be. But this is more an argument against using bad tools than against proper software development practices.

FWIW i'm not talking about Unity specifically but about games in general. And IMO using two VCS because one cannot cope with large files is exactly all about having your tools influence how you work instead of using tools that let you do the work properly (BTW there is nothing about "proper software practices" related to placing your game's assets in another repository - if anything, what you put in version control isn't some arbitrary code or arbitrary assets, it is the game itself, so it makes more sense to put the entire game in version control than split it in parts as a work around for trying to use a tool in ways that it wasn't meant to be used).

Come on, even if you don't like git, why use SVN, or any centralized version control system?

Because the decentralization is pointless when it comes to projects worked by a single person or specific team, especially when the project isn't even developed in the open (like most - though not all open source - projects), while it adds unnecessary overhead in terms of resources (mainly disk space but also performance after a threshold is hit) and maintenance.

A DVCS makes perfect sense for projects that have many people working in their own "silos" independently from each other that they eventually merge and any centralization exists mainly at a social level, like the Linux kernel which has thousands of developers working on it with hundreds of kernel repositories that may or may not be merged into Linus' repository, but it doesn't make sense for all projects - especially projects that are worked by single teams/organizations and everything goes through that team anyway.

Now, you can certainly use a DVCS as a CVCS (and this is what if not the majority, then certainly a very large number of people on GitHub do) however that doesn't mean you have to persist doing that when you hit issues that a DVCS isn't in a good position to handle and a real CVCS is superior at.

(though just to be clear, Subversion is certainly not the best CVCS out there for games, Perforce would be the best i've used myself at such a setting extensively. However Perforce is also too enterprise-y for my taste and i'd prefer a free option anyway so Subversion is the next best thing that i know of and have also used myself at two jobs working on games, so i know that at least it is up to the task)
 

NPC451

Literate
Joined
Jun 20, 2020
Messages
46
OK, folks, I am genuinely curious and would hope to be educated by you (and condolences to Cleve...). My idea of software development is typing code into vi (that was a lie, actually I'm part of the unwashed EMACS masses) and then entering 'g++' into bash, linking against any external libraries as needed.

What is the mechanism by which an update of external software can actually mess up your own code and data? I'd understand API calls not functioning properly any more, but it sounds like Cleve's own source code and assets were somehow altered - how does that work?

Here are the breaking changes going from 2019.3 to 2019.4(https://docs.unity3d.com/Manual/UpgradeGuide2019LTS.html).

Also, I have no idea why you would do a major upgrade without doing a complete backup first.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,223
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
What is the mechanism by which an update of external software can actually mess up your own code and data? I'd understand API calls not functioning properly any more, but it sounds like Cleve's own source code and assets were somehow altered - how does that work?

My understanding from Cleve's "Mucked up the shaders, destroyed the lightmapping, crashed 3/4s of the assets." and the bit about Enlighten later isn't that it messed up his actual code and data, but that the engine changed the way it treats shaders (especially lighting, since Enlighten is a lightmapping middleware used -among others- by Unity) and that was what messed up his work. Essentially it is as if you used some external library that had a SetPopStrength(float strength) function and in the newest version the function still has the same signature (so it is technically backwards compatible from an ABI perspective) as it had before, but the meaning of "strength" changed so even though nothing changed in your code that called that function, the actual behavior of your program did end up changing.

Note that this is just a guess, i do not use Unity myself so i have no knowledge about its quirks.

But i doubt he can do much about it and in the case i describe above a backup wouldn't help since it isn't his code that changed, it is Unity and unless he decides to never update to newer versions, at some point he'd have to adjust his assets to work with the changes.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom