Monocause
Arcane
- Joined
- Aug 15, 2008
- Messages
- 3,656
So I was wondering how Obsidian's doing under MS.
Years ago I've worked as QA, intermittently in an outsource outfit providing services to MS and directly at Microsoft. I've seen studios that appeared to thrive under Microsoft's umbrella (e.g. Playground Games), but also witnessed how fe. Lionhead sank during the disaster that was Fable Legends development.
F:L was an interesting case study, especially in hindsight, now that I've learned a fair bit about project management and participated in different projects from both tech and business perspectives. To give a few examples - it was IIRC 2012 or 2013, and Agile + Scrum were becoming hugely popular in software development. MS was a big ship, so adoption of Agile was usually patchwork and more often than not resulted in frankenstein monsters. F:L was no exception - the Lionhead guys technically worked in Agile, but MS management had a very waterfall approach and, AFAIK, expectations. I remember being baffled about how you can talk about being Agile, but also about milestones you committed to deliver in the next 2-3 quarters.
My impression was that the Lionhead guys kept alternating between trying to satisfy the actual users that they've engaged in the beta and satisfying contracts and commitments they had with Microsoft. End result was that the project went on in a snail's pace, and went massively over budget - don't remember the figures anymore but I think we were talking about tens of millions of pounds in the red. Microsoft, on its part, kept alternating between trying to satisfy the creative studios need for autonomy and creative independence (and they honestly seemed to try, and succeeded with playground) and MS need for control coming from the fact that they just kept losing money and also really wanted the title out of the door to boost recently released Xbox One's game portfolio which at that time was still rather underwhelming and outgunned by what PS4 offered.
The inflated budget didn't come only because of Lionhead's own failings too, mind you. A great example of how Microsoft blew its own money (and also of how the Agile vs Waterfall cultures clashed) was automated testing.
See - some MS engineers and execs were absolutely enamored with pushing the boundaries of test automation. This is a long story in QA in general, back then and even today people keep going around making powerpoint presentations claiming how you can achieve 90% automation coverage which reduces the need for manual QA, increases security and saves budget in the long run. Thing is, it's incredibly hard to pull off, and once you invest it's difficult to turn back - and overhead resulting from maintenance just keeps growing if the automation plan goes south. In a different company, I've seen 4 highly-paid automation developers working on solutions that tracked pixel positions on a webpage; it was an overengineered piece of crap and similar results would've been achieved by a single guy testing the website sporadically with his own pair of eyes. No coming back though, cause it'd make someone look bad.
With Legends there was a similar case. A bunch of highly-qualified and well-paid SDETs (Software Development Engineers in Test) laboured very hard on delivering a solution that technically could deliver 100% automation coverage. The idea was that the game would be developed with hooks in the engine in all the right places, so you could automate characters performing complex actions and spend 3 minutes (once!) writing scripts for the automation rather than 30 minutes doing the same thing manually every time code changes could've impacted the scenario.
In principle it was a good idea; 100% automation coverage was a joke that probably was supposed to brainwash some people in management, but it would indeed be a cool system. Problem was, the codebase was notoriously unstable, likely because of the clash between the automation guys expecting solid specs and Lionhead guys going agile (and also messy). Thing with test automation is that maintenance is key as you need it to be super reliable. Meanwhile what would happen was that we in manual QA would quickly start ignoring the automation reports completely as they were very rarely right about anything.
I still remember coming in to the office one morning and receiving an automation report saying that build #XX crashed on boot. I look up above my PC screen and see the guys working on build #XX which appears to run just fine. That was the last time I read an automation report.
Considering the likely wages of these SDETs involved, I'd estimate that the automation system alone consumed at least a million pounds, probably a fair bit more, while delivering no value whatsoever.
So, after this lengthy intro - would anyone know how OBS is doing under MS? Any culture clashes, technical meddling? Not asking about classic "publisher interference" but about the grittier details such as those seen in examples above.
Do we still have any OBS devs here?
Years ago I've worked as QA, intermittently in an outsource outfit providing services to MS and directly at Microsoft. I've seen studios that appeared to thrive under Microsoft's umbrella (e.g. Playground Games), but also witnessed how fe. Lionhead sank during the disaster that was Fable Legends development.
F:L was an interesting case study, especially in hindsight, now that I've learned a fair bit about project management and participated in different projects from both tech and business perspectives. To give a few examples - it was IIRC 2012 or 2013, and Agile + Scrum were becoming hugely popular in software development. MS was a big ship, so adoption of Agile was usually patchwork and more often than not resulted in frankenstein monsters. F:L was no exception - the Lionhead guys technically worked in Agile, but MS management had a very waterfall approach and, AFAIK, expectations. I remember being baffled about how you can talk about being Agile, but also about milestones you committed to deliver in the next 2-3 quarters.
My impression was that the Lionhead guys kept alternating between trying to satisfy the actual users that they've engaged in the beta and satisfying contracts and commitments they had with Microsoft. End result was that the project went on in a snail's pace, and went massively over budget - don't remember the figures anymore but I think we were talking about tens of millions of pounds in the red. Microsoft, on its part, kept alternating between trying to satisfy the creative studios need for autonomy and creative independence (and they honestly seemed to try, and succeeded with playground) and MS need for control coming from the fact that they just kept losing money and also really wanted the title out of the door to boost recently released Xbox One's game portfolio which at that time was still rather underwhelming and outgunned by what PS4 offered.
The inflated budget didn't come only because of Lionhead's own failings too, mind you. A great example of how Microsoft blew its own money (and also of how the Agile vs Waterfall cultures clashed) was automated testing.
See - some MS engineers and execs were absolutely enamored with pushing the boundaries of test automation. This is a long story in QA in general, back then and even today people keep going around making powerpoint presentations claiming how you can achieve 90% automation coverage which reduces the need for manual QA, increases security and saves budget in the long run. Thing is, it's incredibly hard to pull off, and once you invest it's difficult to turn back - and overhead resulting from maintenance just keeps growing if the automation plan goes south. In a different company, I've seen 4 highly-paid automation developers working on solutions that tracked pixel positions on a webpage; it was an overengineered piece of crap and similar results would've been achieved by a single guy testing the website sporadically with his own pair of eyes. No coming back though, cause it'd make someone look bad.
With Legends there was a similar case. A bunch of highly-qualified and well-paid SDETs (Software Development Engineers in Test) laboured very hard on delivering a solution that technically could deliver 100% automation coverage. The idea was that the game would be developed with hooks in the engine in all the right places, so you could automate characters performing complex actions and spend 3 minutes (once!) writing scripts for the automation rather than 30 minutes doing the same thing manually every time code changes could've impacted the scenario.
In principle it was a good idea; 100% automation coverage was a joke that probably was supposed to brainwash some people in management, but it would indeed be a cool system. Problem was, the codebase was notoriously unstable, likely because of the clash between the automation guys expecting solid specs and Lionhead guys going agile (and also messy). Thing with test automation is that maintenance is key as you need it to be super reliable. Meanwhile what would happen was that we in manual QA would quickly start ignoring the automation reports completely as they were very rarely right about anything.
I still remember coming in to the office one morning and receiving an automation report saying that build #XX crashed on boot. I look up above my PC screen and see the guys working on build #XX which appears to run just fine. That was the last time I read an automation report.
Considering the likely wages of these SDETs involved, I'd estimate that the automation system alone consumed at least a million pounds, probably a fair bit more, while delivering no value whatsoever.
So, after this lengthy intro - would anyone know how OBS is doing under MS? Any culture clashes, technical meddling? Not asking about classic "publisher interference" but about the grittier details such as those seen in examples above.
Do we still have any OBS devs here?
Last edited: