Saturday, May 28, 2011

Another Look at the Cost of Working in Games

After talking with a couple of friends I decided to take another look at the overtime data I posted on the 26th.  Specifically, I decided it would be interesting to look at a weekly breakdown.  People are more used to talking about hours-per-week and it makes the data a lot more approachable.  I'll still do it in minutes, to keep the integer accuracy.

I went through and sorted the each day's OT number into one of 5 categories:

  1. Slacking off (working a short day)
  2. Comp Time (specifically given time off in lieu of a holiday worked)
  3. Regular OT (extra time spent working on a regular work day)
  4. Holidays Worked
  5. Weekends Worked
 I then summed the daily totals into weekly totals in each category and generated the following chart.


And for completeness, the net sum as well:



While doing that, I ran into a few days early on where I wasn't counting holidays worked/comped in the same way I was later.  I corrected those for this set of charts.  I may go back and adjust my previous data if I get the time.

Anyway, on with the cool stuff!

A couple of trends are pretty apparent.  After weeks of OT, and especially when I worked a weekend or holiday (or both), there is a bit of relaxation.  Mostly that came in the form of comp time, but I did slack off a bit after a run of weeks of OT.

You can also see the amount of slacking build up just prior to my post-project comp time and vacation.  That won't surprise anyone working in the industry - things slow down as the project is wrapping up.  There's really not a lot you can do at the office if you don't have bugs to fix.  So you come in a little later and maybe leave a little earlier.  As long as you stick around long enough to offer moral support and make sure those that do have work won't need help from you, you're free to go.

Finally, the overall trend of the project is a little more apparent.  Things start out quiet, with maybe a spike early on for a deadline.  Then a slowly building pattern of pulses - a push for a deadline followed by a little down time, then another push.  Toward the end it looks like I added more weekend time, but cut back a little on the regular-day OT a little.

Some stats

Weeks tracked: 132
Weeks with a 0 balance: 18
Weeks of holiday or vacation: 16
Working weeks with a 0 balance: 2

(Apparently I'm pretty bad at hitting a 0 balance.  Heh.)

Across all weeks, all numbers in minutes.
S & C = Slack & Comp time

s & c OT total
Count 132 132 132
AVG -159.8 246.5 86.7
MEDIAN -60.0 108.5 25.0
Min -2250.0 0.0 -2250.0
MAX 0.0 1941.0 1941.0
STDEV 334.6 344.1 517.9


Across weeks with non-zero values, again in minutes:

s & c OT total
Count 98 104 114
AVG -215.3 312.9 100.4
MEDIAN -94.0 169.5 53.0
Min -2250.0 2.0 -2250.0
MAX -5.0 1941.0 1941.0
STDEV 373.1 360.1 556.4


The weeks where I had a lot of comp time is skewing things quite a bit, making it harder to see how much slacking I did.  Subtracting those weeks I get:


s & c
Count 95
AVG -148.8
MEDIAN -93.0
Min -784.0
MAX -5.0
STDEV 151.7

That still has the occasional comp day mixed in, but takes out the worst of the effects, so should be easier to compare against the OT weeks.  Since the OT still contains the holidays worked and those are the reason for the comp days, it seems only fair to leave them in.


Conclusions

Again, this is by no means a complete picture.  These numbers don't capture things like the effects on my health and well being, relationships with friends, etc.  But it's a start.  Now to see if I can figure out a good way to start tracking something quantifiable for those other aspects of my life and start including them on my new spreadsheet...

Friday, May 27, 2011

A Funny Thing Happened on my way to the Blog

Ars Technica posted an article today by Andrew Groen about death marches and the culture of overtime in games. It's not a new story. In fact, it covers a lot of the rumours and stories I rather blithely brushed over when I said "I've seen examples of most every rumour you hear out there, and heard 1st hand from people who've lived the rest."

As with previous versions of this story, Mr. Groen paints a pretty grim picture. Flagrant disregard for years of research on overtime and worker productivity. Exploitation of employees, forcing them to work long hours to make horrible games they don't care about. Early burnout and retirement by industry stars.

But I don't think that's a complete or accurate picture. Mainly because it lacks solid numbers to back it up. Every single assertion Mr. Groen makes is, at its heart, an anecdote. I can provide counter anecdotes for each and every point he makes from my own personal experience. I can also find anecdotes that back up most of what he says and know people who can back up the rest.

The problem is, there's no comprehensive survey to tie it all together, to put those anecdotes in their proper context. He's not talking about a real study or doing quantitative analysis himself. He's just retelling an old story about overtime and exploitation, employees buying into the machismo and visions of creating cool games.

That doesn't mean he's necessarily wrong, though!

It's just that he's not necessarily right, either.

We just don't know, because there aren't enough people gathering real numbers to say for sure.

Do I think the industry has problems? Definitely.

Does the industry need to address the OT issue? Mmm... that's more complicated.

Yes, chronic OT and death marches are bad. But some OT, carefully focused and used wisely, can result in pretty amazing things. Both in the end product, and in the team chemistry. Ask anyone who's gone through it - the guy that leaves before the crunch is rarely missed as much as the one that leaves after.

Will these problems be fixed any time soon? Doubtful.

Until the industry grows up a bit... probably even unionizes, I don't think we'll be able to really address these problems. And I don't think unions will happen as long as most of us are salaried. It allows them to ... encourage ... us to work OT, but it also means they're competing with other salaried positions for similar skills. And for us programmers, that means not a bad living. It's hard to organize a union when most of the potential members have a pretty high standard of living and even sympathize with management and buy into a bit of the mythos of OT.

So yeah. There are issues in the industry. But writing won't-someone-please-think-0f-the-games-worker articles won't solve it. Those of us in the industry now are the only ones who can start to solve it. And the only way we can do that is by gathering data to understand what's really going on. Only then, only once we know where we are, can we figure out the road to get us to where we want to be.

Thursday, May 26, 2011

The Cost Of Working In Games

I've worked as a programmer in the games industry for some years now. The details of where and when aren't important right now so I won't go into them. What I'd like to talk about is the general perception of what it's like to work in the industry vs. the reality.

I run into a couple of common views from people. The most common is "That must be awesome!"

Mostly this comes from people who hear "Video Games" and think we must sit around the office and play games all the time.

Which, well, we do. But not *all* the time, and we do try to keep it off the clock.

And there are certainly some other pretty cool aspects to the job.

But there are downsides as well, and all the pluses don't make the minuses any less bad.

The other view I get a lot is "Wow, I hear it takes over your life."

That one is pretty wide spread among people in the industry and the people close to them. The EA Spouse article is the most prominent example of this.

What I find most interesting about this view is that I've never seen any hard numbers to back those claims up. Anecdotes, regardless of how many you have, don't add up to evidence.

Don't get me wrong - I think there is definitely something going on. I know I've felt like the job was taking over my life. People in the industry do work a lot of hours and sacrifice a lot of holidays and weekends.

But we also get comp time. And there's a lot more flexibility for start times, especially if you just worked until 2am.

I've seen examples of most every rumour you hear out there, and heard 1st hand from people who've lived the rest.

But how much of that is confirmation bias and/or selection bias?

How does it all play out, when you add up the comp days, the late nights, and sometimes equally late mornings? Certainly, every person's experience is different, but you just don't see real numbers.

I've always been salaried, which, as far as I know, makes it illegal for them to track my hours in detail. But that doesn't mean I can't track them.

So I did, for 2.5 years.

The Good Bits

I started a spreadsheet late in pre-production on a particular project. I tracked my time all the way through to finalling, took some time off, and when I got back, continued tracking into pre-production on the next project. I was laid off after ~30 months, late in pre-pro on the next title, so this covers basically a full project cycle.

I decided the best way to post the data is in chart form. The raw data has a lot of both personal info (it turned into a very handy calendar) as well as company-private info (deadlines, etc). The later I clearly can't release (it's just wrong, never mind the lawsuit I'd probably face). The former could be interesting if I was better about being complete with it - joining my personal schedule to the time worked would be fascinating, but probably socially awkward if released.



(Click the images for a larger version.)

This chart is my daily minutes of over time across the entire project. I counted any time worked past the 7.5 hours mandated by law here in Canada as the maximum working day's length. I also counted any time spent on a weekend, or when working a holiday. I opted for minutes to avoid decimal hours, days, etc.

The negative spikes are times I left early, came in late, took a long lunch, or was given a comp day in lieu of a holiday. The -450 minute days are the comp days. The cluster toward the end, followed by the long stretch of 0's is my post-project comp time followed by vacation.

This chart alone is interesting, but it's a little noisy - there's a lot of jitter, and it's hard to see the overall trend - sure, towards the middle there seems to be a bit more of working late and less of leaving early, but it's hard to see the accumulation.


That's the cumulative story. There are a few flat bits after spikes - those tend to be when we got comp time. The down-slope of the tallest peak is my post-project comp time. The plateau is my vacation - I took a few months off after this project, so I wasn't logging any hours either way. The cliff is me returning from vacation and deciding it was best to just reset the counter. It was a new project, a less-stressful environment during pre-pro. I felt a mountain of "uncompensated OT" looming over me would just lead me to slack off to try and balance it out. Which wouldn't have served the project or myself well at all. So I reset it to 0 and started counting again.


Some stats, all in minutes:

Total OT worked: 31644
Total Comp Time: 9675
Total Slacking: 10521
Balance: +11448 OT

Peak running OT balance: 17750
Balance when reset: 10585

Single longest day: 813 (OT: 363)
Longest whole-day OT: 600


Total Days tracked: 921

Total Comp days: 21.5
Sick Days taken: 5 *
Vacation Days taken: 70 **

OT Days ***: 41
5 were holidays, moved and compensated for.
36 were weekends, mostly **** uncompensated.

Streaks (10+ days in a row):

19 days (10 office, 2 home, 7 office)
12 days (5 office, 2 home, 5 office)
12 days (5 office, 2 home, 5 office)
18 days (16 office, 1 home, 1 office)
11 days (5, office, 2 home, 1 sick, 5 office)
13 days (8 office, 1 sick, 4 office)
19 days (6 office, 1 home, 12 office)
14 days (all office - followed by 1 day off, then another 5 days in the office)

(home == working from home)


* - I'm not 100% sure on this figure, but I'm generally healthy and 5 days over 2.5 years is normal for me.

** - 70 days of vacation?! Yeah. One of the (apparently fading?) perks of working in games is a "sabbatical" after some years of service. Basically it's a bunch of extra time off. Mine came up and I took it.

*** - Weekend/Holiday working days - to be fair, it was rare for me to work a full day on these days, though my longest OT day of 600 minutes certainly goes against that.

**** - mostly uncompensated weekend days - we were given a few extra days here and there, but it's hard to attribute them to specific weekend days worked.


So, there you go. Some numbers for one person on one project. I don't want anyone to think this is typical in any way - it's a data set of 1, so take it as just an example.

Conclusions

1) It's pretty clear that, for this one project, I worked a fair amount of OT.

However, I'm by no means bitter about this. Anyone that knows me should know that already! And this is by no means an accusation against any former employer.

Making a game is a creative process, and like all creative processes, it requires some experimentation. "Finding the fun" is a very touchy-feely process, and no amount of planning will ever be able to predict how long it will take.

That means adding features late, cutting a feature after it's built (wasting time), and a fair amount of backtracking - many a time have I torn out code when the design changed, only to have the design change back a week later and put the code right back in.

Overtime is just a reality for any creative process that has marketing involved - IE: any creative product you want to sell. That's just the way it is.

I happen to like the environment. But I also think it's interesting to get some detailed numbers and see how big of an effect it's actually having on me.

2) Interestingly, I found that the act of tracking my hours also helped me hit my estimates for task length. Estimating how long something will take to write is one of the unteachable skills of being a professional software engineer. The only way to learn it is by writing a lot of code and making a lot of bad estimates. Over time, you get a little better at it.

However, I found that shortly after I started tracking my time, my estimates got a lot more accurate. I don't feel like I changed my estimating habits any - rather, it was the heightened scrutiny of time worked that I think made the difference.

Trying to keep track of hours worked in my head is messy, and at 4pm on a Friday, it's hard to start a big task. Even if I felt I should make up for that long lunch the other day.

Having a big -120 in front of me on the spreadsheet made it clear, though. I should stick around for a couple extra hours. So instead of leaving at 6pm, I'll stay 'till 8 and balance things out. That works out to a 4 hour block of time to code, mostly uninterrupted, if I start something at 4pm on a Friday. Not something I'd have thought of before.

3) So on the whole, I'm quite pleased with this experiment. I've learned a lot about the actual impact of working in games on my life. Sure, this data doesn't show the whole picture, but it's a start. I'd love to see numbers from other people in the industry.

Sunday, May 22, 2011

Debugging Software and The Scientific Method

I find it odd that debugging software, sorting out the errors in a program's source code, is simultaneously the most frustrating and exhilarating aspect of writing software. I've actually banged my head against a wall when trying to track down a bug. Conversely the feeling of satisfaction when I finally pin a difficult bug down and dissect it is indescribable. Being able to point at a line of code and say "that's the culprit, and this is how we fix it" is a joy like no other. Almost as good as the first time you watching a program you designed and wrote run correctly.

I've described the process to various non-programmers in the past and the analogy I've always used is that of a detective - sleuthing out the underlying problems and pointing a finger at a particular area of code to say "book 'em Danno." And it's not a bad analogy, really. But I think both processes are actually better described as using the Scientific Method.

When trying to sort out what's misbehaving in a piece of software we have to:

Observe what the software is doing - gather as much data as possible on how it should behave given various inputs, and then observe how it actually behaves when given those inputs. How does the output differ from the expected output?

Hypothesize - in order for the program to behave the way it is, the code must be doing X instead of Y. At this point, for simpler bugs, we can often just look at the source code to see. For more complex or subtle bugs, though, we often need to go further.

Predict - If that's the case and it is doing X, then we should see a particular behaviour A, given some input not in our initial observation.

Test - try that input!

Collect data - Does it do A, as predicted? Or did it do something else? If it did something else, back to step 1! Going through here too frequently can lead to that frustration I mentioned...

However, if it did follow our prediction, we need to make a judgment call - is that enough evidence? What other possible hypotheses are out there? How can we test those, to see if they're the case or not? Building a hypothesis and set of tests to falsify (or not) that hypothesis, and simultaneously falsify other likely hypotheses is the art of debugging. And the core of what Science is all about - subjecting a hypothesis to rigorous testing and possible falsification, building up a consensus of data lines to raise it to a proper Theory.

Well run software companies carry the process even further by adding in peer review - all changes should be put before at least one of your peers to see if they can find faults in your solution or new errors you might introduce with your change. It's not (usually) as wide spread as publishing in a peer-reviewed scientific journal, but it can be just as rigorous. A good reviewer will ask penetrating questions about your understanding of the situation and propose alternative hypotheses as to what might be going on, to see how well your solution works.

Certainly, the analogy only goes so far - we're building software, which is an Engineering task. We're not adding to the sum of human understanding about the functioning of the universe. But the methodology is fundamentally the same and good engineers need to have a solid grasp on it.

Wednesday, May 18, 2011

Crushing your head... crush crush

"It suddenly struck me that that tiny pea, pretty and blue, was the Earth. I put up my thumb and shut one eye, and my thumb blotted out the planet Earth. I didn't feel like a giant. I felt very, very small."

— Neil Armstrong

After a long day at the office, and a busy few weeks, I went out for my monthly Skeptics in the Pub. A fine evening of odd, interesting and fun banter, with people of the same characteristics. I was introduced to a game called "Marry, Shag, Kill", in which one participant proposes 3 people to whom another participant has to assign one of each of those verbs. A fascinating game with many and fine permutations. The more one has imbibed, the finer the game seems, it seems.

Perhaps the most memorable event of the evening, though, occurred on the ride home. A few minutes of relative solitude on the bus provided me with the chance to think. Disdaining such activity for the waste of time it is, I chose instead to distract myself with that odd, moon-like object hanging in the sky.

Which, it turns out, was the moon.

Now, this is an odd thing here in Vancouver, seeing the moon. Rare is the night when one can see past the perpetually looming cloud deck, drizzling its grey gloom.

Remembering Armstrong and the basics of geometry, and glancing nervously around at my fellow late-night commuters, I tentatively raised my thumb to see how easy it was to abolish a world. And easy it was. The moon disappeared behind my thumb, along with an astronomical number of galaxies, quasars, super-clusters and super-massive black holes that happened to be roughly in the light cone extending from my retina and through the most opposable of my appendages.

And oddly, it did make me feel a giant.

For a moment.

Until I realized that the sun is roughly the same size, from my perspective.

And that blew my mind.

The sun and the full moon are (almost) the same size.

How else could we have a total solar eclipse?

Looking at the moon, it was hard to picture the sun as that big in the sky. It's a blinding, but ultimately small point of light, right?

Step outside the next time you can. Look at a full moon. And then take a look up during the day. Those two things are the same size in the sky.

And most interesting of all, this wasn't always so! And won't always be so! Though other, more interesting things will probably occur before we need to worry about losing our total eclipses.

Still, see how easy it is to cover either with your thumb.

And think about what a fascinating, insignificant, pale, blue dot we live on.

G'night.

Sunday, May 15, 2011

Particles and Waves

Light behaves as both a particle and a wave. This is a core part of the standard model of physics and is interesting for a number of reasons. Two things in particular are on my mind of late.

First, I spent a long time trying to wrap my head around how light can be both at the same time. If it behaves as both, it must somehow be both. I've seen a number of people hit that wall and just give up on ever understanding physics or anything quantum.

But that doesn't have to be. Light is neither a particle, nor a wave, nor a duality. It is something else. Light is its own, unique thing. It happens to sometimes behave like our model of a particle. And at other times it behaves like our model of a wave. Thinking it's somehow both is getting hung up on the labels of our models. The fact that light acts kind of like one model and kind of like another just means that neither model by itself is correct.

And that means that neither label is correct, either. Light is light. The fact that it doesn't fit under the intuitive sounding labels Particle or Wave is irrelevant. Even if it behaves like them sometimes. Especially if it behaves like both at some time or another. Light is light, and we need to think of it as its own thing.

I can't recall who said that first... perhaps Hawking? Regardless, when I read that the apparent paradox vanished and my thinking became a lot clearer. Hopefully more people can come to understand that and move past this particular hurdle to their understanding of physics.

--

Second, I'm fascinated by the double slit experiment that so aptly shows that the nature of light is neither a particle nor a wave. For those not familiar with the experiment, google it up. It's amazing.

I recently read Programming The Universe, in which the author discusses how the photon could interact with itself as it passes through both slits to form the interference pattern we see even with single photons shot through the apparatus. What I'm curious about, and haven't been able to find in an admittedly cursory glance around the web, is how far apart the two slits can be to still get the effect.

I'm thinking of an electron's cloud/orbit around the nucleus of an atom - it's not really an orbit, the cloud is more of a probability volume, where the electron is very likely to be at a given time. If a photo has a similar probability volume, it seems likely that the slits couldn't be any further apart than the diameter of the probability volume. After a certain distance apart, the two slits essentially become two separate, individual slits.

The question is - does a photon behave like an electron, IE: does it have a probability distribution for it's location at any given time? And if so, is that related to the wave length or amplitude of the wave, or is it a separate variable? And if so, what is it called?