Monday, July 25, 2011

It's not what you say...

This is a distillation of a talk Sadie Crabtree gave at TAM 9 in Las Vegas.  The ideas are (hopefully) as she presented them - I'm not qualified to know if they're original or if she (just) did a great job of presenting them.

It's not what you say, it's what they hear.

That's the #1 sound-bite to take away.  There are some details, but the heart of the matter is captured in that message.

Classic communication theory has the entire process broken down into a few steps:

  1. You have an idea.
  2. You translate the idea into words.
  3. You speak/write/sing/or otherwise encode those words into a medium.

    The medium transmits the encoded message
  1. Someone hears/reads/otherwise receives the message.
  2. They parse the phrases and words.
  3. They form an idea of what you're saying based on their interpretation of your words.

If any step is missing what you thought is not received by the person on the other end.  If at any step the message changes, their perception of your idea is changed as well.

You have control over your side of things - what words you chose, what form you chose.

From there, the message goes out.  You have no control over how it is transmitted.  You have no control over how someone hears it.  You have no control over how they parse it, or what ideas they form.

The only control you have over how the receiver interprets your message is in the words you chose.

Pro-Choice.  Pro-Life.

Who could be against either of those?  Those are very carefully chosen words because it creates an uncomfortable mental state to think of yourself as against Choice or against Life.  The mental dodge we all use is to say "I'm not Anti-Life, I'm Pro-Choice", or "I'm not Anti-Choice, I'm Pro-Life."  But it's just that - a dodge.  We know "the other side" picked their phrase to make it hard to refute.  We refute it anyway, claiming it's bunk.  But ours... Ours is the real interpretation.  I'm not Anti-Life, I'm Pro-Choice.

Labels.

Next time you're talking to someone, try to be aware of the labels you use.  We all use them - communication is impossible without them.  Just try to think of how someone might have a slightly different definition of a label than you.  What does "Republican" mean to a republican?  What about to a democrat?  What about "Taxes"?  Or "Wellfare"?

Labels have power, in their ability to condense an idea to a simple, easy-to-repeat sound bite.  But that power can be (and often is) subverted by people having different definitions for those labels.

You may be trying to say, "Lets not burden the poorest people with extra expenses" when you say "Tax the rich", but many will hear, "I think the most successful people should be punished for their success by having to pay the way for the lazier members of society."  "Rich", "Taxes" and "Poor" are labels with different definitions for different groups of people.

---

Next time we find ourselves in a situation where someone is just not understanding us, the right thing to do is step back, learn what we can about their perspective and try rephrasing what we're saying so they'll understand it better.


Communication is difficult.  But it's also the most powerful invention of the human species.  Learning that our perspective, our words, and our definitions for those words are not the only ones out there is the first step to effectively utilizing this most powerful of inventions.

Sunday, July 3, 2011

What happened to the Food Network?

I have a confession to make.  It's a dark secret I've been carrying with me for many years.  I love cooking shows.  Not just cooking shows... any show where they show you how to make something.  But cooking shows... there's something much more immediately attainable about them.  I may not have every gadget they use, but it's easy to get by and start making a dish that catches my eye.

And the beauty of those shows, the formula that Julia Child came up with - is in the de-mystifying of the art of cooking.  Showing that while great cooks are artists, you too can create some pretty great food.  You might not ever run a kitchen at a world-class restaurant, but there's no reason you can't try something new in your own kitchen, and no mystery to how the chef's do what they do.  Just years of practice with stuff you have around your own house.

The rise of cable television, the proliferation of 24-hour channels dedicated to a given hobby or interest was a great boon for a while - wannabe cooks got the Food Network, science geeks got Discovery channel, Alien abductions and paranormal woo-ists got The History Channel.

But something has changed.  I tuned in to the Food Network the other night and up came a show - Top Chef.  Prior, recent exposures showed me other shows like Chopped, and Hell's Kitchen. These are reality contest shows.  Not cooking shows.  They don't show you anything of the cooking involved - They quickly toss out who is doing what, but there are so many people and so much time spent showing off their personalities, that there's no time to talk about the how or why - it's 20 seconds of Bob is making souffle! Jody is making Creme Brulee!  Then on to the tasting and judging, with a lot of shots of the contestants as they're being judged.

The cooking has been almost completely jettisoned from the cooking show.  In favour of overhyped drama and jump cuts to fires flaring and pans being banged.  What the food is, how it's made, why the cook is doing what they're doing... all of that is gone.  Most of the time you don't even get to see what the cook is doing.  The "mystery" is back - some people do some stuff, food you couldn't think of yourself is presented, and then judges weigh in.

That's not a cooking show.  That's a game show, based on a skill we can't judge for ourselves.  I see no reason it should even be on the Food Network.  And definitely no reason to watch it myself.

And while we're ranting, what's with these kids, with their music?  Get off my Lawn!

Bah.  Maybe I am getting old after all.

Wednesday, June 15, 2011

It is everyone's business to be downtown during a riot.

I want to start by saying I disapprove entirely of the violence and destruction going on across the bridges from me.  I also think it's entirely likely that it is just a core group of people who are causing the bulk of the damage.  And I understand what the police are saying about it the damage being against the law.  All of those are quite obviously correct.

What I disagree with is the sentiment that the people standing around taking pictures and videos have no business being there.

The police very much want to keep things under control - that's fine.  That's their job.  And it's a good thing to do - without the police, society would collapse pretty fast.

But with the advent of portable, cheap cameras and YouTube I don't think anyone should be at all surprised that people want to video what the police are doing.  Go to YouTube.  Right now.  Put "Police Violence" into the search box and see what comes up.  How many of those videos are from people "standing around, taking pictures, with no business being there"?  How many of those situations would have had one-sided reports from the officers involved prior to the advent of hand held video and easy, wide-spread dissemination of that video?  It's hard to say, but it's easy to understand why most people would think things are more one-sided when only the police can be around and only they can file reports.

I think that most of the lookie-loos downtown are doing us all a service.  Their presence, with the recording equipment they most likely have on them, is helping to keep the police in check.  If not during this incident, any video captured of police crossing the line will serve as a warning for future incidents.

I've also seen footage of the police with cameras - one officer confronting a person, another filming it.  And a group of people all around filming it all.  Bravo!  That is the solution to big-brother.  Give everyone a camera and who is watching the watchers?  Everyone.  The threat and power of Big Brother comes from the lopsided nature of the equation - it's even in the name - BIG brother.  If everyone is equal, if only through their ability to film the goings-on, then Big Brother isn't so big any more.

Yes, I have sympathy for the police - it's a tough situation and bad things are happening.
Yes, they need to get in there and clean things up.
But No - people do have business downtown.  As long as you, as public servants, acting in public, are around, the public has a right to watch.
Yes, they can do it from outside of any areas you designate as hotspots - specific blocks with specific problems.

Tonight hasn't been the best night for this city.  I'd rather have seen an orderly, if disappointed, crowd leaving after the game.  But I'm actually rather pleased that so many people are staying around, watching and recording.  The media is portraying things one way, and they have a lot of sway.  But the deluge of video that is likely going to hit the 'net in the coming hours is going to make an impression.  Some good will come of this, even if it seems pretty rough for a while to come.

Saturday, June 11, 2011

Redefining Computer Science

People have been complaining about the lack of students in science and tech degrees here in North America for some time now.  The fewer students studying these topics, the fewer people going into the work force in those fields, the more our position in those fields is weakened relative to other countries.  I haven't come to a conclusion on whether that's necessarily a bad thing, but I suspect most people will assume it is.  Or that there are plenty of other reasons that it's good to have a decent proportion of our students studying these fields.

Regardless, the slipping enrolment numbers for science and tech degrees is being treated as a problem by universities.  Not surprising - they're in the business of selling degree programs.  And I don't really have a problem with this - it's in their interest to decry the numbers going down in what is basically their revenue stream.  And it's probably in our interest to have more students studying those things.  So it's likely a win-win, if it works.

What I find a little disturbing, though, is the possible dilution of the field of Computer Science, as hinted at in this article in the New York Times.

On the surface, the changes the article talks about seem like a good idea - tinker with the curricula to draw potential students in with a focus on the applications they use every day.  Want to know how your iPhone works?  Come study Comp. Sci. and we'll show you!  On the way, you'll learn the fundamentals, and off you go to make your fortune.

The problem, though, is that the assumption seems to be that the theories of computer science, and the focus on those theories, is what is driving students away.  Therefore the way to fix the numbers problem is to ... adjust ... the balance of the curriculum to not hit the theory so hard.


What worries me is that my personal experience on the interviewer-side of the table these last few years has been disheartening.  Too many recent grads with C.S. degrees just don't seem to have a solid understanding of the underlying theories of computation or the mental tools used in solving computational problems. 
Curricula are already too light on the theory!

Talking about tinkering with them to make them less overbearing, with all their talk of Theory... it makes me worry that they're re-branding Computer Science.  It sounds like they're shifting it to be what was called Computer Information Systems at my university.  Basically - application programming, with only enough theory to make it so you could ask something resembling an intelligent question of a Comp. Sci. person when you run up against the edges of your knowledge.


Don't get me wrong - I think there is plenty of room for that kind of a degree.  The software field has more than enough work to go around, and the vast majority of it really doesn't need a deep understanding of formal languages or set theory to be done.

Just don't call it Computer Science!  It's dishonest and confuses the matter.

Wednesday, June 8, 2011

Essentialism and Evolution

I've been thinking a bit about Essentialism lately.  People seem to be wired up as Essentialists.  We love to, maybe even need to, classify things - that's a cat, this is a dog.  That over there is a tree.  Or is it a shrub?  And is that a Beagle or a Basset?  It's a hybrid?  What breeds?

Classifications and labels are very useful things, for sure.  Beagles are good hunting dogs.  Bassets... not so much.  Knowing which is which is pretty helpful when you've got a fox on the loose.

But it's easy to lose sight of the forest for the trees.  Richard Dawkins makes this point in his most recent book when talking about why some people seem to have a hard time grasping Evolutionary Theory.  Thinking of something as Essentially a dog appears to be a detriment when trying to understand how the species evolved and continues to evolve.

Take Beagles.  A "Beagle" is just a member of a population of beings similar enough to interbreed and produce something we'd still lump in with the Beagles.  Any one beagle is a member of that population, and will have characteristics that are within the normal range for that population.  That population is "All Current Beagles", lumped together based on our arbitrary description of the characteristics of the breed.  But within that group there is a lot of variation.  If you took the extremes of height from within the group - the tallest and shortest individual dogs - and stood them next to each other, it's less apparent that they're the same breed.

Now imagine splitting the population in two - half stay here, the other half are sent to Russia.  You breed them for 100 generations, selecting only the shortest.  The Russians do the same, taking only the tallest.

At the end, you bring them back together - the 100th generation tallest of the tall and shortest of the short would probably be considered different breeds by then - Great Beagles and Miniature Beagles, perhaps.

At no point along the way could you identify one individual in the group and say "your parent is a normal Beagle, and you're the first Great Beagle".  There's no clean line across the family tree - there's just a gradual shift from some individuals at the start that are clearly Beagle to an indistinct group in the middle to some individuals that are clearly Great Beagles at the other end(*).

Once we start thinking about things as individuals in a population, and the population as only roughly defined by a set of somewhat arbitrary characteristics, our view of things change.  No longer are there individual species... there's a massive population of living beings, some of which can interbreed and produce remarkably similar offspring.  Some very few of which can interbreed in fascinating hybrid ways.  And most of which can't interbreed.  Over many generations, the characteristics of the various populations will shift and morph to fit their environment, driven by the environment's relentless selection of only those individuals that are least-worst suited to it (IE: that can produce the most offspring).

Looking at it that way kinda makes the whole Evolution thing seem a little more clear.  At least, I find it so.


(*) - I picked height because it's generally a feature driven by many genes.  The case of dwarfism, though, would be an exception.  As I understand it, there is a single mutation that causes dwarfism in humans, and interestingly, I understand it's a very similar mutation that causes it in dogs.  Miniature breeds with short, stubby legs have that mutation, and it's quite conceivable that you could draw a line in the tree where that first shows up.  Aside from that, though, height is a good example for the point I'm trying to make.

Wednesday, June 1, 2011

Robo-Ethics

I went to a thing called Cafe Scientifique last night.  I happened upon one a month or so ago when it overlapped with the Skeptics In The Pub event that I was attending (only the second of those that I'd been to, as well!).  It struck me as rather a nifty idea - a person of a scientific persuasion shows up and talks about their specialty.  "Talks" being the active verb - it's something of a conversation with the group, not so much a lecture.

This week was a rather lively discussion on the ethics of Robotics, led by AJung Moon from UBC.  The conversation wandered a fair bit, and I'm afraid she didn't get a chance to talk about everything she'd planned to, but it was still fascinating and I'm glad she came out to talk with all of us.

There was a lot of discussion on defining what a Robot actually is.  Which I suppose shouldn't have been too surprising - a scientifically minded audience is going to want you to be precise with your definitions.  Perhaps a couple of people got overly into that sub-topic.

AJung did provide a definition - an entity that is capable of interacting with the world, through sensors and some sort of mechanism (IE: a camera and robot arm), and has some level of autonomous decision making capabilities.  It's a broad definition, but the topic of ethics should cover a wide range of situations (if not all!), so it seemed suitable to me.

The very breadth of the topic is a little daunting.  After a little bit of introduction she jumped in with a number of more-or-less disturbing examples of robots that are out there now, some on the market, some in more of a research capacity.

For me, the most interesting, and most immediately pressing from an ethical point of view, is the robot teacher from Korea.  It's not fully autonomous - it has a human operator.  But the operator isn't controlling every minute motion of the robot.

The ethical questions she brought up around this particular example were on two lines:

1) the changing of the teacher-student relationship - normally it's dominant-submissive. But person-robot is as well, and when you make the teacher a robot, suddenly you have a dominant role but a submissive entity...

2) uncertainty around such young children being exposed to and interacting with a telepresent person - will they be able to tell the difference between that and a robot and a person?

Both are interesting questions.  My gut feeling is that the second is less of a problem than some people might think - kids are pretty good at sorting out complex rules of interaction with different people.  I suspect they'll handle this new telepresent person-entity just fine.

But there's definitely room for research into both - how do kids bond and interact with such artificial entities?  How do people behave when you have contradictory role and entity relationship modes?  Science and research can provide a lot of data to inform ethical decisions like that, and it's good to have people like AJung bringing them up for consideration.

I think I'll be going to more of these events.

Saturday, May 28, 2011

Another Look at the Cost of Working in Games

After talking with a couple of friends I decided to take another look at the overtime data I posted on the 26th.  Specifically, I decided it would be interesting to look at a weekly breakdown.  People are more used to talking about hours-per-week and it makes the data a lot more approachable.  I'll still do it in minutes, to keep the integer accuracy.

I went through and sorted the each day's OT number into one of 5 categories:

  1. Slacking off (working a short day)
  2. Comp Time (specifically given time off in lieu of a holiday worked)
  3. Regular OT (extra time spent working on a regular work day)
  4. Holidays Worked
  5. Weekends Worked
 I then summed the daily totals into weekly totals in each category and generated the following chart.


And for completeness, the net sum as well:



While doing that, I ran into a few days early on where I wasn't counting holidays worked/comped in the same way I was later.  I corrected those for this set of charts.  I may go back and adjust my previous data if I get the time.

Anyway, on with the cool stuff!

A couple of trends are pretty apparent.  After weeks of OT, and especially when I worked a weekend or holiday (or both), there is a bit of relaxation.  Mostly that came in the form of comp time, but I did slack off a bit after a run of weeks of OT.

You can also see the amount of slacking build up just prior to my post-project comp time and vacation.  That won't surprise anyone working in the industry - things slow down as the project is wrapping up.  There's really not a lot you can do at the office if you don't have bugs to fix.  So you come in a little later and maybe leave a little earlier.  As long as you stick around long enough to offer moral support and make sure those that do have work won't need help from you, you're free to go.

Finally, the overall trend of the project is a little more apparent.  Things start out quiet, with maybe a spike early on for a deadline.  Then a slowly building pattern of pulses - a push for a deadline followed by a little down time, then another push.  Toward the end it looks like I added more weekend time, but cut back a little on the regular-day OT a little.

Some stats

Weeks tracked: 132
Weeks with a 0 balance: 18
Weeks of holiday or vacation: 16
Working weeks with a 0 balance: 2

(Apparently I'm pretty bad at hitting a 0 balance.  Heh.)

Across all weeks, all numbers in minutes.
S & C = Slack & Comp time

s & c OT total
Count 132 132 132
AVG -159.8 246.5 86.7
MEDIAN -60.0 108.5 25.0
Min -2250.0 0.0 -2250.0
MAX 0.0 1941.0 1941.0
STDEV 334.6 344.1 517.9


Across weeks with non-zero values, again in minutes:

s & c OT total
Count 98 104 114
AVG -215.3 312.9 100.4
MEDIAN -94.0 169.5 53.0
Min -2250.0 2.0 -2250.0
MAX -5.0 1941.0 1941.0
STDEV 373.1 360.1 556.4


The weeks where I had a lot of comp time is skewing things quite a bit, making it harder to see how much slacking I did.  Subtracting those weeks I get:


s & c
Count 95
AVG -148.8
MEDIAN -93.0
Min -784.0
MAX -5.0
STDEV 151.7

That still has the occasional comp day mixed in, but takes out the worst of the effects, so should be easier to compare against the OT weeks.  Since the OT still contains the holidays worked and those are the reason for the comp days, it seems only fair to leave them in.


Conclusions

Again, this is by no means a complete picture.  These numbers don't capture things like the effects on my health and well being, relationships with friends, etc.  But it's a start.  Now to see if I can figure out a good way to start tracking something quantifiable for those other aspects of my life and start including them on my new spreadsheet...