English Matters

Once Again, Without Data

It has become a common complaint, across the board, that statistics relating to Canadian higher education are sorely lacking or, when they do exist, are misleading–ACCUTE past-president Stephen Slemon has written about this matter on this very blog.  Whether it’s in terms of faculty hiring (and of what sorts), student outcome, or relative growth in funding and expenditures between capital projects, administration, and teaching, we’re operating with few details, and often with far fewer facts to work with than our American cousins have at their disposal.  Of course, we can still learn from some of those American data points, but we do have to keep the multiple different contexts (between nations, between states and provinces, between educational systems) firmly in mind.

And so it’s with that caveat in mind that I wanted to bring to your attention two different surveys being conducted.

The first is being run by (the much cut) Statistics Canada (the real one, not the twitter parody): an invitation-only survey, “Research and Development in the Higher Education Sector.” I’m in the middle of it now (had to stop to shovel, and then chip ice, and then get ready to shovel some more), and I’m feeling ambivalent about it.  The survey addresses primarily the work-balance between research, teaching, and administration.  I’m finding some of the questions relatively science- and lab-specific, and it also draws a hard-and-fast line between teaching and research, one that seems… problematic.  (I may have gone on a wee rant in the comments, too, regarding the need for a proper, long-term analysis of hiring trends, by discipline, in the Canadian academy…)  How many of you are also filling this out (I wonder, for example, what range of faculty–CAF, tenure-track, etc.–were sent the survey, and how that was decided)? And those who are filling it out, what do you think of the questions?

The second survey is one you can all fill out, sent to us by our colleagues at the MLA.  Many of you will have seen their email: it’s a survey about Work Roles and Workforce Models, and you can find it here.  This project is being run by the Delphi Project on the Changing Faculty and Student Success  From the first question, however, you’ll discover that this is not aimed at faculty teaching in Canadian colleges and universities, and so we’re back to my opening caveat.

Getting detailed, Canadian statistics on postsecondary education is clearly necessary, and yet seems to not be happening.  This is something ACCUTE has discussed with the CFHSS and CAUT before, and we will again, but maybe we should all also be raising it with our federal and provincial representatives?  What research projects out there are doing this work?  What do you think?

4 replies »

  1. Hi Jason, Yes I will be honest and say I have become more and more disenchanted with the MLA and how they seem to consistently forget that Canadians AND adjuncts (sessionals) exist, not in any particular order. Also I remember filling out a survey done by SSHRC a few months back that was also lacking in the key areas you mention above (poor demarcations between research and teaching, lack of questions that probe financial issues and stability faced by those in graduate school, etc.). Thank you so much for this very necessary blog and for putting this issue out for discussion.

  2. The tweets coming from @MLA today are rather poorly phrased and the data in that study is done with a very small sample size. Instead of using the study to suggest #alt-ac is a path being chosen, it is instead being spun as adjuncts lack PhDs (which is very bold to say using such a statistically unrepresentative sample). So long and short, you are right, we need more data!

  3. I would disagree that the problem with the MLA study is sample size. 2,500 people sampled from a population of 65,000 is actually extremely large (a confidence interval of 2% at a 95% confidence level). Many surveys go with a lot lower numbers than this. The problem, it seems to me, is the missing 300 people from the initial sample group that the MLA couldn’t locate. That’s a fatal flaw when you have to discard 11% of your sample. I think it is safe to assume that few of these people wound up in tenure-track jobs, since departmental websites would be the easiest place to locate them. This gap throws the whole unemployed and under-employed end of the scale into question.

    The StatsCan survey looks like it will be even worse – self-reporting hours worked is rife for misrepresentation, inaccuracy and bias. There won’t be enough salt in the country to swallow those numbers.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s