Ellie Dommett
  • Home
  • Blogs
  • Books
  • Articles & Media
Blogging space

Culture, Strategy and Breakfast

7/12/2018

3 Comments

 
Many of us have heard the quote that "Culture eats strategy for breakfast" which is often attributed to the famous management guru Peter Drucker. My recent reading for H817 reminded me that this quote applies to more than just businesses in the traditional sense but also to universities.

This week I have been reading a paper by Macfadyen and Dawson (2012) called "Numbers are not enough." In this article they discuss several reasons for why evidence, in the form of analytics, is not often taken on board at universities in a way that drives stategic change. They suggest two broad reasons for this:
1. The perceived attributes of innovation
2. The realities of university culture

When talking about the attributes of innovation, they raise the suggestion that staff workload often blocks innovation or change because people do not feel that they have the time to engage with innovation. I can completely agree with this - workload is one of the most heavily cited reasons for not adopting new practices, and quite frankly, for a lot of substandard educational offerings. But I think there is more to it that than. It is not as though academics and many academic-related support staff are adverse to working long hours; it is a case, however, that all innovation is not created equal. Let's take a hypothetical example. Consider a situation where someone works really hard to redevelop a VLE for their programme or create a comprehensive assessment strategy or innovative module - what is their reward? Now you can make various suggestions here including greater student satisfaction, respect from colleagues for their expertise etc. They may even get a peer-reviewed article out of it, but for the most part, the reward is intangible. But now lets look at another situation, one in which those same hours are put into writing a grant application, which is also successful - what is the reward for that? In many universities the reward is much greater and much more concrete. 

The other reasons cited by Macfadyen and Dawson (2012) related to the culture of universities and below I have given some thoughts on each of them:

1. The fact that they tend to operate on a consensus governance model, which in reality is rarely reached. The opposite of this of course would be a strict hierarchy - something which would result in cries of top-down control and extremely unhappy staff. Having seen this first-hand myself I don't think anyone gains with the strict hierarchy, least of all the students but on the other hand having a decision actually made at a university, can sometimes be refreshing.
2. Faculty control over teaching and research is also raised as a potential cultural blocker to adopting innovation. I partially agree with this but I also think this faculty control supports innovation because it allows for smaller scale testing grounds. Whilst this relates in difficulty gaining consistency, it also support grass-roots innovation. The blocker for me is actually communication of the grass-roots findings rather than the fact the grass roots exist.
3. An organisation culture that supports change by adding resources rather than re-allocating them. This paper was written in 2012, which in the UK was just before the fee hike that change the face of higher education and, in some cases, put a massive strain on university finances, at least in the short term. I think we are much more comfortable now with the idea of reallocation but - given the comments above research - I would be amazed if funding would ever be reallocated from research infrastructure to education.
4. A curriculum structure that makes false assumptions about heterogeneity. I suspect this one is still true and possibly even more of a problem than it was in 2012 as we see a greater diversity in students.

In addition to the four listed above, I would like to add Change Fatigue to the cultural reasons why innovation is not often adopted. In some universities there is a constant cycle of changes - so much so that things nearly always reach full circle but many small changes happening one after the other create feelings of uncertainty and the benefits are often too small to be realised. This means that people just get tired of change, which appears to happen just for the sake of it rather than for good reason and to good effect.

Having reviewed the suggestions as to why innovation is rarely adopted, even where there is evidence to support its implementation, I agree there are a range of factors but I suspect they still all come down to one thing and that is the culture of universities - because it is this that determines the value of innovation rather than the inherent properties of innovation.


3 Comments

Different numbers mean different things to different people

7/3/2018

1 Comment

 
In this blog I am reflecting on how analytics, such as those generated by Google Analytics could be useful in education, specifically, what would be most useful to the different people.  As always within education I want to start with the learner. 

What can numbers tell me about how I learn and about how other learn?
The first part of this question might sound logical but why should a learner care about how others learn as well. I think there are two reasons they should. Firstly, the truth is that often there is an element of competition - maybe not openly amongst peers but certain in the job market and it is therefore helpful to have a sense of where you are in the cohort. Secondly, knowing how a first class student works can provide insight for other students. The latter is something we often lament in neuroscience - when trying to understand how, for example, the brain learns, we look at what happens when it goes wrong. Now this can be truly insightful but it is strange that we do not consider looking at the brains of those who have it mastered! I think there are a number of analytics that could help a learner address these questions:
  • User explorer - i.e. the individual learner behaviour on the VLE for example
  • Cohort analysis - where the cohort could be defined as your whole class e.g. every one taking a set module or every one taking a set module who performed similarly to you on the pre-requisite course etc.
  • User Flow - this would be very helpful if shown for an individual learner and then for a high performing cohort. This would enable the learner to see how the strongest learners navigate their learning and could give them tips for changing their behaviour.

What analytics can I use enhance my teaching?
I think there is lots of information that a teacher can use from analytics to enhance their practice but I also think that with a class of 200, for example, individual level data will be conflicting and unhelpful so I think for the educator designing material/learning activities group level data is key. For example, the following could help:
  • Demographic data including location: This can be helpful because it tells you a little bit more about your cohort and diversity but it is also a way to make your learning more inclusive. Take Psychology for example; the traditional student in this discipline is the white middle class female, but the field is becoming increasingly diverse and having an awareness of this enables educators to choose relevant and appropriate examples in their teaching. This is particularly important in a self-reflective discipline.
  • Cohort analysis: This can be helpful to allow comparisons before and after specific changes to a module but also to see if any particular cohorts require extra support e.g. those with or without a specific A-level subject may require extra help in the first year with certain modules.
  • User behaviour: For example, looking at how often people are logging and for how long can be helpful if the VLE will be used for specific time limited activities.
Following on from these group level analytics, if an individual student requires additional help then it is useful to have information on them specifically, similar to the kind of information that the learner themselves may find helpful.

What about TEL support staff?
We were asked as part of the activity to consider administrators as well, but in my current role, administrators have little to do with the design of the programme or student support beyond processing attendance and dealing with mitigating circumstances around assessment so instead I chose to consider the role of TEL support staff. For this group there is some potentially useful data that could inform high-level design of the online learning resources we offer, such as:
  • Mobile devices and technology: It is very helpful to know what kind of devices learners are using to access learning resources. This gives a sense of what minimum requirements must be met and also whether we are spending too much resource on, for example, creating a better mobile app, when most students log in from a laptop.
  • Benchmark data: Knowing how we compare to other universities could be very helpful. 

I think the key with analytics is that all data should be available to any role but that it is sensible to first provide the relevant data to specific individuals. If they then wish to delve a little deeper then it may be appropriate to share, for example, data you would normal reserve for TEL support staff with the educator. Of course, this would probably not be necessary if teaching and learning was co-constructed by all three of these key roles.

1 Comment

    Archives

    June 2020
    October 2018
    July 2018
    June 2018
    May 2018
    April 2018

    RSS Feed

Proudly powered by Weebly
  • Home
  • Blogs
  • Books
  • Articles & Media