Friday 29 January 2016

The AI Goblin and the coming Internet of Trust

There's been a lot of talk about AI recently - which always makes me suspect that things are about to change drastically in technology. The 'AI Goblin' is a herald of change - but the change is always something other than AI! In the late 80s and early 90s, there was similar talk: "the future is in expert systems, AI, data processing!" was the cry. Only more thoughtful people like Fernando Flores (plus quite a few other cyberneticians at the time) knew this was nonsense. The future of technology, they said (in 1986), was communication. By 1993 with the launch of the World Wide Web, we knew that they were right.

This is not to say that there haven't been advances in AI, and clearly "big data" is a big - if conceptually confused - thing. But Google translate and speech recognition remain the most impressive examples - hardly the kind of "wicked problems" that human intelligence is required to sort out. More modest data analysis and visualisations (for example, Danny Dorling or Hans Rosling) is far more powerful than any amount of automated sentiment analysis, topic modelling, and so on. Where to from here? Intelligent x, y, and z - we are told. But to what purpose? In whose interests?

But just as the 1990's AI goblin heralded a drastic technological change, so it does now - and just as in 1993, it's got little to do with AI.

We have learnt that the communicative internet has many wonderful properties, and it has led to us all becoming wired into the network. We've also learnt that it is possible to lie to people within this complex communication network and (very often) get away with it. Sometimes the lies are very big and very expensive: lies about economic performance particularly. Transparency on today's internet is just a mask: underneath the mask everything can be changed by powerful people, and nobody would be any the wiser. But there is more to communication than exchanging text messages and documents. Where the communicative internet has failed is in establishing trust.

This is why Bitcoin is likely to be seen by future historians (if not economists) as the real revolutionary technology of the age. Because in order to have a functioning currency people have to trust the authority that backs up the currency. We trust the Bank of England when they write on bits of paper "I promise to pay the bearer" and so the currency has value. We also trust the bank and national governments to manage the money supply so that the bits of paper (which are really IOUs) are sufficiently scarce (but not too scarce) to maintain the value of the currency and keep the social mechanisms of exchange going. But Bitcoin has no bank and no central authority. There is no single person who declares "I promise to pay the bearer", there is no authority to control the money supply. There is a distributed ledger which everyone can see, and a mechanism for managing the money supply linked on the activity to verify the contents of the ledger. So what we have here is a technology of trust: a radically different way of establishing faith in something without putting some social institution or individual in a position of authority.

Key to it is the way the ledger works and this - the Blockchain - is the thing that has got technologists excited.

There is no reason why the internet should work in the way that it does. We have servers and domain names because of a particular set of historical conditions and design decisions which led to the World Wide Web. But it has many drawbacks: it's centralised, it's impermanent, it exposes everyone to surveillance which is mostly used for trying to sell us things. The blockchain represents a different way of organising the internet which is decentralised (it's peer-to-peer), permanent (every file, every version of a file, web page and so on is stored), and whilst it provides a transparent audit of every single transaction on  the web, the actual contents of those transactions remain private - so the web doesn't simply become a vehicle for selling you things, whilst illicit activities would technically be flagged-up more readily.

Of course, the banks are interested because they have been in the firing line of breaches of trust which have exploited the internet thus far. National governments are interested because they don't trust each others' economic data. Radicals are interested because they believe the internet can be a better place than a giant panopticon. Educationalists should be interested because a better way of organising technology allows us to rethink the relationship between technology, education and society. 

Sunday 24 January 2016

Why does educational theory get 'stuck'?

An educational theory is an attempt to calibrate understanding about educational process with experiences gained from educational practice. Yet since education is a moving target, changed by government policies, technologies, new teachers, new students, and attempts to theorise it, the calibration process must be a continual effort. Yet so often, as with the social sciences more broadly, it isn't. Instead we see educational theories "solidified" (or reified) as a kind of blueprint for educational practice. This is particularly surprising given that most educational theories have grown from the system sciences (either General Systems Theory - Piaget, Vygotsky - or Cybernetics - Kolb, Pask, von Glasersfeld... maybe critical pedagogy theorists too) which are fundamentally concerned with circular and continually emergent processes. Why does this happen?

One symptom of the "stuckness" of educational theory is the fact that few educational theories are explored for their explanatory and predictive failures rather than their successes. In cybernetics, failure, or error, is the driving force behind the dynamic process of calibration. Bateson summarised this with his 'zig-zag' diagrams in Mind and Nature, here explaining the relationship between a thermostat managing room temperature and a person:



A theory is a way of logically generating possibilities which provide a lens for exploring reality. Ross Ashby, a cybernetician whose attempts to model the brain (and indeed unsuccessfully make an artificial one) led him to think deeply about his methodology: he argued that his knowledge arose from seeing what wasn't accounted for in any model, or where the model was wrong. Of course, models will always be wrong at some level. But seeing knowledge as emerging from where things don't fit rather than where they do is something of a Copernican turn in epistemology. And there are some obvious things that educational models cannot model: for example, the process of modelling cannot account for the factors that affect modellers (and everyone else): there are constraints bearing upon the modeller's brain: not just biology, but discourse, economics, physics, chemistry and education itself. How could Ashby's brain know a brain? To be fair to him, Ashby knew the question well, and his efforts were really about exploring the parameters of his lack of understanding. Following his example, we might ask how a brain can understand education? How could we pursue our lack of understanding of education scientifically?

To explore the lack of understanding of something, or the contours of understanding/non-understanding, first demands that the separation between knowledge and practice is dissolved. Ryle's distinction between knowing-how and knowing-that doesn't make much sense to a cybernetician. The only thing to say about practical knowledge as opposed to theoretical knowledge is that the balance of constraints bearing upon each is slightly different: academic knowledge is constrained by social forces like discourse and social norms; practical knowledge, whilst also being situated in discourse, is perhaps more directly constrained by individual bodies and psychologies. The two are entwined: the utterances of theoretical knowledge are the results of skilled performances which would fall under Ryle's 'knowing how'.

Having said all this, the fact is that we don't explore constraints in education. Instead we look for causal mechanisms. In the process, theory gets stuck. Why is this, even when it appears that educational theories are deficient in some important aspect or other?

As a critic of  this situation, I am in the position of a 'modeller' whose model of the complexities of education and the role of theory as calibration is not borne out by the experience of observing the dynamics of educational theory. However, if I am true to the principles that I have outlined, I am not seeking to identify mechanisms in my model which explain reality. I am looking for the deficiencies in my model in the light of experience, for those deficiencies give me more information about the boundaries constraint of education. The pursuit of these boundaries drives intellectual excitement and curiosity: in the case of theories of physics or chemistry which successfully predict natural events, the intellectual excitement is the pursuit of the boundaries within which given explanations hold good, and beyond which they break down. So often, it is this intellectual excitement which appears to be missing from education: curiosity gives way to dogma.

The constraints bearing on education (and theory about education) cover every discipline: physics, chemistry, biology, art and aesthetics, sociology, etc. The pathology of stuckness seems to set in when obvious constraints are ignored. For example, positivistic attitudes to education which argue for the measureability of learning through continuous monitoring and testing not only overlooks the psychological, sociological and biological constraints constraints which bear upon educational development, but also overlooks the many new constraints which its own approach (and the agents who uphold it) impose upon educational practice and change the nature of education. But the positivists will defend their position as 'scientific' whose interventions are causally defensible. Do they defend it more strongly because they know (deep down) it ignores so much? Is there a connection between the deficiencies of a model and the vehemence with which it is defended?

There is something odd about the social mechanisms which reward sticking to an established educational theory even in the light of it not being entirely effective. Here we are asking something about the different kinds of constraint that are at work and the ways they interfere with each other. There are plenty of examples of deficient explanatory ideas which are vehemently maintained in the absence of any justification. Religions, for example, are much more 'sticky' than educational theories!

In cybernetics, this interference between different constraints at different levels was studied by Gregory Bateson. Bateson based his theories on Ashby's identification of levels of coordination, in conjunction with Russell and Whitehead's idea of 'logical levels'. In Bateson's language, the 'stuckness' of educational theory is the result of a kind of 'knot' between different logical levels of constraint. A particular variety of this dynamic he called the 'double bind'. For example, we might imagine the constraints of a discourse, with its journals, reviewers, and a host of codified expectations about education. Then we can imagine the constraints within an institution which bear upon individual academics who wish to maintain their careers - which they can do by publishing. And we can imagine the constraints that bear upon individual teachers and learners. To identify a deficiency in a theory which perhaps spoke of the realities of classroom experience, would be be very difficult to get published if the reviewers had a stake in maintaining a particular theoretical pitch which was blind to particular issues. Given that the institution rewards publication, and the individual academic wants to progress in their career, the natural solution to the tension is to maintain existing theory. By doing so, the conditions are created where the tensions produced in the dissonance between theory and practice make it increasingly likely that the existing theory is safely maintained.

Knots like these are borne out of fear. The academic ego which resists evaluating their theory is borne of fear, and since fear is a bigger part of the university culture today than ever before, the state of knowledge within universities is a matter of serious concern.

Bateson was interested in how knots can be untied. The challenge, he argued, was to help people step outside the 'double-bind' situation they are in and understand its dynamic. This involves invoking a 'higher power' in the hierarchy which can observe the situation as a whole. Sometimes a "higher power" simply emerges by virtue of growing older. An interesting example of this is the concept of 'Santa Claus'. For very young children, Santa serves as an explanation for where their presents come from at Christmas, and some children can be genuinely anxious whether they have been 'good enough'. As children grow older, the concept of "Santa" cannot explain the obvious evidence that presents under the Christmas tree appear to be bought by mum and dad. 'Santa' persists as a concept, not only as an explanatory principle for the very young about presents, but also as an explanation for the older children about child psychology - that telling silly stories to young children is a playful and loving thing to do. This is knowledge emerging through the revelation of constraint, where the constraint revealed by the concept of Santa is the cognitive difference between the very young and the not so very young. Understanding this constraint is an important stage in the process of growing up.

Education, by contrast, doesn't have the luxury of simply being able to 'grow up'. The higher power which could address its double-binds will have to come in the form of a new way in which education thinks about itself.

Wednesday 20 January 2016

Bateson on Numbers, Quantity and Pattern

I was looking in Bateson's Mind and Nature for something other than that which eventually caught my attention (I was looking for what he said about calibration and feedback - more about that later). But the book opened on page 49 with the heading "Number is different from quantity". Since I have been thinking a lot about counting and probability recently, this resonated very strongly - Bateson's position with regard to probability is unclear, but his focus on counting here is important and relevant to the way we look at information and data today. Big data, for example, is fundamentally a counting exercise, not a quantifying one. Bateson says:
"Numbers are the product of counting. Quantities are the product of measurement."
So here we are going to get a distinction between counting and measurement...

"This means that numbers can conceivably be accurate because there is a discontinuity between each integer and the next. Between two and three, there is a jump. In the case of quantity, there is no such jump; and because jump is missing in the world of quantity, it is impossible for any quantity to be exact. You can have exactly three tomatoes. You can never have exactly three gallons of water. Always quantity is approximate."
Given the fact that I was looking for his ideas about calibration, this distinction between measurement as an approximation (a calibration?) and counting I found insightful.

"Even when number and quantity are clearly discriminated, there is another concept that must be recognized and distinguished from both number and quantity. For this other concept, there is, I think, no English word, so we have to be content with remembering that there is subset of patterns whose members are commonly called "numbers". Not all numbers are the products of counting. Indeed, it is the smaller, and therefore commoner, numbers that are often not counted but recognised as patterns at a single glance. Cardplayers do not stop to count the pips in the eight of spades and can even recognize the characteristic patterning of pips up to 'ten'. 
In other words, number is of the world of pattern, gestalt and digital computation; quantity is the world of analogic and probabilistic computation."
He then goes on to talk about Otto Koehler's experiments with jackwdaws which appear to count pieces of meat placed in cups. Bateson argues that the jackdaws "count"; I'm not sure (how could we be sure??!) - but there appear to be similarities between the manifest behaviour of the bird and human behaviour in counting.

He then goes on to say that "Quantity does not determine pattern". This one is particularly important for Big Data enthusiasts.

"It is impossible, in principle, to explain any pattern by invoking a single quantity. But note that a ratio between two quantities is already the beginning of pattern. In others words, quantity and pattern are of different logical type and do not readily fit together in the same thinking. 
What appears to be a genesis of pattern by quantity arises where the pattern was latent before the quantity had impact on the system. The familiar case is that of tension which will break a chain at the weakest link. under change of quantity, tension, a latent difference is made manifest or, as the photographers would say, developed. The development of a photographic negative is precisely the making manifest of latent differences laid down in the photographic emulsion by previous differential exposure to light. 
Imagine an island with two mountains on it. A quantitative change, a rise, in the level of the ocean may convert this single island into two islands. This will happen at the point where the level of the ocean rises higher than the saddle between the two mountains. Again, the qualitative pattern was latent before the quantity had impact on it; and when the pattern changed, the change was sudden and discontinuous.
There is a strong tendency in explanatory prose to invoke quantities of tension, energy, and whatnot to explain the genesis of pattern. I believe that such explanations are inappropriate or wrong. From the point of view of any agent who imposes a quantitative change, any change of pattern which may occur will be unpredictable or divergent. "

Pattern is closely tied up with the idea of constraint. In the example of the mountains, quantity might be thought of as an index of change which reveals latent patterning. "More water" is a kind of construct based on the similarity between "a bit of water" and "a lot of water". In reality the difference between "a bit" and "a lot" is revealed through the constraints of nature that it encounters.

There are some similarities between this thinking and the talk about the "tendencies" and "causal powers" of things in Critical Realism. I've always found the Critical Realist approach confused on this (or found myself easily confused by it). In a few paragraphs, Bateson cuts through the urge to make declarations about the nature of a mind-independent reality without losing sight of scientific integrity. 

Sunday 17 January 2016

Doing Cybernetics (and talking about it)

Talking about cybernetics is not the same as doing it. Doing cybernetics is a way of speculating about nature whilst actively engaging in a methodical way with nature. Talking about cybernetics is a largely descriptive and analytical exercise. Academia, on the whole, encourages people to talk about things. The kind of speculative activity that cybernetics engages in does not fit academic expectations. There are some areas of academia where doing cybernetics is possible - most notably, the performing arts, and also the practice of education itself. However, cybernetic speculations struggle to establish academic legitimacy - particularly when their codification is divorced from the practice that gave rise to their findings. There seems to be an unseemly rush to codify cybernetic speculations, and then to turn them into objects of discursive inquiry, or (worse) blueprints for interventions or analysis. As a result speculations get reified. The most appalling example of this is in the reification of the various speculations about learning contained within constructivism: speculation becomes dogma.

This has led me to think that maybe a meta-description of what 'doing cybernetics' means might help. I think we need a kind of meta-model of cybernetic activity which defends the highly speculative activities of cybernetics within the context of scientific practice recognised by institutions.

I start by thinking about modelling. Modelling is often associated with prediction and control, and there are many second-rate descriptions of cybernetics which commit this mistake. In fact, modelling is fundamentally about coordinating understanding through mapping a set of idealised constraints. A model, or a machine or software which articulates a model, provides an opportunity for cyberneticians to point at abstract mechanisms and reach shared understanding. Moreover, models are generative of many emergent possibilities. In agreeing about fundamental mechanisms and components, cyberneticians can also agree about the logical emergent potential of a model and its fundamental properties. It is through this logical emergent potential that modelling acquires the connotation of prediction. But this is really to get the wrong end of the stick.

If a model presents logical possibilities, reality (or nature) produces events which may be mapped and measured in the model. The most important process in modelling is the 'indexing' of real events with particular behaviours or emergent processes in the model. The indexing of events is not a trivial exercise: it is at the very heart of the cybernetic process. To index something is to determine similarities and differences between events: it is to reduce perceptions to manageable and measurable components. Although this is a reduction, it is not necessarily reductionist. It would only become reductionist if a particular reduction became dogma, washing over emerging differences which did not fit the model. In actual cybernetic practice, reductions are useful precisely because they reveal their deficiencies in the light of actual experiences.

To measure events is to index them, and then count similar events and their relations. In this process, the key feature is the distinguishing between unsurprising events and surprising ones. In measuring surprise among events, the degree of surprise in nature can be compared to the logical model of how such surprise might be generated. Knowledge is gained by recognising how the constraints of nature are different from the
constraints contained within the abstract model. Usually this leads to the identification of new indexes in nature (new things to count), or critiquing some indexes of things that were thought to be the same, but in fact are distinct. Models consequently have to be adapted to take account of new indexes, and in being adapted generate new sets of possibilities.

The disparity between constraints discovered in nature and constraints within a model is only part of the story. There are also many constraints which bear upon the minds of those who produce models. Among the most serious constraints which appear to bear on all humans is the constraint which encourages individuals to insist on the veracity of particular models even in the face of evidence that they are deficient. Cybernetics requires more than a logical approach to the examination of models and participation with nature. It also requires a high degree of self-examination, reflection and a letting-go of ego in the continual maintenance of a speculative attitude.

In the philosophy of science, the social identification of causal paradigms or explanations has been the central focus. Doing cybernetics brings a different social focus: the coordination of a discourse through the shared participation in nature, and social agreement about indexes within both nature and in models. Agreeing similarities is to determine constraints. To measure the surprisingness of events against a model is to learn about new constraints.

Perhaps the surprising thing is that I think good teachers do this all the time, and always have done!

Thursday 14 January 2016

Is E-Learning really an Academic Discipline - or is it just pretending?

There are very few e-learning academic papers I would recommend anybody to read. And yet, when examining the 'highest impact' journals in the 'Social Sciences' section on Google Scholar's index, what comes out top? The journal "Computers and Education".

Of course it's interesting to see which competing journals Computers and Education is seen to be superior to. Down at 16 is the American Sociological Review, and at 19, Organisation Studies! Frankly, this is surprising. What's going on? 

This reflects a kind of game that's going on in academic publishing which is being played between publishers (who are making a fortune), universities (whose managements seek simple metrics to manage academic staff) and academic staff who want to get promotion, or keep their jobs. Educational technology papers get most citations because the topic is at the heart of the game. Their scholarly content doesn't compare favourably with a journal like Organisation Studies. So many papers are formulaic applications of methodologies whose handles are cranked on spurious data from ill-conceived questionnaires and achievement data, with uncritical assertions of causal mechanisms. All without asking any meaningful questions.

It all reminds me of Stafford Beer's statement of Heinz von Foerster's 'Theorem no. 1': The more profound the problem that is ignored, the greater the chances for fame and success. (in von Foerster, "The responsibilities of competence" see http://link.springer.com/chapter/10.1007/0-387-21722-3_6)

Monday 11 January 2016

Elevating the human spirit: Why Music Matters

The last week has brought the death of David Bowie and Pierre Boulez. You couldn't really get an odder couple! Today's hysteria around Bowie's death is at least partly about nostalgia on the part of his fans for a musical soundtrack which accompanied their teens. There is creative musical and artistic innovation - a kind of gesamtkunstwerk of the embodied artist as media icon, composer, performer, actor, self-promoter and social commentator. Boulez, by contrast, is all about music as music, which he understood intellectually very deeply indeed.

To take away the sentiment about Bowie leaves something tangible in the music. How much is left, time will tell. There is little sentiment in Boulez - just the power of the intellect whose sonic form is, for many, is too raw to take. But it is an intellect and insight which gave him remarkable powers of interpretation of 20th century classical music - much of it tremendously delicate. In music, it is what is left which matters. Musicians live for a short time. Music lives for centuries.

We live in a very confused time. There is more than consolation or entertainment in the music of the past. There are messages of those lived in confused times but who managed to make some sense out of it, and the sense they made was often beautiful, challenging, stimulating or sometimes downright weird. The musical sense is the sense of connection with the past: the practices, the intellectual struggle, the techniques, solutions, tricks and so on. And what it produces is a 'whole' - an ecology - with components connecting across centuries via the physics of sound, techniques of composition, history, anthropology, biology, anatomy and psychology.

Most ecologies are slow moving and rather abstract and difficult to study. Although most are historical in some sense (they have ontogeny), music is an ecology which grows in front of us, whose growth we feel viscerally, whose ontogeny and phylogeny are laid bare, whose structure we can analyse, whose practices we can reproduce. Only musical practice allows us to tune-in to minds long dead. Tuning-in to the past unlocks new possibilities in the present.

Both Boulez and Bowie were in some way about elevating the human spirit. But it is what they knew about elevating the human spirit which will stand the test of time. In playing some Beethoven piano sonatas recently, I've been reflecting on what Beethoven knew about what can elevate the human spirit. I struggle to understand this. But I have little doubt that what we need urgently is the elevation of the human spirit. We need to find ways of infusing the musical insight of the past into our experience of today's world. 

Saturday 2 January 2016

Why Sociologists should be early for the next Technological Transformation of Education and Society

Over 10 years ago, I was engaged with the Centre for Educational Technology and Interoperability Standards (which was a service of JISC) in exploring the impact of Web Services on education. Web Services were a fundamental shift in the technological infrastructure - the exploiting of the potential of the HTTP protocol for disaggregating data services, and with it the disaggregation of the functionality of applications. Web Services introduced new levels of rationalisation into technological infrastructure, allowing rapid growth of the social web, with Web 2.0 services like Facebook, Twitter and so on rapidly arriving in the wake of the possibilities of the new technology. JISC were interested in the possibility of Service-Oriented Architecture (SOA) in educational institutions, initially creating a 'framework' of institutional services which could be linked together by means of standardised data interfaces. SOA has changed much in business infrastructure - not least, in new subscription funding models for software vendors (Microsoft's Office 365 is a classic Web Service implementation, iTunes was one of the first adopters), but also in affording new possibilities for outsourcing, disaggregating services, seamless mashups of different services and so on. And the social web, Web 2.0, has changed the way we live in a fundamental headline-grabbing way: headlines which we flick through on our smartphones - the clearest example of mashups and web-service coordination.

However, the thing that made it possible wasn't particularly headline-grabbing - Web Services are 'geeky'. Consequently, those who think deeply about society and education were relatively late to the party. It seems that only when the technology becomes so obviously important do they take any note. Now there are quite a few important critical books on the impact of web2.0, surveillance, capitalism, etc: Old-timers like Andrew Feenberg have been trying to keep up for a long time (but always somewhat late!), but among the younger academics, personally I'm keeping an eye on Dave Elder-Vass who has an important book on Pro-suming and Web2.0 coming soon, Clive Lawson who has just finished his book about the ontology of technology, and Mark Carrigan, who has just written a great guide to social media for academics, as well as more thoughtful e-learning academics like Lesley Gourlay who are delving into the socio-material literature in an effort to find new explanatory frameworks for technology in education.

But is it all about to change again? I can list a few problems:

  1. The sociological critique is too late to have any real impact - save the opportunity and necessity for sociologists to carve out a reputation for themselves (a sad indictment on the state of our Universities which technology has been instrumental in delivering!)
  2. Technology keeps on moving
  3. Web 2.0 is now old - and the future isn't Web 3.0... or probably "Web anything"...
  4. There are rumblings of another fundamental technical change which are only being heard by techies at the moment but which ought to be heard by the sociologists and educationalists if they really want to try to build a better society (and we should aspire to this, shouldn't we?)
  5. We need sociological and educational input in the early stages of these technologies, not when the technologies have matured - it's too late then.

I cannot be certain of this, but the technological 'rumblings' that could change everything concern the resurgent interest in the peer-to-peer technologies surrounding bitcoin and the consequent 'distribution of the web': blockchain, ipfs (http://ipfs.io), the Linux Foundation's announcement that it is to create a 'standardised' blockchain (see http://news.softpedia.com/news/linux-foundation-will-build-a-standard-blockchain-bitcoin-s-core-technology-498206.shtml), IBM's ADEPT (see http://www.coindesk.com/ibm-reveals-proof-concept-blockchain-powered-internet-things/) focusing on the internet of things, Microsoft's deal with Ethereum to provide Blockchain on Azure (see https://azure.microsoft.com/en-gb/blog/ethereum-blockchain-as-a-service-now-on-azure/). Web distribution fundamentally means everyone becomes a server, and this is a concept which has implications not only for the computing devices we now know - phones, laptops, etc, but the computing devices which are becoming increasingly ubiquitous in the environment - the so-called internet of things. The network gives us means of effective control to harvest processing power from billions of devices rather than rely on the servers of Google, Amazon and so on. Similar arguments are being made for harnessing the network as a means of control for low-powered individual devices like LED lights, fuel cells, and so on. Large-scale grids may soon look very old-fashioned.

The social consequences of a decentralised approach will not be insignificant if it takes off. As an educational technologist, I'm continually frustrated by what I can't do because I can't get on to the right servers to do it. Blockchains could remove those obstacles: any tool can be deployed by anyone for use by anyone: no more waiting for Twitter to think of the next way they can 'enhance' their service. More importantly, blockchains can potentially de-massify the web. Peer-to-peer relationships are different from client-server relationships: everyone can be both a client and a server. This is very different from the typical MOOC: there's an opportunity for genuine 'resource bargains' between individuals in sharing documents, conversations, code, tools, etc: maybe a bit like the exchange relations in bitcoin transactions. Nobody can be sure of the details of this, but the prospect is fascinating and exciting - it demands some deep thinking and careful interventions: the technology will take on a life of its own soon enough and the opportunity to influence it will be lost.

Any new technology is an opportunity to explore ancient problems afresh. The problem is that those who think most about the ancient problems are often the last to think about new technology as it emerges. I would like to change this - particularly if Blockchain really is as significant as it looks right now. As Andrew Feenberg points out,
"Technology can deliver more than one type of technological civilization. We have not yet exhausted its democratic potential" (Feenberg, "Between Reason and Experience"). 
Technologists have little experience in thinking about "democratic potential"; sociologists and educationalists are much better equipped. 

Friday 1 January 2016

Technology, Teaching, Learning and the Body

As a new year begins, the world of educational technology seems to be on the move. There are a number of reasons for this. First of all, there is a growing feeling that somehow the internet we have today is a disappointment. It does not seem to be the great emancipatory force which its creators envisaged. It's become a giant advertising hoarding, selling us all things we don't really want on the basis of what we click on. As we've become lured into the world of social media, conned into becoming pro-sumers (information slaves) for Google and Facebook, entranced by the shining magic laterns in our pockets, so the optimism of the web's early years has died. Most depressing is that the conversion from counter-culture to corporation happened so quickly without anybody putting up a fight.

Alongside this, the 'massive interconnections' of the web, upon which the advertising model is based, has provided a conduit for the darker side of humanity to reveal itself. Most recently we saw this with the twitter response to Carrie Fisher's appearance in Star Wars, but there are sadly so many instances of online misogyny, hate, racism, and so on. The narcissistic thrill of spleen-venting in public is one of the peculiar side-effects of the web. I've been guilty of it myself, convincing myself of the wholesomeness of 'advanced' technological practice (which somehow absolves responsibility), developing practices of blogging, tweeting, etc, but really (also) feeding a deep narcissism. This year I want to examine myself more deeply: I think there is much merit in public online communication practice; but I don't think we understand what it does to us or what it does to others. Fundamentally, the problem is that in our online world, real feelings and (particularly) real bodies are absent. That's becoming a big problem - not least in education.

In 2016, the response to the disappointment of the web is a resurgence of interest in peer-to-peer technology. The Linux foundation has just announced a big project to create a standard Blockchain (the technology behind BitCoin) - see http://news.softpedia.com/news/linux-foundation-will-build-a-standard-blockchain-bitcoin-s-core-technology-498206.shtml. This is an attempt to delve into the fundamental technologies of the web. Most fundamentally it is to rethink the relationship between the documents we exchange on the web, and the servers which contain them. It is not only to re-ask the peer-to-peer question, "Do we need servers?", but to rethink the relationship between interconnections between individual computers (yours and mine, say), and how changing those inter-relationships changes the human relationships behind them. There is an important example of how a different kind of architecture and protocol create new kind of relations in the technology that's given rise to Blockchain: the apparent viability of cryptocurrency like BitCoin.

The bottom line here is that currencies only work when there is trust. If we distrust the web now, if we despair at the surveillance, advertising and abuse of the crowd, might the peer-based technology that supports economic transactions be a platform for a deeper level of trust and understanding between individuals? And then, trust and understanding between individuals is also fundamental to learning. For all the nonsense about the status of universities, league tables, etc, any fulfilling teacher-learner relationship requires a deep two-way trust. It doesn't take an elite institution to guarantee it. But it does take intelligent management of institutions to facilitate it - that means maintaining an ecology, not instantly dismissing noisy union reps and their spouses, not closing departments, not sacking people for stealing screwdrivers, not making your business 'selling courses' rather than thinking about the actual needs of learners, teachers and society.. in fact, NOT doing all of the things which I've seen unfold in front of me in the last year. It's worse than unjust; its irresponsibility borders on the kind of criminal social neglect that we saw in the banks. And we shouldn't be giving honours to University leaders either! - that really is beyond the pale when they are paid such ridiculous amounts of money.

Educational technology is political and the principal driver for innovation is social justice. Maybe Blockchain is the beginning of something big. But we shouldn't lose sight of the fact that the real problem in educational technology is the human body and its relation to the screen. I think music provides a model for thinking of something better than our screen-obsession. Music is highly technological and always has been - some of our oldest technologies are musical instruments. Somehow the relationship between our computer technologies and our embodied selves needs to move towards the relationship between the musical instrument, player and audience.

On this theme, I was very struck by the current exhibition in the Wellcome Collection in London on "Tibet's Secret Temple" which contains medical and other documents from the Dalai Lama's private meditation chamber in the Lhasa Lukhang temple. These are striking images of whole-bodiness - very different from any kind of Western approach to medicine (I'm very mindful of this at the moment because I'm preparing some technology support for trainee doctors!). Wellcome is organising a series of events on mindfulness, meditation and health in the new year to accompany this. What about technology in this? There is technology in Tibetan medicine... but it's relationship to the body and the mind is very different to our conception of technology. I'm not the first to ask these kinds of questions: Stafford Beer, for example, sought advice from an Indian guru in the 1940s when he was in the army, and this certainly influenced his work on 'viable systems' and ways of using technology. Now his archive is on my doorstep in Liverpool, there's an opportunity to delve into this more closely... I think that's my New Year's resolution!!