Hello cyber world. Well, I’ve taken a bit of a hiatus, succumbing to all those warnings I so often see about new blogs on the net: the increasingly scarce postings that eventually lead to a blog left in a moment in the past, encapsulating a brief moment of one person’s effervescent inspirational burst of energy before a certain loss of overzealous exuberance and eventual indifference and abandonment, leaving their blog to float in the virtual purgatory of unloved blogs.
But no! I won’t let my blog go down this path! I am here to pull out my defibrillator, place the electrified pads on my blog’s motionless chest, and bring it back to life! CPR, Heimlich Manoeuvre, slap in the face and a bucket of ice water over the head, whatever it takes! Worst case scenario I’ll have to break out my mad scientist wig, strap the blog to the top of a building and wait for the lightning storm of the century to jolt this Frankenstein back to life!
Okay, enough with the drama, let’s talk about Complexity Theory in Second Language Acquisition (SLA).
“The act of playing the game has a way of changing the rules.”
– James Gleick (1987)
The main proponent of this theory in its application to how we think about how people learn languages is Diane Larsen-Freeman. Now, my main goal here will be to briefly outline a theoretical position, quickly sliding over problematic areas, and conveniently sweeping areas needing more elaboration under the rug. As the name implies, this theory is complex, and especially in considering how it seems to be applied to SLA, more complex (or rather counter-intuitive) than its proponents might have us believe.
But let’s get on with it. How did it get into Second Language Acquisition research???
Much of initial research in SLA had a strongly cognitive focus. This in large part was inspired by Chomsky’s devastating blow dealt to behaviorist models of language learning and in its stead posited a “Universal Grammar,” that is an innate capacity in all human beings to learn language.
From this, Corder (1967) proposed that learners have a “built-in” syllabus which was added to by Selinker’s (1972) idea of a kind of systematic inter-language, which according to him, is a intermediate process in learning a language whereby a “psychological structure… latent in the brain” (p. 211) becomes activated (see Pinker 1979:PDF, for an overview of some of the major initial theories in language acquisition around this time).
Skipping ahead, after first language acquisition researchers showed a systematic progression of language learning (specifically a highly regular pattern of acquisition of English morphemes by first language learners (Brown 1973)), second language acquisition posited a similar acquisition order in English as a second language (Dulay & Burt, 1973). The major questions involved with all this research revolved around the attempt to understand how people acquire “mental grammars,” both of a first and second language. The focus was on the individual and their mental processes.
Larsen-Freeman (2011) describes her uneasiness towards much of this kind of research, which is very much still prominent today (Doughty & Long, 2003).
She cites three points of contention with the way data was gathered for these research questions using experimental designs, which I will phrase as the following:
1) The problem of ecological validity (how do we consider the varieity of factors involved in language learning, ranging from individual difference to context?)
2) The problem of causality (what are the cause and effect relationships between these multivariate factors in language learning?)
3) The problem of “ubiquitous variability” (how do we reconcile the variability of both data collection methods and the context of that data collection, i.e. where and how we got our data?).
“To me (these research designs) denied the commonsense understanding that SLA processes were complex, situated, and likely multivariate” (p. 49).
Then one day, in a ‘serendipitous’ encounter, Larsen-Freeman stumbled across a book that radically changed her perception of the language learning process. This book was James Gleick’s Chaos: Making a New Science (1987).
This led her to see language as a ‘bottom up’ ‘emergent’ process, resulting from the ‘interactions of multiple agents in speech communities.’
Larsen-Freeman describes the way complexity allows us to look at phenomena from a different perspective:
“…complexity theory seeks to explain complex, dynamic, open, adaptive, self-organizing, non-linear systems…It sees complex behavior as arising from interactions among many components – a bottom-up process based on the contributions of each, which are subject to change over time” (2011, p.52).
Language along with the myriad of its related aspects, from use to teaching, are all considered complex systems. Thus, with our analytical lens of complexity theory, the idealized isolation of components in a system to test for correlation becomes increasingly problematic with each additional “element,” forcing us to unite the range of phenomena into an ecologically viable whole….or at least that’s one of the major agenda’s of Complexity Theory in SLA.
These are some of the general theoretical principles Diane Larsen-Freeman proposes for complexity theory as an approach to SLA:
1)”Language is a dynamic set of graded patterns emerging from use”: “grammar isn’t the source of understanding and communication, it is a by-product of communication.”
2) “Language-using patterns are heterochronic and adapted to their context of use” (i.e language use is locally contigent, i.e. context is important) as far as the use of ‘heterochronic’ here, she writes that this means: “Language events on some local timescale may simultaneously be part of language change on longer timescales.” Not sure if this is a heterochronus pattern, but it sounds cool at least.
3) “Language Development proceeds through soft-assembly and co-adaptation”: Soft-assembly is a neat word to “signify how learners use their language resources to respond intentionally to the communicative pressures presented by their interlocutors, including classmates and teachers.” Co-adaption here refers to the following formula: put two people in a room = put two dynamic systems together, and dynamic stuff will happen (i.e. unpredictable stuff) as these two systems dynamically interact.
4) “Stable patterns emerge bottom-up from frequent soft-assemblies in co-adapted interaction”
5) “Learners play an active role in language development”: learners pay attention to not only “positive evidence” but negative evidence as well. In other words, learners are actively exploring the terrain of language use, both using evidence of both what appears as well as what doesn’t appear in daily interactions. That is to say, again, that the “knowledge” a learner has of a language is a collection of all the memories of “previously experienced utterances (p.55).
6) “Cross-linguistic influence manifests itself in numerous ways”: “Learners’ language resources are always dynamic ensembles, expanding and contracting with time, place, and circumstance” (p.58)
7) “Intentionality and Agency are important”: Larsen Freeman writes, “Some have criticized the extension of complexity theory – a theory originating in the natural sciences – to human endeavors such as language acquisition.” This to me is one of the most important and least thought out areas as far as I can tell in this whole complexity theory area. When we introduce consciousness into the equation, or value, or any ‘human’ experentially factual term into complexity theory, we inevitably plunge into an area that gets awfully difficult to navigate awfully quickly. This is one of several areas where complexity theory as it is conceived of now breaks down in my view, where the reconciliation of the natural sciences and the other ‘special sciences’ begins to become an problem to be avoided, or a solution to be solved: (but see my note at the end of this post…).
8. “Nothing is foreclosed in open, dynamic systems”: This is where complexity theory starts to cast its quickly expanding theoretically web into further and further areas, and in this case, political. Because complexity theory’s logic is “non-deterministic,” then new paths can be taken at any point. She goes on to quote Osberg (2007) who says, “…the logic of emergence could therefore also be characterized as a logic of freedom…”.
This falls well into critical theorists pressing concern to constantly question the powers that be to bring us more towards linguistic equality. This is also where I would say that complexity theory seems to have come to a point where its explanatory power is exponentially huge, which is a nice way of saying that in attempting to become a theory that explains more and more, it effectively is helping us to understand less and less, and merely serves as a descriptive process to identify a system where lot’s of unpredictable stuff is going (that’s just what it seems like to me at this point).
Complexity theory, which originated in the physical sciences, has been used as a productive metaphor in SLA to stress the relativity of self and other, the need to consider events on more than one timescale and to take into account the fractal nature and unfinalizability of events (2009, p. 247).
– Claire Kramsch
Diane Larsen-Freeman finishes her overview of this theoretical perspective with a well taken final statement:
“Above all, complexity theory argues for epistemological modesty. To understand L2 development more completely, we must resist the arrogance of certainty and premature closure” (p. 68).
Rather than seeing it as an all encompassing theory of everything, as I have often misinterpreted it, she claims that it should be viewed rather as a “cross-disciplanary field of research and meeting place for dialogue.”
All in all, complexity theory offers an interesting and thoroughly commonsensical way to see the world. The world is complicated and understanding the elements in it as dynamic, changing processes, emerging into organized patterns or succumbing to the second law of thermodynamics and “breaking down” into chaos is a clearly viable but incredibly difficult way to go about doing research to understand these phenomena. As Claire Kramsch notes, it is a great metaphor. However, until the word ‘emergence’ ceases to be a nifty word to explain some kind of unpredictable bottom-up transition from one process to another, it serves as little more than a heuristic device, or a nice metaphor for describing the complicated processes that we see in the world.
On a side note, I am currently reading Terrance Deacon’s (2011) new book which makes some serious claims about not only revamping the notion of ‘emergence’ but radically changing the historical divide between the natural sciences and the “special sciences,” that C. P. Snow famously referred to as “the two cultures.” Bridging this gap may be possible, although it has been viewed as thoroughly undesirable by most from both of these cultures. But coming to terms with the understanding that mind emerges from matter, and life emerges from non-life, and explaining exactly how it does so through all of the transitional processes, may be just the key for a radically new perspective bringing complexity theory to the fore as a robust, “pandisciplinary” research program. The future will be interesting! I imagine this post will be updated when I finish Deacon’s book so stay tuned.
The main resource used for this post was:
Atkinson, D. (Ed.) (2011) Alternative Approaches to Second Language Acquisition. Routledge: New York.