I once waited 45 minutes just for a logon prompt -- after that I continued to use cards until my final semester. One had comments and was nicely formatted, the other had all of that removed so that the TRS had enough memory to actually compile and run it. Just five years later, when Pierce took his first class in , his hardware environment was quite different: a lab full of Apple IIs.
If you need a sense of how computer education, and indeed the whole industry, transformed in a very brief period, imagine the leap from using punchcards to microcomputers in half a decade. By contrast, Dr. Nick Carlson, a civil engineer and an instructor at New Jersey community colleges, considers the work environment he oversees today -- "a lab full of networked desktops running Windows" -- as being essentially the same as what he would've used when he took his first programming class more than fifteen years earlier. When I was taking computer classes in high school in the late s, we discussed transistors and logic gates, not that I really remember much of it or ever fully grasped how it related to programming a computer.
Still, I wondered if that was something that anyone in school today would still be expected to understand at an introductory level. He did programming in assembly. I've never had a class in assembly. At the junior level there's Computational Structures, which goes into the theory and math behind binary logic and arithmetic, graphing, shortest-route, and so on. Another class goes into the hardware side of this, with logic gates and some assembly programming.
It seems that while computer science once took a view that to learn the discipline you had to trace it to its beginnings on practically bare metal, today that's considered an advanced-level branch of study. You don't need it for the basics. Beyond the nuts and bolts of what specifically you'd study and what machines you'd use to study it on, there's a bigger question looming over the field: why would you bothering studying the subject at all?
In the '80s, when Nancie K. As it happens, that's about what I was doing my first five years after graduation. And minus the government contractor, it's what I'm still doing.
- Spicy Padang Cooking.
- Computers and Languages: Theory and Practice.
- Making Waste: Leftovers and the Eighteenth-Century Imagination?
- Department of Computer Science.
This was an age when for some years, the field had been -- not stagnant, necessarily, but mature. Computers as data processors were well understood by businesses, and were very lucrative for both companies that sold them and organizations that used them. But a revolution was brewing. The Last Starfighter came out around that time, so people were starting to see the creative things computers could do. By the turn of the century, the classes Dr. Carlson took in high school and college had a very different feel. They weren't taught with the idea that you'd eventually want to try to put these pieces together into a bigger whole that would actually do something relevant or useful.
He sees this too as a reflection of the times -- there were kids who "enjoyed computers for the sake of tinkering with a computer," as the PC revolution had brought a wave of machines into homes but hadn't quite cooled off to the point that they had become dull appliances. The class Dr. Carlson teaches now is called "programming for engineers," and is much more aimed at practical use.
The language it's based on is MATLAB, "a numerical programming language that's fairly popular with academia and engineers. Writing the code is just one step of the whole solution, since the students need to understand the physical basis of the code they're writing and correctly interpret the output it produces. The class is also set up to show students the limitations of computers, like floating-point accuracy or the concept of garbage-in, garbage-out, which is definitely relevant to them since they'll be using software for design even if they never write their own code again.
In this way, the students of today aren't that different from the fifty-one people who attended the Summer School on Programme Design for Automatic Digital Computing Machines at Cambridge University in Those students didn't all necessarily want to become computer scientists -- it wasn't particularly clear that "computer science" was it's own thing, yet. Many were chemists and mathematicians who were just excited about the practical ways new technology could make their existing jobs easier.
In fact, the practical needs of both students and employers have given rise to a whole category of computer science education under the aegis of schools that aren't colleges at all. These "code schools" are aimed at eschewing theory and giving students practical skills in a short amount of time. As Christopher Mims put it in the Wall Street Journal, "we've entered an age in which demanding that every programmer has a degree is like asking every bricklayer to have a background in architectural engineering. And indeed, employers are looking for a focus on the practical as well.
It's not that the skills colleges teach are obsolete -- but students seem to need extra help once they hit the industry. Facebook puts its new hires through an intensive programming course when they arrive, for instance.
- Logical frameworks and Meta-languages: Theory and Practice (LFMTP'18);
- Theory Of Computation and Automata Tutorials?
- Learning with Computers: The Theory Behind the Practice.
- Invitation to existential psychology : a psychology for the unique human being and its applications in therapy;
- CS Automata Theory.
- Crash Course Endocrinology.
Dave Parker, CEO of Code Fellows, a software programming trade school in Seattle, says that "there aren't a lot of IT jobs left managing hardware or server farms, some of those old degree programs need to recognize that and move on. But courses like his company offers are a useful supplement. We get to deliver 'life experience' candidates with fresh technical skills on a stack that is in high demand today.
It doesn't replace a CS degree, but depending where you are in your career it's a great alternative. Perhaps the best way of thinking about the future of CS education is that even if you've got a degree, you're going to keep needing an education. Cobol keeps on keeping on, this time thanks to support from Micro Focus for use with the latest If this then that: 9 programing skills Show More.
What you have to work with Rob Pierce has enjoyed a smorgasbord of decades' worth of computer education: he took an introduction to computer concepts in the mids, an introduction to programming course in the s, and is taking a data structures and object-oriented programming class today. How far do you drill down? But what's it all for? Instantiating the activity theory framework in the given telecollaborative environment made it possible to incorporate the online bulletin board as a mediational tool and identify three breakdowns that students connected to its features: message overload, slowness of bulletin board versus chat, and name and gender confusion due to an absence of visual cues.
Our sense of instantiation differs, for example, from borrowing directly from interactionist SLA or other accounts, where the technology has no special status and the technology aspect of the interaction would have been outside the theoretical framing rather than intrinsic to it. A process that starts with borrowing but then transforms the borrowed theory is theory adaptation.
In this case, the researcher is led to propose some change in the theory construct to account for data emerging from CALL research results or potentially from observations prior to the actual research. Smith see Levy, Chapter 7 this volume provides an example of the former in a CMC study: he finds that the special character of the synchronous, text-based chat environment requires him to extend the established negotiated interaction framework of Varonis and Gass Levy and Stockwell observe the growing use of multiple theories within a single study to capture a range of perspectives that a single theory cannot, especially in studies involving development or design.
Capturing the notion that these theoretical sources combine while maintaining their individuality, we label this a theory ensemble see also Levy, Chapter 7 this volume , another category absent from the Hubbard typology. In some respects, a theory ensemble is akin to an adaptation, but rather than transforming the initial theory, it is enriched with additional sources, either before or after the study. Because no single theory is perceived as rich enough to meet the needs of the research, teaching or development project, rather than change the theory, different sources are drawn on for different purposes.
The study by Cornillie, Clarebout and Desmet in the final section of this chapter is a clear example of this process, combining the differing perspectives and theoretical traditions of language learning and gaming in a coherent way. Coherence is important here: care must be taken by researchers and developers not to simply string together potentially conflicting or incompatible theoretical sources just because each might say seem to say something interesting.
Taking this process one step further is a theory synthesis , where insights from two or more sources are combined into a single theoretical entity. The outcome of a synthesis is an object of sorts — a new theory, framework or model — as in Plass and Jones An ensemble, by contrast, does not imply either permanence or generality — the collection is made in pursuit of understanding a particular phenomenon or to guide in the development of a given project.
This leaves open the possibility that an ensemble could evolve into a synthesis through reuse in other research studies or development projects either by the original author or others. All of the previous processes involve taking one or more specific theoretical entities and building on them directly. Although informed by prior theories, a constructed theory has a certain independence lacking in its predecessors. Finally, operating in parallel with most of the preceding categories is the process of theory refinement , the idea that, in line with other scientific tradition, theories improve or in some cases fall as more data come to support or refute them.
As the field moves forward and certain theoretical options become more established, we can expect to see growth in this area. The preceding categories reflect a rough continuum from greater to lesser dependence on the original sources and conceptualisation. Although it might seem that constructed theory is a desirable goal for the field in the long term, it is not clear whether that is indeed the case. To conclude this section, it is important to note that the discussion here has focused on understanding how theory is incorporated into CALL rather than how well.
We have explained that theories may be absent, borrowed singly, assembled in an ensemble, instantiated, adapted, synthesised and even created. However, regardless of the process involved, a theory or model may be misappropriated or misapplied in a study due to a lack of care or understanding on the part of the author. In addition, a theory or model may simply be invoked to lend credibility to a study. Although a step beyond atheoretical CALL in principle, the presence and impact of the theory is largely invisible beyond that invocation.
What is apparent is that in a research project, an invoked theory is neither the object of study nor the lens through which study data is collected, analysed and interpreted. The more gratuitous sense of invocation resonates with the concept of theory buffet introduced by Levy and Stockwell When we design research studies and develop technology projects, or when we evaluate the work of others in those domains, we need to consider very carefully the relative merits of the theories, models and frameworks we employ.
The theoretical dimension cannot simply be a gap to be filled in a proposal or paper. It should be integrated and articulated in a way that leads to greater coherence and clearer understanding than would result if it were absent. In discussing theory within this field, it is important to consider that theory as used in research is not necessarily the same as theory supporting practice.
Along these lines, Levy and Stockwell draw distinctions in the role of theory for design, teaching and research in CALL. This section and the one that follows draw on some of those insights. Despite the wide range of theoretical sources from various disciplines described in the previous section, theories from second language acquisition can be said to have had a more central role than others. She discusses four general orientations, each of which collapses a number of related theories and models under its label: cognitive linguistic e.
She speculates on the implications of thirteen specific theoretical approaches for CALL. Although a number of theories, frameworks and models, including those mentioned by Chapelle , have been used to motivate CALL projects and to provide a basis for research and evaluation, three in particular stand out: the interaction account, sociocultural theory, and constructivism.
Each is discussed in more detail in the following sections. The interaction account IA emphasises the role of interaction in second language development Long It incorporates certain central processes such as the negotiation of meaning , in which the learner and interlocutor s engage in an ongoing process of interactional adjustments Pica The IA focuses upon learning interactions that by necessity involve two or more people, or a person and the computer Chapelle In particular, the IA has been extensively referenced as a theoretical base in CMC-based CALL often simply borrowed rather than instantiated in the sense described earlier , especially in projects that involve email and chat as a basis for learner interaction and exchange Darhower Any setting where synchronous or asynchronous communication occurs can draw on the IA for guidance, including text-chat and voice-chat, either used independently or embedded in other programs e.
Vygotsky claimed that learning resulted from social interaction rather than through isolated individual effort, and that engagement with others was a critical factor in the process Vygotsky : In his view, learning was at first social intermental , and only later individual intramental. The preeminent tool for mediation is language. But language is not the only tool for mediation. From a sociocultural perspective, it is via these different forms of mediation that cognitive change or learning occurs see also Darhower In the context of the present discussion, two points should be emphasised.
First, with regard to material tools, technologies mediate communication and thereby cognitive change differently. From the landline phone through email, text messaging and Skype, the technology itself shapes the interaction in particular ways. Each technology has its own affordances that govern differentially the ways in which interactions occur see Hutchby ; Smith The technology does not determine the interaction, but its attributes do help shape them.
When sociocultural theory is applied in CALL, it often reflects the process of theory instantiation described earlier precisely because the mediational role of the technology is an integral part of the study. Second, with regard to social interaction, new technological means allow new and different forms of social interaction to occur, both online and in the classroom.
Now, of course, social worlds extend into the virtual worlds of gaming among numerous other complex modes of online social interaction e. Lee ; Peterson In a seminal article on constructivist theory, Phillips describes constructivism as a large-scale movement and system of beliefs: he also highlights its diversity and its many interpretations. Yet beyond that basic statement, interpretations tend to differ and follow rather divergent paths. These understandings and widely differing interpretations of constructivism have carried over into the CALL area Felix In essence, the cognitive constructivist describes the mind in terms of the individual; the social constructivist describes the mind as a distributed entity that extends beyond the bounds of the body into the social environment.
Healey and Klinghammer also emphasised the centrality of the learner in the learning process and the importance of the teacher in creating motivating authentic activities that involve investigation, discussion, collaboration and negotiation.
Each author in that special issue draws rather differently on the constructivist idea, often listing overlapping sets of principles that underpin the individual constructivist CALL learning environments they are creating. Theory guides and shapes research in many ways, but perhaps one of its most important roles concerns its influence on the ways in which the researcher sees the problem.
Through theory, the researcher is guided not only towards particular ways of formulating the research problem initially, but also towards ways of investigating it, through the choice of terminology and constructs, research method and procedure, data collection procedures and mechanisms of analysis and interpretation: each are both directly and indirectly suggested by theory.
Programming Languages – Khoury College of Computer Sciences
This role of theory in research is described eloquently by Neuman: Theory frames how we look at and think about a topic. It gives us concepts, provides basic assumptions, directs us to the important questions, and suggests ways for us to make sense of data.
Theory enables us to connect a single study to the immense base of knowledge to which other researchers contribute. To use an analogy, theory helps a researcher see the forest instead of just a single tree. In other words, the theory drives and shapes the whole research conceptualisation and process.
It also sets the boundaries and largely governs points of focus, the concepts or constructs to be included and excluded, and of those included, those foregrounded and those that remain in the background. A suitable example in CALL drawn from Levy and Stockwell compares and contrasts two studies in an online chat environment. The two contrasting theoretical approaches illustrate well the choices that confront contemporary researchers when no single language learning theory is preeminent and when more than one theoretical account lends itself to the job of description and explanation.
Levy and Stockwell discuss essential differences between these two theoretical positions and their implications. Intersubjectivity refers to the shared perspective experienced by participants: it is an interactional feature that needs to be maintained if effective communicative action is to continue. The quality and degree of participation are essential in generating cognitive change see earlier discussion. Thus, Darhower is interested in the maintenance or otherwise of intersubjectivity and the ways learners participated and managed their interactions — for instance whether they chose to stay on-task or go off-task, and if they went off-task, what topics they chose to discuss.
Sometimes conflicts occurred — also of interest to Darhower — when one learner wanted to stay on-task while the other did not. With his theoretical position, this movement between on-task and off-task work is fundamental to the way social cohesiveness is built up and maintained. Thus, off-task work is firmly in the frame and remains very much a feature of this study: essentially, it is treated equally with on-task work. The authors show no interest in the possibility of off-task discussion: it is not a salient feature of their theoretical framework, and the tacit assumption is made that students remain on-task throughout the activity whether true or not, we do not know.
The construct of intersubjectivity is also not a concern. These terms derive directly from the particular theoretical orientation that drives the research study. In both studies, the theoretical point of departure sets the field of view and the mechanisms of interpretation. The theory defines the key constructs, the data to be collected and the way in which the argument that learning has occurred will be made. Both use theory to support their rationale and justify their research, and both draw on theory to identify desired features in the chat room interaction.
Darhower is looking for evidence of the intersubjectivity and social cohesiveness hypothesised in sociocultural theory to be important for language development and learning and the development of sociolinguistic competence. The two theoretical bases led the researchers in different directions.
CIS 352 - Programming Language: Theory & Practice
When theory is used for teaching and CALL, it is often used as a guide rather than as a prescription. Instead of drawing upon one theory exclusively, language teachers are more likely to draw on a number of theories simultaneously. Thus, there is a distinct difference between the way in which theory is used in teaching, and similarly in design and development, compared to the single theoretical framework of many research studies.
Following the typology presented previously, this means that CALL theory in practice is more likely to be an ensemble or a synthesis. This approach to the nature, use and application of theory for teaching and CALL is examined by Doughty and Long in their very useful discussion of task-based language teaching TBLT. They continue: And whereas theories generally strive for parsimony, among other qualities — to identify what is necessary and sufficient to explain something — a theory of language teaching seeks to capture all those components, plus whatever else can be done to make language teaching efficient.
Language education is a social service, after all, and providers and consumers alike are concerned with such bread-and-butter issues as rate of learning, not with what may or may not eventually be achieved through a minimalist approach motivated exclusively by theory of SLA. A good example of a more broadly defined set of guidelines that are drawn from a number of theories rather than a single one is that presented by Egbert et al. Within our model, this could be considered a theory synthesis , though the sources are more varied and the connections less explicit than in Plass and Jones This theoretical diversity stands in contrast to the seven hypotheses that derive directly from the interaction account, described by Chapelle : 23—25 , for example 1 the linguistic characteristics of target language input need to be made salient, 4 learners need to notice errors in their own output and 6 learners need to engage in target language interaction whose structure can be modified for negotiation of meaning.
Neither is necessarily better than the other, but they do speak to practice in rather different ways, one being broader and more encompassing, the other more finely targeted and focused. Both have a role to play. Perhaps most interestingly, although both claim to be guidelines for CALL, neither has any direct reference to technology in their core generalisations. They are borrowed from theory and research in SLA and transported into the CALL setting without incorporating any explicit role for technology.
Nevertheless, these two contrasting positions are helpful in understanding how theory can relate to practice. The position held by Egbert et al. Perhaps multiple theoretical perspectives are an acknowledgement that no single theory is preeminent in describing the processes of language learning; or it may indicate that no single theory is sufficiently powerful to provide a broad and principled set of guidelines for the many decisions that need to be made in creating online teaching and learning environments.
As noted previously, CALL projects are regularly influenced by multiple theoretical perspectives, what we have called theory ensembles. For example, Levy and Stockwell : noted the multiple theoretical sources for the Lyceum distance language learning environment, an audiovisual conferencing system developed by the Open University in the UK and used extensively for language learning purposes.
They included the interactionist account, sociocultural theory, constructivism, situated learning and multimodality. Some of these theories and their proponents clash with one another in the research-centred SLA arena, yet in the pragmatic development of Lyceum , the different theoretical perspectives spoke to distinct elements and processes within the learning environment that was being created. Such learning environments are multifaceted and complex, so it should perhaps not be surprising to learn that multiple theoretical influences, even those that might on the surface appear incompatible, are referenced to inspire them.
To begin to understand how this trend of multiple theories is being realised, it is instructive to examine some recent examples closely. Each study references multiple theories and each theory is called upon for different reasons. For instance, the first study Cornillie et al. It is this core problem that engages Cornillie and his research team as they aim to design corrective feedback such that learning is facilitated while, at the same time, the high levels of interactivity and engagement in gameplay are not interrupted. Managing both of these goals simultaneously is not straightforward.
The circumstances call for a balance between instruction and play in designing corrective feedback CF , and the study draws on the two theoretical bases accordingly. This project also includes a number of further theoretical sources, including the cognitive mediational paradigm p. Each theory is included to serve a particular purpose. Two more studies in the special issue are worthy of deeper consideration.
The first study, by Zheng, Newgarden and Young , uses an ensemble of theories or pseudotheories to motivate the project, including communicative project theory, multimodal analysis, languaging, situated learning, values-realising theory and an ecological perspective, among others. The complex nature of both the environment of the game and the activity give rise to the activation of a number of different theories employed for different purposes.
The second study, by Rama et al. Here theory is used to help make observations about the gaming environment, especially on its affordances for language learning, forms of participation, and its effectiveness as an arena for building and sustaining relationships intersubjectivity. In these varied examples on the uses of sophisticated games for language learning we see many theories in play excuse the pun. The context of the language learning game is not the same as the typical teacher-fronted face-to-face language classroom.
Care needs to be taken, therefore, in using theories developed and tested on face-to-face settings in game-based learning environments. In fact, a broader principle applies here. The default position for the researcher should always be that the online learning environment is substantively different from — not the same as — the classroom setting. Interacting via a screen, often with several windows open at the same time, presents the teacher and learner with multiple options for simultaneous interaction e. Theories emerge in new combinations according to the affordances of these novel language learning environments.
In this chapter, we have presented a working definition of CALL theory and shown that it draws on many sources and an expanded framework for classifying how theory is integrated into various CALL studies. Theories should not be chosen lightly, or simply because they happen to be in vogue at the time. Ideally, theory should play a foundational role in the study and be fully integrated into its goals, constructs and design.
As Chapter 3 Blin, this volume demonstrates, options such as activity theory that have a place within their frameworks for the technology are already showing promise for meeting this need. What will be interesting to see over time is what types of progress can be made in establishing useful theory ensembles and in increasing the instances of theory adaptation, synthesis, instantiation and perhaps construction. It is the theoretical innovation in these areas that will ensure the field remains dynamic and relevant. To conclude this chapter, we would like to emphasise our position that the incorporation of technology in language teaching and learning, whether called CALL or something else, should continue to be influenced and guided by theory.
The presence of theory provides the frame through which the complexity of the object under study can be coherently interpreted and the means to reach out beyond the single, context-specific research study. From age and gender to iden CALL tools for lexico-gramm