Membership Programs Meetings Publications LEAP Press Room About AAC&U
Association of American Colleges and Universities
Search Web Site
AAC&U
Resources on:
Liberal Education
General Education
Curriculum
Faculty
Student Success
Institutional Change
Assessment
Diversity
Civic Engagement
Women
Global Learning
Science & Health
PKAL
Connect with AACU:
Join Our Email List
RSS Feed
Facebook
Follow us on Twitter
LEAP Blog
LEAP Toolkit
YouTube
Podcasts
Support AACU
Online Giving Form
 
Peer Review Fall 2006 Cover
 

Peer Review, Fall 2006

Combating Educational Somnambulism in the Information Age

By Luke Fernandez, assistant manager of program and technology development, Weber State University


Like many other universities that are managing change in the information age, Weber State University is constantly assessing whether conventional, off-the-shelf information technology (IT) products will meet its needs or whether it needs to invest energy in customized solutions that suit its particular institutional culture. Often the dilemma is cast in terms of the question “build or buy, insource or outsource?” The question, however, is more profound than that. What Weber State University’s IT staff really are trying to determine is whether our university should adopt and practice received wisdom (as embodied in existing IT solutions) or call this wisdom into question and forge our own way through a combination of IT innovation and IT research.

I argue that if universities want to strike a wise balance between these poles, IT cannot be treated merely as a commodity but instead should be regarded as a set of communities and social practices with pervasive effects on the way people collaborate, think, and learn. Because of these effects, good IT strategy and IT acquisitions depend on fostering a culture of innovation and a culture of reflection in which the political, social, and cognitive implications of IT choices can be considered. We need to muster the full powers of social science and computer science to understand the place of the university in the information age. We need, in short, to recognize that IT matters—enough so that we must be creators and scholars of IT, not just consumers.

Arguments against Investing Heavily in Innovation and Research

At our school the answer to the question of how much energy we want to invest in researching the relationship between IT and university life is informed in part by the type of Carnegie Classification we embody—which is very high undergraduate and primarily nonresidential. The lack of a research culture might suggest that our IT initiatives should follow the cues of our academic departments; many of our faculty are making important research contributions to their disciplines, but our undergraduate orientation encourages faculty to focus their energies on being stewards of existing knowledge and learning rather than discovering new knowledge. If our IT culture were to follow suit, it would invest in technology that was stable, tried, and true rather than cutting edge. Moreover, it would focus the majority of its energies on refining existing business processes rather than fostering or creating a culture of innovation. In concise terms, the argument suggests that if our departments do not focus on research, then neither should campus IT.

IT Doesn’t Matter (Very Much): “I Just Want the Technology to Work”

These arguments gain additional resonance when faculty members claim that they “just want the technology to work,” and when administrators tout the value of service and stability or suggest that software produced through open-source collaboration with other universities simply doesn’t have the same level of support as that which is available when a product is purchased from a vendor. To some extent our school has been persuaded by these arguments. We have convinced ourselves that tools for facilitating campus learning and campus administration have become so ubiquitous and so refined that they are now “commodified.” Since quality products can be bought off the shelf, there is little or no return on investment when we create or customize these products in house.

The justification for such a position is reinforced by Nicholas G. Carr’s well-circulated article “IT Doesn’t Matter.” While dismissed by many, Carr’s argument has some validity: investment in IT has fewer marginal returns than in the past and, as a result, corporations should treat IT like electricity, water, or other utilities. In Carr’s view, companies need IT in order to compete, but they do not need to be provided with a commodity that is fancier or better designed than their competitors’ because the base “plain vanilla” product is more than good enough. Although Carr’s article was directed primarily at the business world, its appeal and luster are not completely lost on academic culture, especially when academics are known to say that they “just want it to work.” The implicit message here, as stated by a significant portion of end users, is that they do not need bells and whistles or cutting-edge features. In order to do their jobs as teachers, they just need the existing technology to function as advertised.

Arguments for Investment in Research and Innovation

While these are sensible positions to adopt in a corporate setting, we need to be careful how we choose to deal with them in academia. We also need to consider whether the logic that may apply in the corporate world can be applied wholesale in academia. There are some good reasons why an idea that makes sense in the world of business may not be applicable in academia; they revolve around the fact that IT is not just a commodity that can be bought and sold and traded. IT is not just a set of material artifacts—it also represents a set of profound social developments.

Reflecting on IT
Because IT is more than a commodity, we need to think about creating organizational arrangements that more effectively integrate the methodologies of academic disciplines into IT management. The reason this is necessary is that IT is not just a tool, but something that has transformative effects on the university, has reflexive properties, and is intimately implicated in the evolution of local and trans-local learning communities. The import of IT is so broad and so profound that the process of tooling or retooling the university is not something that can be left up to some folks in IT partnering with ad hoc committees of faculty and administrators. Instead, the university needs a formal set of offices that have the intellectual and technical authority, along with the fiscal resources, to sponsor ongoing colloquia on IT and the university. These offices can help the university develop more reflexive strategies that are informed not only by tactical, fiscal, and pragmatic interests but also by the broader and intractable challenges that face universities in the information age.

If there is not an office that is formally tasked with researching the broader social undercurrents of IT and how the university is swept up in these currents, the social and political implications of IT decision making are left relatively unexamined. The tactical and pragmatic focus of administrative IT means that these questions do not get the reflection they deserve in day-to-day decision making. And while academics have the tools and dispositions to reflect on these issues, their departmental responsibilities and their lack of involvement in IT problems (e.g., their lack of daily proximity to the machine) mean that in practice, they seldom give as much attention to IT as they do to their own disciplinary interests. This lack of reflection might be acceptable in an institution that does not hallow deep thinking. But in a university, this situation is unacceptable, and if it exists, it is best remedied by forming an academic office that can address IT problems on a continual, rather than ad hoc, basis.

The Pedagogical and Cognitive Impact of IT
One of the major reasons we cannot relegate IT decision making strictly to offices that treat it as a consumer good that can be assessed on a fiscal balance sheet is that IT has transformative effects on the university and on the way instructors teach. Technology has not just allowed us to pursue existing pedagogical goals (and larger university ends) with greater facility; it has also, in subtle and sometimes not so subtle ways, changed or redefined the ends we are pursuing.

Instructors spend a lot more time fiddling with IT than they did in the past, and this is a source of frustration for some, since it suggests, as Thoreau put it, that we’ve become “tools of our tools.” But others have embraced the change without complaints about wasted time because they are adjusting to and pursuing new forms of technical and communicative literacy that are beginning to be valued as much as the more orthodox textual literacies that universities have hallowed. In its most curious manifestation, and in ways that extend from communication into cognition, we see educators touting the new “multitasking” capacities of the Net generation and the ostensible need to transform traditional pedagogies, which often quiet, cloistered reflection, into pedagogies that cater to the Net generation’s increased tolerance (and actual embrace) of discontinuity and interruption that has, ironically enough, been fostered by IT itself.

Combating the Threat of Technological Determinism
While these new pedagogical ends may be worth pursuing, the salient point is that new technologies are transforming university learning without anyone’s explicit consent. We may want or even embrace this change in ends, but we should not do so without reflecting on technology’s implications. And to the extent that we value reflexivity, if we do embrace these technologies, we should make a conscious choice about embracing them. If we do not consciously embrace them, we run the risk of letting pragmatic technical decisions and acquisitions determine university ends. We run the risk of letting the technological tail wag the university dog. We run the risk of sleepwalking while technicians determine the topology and character of learning. We run the risk of allowing technology to determine its own ends, or (in more academic language) allowing technological determinism to jeopardize the happy prospect of the university determining its own fate.

To combat somnambulism we need to recruit academics from a variety of disciplines to study IT on an ongoing basis as it manifests itself in the university. As Rosalind Williams (2002, 25) observes in Retooling: A Historian Confronts Technological Change, the very technology that universities produce has a habit of “boomeranging”:

The new fact of history on a social level is that we keep running into ourselves . . . as we build our values and social order into the world. . . . We live in a world of echoes, a “boomerang” world where everything that goes out comes back . . . where technology changes the very institutions producing it . . . the process keeps getting more intensely reflexive. . . . A leading product of information technology is more information technology.

If computer scientists and social scientists want to study the information age, a good place to start would be in their own backyard. But in my experience, this usually doesn’t happen. Generally, when I try to engage academics in the political and social problems that IT presents, eyes begin to roll. The sleepiness is palpable. It is as if academics did not believe that there were any interesting political or social problems in their midst. The irony is especially poignant because, in general, these are the same academics who know on the one hand that technology is reshaping university life but on the other hand mutter that “they just want it to work.”

Participating in the Commons: The Virtues of Collaborative Innovation
When apathy runs this thick, people need to be reminded of the obvious. The educational commons (that is, the shared set of learning practices and associated tools that faculty use) is no longer solely shaped by the fiat of a provost, a president, or the decision of a (largely technologically uninterested) faculty senate. The character of the modern university, and the way it is experienced by a student, is increasingly decided by technology acquisitions, the vendors who control the evolution of these technology acquisitions, and the discretionary policy decisions of a few well-placed technology administrators. In recent years, universities have begun to feel the brunt of this technological determinism, which is most painfully manifest when a broad constituency of users is happy with the way things are but change and migration to a new system is mandated because the original vendor is no longer supporting the product.

In order to counter the power and influence of these technologies, and the vendors that direct their fate, universities have increasingly been turning to community-source software solutions (such as Sakai, Kuali, and Moodle). By collaboratively creating software with other learning institutions, universities that join these communities aim to gain more fiscal and technological control over their IT futures—futures that at the moment have been ceded, in large part, to outside vendors.

When universities take the time to embody their own learning goals and theory in software they’ve built in collaboration with other universities, the opportunities to reflect on the relationship between learning, community, and technology are increased. The idea is to create an ecology where members are not just consumers of knowledge and technology but active participants in its production. And in emphasizing collaborative production over consumption, in encouraging participants to defend their design choices to other members in the community, homegrown or community-source software catalyzes reflection.

Fostering Innovation and Research

If IT matters to universities (and is not really just a commodity that can be reified) and if universities need to continue to invest in it by fostering a culture of innovation, universities also need to foster a culture of reflection. The import and effects of IT on university life are so great that if we want to avoid sleepwalking in the information age, we need to do more than merely envision ourselves as inventors and creators (rather than mere consumers) of technology. We also need to think of ourselves as philosophers who are willing to spend time reflecting on the broader sociological movements and communities that we partner with (and condone) when the university makes particular software acquisitions. Given that IT is much more than a commodity, we need to create the cultures—and the offices—in university life that will encourage us to innovate, to reflect, and ultimately to keep the prospect of technological somnambulism (and its corollary, technological determinism) at bay.

Reference

Williams, R. 2002. Retooling: A historian confronts technological change. Cambridge, MA: MIT Press.

 

 
 

 

 

spacer