I appreciate Virginia Eubanks’ expansion of information technology expertise outlined in her 2011 book Digital Dead End. This is not to say that everyone has the same expertise, by any means. But it is to challenge the primacy of those whose expertise is in the physical or software domains. A recent post touched on the concept of critical sociotechnical systems approach to digital literacy. This post builds upon this concept, expanding upon some ideas my colleague Colin Rhinesmith and I presented at the 2014 Community Informatics Research Network Conference. I’d like to put forward four ways in which I think we need to better hear/liberate people’s expert knowledge regarding information technology:
- Sociotechnical Framework = physical + software + human + social;
- Co-creation of technology — innovation is never static but always being co-created by users to fit the context;
- Real-world, everyday technologies — not just the high-tech innovation, and not just the artifact but also the practice;
- Given the technical artifact is socially shaped, the dominant narratives and social practices within society become embedded within the artifacts and their use practices, potentially reifying unjust social systems. Some especially experience and are thereby experts in the oppressive nature of technologies.
Let’s explore each of these in turn a little further.
Sociotechnical Framework: a holistic understanding of the technical (physical and software) and social (human and societal) layers of digital technology. We need a more nuanced, complex understanding of the social influences that shape the design, production, distribution, use-policies, co-creation, and end-of-life decisions of a technical artifact. And we need a more nuanced, complex understanding of the emergent properties that result when the social and technical come together, and how those emergent properties shape our social systems.
Brian Whitworth, in his 2009 chapter “A Brief Introduction to Sociotechnical Systems“, highlights how technical artifacts improved in the 1960’s, 70’s, and 80’s as engineers working on the physical layer and computer scientists working on the software layer began collaborating more closely. Then, beginning in the 90’s another leap forward was made when behavioral scientists joined the group, bringing in the human layer as part of Human Computer Interaction and Computer Supported Collaborative Work research, and later Human-centered Design approaches.
But still missing all too often is expertise at the societal layer, or what Whitworth calls the sociotechnical gap. As a result, the technical artifact is developed in ways that are inconsistent with the social values and goals of those working to achieve transformative action and social change. As Fisher and Herrmann point out in their 2014 chapter “Meta-Design: Transforming and Enriching the Design and Use of Socio-Technical Systems”, user-centered design doesn’t go far enough in incorporating everyone’s expertise in design, explored further in the next section.
The need for equal or even priority valuing of the social expertise that each person contributes to a sociotechnical artifact brings us to the second point.
Co-Creation of Technology: Fisher and Herrman go on to argue that we need a new approach to the initial design of the technical artifact. Starting with a user-centered design approach that incorporates user representatives into the initial design process, the technical artifacts are intentionally under-designed. This is not to suggest it is a half-finished product in an inferior sort of way. Rather, it is designed with an understanding that each user will further co-create the artifact. Back in 2004, Ron Eglash described a normal process of appropriation that happens with technology, in which people take the stuff they have and use it in ways and to achieve goals not conceived of by the designers/producers. I now start many of my computer/digital literacy classes with an icebreaker question asking to describe one such way they’ve used the stuff they have in a way it wasn’t intended to address an immediate need. Without fail, everyone has a story. Bruce, Rubens, and An (2009) proposed we call the technical artifacts “innovations-in-use” as people continuously co-create the technologies.
Observe the setup and use of two smartphones, or compare the setup and use of same smartphone in two different contexts. Indeed, note just how infrequently it actually is used as a traditional voice communication device. Users configure their smartphones to serve as a tool for a variety of different activities, depending on the context. My personal smartphone, then, can only be described at the time of this writing as the November 24, 2014, 3pm, in-wait-mode smartphone. At 7pm when I attend my next meeting, it may become a note-taking, live-tweeting smartphone. Later this evening it might become the who’s-that-actor-research smartphone, or the I-want-to-learn-more smartphone as I move towards a more active TV watcher.
An innovation-in-use framework also pushes us towards new, situated evaluation approaches for evaluating a sociotechnical system (Bruce, Ruben, and An, 2009, page 687). Instead of asking “What can an innovation do?”, we ask “What do people do as they use the innovation?”. Instead of “To what extent are the innovation’s goals achieved”, we ask “How do social practices change, in whatever direction?”. Instead of asking “How should people or the context of use change in order to use the innovation most effectively?”, we ask “How should the innovation be changed and how can people interact differently with it in order to achieve community goals?”. Instead of asking “How does the innovation change the people using it?”, we ask “How does the community fit the innovation into its ongoing history?”.
These questions generally, and the last question in particular, begin to touch on the third point.
Real-world, everyday technologies: It can be argued that human history is the story of tool design and implementation. We build tools to manipulate our environment, creating a new environment. This new environment shapes us and leads towards new tools. For instance, in their presentation to the 2014 Engagement Scholarship Consortium, Maria deBruijn and Lisa Grotkowski of Emerge Solutions, Inc. noted that in our early history we learned to intentionally create fire. We then began gathering in a circle around the fire. This contributed eventually to inner and outer circles and began to further shape our social order, which required new systems to be developed. Their presentation went on to helpfully describe the need to demystify the community development process and better engage between the rings of the circle — what they refer to as the groan zone — and to also be much more flexible in our selection of tools to support such development and engagement work so as to better build community.
Given this history of humans and tools, why is it that today technology experts are only defined within a very narrow window, primarily of things currently or recently developed by engineers and scientists. Before asking people at the beginning of a digital literacy workshop to describe a way they’ve used stuff they have in a way it wasn’t intended to be used, I ask them to first draw a picture of an innovator innovating. Almost universally, while everyone describes a way they’ve reinvented something they have, they draw and describe a white male innovator working alone on an innovation.
Judy Wacjman, in her 2009 overview “Feminist Theories of Technology” traces the history of how technologies came to be defined as those things engineers and computer scientists do. By challenging this viewpoint, we move from seeing attendees of computer/digital literacy classes as non-technology people to people who have expertise in different technologies. Not only does this give us a starting point for transferring skills from one sphere of technology expertise to another, but it also opens up opportunities to consider what’s gained and lost by choosing to use one technology over another.
Learning to identify what’s gained and lost by choosing to use one technology of another is a very difficult process, especially when the gains and losses are not universally shared. It is to this point that I believe Virginia Eubanks was especially referring in her book Digital Dead End.
Experience and expertise in the oppressive nature of technologies: Virginia Eubanks describes the example of a Supplemental Nutrition Assistance Program (SNAP; previously referred to as food stamps) recipient whose pattern of use of their electronic debit card to buy food is called into question by a social worker. The SNAP recipient may never have used a keyboard and mouse, or a word-processor, but they are very familiar with the intrusive and judging aspects of technology in a way that those of us in a more privileged class can’t even imagine. It’s easy to dismiss or explain away such “judging” as holding accountable a government support recipient. But corporate and government corruption, misuse of funds, and other large dollar mismanagement of government support programs are far more costly than the smaller misuse of funds to programs like SNAP by the recipient — if it really even is misuse. It may have been the very best use given the context within the recipient is forced to live. But we do not put the same onerous, intrusive, judging systems on a CEO. In a culture where wealth is correlated with initiative, sound judgement, and trustworthiness our sociotechnical artifacts and practices have embedded within them a trust of the CEO and distrust the SNAP recipient. Only by listening to the expert in the oppressive nature of technologies will we learn to champion, design, and co-create more just sociotechnical artifacts and practices.
Too often when referring to new technologies we hear arguments regarding how we need to get onboard or get left behind. We hear about those who have been left behind. When we become frustrated because we find the technology using our time or taking us places inconsistent with our values and goals, we’re told that there’s no going back. This is the myth of technological determinism, and it pervades the very core of our culture. Further, it’s combined with technocentrism — the deep abiding belief that technological solutions will fix our environmental and social problems. This, too, pervades the very core of our culture.
Colin Rhinesmith has explored the ways in which external stakeholder demands and internal organizational needs sometimes come into conflict when social service computing systems are implemented by an organization to meet funder demands. Design inspirations based on limited awareness of a local context, on false assumptions, and on funding agency demands may expose hidden work done by a social service agency to meet their responsibilities to service recipients. Consistent with technological determinism and technocentrism, external stakeholders may believe outcomes-based reporting software will make funded agencies more efficient. But embedded within such a consideration are beliefs about efficiency and scalability that may at best be inconsistent with the local context and at worst may be based on unconsidered and oppressive ways of thinking perpetuated through systemic injustice. Rhinesmith’s work helps us further see how not just the aid recipient but also the social service worker can bring expertise to the table regarding the oppressive nature of sociotechnical artifacts and practices. And we also begin to appreciate how software and computer systems can be designed, or under-designed using the terminology introduced by Fisher and Hermann, in ways that allow innovation-in-use to contextualize and hopefully challenge the oppressive nature of these systems.
Ultimately, these four different types of expertise are mutually supporting. When effective dialog brings all experts to the table, software and computer systems have the potential to better address societal values and goals based on inclusion and social justice for all.
0 thoughts on “EVERYONE is a Technology Expert”