Tag Archives: literacies

Education Remix: New Media, Literacies, and the Emerging Digital Geographies

By Sara Evans, Jessica Gilbert Redman, Joanne Rumig, & Marcia Seaton-Martin

Link to article: http://www.digitalcultureandeducation.com/cms/wp-content/uploads/2010/05/dce1034_vasudevan_2010.pdf

Article Synopsis & Core Research Questions

Vasudevan (2010) explores the way in which emerging digital geographies are making a difference in the way our youth population are being educated today, whether in the classroom or outside the school, as well as how technology is changing the way in which they communicate with their peers and teachers. Her research focus is to examine the types of digital spaces that youth are participating in and how they can be incorporated into current education practices. YouTube, Flickr, and other digital literacies such as cell phones, mp3 players, social networking sites, and virtual worlds have all contributed to this movement where students are engaged in the learning process both online and in the classroom. This affects the instruction models and assessment tools already in place.

In this particular article Vasudevan examines the social media practices and the technologies used by two youths, Joey and EJ, that are currently involved in an alternative to incarceration program (ATIP). Using portable technologies, Joey and EJ explored digital geographies in various workshops and improved writing skills using various digital spaces. In both examples the relationships between literacies and modalities are highlighted, and how their experiences will shape curriculum design in youth education.

Methods Used

The primary methods employed are case study examples from Joey and EJ in the ATIP workshops. Both completed different creative projects in different cycles of the program. Joey’s project was a part of a digital media workshop, where students were asked to create “movies,” and EJ’s project was a part of a larger program called The Insight Theatre Project, where students were asked to co-write scripts which were performed by other participants. The researchers used ethnography styled approaches to examine the processes of the either student in creating their projects, though these approaches were complicated by the students immersion in virtual spaces where the researcher could not necessarily situate themselves.The focus, however, was examining how either project highlighted the digital literacy and level of media engagement of the student, and how either changed throughout the course and with the addition of more virtual spaces for the students to occupy.

Findings & Conclusions

As a result of the digital medial workshop, Joey was able to use a digital camera not only as a tool but as a space to show the layered geography of his life. He used his PSP as a tool to transfer files from the camera to his online profile, while gaining new skills of customizing backgrounds and uploading music and multimedia poems. Through the use of ProTools, Joey was able to create beats for his multimedia narrative and later create a collection of beats for other multimodal compositions. He got a renewed sense of exploring his personal history with the PSP and the digital camera.

EJ began to navigate new spaces, starting with writing blogs, which helped him develop more of an appreciation for multiple audiences. With the blogging and observations he made, he started to identify himself not only as an intern but also as an ethnographer. As his composing evolved, his digital geographies began to include Twitter, Facebook, and YouTube. By accessing many digital composing spaces, EJ was able to participate in new communities, be recognized for new identities, and gain new audiences.

“By paying attention to digital geographies, particularly the navigation across digital spaces and orchestration of multiple modalities, educators can cultivate youth’ literacies while at the same time inspire new sites of education” (Vasudevan, 2010, p. 79).

Further Research Opportunities & Unanswered Questions

Given the few student examples we see, it would be interesting to see how students use the digital landscape on a wider basis. One student uses technology in interesting and unusual ways, but how common is this “thinking outside the box”? What percentage of students get to know their technology this well, so they are able to think up new ways to use the technology? It would be helpful to know what students are using and to what point they are being creative with technology.

We often see students who are so-called digital natives who do not have what many consider basic digital skills (e.g., being able to enter a web address in the address bar of a browser instead of using Google to find the site and then clicking on it from the search results). Based on these observations, how many students are actively and eagerly participating in furthering their technological skills and knowledge beyond what is required for basic interpersonal communication (texting, Facebook, etc.)? By this, what we really want to know is if the Joeys and the EJs of the student world are outliers or if they are the norm. How much technological inquisitiveness can we expect from the average student? Are they willing (and/or able) to go above and beyond the normal requirements of a task or project to learn new skills or to bring together disparate skills in order to create something new?

Is it logical to believe that schools should have such varied types of technology (such as gaming systems or different types of computer OSes) available to students, just in case they might be able to use it creatively? Beyond that, do budgets allow for this possibility?

With schools fighting the influx of technology by often passing “screens down” or “phone basket” (where students put their phones in a basket upon entering a classroom to avoid the temptation of looking at it throughout class) regulations school-wide, where is the happy medium to make sure students are using technology effectively but not tuning out during the school day?

Vasudevan, L. (2010). Education remix: New media, literacies, and the emerging digital geographies. Digital Culture & Education, 2(1). Retrieved from http://www.digitalcultureandeducation.com

Bonfield, B. (2014). How Well Are You Doing Your Job? You Don’t Know. No One Does. In the Library with the Lead Pipe.


Pre-intro: We picked this article because it looked like a good one for a discussion. As future librarians we will want to do a good job and we will want a way to verify that we are.

How does a librarian know how well they are doing their job? On that note: What is a library?
Bonfield thinks of a library as “a cooperative for infrequently needed, relatively inexpensive, durable goods”. This is the traditional view of libraries. When people think of libraries they think of books but libraries are much more than books. Some libraries don’t even have books. So how do we know we are doing a good job? Bonfield discusses other agencies and how they assess and measure their success at serving their constituents. He asks the question “Can libraries measure their job in the same way? Can their success be measured at all”?


Libraries are responsible for improving a patron’s well-being. It doesn’t matter how the library receives funding. The article goes on to discuss principles borrowed from other agencies and applies them to American Public Libraries. How do measure success? Now it involves standard testing but it used to involve graduation rates or whether or not graduates had obtained employment. (This is still true for some universities).

He also defines well-being.
Well-being is
1. Material living standards like income,consumption and wealth;
2. Health;
3. Education;
4. Personal activities including work;
5. Political voice and governance;
6. Social connections and relationships;
7. Environment (present and future conditions);
8. Insecurity, of an economic as well as a physical nature. (Stiglitz, Sen, & Fitoussi, 2009, pp. 14-15)

There are four responses to the question: Why aren’t libraries measuring outcomes?
1. We are
Libraries are doing some unplanned, unmeasured, and unscientific methods of measuring their success at improving the well-being of their patrons. This is involves looking at the feedback we get from patrons.
2. It’s expensive
Science is expensive. Far too little federal funds go to libraries. They can’t afford to assess whether or not they are succeeding using some science.
3. No one cares
The government is too busy assessing healthcare outcomes and education outcomes to focus on library outcomes. This doesn’t mean they do not actually care. They just don’t care to measure the success of a library.
4. It’s impossible
Every situation is different. There is no hard science to even evaluate these outcomes. Teachers said the same thing when No Child Left Behind was enacted.

So what are some measurable ways of assessing libraries contribution to the well-being of patrons?
Are the people in your area voting? Are they voting less or more than one would expect for the demographic? Libraries can find ways to help patrons participate in democracy.
Libraries can test for this even if it is expensive. They can then assist each other with improvements and compare notes. He does not mention where the initial funding will come from.
Social Capitol
Libraries create opportunities for patrons to get to know their neighbors. He says there are numerous ways to measure our contribution to social capitol but does not list any.
And Employment
Libraries can introduce programs to help patrons obtain employment and then measure their success.

The point Bonfield is trying to make is that people focus on what they measure.

Core Research Questions:
Bonfield addressed the following three research questions in his article.

1.“What is a library?”

2. “What’s the best way to measure how you’re making your constituents’ lives better?”

3. “Why aren’t libraries already measuring outcomes?”

Findings and Conclusions:

Bonfield concluded libraries are not measuring how their services and programming are affecting the lives of patrons they serve. He discussed how libraries focused too much on outputs rather than outcomes. Libraries measure quantitative outputs such as library visits, number of items circulated, patron attendance at programs, number of library employees and amount of library spending. Instead, they should focus on qualitative aspects. Libraries should focus more on outcome-based programming. He believes libraries need to strive for outcome measurements on how they can improve lives in the community. What is it they want to address and change? What are positive outcomes? It starts with identifying the needs and aspirations of patrons.

He mentioned the willingness of libraries to catch up with other public services on this particular issue. For example, the PLA (Public Library Association) President Carloyn Anthony created a Performance Measurement Task Force. The task force will spend three years to identify appropriate library outcomes that would make a positive impact in the community. Also, how libraries can help patrons achieve them.

Bonfield expressed concerns about libraries running out of time on outcome standards. He emphasized the need for libraries to work together to develop and measure outcomes. He suspects agencies from other fields will implement library outcomes for them. They might not truly reflect the values of libraries or the best ways to serve the public.

Unanswered questions:
How can he be sure that libraries can even improve the employment rate anyway? That seems to focus more on the businesses in an area than on anything else. There is no way a library can bring new businesses in.

Which types of literacies would a library measure? Would there be a training program for digital literacies with a test afterwards? What about new literacies? Social media literacies? How necessary are these things for well-being?

What aspects of well-being do libraries wish to promote?
Bonfield provided 8 examples of well-being. Some examples were education, health and income.

What are some ways we really can measure our success at improving the well-being of patrons?
If we measure outcomes and show results would we get more federal funding? What if federal funding starts to be affected by these measured outcomes?