Tag Archives: Research

A Randomised Controlled Trial Comparing the Effect of E-learning, with a Taught Workshop, on the Knowledge and Search Skills of Health Professionals

Reviewed By: Heather Campbell, Marisa Eytalis, Gloria Nguyen, Bracha Schefres, Stacy Sorrells

Link to article: http://ejournals.library.ualberta.ca/index.php/EBLip/article/view/54/155

Article synopsis and core research question

Nichola Pearce-Smith (2006) conducted a study to answer her main research question: Is there a significant difference between self-directed learning using web-based resources and learning in a traditional classroom-based workshop for healthcare professionals trying to improve their database searching skills? Her objective was to test two different null hypotheses: 1) There is no difference between the two interventions being compared, and 2) That there would be no difference before and after either educational intervention.
Recognizing that searching for evidence is an essential job skill that healthcare professionals need to have, and trainings are thought to improve their skills, Pearce-Smith’s literature review could only find some evidence of that and most studies were “small and methodologically poor” (p. 45). This study was published in 2006, so there was not a lot of evidence showing the effects of e-learning and whether or not it improves knowledge in health care professionals. She did find one qualitative study that found no differences between two groups who participated in workshops versus e-learning. Since the literature on this subject was not evident, the author designed and conducted her research study between September 2004 and September 2005.

Methods used to answer the research question

The methods used in this study was a controlled trial of 17 health care professionals randomized into two groups. The study population was a convenience sample drawn from the Oxfordshire Radcliffe Hospitals NHS Trust (ORHT) and recruited via an invitation letter by mail or email from the Trust’s intranet and email list or elicited through posters and leaflets displayed throughout the hospital. Researcher acknowledged that participants might have been more inclined to volunteer for the trial since they may wish to learn or improve their searching skills. In order to be included in the study, participants must work within ORHT and have access to the Internet. To test both null hypotheses, participants were randomized into two groups using computer-generated random numbers. Both groups completed a search exercise before the training to get a baseline of their ability and another after training to measure if any improvement was achieved. The first group (WG) attended a two-hour search skills workshop conducted by a librarian while the second group (EG) were shown how to access the online learning module which included question formulation, study design, free text thesaurus and Boolean searching training, and examples of searching PubMed and Cochrane Library.

Findings and conclusions

The study found that while there was a slight increase in the knowledge and search skills of each group, this increase was not significant. Also, while the workshop group did perform better at devising a search strategy, there was no other notable difference in the improvement between the two groups answering the first null hypothesis. The second null hypothesis, that there would not be a difference between the knowledge and search skills of either group after the online or workshop training, was accepted.

While the second null hypothesis was accepted, the results were largely inconclusive due to the small sample size. Several factors contributed to the lack of participation. The initial research question is still an important one, and the authors hope that others will build off of their research and methodology to explore this topic further.

Unanswered questions you have and what future research might address

More than once the author/researcher pointed out that the results of this study are inconclusive due to the small sample size. Contributing factors to the small sample size and inconclusive results were that compensation was not provided to the participants, clinical staff participants are generally hard to acquire, and a second test search skills test administered at a later day may reveal true search skill retention. The author/researcher provides her suggestions to gain significant results which include, seeking participants early, providing compensation “use different contact methods” and make “inclusion criteria as wide as possible.”
Collaboration with library staff to conduct this type of study is necessary and once established, along with the e-learning course, a longer study period may be possible. The librarian(s) could collect participant data, provide consent forms and pre-intervention tests (to find or willing participants and the researcher could obtain the extant data at a later date/once the (WG) group reaches a larger number. The researcher can also obtain the e-learning information on demand and postpone the second round of testing (in a clinical setting) once a larger number of study participants has been reached. Stratified randomization could then be used to analyze a larger participant group. Another possible question would be to find out how accessibility was addressed/accommodated (i.e. language barriers) for this study?

A thoughtful attempt to answer your own questions

Is there a significant difference between self-directed learning using web-based resources and learning in a traditional classroom-based workshop for healthcare professionals trying to improve their database searching skills? The author’s objective was to test two different insignificant theories and ultimately, there are no differences between the knowledge and search skills of either group after the online or workshop training.
The other question raised was due to the limited number of participants, how was accessibility addressed for this study? Since the study was easier to maneuver with such a small sample group, the accessibility of the study was more attainable. The author designed the online learning resource, and this was made available on the web, and it was password protected so that only EG participants could access it. The content included question formulation, study design, free text, thesaurus and Boolean searching. An experienced librarian, who used methods such as presentations, live Internet demonstrations, and interactive group work, taught the WG.
Lastly, how would the sample group be increased from being a small group to a larger sample group? Health professionals could have been encouraged to participate by offering incentives such as prizes: book tokens, wine, free passes to festivals, gift cards, etc. Another thing could have been more funding, like any other study or research, there are monetary funds that need to be utilized to complete a successful research. Web-based resources are costly, and not everyone can have easy access to such materials.


Pearce-Smith, N. (2006). A randomized controlled trial comparing the effect of E-learning, with a taught workshop, on the knowledge and search skills of health professionals. Evidence-Based Library and Information Practice. 1:(3), 44-56.


Reviewed By: Megan Lohnash, Frances Owens, Emmanuel Edward Te, and Janice Christen-Whitney

Link to article: http://www.inthelibrarywiththeleadpipe.org/2015/ditchthesurvey-expanding-methodological-diversity-in-lis-research/

Article synopsis and core research question.

Halpern, Eaker, Jackson, and Bouquin (2015) identify surveys as the most commonly used research method in Library and Information Science (LIS) research. They report that 21% to 49% of LIS research articles utilize surveys as their primary data gathering method (Halpern, Eaker, Jackson, & Bouquin, 2015). This “over-reliance on the survey method limit[s] the types of questions we are asking, and thus, the answers we can obtain” (Halpern et al., 2015). The article attempts to discover why the survey as a research method is heavily favored and uncover possible alternative methods.

Despite being an affordable, quick, and easy to implement data gathering method, many LIS professionals do not have the training required to conduct an effective survey. As a result, LIS professionals may not deliver survey questions, offer suitable response choices, or use surveys in an appropriate manner. Halpern and colleagues propose that librarians generally lack familiarity with research design and implementation to obtain data required for their research questions (see also Kennedy & Brancolini, 2012).

The authors propose that LIS professionals consider using evidence-based library and information practice (EBILP). Eldredge, one of the pioneers of EBILP, argues that EBILP “employs the best available evidence based upon library science research to arrive at sound decisions about solving practical problems in librarianship” (as cited in Halpern et al., 2015). The authors conclude by discussing strategies for choosing the best research methods for a given research question, by describing various research methods available to LIS professionals, and a calling for LIS professionals to “think outside the checkbox” (Halpern et al., 2015).

Methods used to answer the research question.

The authors present a literature review to provide context and suggest alternative research methodologies. They found that earlier studies investigating the research methods used in LIS journal articles used the same definition of research. However, these studies utilized different criteria to select journals for study and different definitions of what constitutes a research article, and influenced by the LIS journals at the researcher’s organization (Halpern et al., 2015).

Findings and conclusions.

The primary finding of this article is that surveys are over-represented as a research method in LIS literature. While the authors acknowledge that surveys are low cost, easy to implement, and appropriate for some research questions, they assert that surveys are frequently used by librarians as a one-size-fits-all data gathering method, which inevitably hampers a researcher’s ability to probe participants for the meaning behind their answers. One example cited in the article draws from a Brown University Library patron satisfaction study in which the researchers used both focus groups and surveys to answer their research question. In the focus groups, “‘clear patterns of deep concern began to emerge’ [obtained via probing] that were not apparent in survey responses, and indeed, that surveys are not capable of obtaining” (Halpern et al., 2015).
The authors conclude with a call for other LIS researchers to enrich their “methodological toolbox,” and “to seek out questions that can be best answered by less frequently employed practice” (Halpern et al., 2015). How research is conducted and analyzed depends on the way the question is being asked, which ultimately influences the kinds of results that can be uncovered. The authors also encourage people to join them in an ongoing discussion on Twitter by using the associated “#DitchTheSurvey” hashtag.
Questions for future research
While it has been interesting to learn that surveys are over-represented in LIS literature, this declaration begs the question of why this one method is used so heavily by LIS professionals as a crutch for their published research. The authors’ assertion that LIS professionals may not be as familiar with other methods and that surveys may be cheap, easy, and effective for reaching a large audience are certainly valid reasons. However, we would be interested in determining whether there is a relationship between the educational history of the LIS professionals and the use of surveys. How robust is the course offering of research methods in LIS schools? Is the survey research trend due to a generational gap that will correct itself over time and with experience? This could be studied with a longitudinal study of a graduating group of LIS professionals, over 5-20 years that also monitors the types of research that they conduct and publish as time progresses.
It may also be useful to analyze the data gathered by Halpern et al. (2015), as well as the complementary data referenced, in order to study the frequencies of methods used in LIS. This would help resolve some of the issues identified by the authors regarding a lack of consistency in analysis caused by selection bias and inconsistent definitions across literature.


Halpern, R., Eaker, C., Jackson, J., & Bouquin, D. (2015). #DitchTheSurvey: Expanding methodological diversity in LIS research. In the Library with the Lead Pipe. Retrieved from http://www.inthelibrarywiththeleadpipe.org/2015/ditchthesurvey-expanding-methodological-diversity-in-lis-research/
Kennedy, M., & Brancolini, K. (2012). Academic librarian research: A survey of attitudes, involvement, and perceived capabilities. College & Research Libraries, 73(5), 431-448. doi:10.5860/crl-276