Of libraries, doctorates and Web 2.0

by Urs E. Gattiker on 2009/11/26 · 15 comments 9,803 views

in a analytics taking action

Image - tweet by ComMetrics - Metrics improve focus. Just be sure you know where you want to go and you focus on the relevant #KPIs to get you thereWe recently came across a three-year study that tracked the research behavior of ‘Generation Y’ doctoral students (those born between 1982 and 1994). The focus was on how these students are using library resources, including traditional library, librarian assistance, electronic libraries and Web 2.0 tools.

The media headlines were focus on the fact that just under half of those polled use RSS feeds. Journalists also noted that only about 10 percent use social bookmarking, with Generation Y students exhibiting the same behavior as other age groups.

What does this mean – if you don’t use RSS or Del.icio.us, you must be a bad student? This post debunks a few myths.

In her article Next-gen PhDs fail to find Web 2.0’s ‘on-switch’, the Times’ Zoë Corbyn highlighted several sections of Researchers of Tomorrow, a study jointly funded by the Joint Information Systems Committee (JISC) and the British Library.

    “Interim results, released to Times Higher Education, show that only a small proportion of those surveyed are using technology such as virtual-research environments, social bookmarking, data and text mining, wikis, blogs and RSS-feed alerts in their work. This contrasts with the fact that many respondents professed to finding technological tools valuable.”

Based on the above and after having read both the survey used to collect data, and the preliminary research report, I felt compelled to respond to Corbyn’s article:

    This is a very interesting post and I also looked at the preliminary research report to get a better grasp of the findings. For me, the big question is whether Web 2.0 use has anything to do with outcomes that we want to monitor.
    An example would be whether RSS use helps reduce the time needed to finish one’s studies. In fact, some US data about Facebook usage by undergraduates shows a positive correlation with lower grades and less time spent studying.
    Is this desirable?
    Like the US Pew Research Center, we have found in our work that the majority of our readers do not want RSS feeds; however, they have good reason(s) to prefer getting the information via email.
    Most of our readers prefer to get blog posts via email – RSS is one of many choices we have to get information on the net, but is it the best? Worse, does it necessarily follow that Ph.D. students are less Web 2.0 savvy if they do not use RSS feeds?
    It could actually represent a choice made by some Ph.D.s to focus on getting their work done as fast as possible instead of tweeting or spending time on Facebook.
    Conclusion
    Some Ph.D. students may have decided not to turn the Web 2.0 switch on when it comes to RSS feeds, using Voice over IP (VoIP), or surfing the web from their smartphone. However, not using Web 2.0 technology extensively does not make a student an inferior researcher. Nor does it mean that it will take more time for that student to complete their studies.
    Neither the summary above nor the preliminary report allows us to answer the above question. Accordingly, we still don’t know why the Ph.D.s chose not to switch Web 2.0 tools on more often.
    What do you think?
    Regards,
    Urs @ComMetrics

Needless to say, I never got a reply to my comment from Zoë Corbyn. Could this be another example of why newspapers fail the social media test?

Is data-bias an issue?
I recently pointed out that data-bias is becoming a major issue for those interested in web analytics. I could also have pointed out that questionnaire design is something we absolutely must be careful about.

Double-barreled questions are two questions (generally inadvertently) asked as one. For instance, in the previously mentioned study, question five asks what kind of information technology user I am and provides me with several response choices, of which the respondent must choose the one that most applies.

Considering the broadness of the question, researchers cannot be certain how the respondent understood and answered the question: being a Blackberry addict, occasionally using their mobile phone or almost never accessing the Internet from a computer.

More resources on data bias and the report:

Bottom line
The study tries to track information‐seeking behavior and changing attitudes toward research. Among the findings reported were that 75 percent of Generation Y students (those born between 1982 and 1994) found the information they sought in an e-journal article.

I am not sure the study makes me feel confident enough to accept the findings regarding the use of the information and research resources, both online and off, by Ph.D. students in the UK.

If we want to improve web analytics, research policy and our students’ doctoral education, the methodologies used to collect data should assure a certain level of reliability and validity. To advance our work we must follow best practice and address the issues outlined in this post, as well as the take-aways below.

Take-aways (in random order)

    1. Validity of research findings is important: This means the questions must measure what they are supposed to and unless we can be certain of that, reaching conclusions from the data is impossible.
    2. Understanding bias is key: If one does not understand the bias in one’s research data, once again, conclusions are impossible.
    3. Ph.D. and Web 2.0 – switch on or off by choice: Maybe the study indicates that Ph.D. students are smart and, instead of spending hours in virtual nirvana, they have a richer social life in the real world. Most importantly, they may complete their studies that much faster, while delivering high-quality dissertations. Sounds great to me.

Please, share your experiences with us! How have you dealt with these issues in your research, web analytics and/or benchmarking work?

  • Pingback: Wales Social Media

  • Pingback: JobShoots

  • Pingback: JobShoots

  • Pingback: Urs E. Gattiker

  • Pingback: WhitePapers

  • Pingback: World Economic Forum

  • Pingback: Urs E. Gattiker

  • Pingback: Matthew Fraser

  • Pingback: World Economic Forum

  • http://kylefox.ca/ Daniel Test

    Testing again.

  • Pingback: ComMetrics weekly review: ROI to porn via Facebook - blog benchmark, KPI, ROI, KISS, SMART metrics, social media monitoring, best metrics, best practice, benchmark software, cost-benefit analysis, social media - ComMetrics: social media monitoring => b

  • Pingback: ComMetrics weekly review: ROI to porn via Facebook - blog benchmark, KPI, ROI, KISS, SMART metrics, social media monitoring, best metrics, best practice, benchmark software, cost-benefit analysis, social media - ComMetrics: social media monitoring => b

  • http://www.efc.co.uk/ Julie Carpenter

    Hi, Urs.

    I am a Director of Education for Change, the company commissioned to develop and deliver this Researchers of Tomorrow study for the BL and JISC, which is the subject of your discussion. I am also the research director on the study.

    I would just like to point out three things:

    First, this is a qualitative research study, using survey questionnaires only to provide context for a longitudinal qualitative tracking study of 70 research participants which is ongoing.
    Second, the question about 'what kind of information technology user am I' uses a fairly standard technique of proposing statements about technology use and asking respondents to position themselves as an IT user by selecting the one that feels most appropriate to them – certainty about “how the respondent understood and answered the question” is not relevant here since we are asking them to self-asses a set of qualitative characteristics that cannot be 'proved' empirically.
    Finally, please bear in mind that this is a three year study (as the summary report at http://explorationforchange.net/attachments/054… explains).

    This first interim report focuses on the results of our first survey of the wider context of doctoral studies (which, by the way, attracted over 5500 responses). Hard conclusions about any researcher behaviour would be premature at this stage – what we present are interesting indications.

  • http://My.ComMetrics.com Urs E. Gattiker

    Julie

    Thanks so much for responding to my post. I appreciate your reasoning but I would like to point out three issues below if I may:

    a) Neither your press release nor the summary report provide an indication that survey data were used for context purposes only. Maybe this could be made a bit clearer next time?

    b) As well, double-barrelled questions are double-barrelled questions and they result in respondents being confused and not sure what to answer. For the researcher it becomes very difficult to know to which part of the question the person answered.
    So how is one than supposed to interpret these data?

    c) You state that the study / report presents indication only – not findings. This distinction does neither seem to come across very clearly in your press release nor the summary report. The term used is ‘research findings’ which has a vastly different meaning, of course.

    Nevertheless, I surely want to see the next report about this longitudinal study (3-year study) and I hope you will send me a link/copy when it is being published.

    Finally, misunderstandings happened in part because probably the media did not take the time to carefully study the material including the summary report you provided. The result is that indications become facts.nI have tried to make this clearer in my post above but I can see with your comment that I have not succeeded fully in getting this across to my readers. For this I apologise.

    Thanks so much for sharing

    Regards
    Urs

  • http://My.ComMetrics.com Urs E. Gattiker

    Julie

    Thanks so much for responding to my post. I appreciate your reasoning but I would like to point out the following:

    a) Neither your press release nor the summary report provide an indication how the survey used for collecting data was used to provide the context. Maybe this could be made a bit clearer next time?

    b) As well, double-barrelled questions are double-barrelled questions and they result in respondents being confused and not sure what to answer. For the researcher it becomes very difficult to know to which part of the question the person answered.
    Unless that is really known, what insights are gained from this?

    c) So you present indications. Unfortunately, again your press release nor summary report talk about indications but research findings instead.

    Nevertheless, I surely want to see the next report about this longitudinal study (3-year study) and I hope you will send me a link/copy when it is being published.

    Finally, I am also aware and point this out above that whilst the researchers including you as the person responsible tried to make a few things clear that the press, including The Times' reporter missed, because they either did not understand or else might not have taken the time to read it in full.

    Thanks so much for sharing

    Regards
    Urs

Previous post:

Next post: