Click on the headings to open them. They will open on this page. Open the following link for further information about these headings if required.
Your browser does not support these headings. To ensure that the contents remain accessible, they have been automatically opened so that all the information on the page is displayed.
However, to take advantage of the headings and to ensure that the layout and design of this site are displayed correctly, you are recommended to upgrade to a current version of one of the following standards compliant browsers:
- Internet Explorer (http://www.microsoft.com/ windows/ie/downloads/ default.mspx)
- Mozilla Firefox (http://www.mozilla.org/ products/firefox/)
- Opera (http://www.opera.com/download/)
Evaluation timeline
A programme of evaluation was established to ensure ongoing feedback could be obtained and used to inform the design and development process. The evaluation process was based on the 'Evaluation Lifecycle Toolkit' created by Julia Meek of Birmingham University. This is an integrated package designed to place evaluation at the heart of the development of learning technology, building evaluation activities into every stage of the process. The programme aimed to ensure the training package would meet the needs of different user communities as effectively as possible.
The timeline of evaluation activities is shown below.
Text-only version
Sept 2004 - Development Begins
Oct 2004: Initial heuristic Evaluation
Details: An early heuristic evaluation was undertaken, focusing on navigation and the look and feel of the web site. This study highlighted issues related to page design, navigation and content design. A report outlining these key issues was produced for the development team. The web site was amended as a result of the evaluation.
Dec 2004: Usability evaluation - User group
Details: A user study was conducted focusing on usability. Three participants were observed using the web site; they were asked to think aloud enabling a record of their reactions to be noted. This evaluation led to changes with instructions and navigation; in particular the need to separate instructions from content was highlighted.
Jan - Mar 2005: Trialling with a group of users
Details: A trial was carried out with a group of 1st year postgraduates from the University of Leicester who were following a 10 week course on research methods. They were either PhD students or Masters students. After a session on Online Research Methods, volunteers were asked to evaluate sections of the site.
Mar 2005: Follow-up usability evaluation
Details: A second user study was conducted. Again, three participants were observed and asked to think aloud. This study illustrated that the previous changes had been effective, and that problems with the structure and links were no longer apparent. The main navigation, structure and design were effective.
Apr 2005: Second heuristic Evaluation
Details: A further heuristic evaluation was conducted which ensured that the issues highlighted by the initial heuristic evaluation had been addressed and provided a final check that the site was easy to navigate and use. The evaluation highlighted some minor issues e.g. broken links, but essentially it confirmed that the screen design and navigation were consistent and robust.
Autumn 2005: Detailed evaluation of content
Details: Recognised subject experts undertook a detailed evaluation of the content of the web site, and outcomes were fed back to the design team, enabling any necessary changes to the content to be made. Academics whose work was used for case studies and examples on the site were asked to review the material and provide feedback. The response from these reviewers was positive.
Autumn 2005: Planned user study
Details: In the first draft of the Timeline (September 2004) it was envisaged that a User Study would be conducted during the autumn of 2005. This study did not proceed as the content evaluations were extended to include a wider range of evaluators and the development team did not want to present the site to a user group before the changes recommended by the evaluators had been incorporated.
March - July 2006: Dissemination activities
Details: During the final phase of the project, a series of dissemination activities will be undertaken. These will include face-to-face training with academic staff and postgraduates, as well as online dissemination. These activities will be evaluated.
Further details about the evaluation activities carried out at different points during the project can be seen below.
October 2004
Heuristic evaluation
A heuristic evaluation was carried out by Julia Meek (University of Birmingham) with a team of three evaluators.
This led to the following changes in the navigation and design of the site:
- Ensuring that the side menu bar was available as a navigation option on every page and that it was moved to the normal position for such a bar on the left-hand side of the page;
- The number of links in the project background section was reduced to improve clarity and usability;
The profiles of the evaluators are as follows:
Dr Stamatina Anastopoulou
Research Fellow, University of the Aegean, Dept. of Products and
Systems
Design, Ermoupolis, Syros, 84 100 Greece.
Dr Anastopoulou is interested in studying learners' interactions with technology, especially in those interactions that involve multiple sensory and communicative modalities. During her PhD research in University of Birmingham, she explored the effect of multimodal interactions in science learning and developed a theoretical framework for the design of educational technologies. Now in the University of the Aegean, she explores the applicability of this framework in different cultural and educational settings.
Dr Liz Masterman
e-Learning Researcher
PhD in Educational Technology, University of Birmingham, 2004
Currently working as evaluation officer on contract with the Learning Technologies Group, Oxford University Computing Services.
Mike Sharples
Professor of Educational Technology at the University of Birmingham and Director of the University's Centre for Educational Technology and Distance Learning (CETADL).
His current research interests include human-centred design of
learning technology, the theory and development of mobile learning,
and application of models of cognition and social interaction to
the design of technology for learning and knowledge working. He
is the author of 7 books and over 150 other publications in the
areas of interactive systems design, artificial intelligence and
human-computer interaction.
The report is available from the following link (Requires Adobe Acrobat, link to Adobe Reader)
The report will open in a new window, which you should close to return to this page.
Heuristic evaluation report (pdf, 175 KB)
December 2004
Presentation on project evaluation
A presentation was delivered at the ESRC Research Method Programme's two-day workshop on online resources by two members of the team (Tristram Hooley and Rob Shaw). The presentation covered key project evaluation activities and is available from the following link.
The presentation will open in a new window, which you should close to return to this page.
The Evaluation Lifecycle (ppt, 927KB)
Usability study
A usability study was carried out which led to the following key changes in the navigation and design of the site:
- Clearer differentiation between site instructions, activity instructions and general site text;
- The introduction of arrow graphics and status bar instructions to clarify the links opening onto the same page;
- Refinement of the instructions for the personal references list in the modules and the introduction of a method of storing selected references to allow them to be maintained across different pages.
Many thanks to the following people for taking part in the study:
- Colin Hyde, Researcher and Outreach Officer, East Midlands Oral History Archive, University of Leicester;
- Kate Moore, Cartographic and GIS Officer, Department of Geography, University of Leicester;
- Tim Vorley, PhD researcher, Department of Geography, University of Leicester.
Trialling with a group
A sample of the questionnaires module was delivered to a group of first year postgraduate students following a 10-week course on research methods at the University of Leicester. After a session on online research methods volunteers were asked to evaluate the site by working through a series of pages from the questionnaires module consisting of reading materials and learning activities presented as a mini site. Students were asked to complete short questionnaires about their experiences of using the module.
These questionnaires confirmed that the changes made as a result of the initial heuristic evaluation and usability study had been generally effective as no problems were reported with design and navigation either of the site as a whole, or in the use of the personal references list. An example of a completed questionnaire is shown below (Requires Adobe Acrobat, link to Adobe Reader):
The questionnaire will open in a new window, which you should close to return to this page.
Example questionnaire (pdf, 115KB)
March/April 2005
Usability study
A further usability study was carried out at the end of March. This illustrated that the changes made after the initial studies had been effective. Navigation and usability problems were no longer apparent.
Many thanks to the following people for taking part in the study:
- Judith Guevarra Enriquez, PhD Student, University of Aberdeen;
- Selina Lock, Information Librarian, University of Leicester;
- Liz Towner, Educational Development and Support Centre, University of Leicester.
Heuristic evaluation
A further heuristic evaluation was carried out by Julia Meek (University of Birmingham) in April. This aimed to ensure that the issues highlighted by the initial heuristic evaluation had been addressed and to provide a final check that the site was easy to navigate and use. The evaluation highlighted some minor issues such as broken links, but essentially it confirmed that the screen design and navigation were consistent and robust. The main conclusions were:
- There had been a significant overall improvement since the November heuristic evaluation - the look and feel of the site was felt to be good and it was considered to be easy to use;
- Page design, navigation and control over how text is viewed were highlighted as positive features;
- The learning activities were felt to be useful;
- A number of largely minor issues were highlighted for improvement.
The report is available to view from the following link (Requires Adobe Acrobat, link to Adobe Reader)
The report will open in a new window, which you should close to return to this page.
Heuristic evaluation report (pdf, 44 KB)
October/November 2005
Content evaluation
Upon completion of the modules in October, a detailed evaluation of the content of the web site was carried out. Chris Mann (Oxford Internet Institute) and Parvati Raghuram (Open University) were asked to carry out a detailed evaluation of the modules. A variety of academic and non-academic evaluators also provided feedback on specific areas of the site and/or the site in general. The following table shows the key evaluators along with their institutional affiliation and the scope of their evaluation.
Evaluator | Institution | Scope of evaluation |
---|---|---|
Chris Mann | Oxford Internet Institute | Introduction; Interviews; Ethics. |
Parvati Raghuram, Lecturer in Geography | Open University | Introduction; Questionnaire design. |
Christine Hine, Senior Lecturer specialing in Virtual Ethnography | University of Surrey | Ethics. |
Chris Taylor, Project Manager for the ESRC-funded Teaching and Learning Research Programme (TLRP) Research Capacity Building Network (RCBN), Cardiff University | Cardiff University | Questionnaires. |
Martyn Denscombe, Professor of Social Research | De Montfort University | Site in general. |
Claire Hewson, Lecturer in Cognitive Psychology | University of Bolton | Site in general. |
Christine Gratton, E-Learning Co-ordinator | University of Nottingham | Technical guide. |
Jane Hanford, Market researcher |
Nokia | Site in general. |
Rob Negrine, Doctor | NHS | Site in general. |
Colin Hyde, Researcher and Outreach Officer | East Midlands Oral History Archive | Site in general. |
The evaluations were organised and structured by the project's evaluation consultant Julia Meek (University of Birmingham). The outcomes were fed back to the design team, and some of the key changes made as a result were as follows:
- The introductory section required revision to provide clearer pointers to the sections in the main body of the material;
- A clearer section outlining how the site and the modules can be used was included;
- Minor changes and additions were made to the academic content;
- A full site map and module index were added to allow navigation direct to module content;
- Glossary links were added and more links were included between module pages.
As an example of the feedback forms returned by the evaluators, feedback from Christine Hine (University of Surrey) is available to view from the following link (Requires Adobe Acrobat, link to Adobe Reader). The feedback from all evaluators can be viewed in the evaluation report in the section below.
The form will open in a new window, which you should close to return to this page.
Feedback from Christine Hine (November 2005) (pdf, 82 KB)
March - July 2006
The training and dissemination activities carried out by the team are to be evaluated and details of this evaluation will be posted here.
March 2006
On 14th March 2006, the project team delivered a workshop organised by Jane Wellens which was promoted through the M1/M69 Academic Development Network of Universities, comprising 11 Universities and Higher Education Institutions in the Midlands area. The workshop was held at the University of Leicester and was targeted at academic staff, research assistants and postgraduate students. It explored some of the key issues surrounding the use of online questionnaires, and synchronous and asynchronous interviews in social science research. It also introduced participants to this online training resource to allow them to make use of it subsequently to further develop their skills and knowledge in online research methods.
The workshop was evaluated via a questionnaire which began by asking the participants to outline their prior experiences of online research methods. It then asked them to provide a description of their experiences of the workshop by completing the following sentences:
- The main issues that have stood out for me in this session are...
- As a result of this session I will...
- The best thing about the session was...
- The worst thing about the session was...
Finally, it asked them to outline whether they expected to make use of the different sections of this training package in the future.
The questionnaire showed that the participants found the workshop useful and that they appreciated the introduction to the resource, expecting to use a wide range of different sections of the site in future. It was clear that some of the participants would have appreciated more focus on online questionnaires and specifically hands-on information on how they can be created. However, these participants said that they intended to use the site to explore this further after the workshop.
Evaluation report
The evaluation consultant, Julia Meek, produced a final evaluation report of the evaluation process in March 2006. Her summary is outlined below and the full report can be viewed by following the links below.
The 'Exploring online research methods in a virtual training environment' project, funded by ESRC Research Methods Programme (Phase 2, award no: RES-333-25-0001) has successfully met the aims outlined at the outset. This report concludes that:
- The website provides an excellent resource for training, meeting key training guidelines specified by ESRC and the Joint Skills Statement. The website can be used for both face-to-face training and online dissemination and is a useful resource for researchers at all stages in their career;
- The website includes modules on online questionnaires, interviews, ethics and a technical guide;
- There is extensive reference to papers, websites, resources which the learner can collect and export to other bibliographic software e.g. Endnote;
- This is a totally new resource; there is no other training of this kind widely available. All evaluators commented that they had not come across a similar resource;
- The website is professional presented, well designed and easy
to navigate. The following a quote from Dr Chris Mann illustrates
how well the website has been received by academic reviewers:
"This is self-study online training of the very best kind: practical and hands-on; theoretically sound; technically exacting; supportive and inspirational. The opportunities and challenges of online research are presented by an interdisciplinary training team with comprehensive expertise in the methods discussed".
Dr Chris Mann (Oxford Internet Institute) November 2005
The full report is available to view from the following link (Requires Adobe Acrobat, link to Adobe Reader).
The report and appendices will open in new windows, which you should close to return to this page.
Evaluation report (pdf, 439 KB)
Appendix A: Evaluation timeline (pdf, 70 KB)
Appendix B: Heuristic evaluation report: November 2004 (pdf, 172 KB)
Appendix C: Feedback form from Group trial (pdf, 116 KB)
Appendix D: Heuristic evaluation report: April 2005 (pdf, 34 KB)
Appendix E: Full feedback from external evaluators (pdf, 213 KB)
Usage report
The following report on usage of the site in the first five months following its release was written by Learning Technologist Rob Shaw in October 2006.
1. Introduction
This report aims to provide information on the usage of the 'Exploring Online Research Methods’ website from the period 3 April to 31 August 2006 (the first five months following the website’s completion and release). Firstly, it will provide an overview of the site's traffic as recorded in the server log files. This will include indications of total traffic as well as analysis of the nature and origin of this traffic. Secondly, the report will provide an indication of the reach and effectiveness of the site by outlining how far it has been included in key resource listings and search engines and by providing a summary of testimonials and recommendations from users. Finally, a short evaluation of the use of the site is given.
2. Site traffic
Accurate figures of website usage are notoriously difficult to gather and prone to inaccuracy. Analog 6.0 was the software chosen for the analysis of the server logfiles as it offered a high degree of accuracy in its logfile reporting and provided a cost-effective long-term solution. The method of counting the number of requests for pages was used to gather estimates of total usage as this ensured that the figures were not artificially inflated through the inclusion of requests for graphics and other resources packaged within each single page. Alongside the measurement of 'distinct hosts' or the number of individual computers which have accessed the site, these figures can be obtained with some certainty to provide an indication of minimum usage.
As the following graph (Figure 1) shows, a total of 52,755 pages were requested during the 5 month period making an average daily request for 352 pages.
The graph shows that there has been a consistent growth in page requests from about 10.000 after one month, to over 50,000 after 5 months. The rate of growth has shown a slight reduction in monthly growth rate over the second half of the period, but this is likely to have been caused by the nature of the annual academic cycle. If this proves to be the case, the figures for the next two to three months which will cover the beginning of a new academic year are likely to show an upturn.
When considering these figures, it should be emphasised that they are completely accurate only as an indication of minimum usage levels. There is likely to be a degree of underestimation in the fact that following a first visit to a site, the information may be 'cached' or saved to a user's hard drive for access at a later date. In this case, future visits would not be included in the calculation. In some cases, users of the same computer, network or Internet Service Provider may also share a cache so that only the first request from a user for a particular page would be counted.
The requests shown originated from a total of 10,167 distinct hosts. This can be considered an accurate measurement of the number of different machines that have downloaded pages from the site, but this may again be an underestimation given that multiple users may share a single machine.
Traffic to the site originated from a total of 94 different countries
worldwide. The geographical breakdown is shown in the following
chart (Figure 2). Although about half of the users so far have originated
from the UK (25%) or commercial companies (including Internet Service
providers) (22%), significant numbers of users have also originated
from Europe (6%), Australia (2%), Asia (2%), US (2%) and south America
(2%). Smaller numbers have originated from Canada, New Zealand,
Africa and the Middle East. The site appears to be gaining widespread
international usage.
The referring site information shows the top twenty websites from which a link to the site was followed. This can be seen in the following table (Figure 3).
Figure 3 Referring site information | ||
---|---|---|
Number of referrals | Referring website | |
1 | 1744 | http://www.google.co.uk/ |
2 | 1671 | http://www.google.com/ |
3 | 708 | http://zillman.blogspot.com/ |
4 | 376 | http://www.google.co.in/ |
5 | 199 | http://www.sosig.ac.uk/ |
6 | 189 | http://gsociology.icaap.org/ |
7 | 179 | http://www.google.com.au/ |
8 | 134 | http://www.google.ca/ |
9 | 111 | http://www.business.heacademy.ac.uk/ |
10 | 97 | http://cc.msnscache.com/ |
11 | 94 | http://www.aoir.org/ |
12 | 90 | http://www.leicester.ac.uk/ |
13 | 87 | http://search.msn.co.uk/ |
14 | 77 | http://www.google.com.my/ |
15 | 70 | http://www.websm.org/ |
16 | 70 | http://search.yahoo.com/ |
17 | 65 | http://www.google.de/ |
18 | 60 | http://www.york.ac.uk/ |
19 | 54 | http://www.bloglines.com/ |
20 | 45 | http://www.hlst.heacademy.ac.uk/ |
The majority who reached the site via another website did so through search engines, and from Google in particular (1, 3, 7, 8, 19, 13, 14, 16, 17). Portal and listings websites made up a substantial number of the others (5, 6, 9, 15, 20) along with academic institution and organisation websites (11, 12, 18) and blogs (3, 19). This information reflects the range of websites which added links to the site and also the importance of ensuring that the site is accessible through simple searches such as 'online research methods' in popular search engines as well as through links in specialist websites.
So despite the caveats about using site traffic to estimate website use, it is clear that the site is well-used, use is growing, the site has an international audience and is being researched via a variety of search engines, academic institutions and through personal recommendation. In the next section an assessment of the website's usefulness will be attempted.
3. Reach and effectiveness
3.1 Links
One way of assessing usefulness is to explore the organisational and individual websites that included links to the 'Exploring Online Research Methods' site. These often included descriptions and/or recommendations for the site. Such portal sites and listings featured highly in the list of referring sites. Some of these links were solicited and some were initiated by the referring site through contact with the authors. Many were complimentary about the site. Examples include:
Institutions and organisations:
Association of Internet Researchers (AoIR): The site was highlighted on the front page as being a useful resource for members. The President of AOIR, Dr. Michael Allen, contacted the authors to confirm this with the following comments: '…this is, from first impressions, an excellent site which does a lot of things we at Curtin have attempted to do (poorly!) in a unit of study on similar issues. We will definitely be using it in our degree programs. I will also bring this to the attention of the whole AoIR mailing list (1000+ subscribers) and will add it to the website.'
Other organisations and institutions that included links with recommendations and descriptions include WebSM, QUALITI, University of York, University of Nottingham, University College London, Barnsley NHS Trust, Association of American Geographers.
Portals:
Sosig (now 'intute: Social Sciences') made the site an 'Editor's choice' for Research Methods tools. Since the 'Exploring Online Research Methods' site's inclusion, it has featured consistently in the 15 most popular research methods websites listed and in the 30 most popular of all the sites on the listing.
Higher Education Academy: Sections of the Subject Network resource sites such as Business, Management, Accountancy and Finance (BMAF) and Social Policy and Social Work (SWAP) gave the site a prominent listing and description.
Resources for Methods in Evaluation and Social Research: The site was added as a featured link.
Blogs and personal bookmark listings:
The site was included in several blogs and personal bookmark listing web pages as is reflected in the inclusion of these websites in Figure 3. An example of inclusion in a personal blog is the following which was included in a blog called Marcus's Musings – Thinking and writing about FLOSS and digital sustainability: 'Yael Levanon, who attended last year's Summer Doctoral Programme of the Oxford Internet Institute with me in Beijing, passed on an interesting virtual training environment hosted by the University of Leicester called Exploring Online Research Methods. It features several modules, e.g. online questionnaires, ethics, and offers a well sorted area of helpful resources on the web dealing with the question of how to run online surveys. Thanks Yael, thanks folks at Uni of Leicester!'
3.2 Comments and recommendations
A wide range of comments and recommendations have been received from users, teachers and experts in the field of Online Research Methods. These have emphasised the value of the website as a free openly-available resource both for individual use and for teaching purposes.
Some examples are given below:
We appreciate your project and I predict many of our members will benefit from your work.
Dr Michael Solem, Association of American Geographers
Dear members of the GIR mailing list:
I'd like to point you to a very important online research training Website which was launched yesterday: To me, this is certainly "the" most comprehensive and well-designed online research training Website, valuable for all of us.'
Dr. Michael Bosnjak, University of Mannheim
Dear Clare
Just wanted to say congratulations on the online methods work. I found it helpful recently, when I wanted to explore the ethics around using forum and message board postings, such as how do you get permission to use such material. It's a really impressive resource and long over due.
Tracy Simmons Lecturer Dept of Media and Communication, University of Leicester
Dear Clare,
I thought you would be interested to read the following reaction
from
Alice who teaches the Research Methods course to our Master's and
PhD
students.
Dr. Lokman Meho, Indiana University
Thank you, thank you, Lokman. This is a wonderful place, a real
find!
Thank you!
4. Conclusions
Quantitative site traffic statistics, in addition to qualitative responses and recommendations, suggest the site has robust and consistent levels of usage. Many users of the site appear to be finding it an accessible, useful and original resource.