Review of an Existing Search System
1. Introduction
People use search or Information Retrieval (IR) technologies in different ways and within different contexts for
locating relevant information. Most websites and information services include functionalities to assist end users
with searching for and browsing information (e.g. IR systems, facets, etc.). These, combined with effective
information architecture design, can support information finding.
The aim of this coursework is for you to investigate the features provided by a specific website to support users
with finding information through searching, i.e. for the kind of information seeking where the starting point is
formulating a query.
This coursework requires you to carry out a form of ‘expert’ assessment of a search system. You will consider
how well the search system meets the needs of its users undertaking their typical search tasks. You will also
review and critically assess how well the system enables users to retrieve information items. You must write
up your review in the form of a 3,000 word structured report (excluding abstract, table of contents, references
and appendices). As a piece of academic work you are expected to engage with the research literature and
provide references to justify and support your writing.
Section 2 provides more information on what you need to do to carry out this assignment. Sections 3 and 4
provide details on producing the written report, including submission details and its overall structure.
If you have questions about the coursework, please contact Sophie Rutter ([email protected]) to
discuss. This piece of individual assessment is worth 100% of the overall module mark. Submission deadline:
10am Monday 20
th January 2020 via Turnitin.
2. What you need to do
2.1. Select a website/search system
From the following list select a search system you would like to review:
The National Gallery – www.nationalgallery.org.uk
Sheffield City Council – www.sheffield.gov.uk
Purple Bricks Estate Agents – www.purplebricks.co.uk
China Daily – www.chinadaily.com.cn (Important, you must use the English language version of this
website and test the search system using queries written in English)
New York Public Library – www.nypl.org select “search the catalog”
Flickr – www.flickr.com
Work through and experiment with all of the above before making your selection.
2.2 Identify typical users
For the selected search system, find out the purpose of the website, the users and likely uses. To do this you
will need to refer to the research literature about the website / type of website selected (e.g. cultural heritage
museums, libraries and so on). Think about who the users are likely to be, what characteristics they are likely
to possess, and for what purposes they would use the website and its search system.
2.3 Identify search features
For the selected website, identify the features of the search system. Focus on the functionalities that
support users as they carry out their search tasks (e.g., search box, facets), rather than features that may
support other types of task, (e.g. buying or sharing items). Using Wilson (2011) categorise these as Input,
Control, Informational and Personalisable features.
2.4. Create appropriate search tasks
Next consider the types of search task that are likely to be performed using the selected site. You must create
scenarios, search tasks and queries that are appropriate for your selected IR system (you should base these
on the typical users and uses that you identified in section 2.2). Consider different types of task (e.g. see Kellar,
Watters & Shepherd, 2006 or Toms, 2011) and then choose two different types of task which are appropriate
and realistic for your chosen website. Make sure that you focus on search tasks rather than other types of
tasks (e.g. buying items). Then, create one or more user scenarios for each search type, tailored to your choice
of search system.
2.5. Investigate how search features support users
Based upon the tasks you created in 2.4, explain how the system features (identified in 2.3) supports users
during the search process (i.e. query formulation / reformulation and results examination – see Sutcliffe &
Ennis,1998) for different task types. The experiences gained from carrying out the search tasks will help you
identify strengths and weaknesses of the search system. You may find that the system supports some tasks
better than others.
2.6. Assess retrieval performance
Now assess the effectiveness of the retrieval system itself, such as its ability to handle different types of
queries, identify relevant items and rank the results. For your two tasks define at least 10 queries,
experimenting with different types, e.g. as discussed by Rose & Levinson (2004). These might include misspellings, use of proper names (e.g. names of people, places, etc.), use of phrases and Boolean expressions,
alternate spellings, queries in multiple languages, synonyms and related terms, plurals, etc. You may also
want to experiment with longer queries, shorter (and often more ambiguous) queries, questions, the use of
stemming, etc. You should aim to find documents/pages/information that you know exist in the website, as it
then makes a fairer test. This may require you to undertake some preliminary searching before you define the
queries for your test.
Enter each query into the search engine and gather the results from the first page (typically the first 10 hits;
although there may be fewer results). For each result, judge its relevance. Next, measure search effectiveness
based on some form of precision (see Lecture 9). Finally, provide some overall rating of the results across all
queries, as well as an analysis of the performance for individual queries and types of queries.
2.7. Evaluate the IRS for its strengths and weaknesses
After testing the system extensively in these different ways, you are now ready to evaluate its strengths and
weaknesses in supporting users to complete their search tasks. Compare your results with what is
considered best practice for designing search systems and search interfaces (see, e.g. Resnick & Vaughan,
2006; Wilson, 2011, and others).
2.8. Identify potential improvement
Finally, identify potential improvements for the search system and to the search user interface, based on
your evaluation and using the research literature to support your recommendations. You should consider
current/future developments in IR research.
3. Written report
Following your investigation of the selected search system, and engagement with literature, you must write up
your findings into a written report that is worth 100% of the module mark. You should write your report for the
website owner. They may have technical expertise and be familiar with the domain, but less familiar with search
systems and evaluation. Your report should be structured as follows.
Structured abstract [up to 2 marks deducted from “presentation” if not included as required]
Include a structured informative abstract (no more than 250 words – this is not included in the word count for
the assignment) that contains the following section headings:
Aim(s) of the report.
Selected website / search system.
Main findings (including strengths, weaknesses and recommendations).
Table of contents [up to 2 marks deducted from “presentation” if not included as required]
Include each of the core sections identified here; use the numbered style, e.g., 1.0. 1.1, 1.2, etc., and include
page numbers. The Table of Contents is not included in the word count for the assignment.
Description of selected search system [10%]
Briefly describe the purpose of the website and search system, the users and likely uses. Where
possible, you should provide references to support your description. (5%) [Guideline: 300 words]
Present in the form of a table the functionalities offered by the search system stating the feature
provided, the purpose of the feature and a categorisation of the feature using the four categories
proposed by Wilson (2011). The table is not included in the word count but you should make table
entries succinct. (5%) [Guideline: n/a but keep succinct]
Description of search tasks [5%]
Describe two different types of search tasks in the form of scenarios that you will use to test the
system. Using the research literature identify the type of task. [Guideline: 200 words]
Supporting the user’s search process [25%]
For each of the two scenarios / search tasks, describe how search system features (identified in the
table above) support the user during the search process. You do not need to describe each system
feature but you should show how the search system supports users at different stages of the search
process, and for different types of task. In the appendix, list the queries you used for each task.
[Guideline: 850 words]
Analysis of retrieval performance [25%]
Describe the purpose of this testing, the tests carried out, the types of queries used, how you judged
relevance and the measures of retrieval effectiveness used. Where possible, you should provide
references to appropriate literature to support your testing (e.g. in selection of measures).
In the appendix provide a list of the queries used and summary of the score for each query in the
form of a table (not included in word count)
o The queries selected (and any other information, such as category).
o How relevance was judged.
o Results for the measures used for each query.
Provide a description, analysis and explanation of results, highlighting the strengths and weaknesses
of the retrieval system based on your results.
[Guideline: 850 words]
Discussion and recommendations [25%].
Bring all your insights and results of testing together to discuss the findings and describe how well
the system performed from the perspective of its provision of search functionalities to support users
during the search process, and from the perspective of retrieving relevant results. You should
list/highlight the main strengths and weaknesses of the search system, making use of the test results
to highlight particular aspects that worked well and not so well.
Make suggestions or recommendations about how the system could be improved to better meet the
search needs of its end users. Your suggestions should be supported by the research literature.
[Guideline: 800 words]
References [up to 2 marks deducted from “presentation” if not included]
You must provide a list of the references cited in the report – this is not included in the word count for the
assignment. These should conform to the required APA style as outlined in the Student Handbook.
Presentation [10%].
The report should be written in a clear manner for a business audience with appropriate use of English,
screenshots, tables and figures. Marks will be deducted if the structured abstract, table of contents and
references are not included as described above.
4. Information School Coursework Submission Requirements
It is the student’s responsibility to ensure no aspect of their work is plagiarised or the result of other unfair
means. The University’s and Information School’s advice on unfair means can be found in your Student
Handbook, available via http://www.sheffield.ac.uk/is/current
Your assignment has a word count limit. A deduction of 3 marks will be applied for coursework that is 10% or
more above or below the word count as specified above or that does not state the word count. It is your
responsibility to ensure your coursework is correctly submitted before the deadline.
It is highly recommended that you submit well before the deadline. Coursework submitted after 10am on the
stated submission date will result in a deduction of 5% of the mark awarded for each working day after the
submission date/time up to a maximum of 5 working days, where ‘working day’ includes Monday to Friday
(excluding public holidays) and runs from 10am to 10am. Coursework submitted after the maximum period
will receive zero marks.
Work submitted electronically, including through Turnitin, should be reviewed to ensure it appears as you
intended.
Before the submission deadline, you can submit coursework to Turnitin numerous times. Each submission
will overwrite the previous submission. Only your most recent submission will be assessed. However, after
the submission deadline, the coursework can only be submitted once.
If you encounter any problems during the electronic submission of your coursework, you should immediately
contact the Module Coordinator, and the Information School Teaching Support Team [email protected] . This does not negate your responsibilities to submit your coursework on
time and correctly.
5. References
Hearst, M.A. (2009). Search User Interfaces. Cambridge University Press. Available online:
http://searchuserinterfaces.com/
Kellar, M., Watters, C. and Shepherd, M. (2006) A Goal-based Classification of Web Information Tasks, In 69th
Annual Meeting of the American Society for Information Science and Technology (ASIST), Austin (US), 3-8
November 2006. Available online: https://core.ac.uk/download/pdf/11880988.pdf
Resnick, M.L. and Vaughan, M.W. (2006). Best practices and future visions for search user interfaces. Journal
of the American Society for Information Science and Technology, 57(6): 781-787.
Rose, D.E and Levinson, D. 2004. Understanding user goals in web search. In Proceedings of the 13th
international conference on World Wide Web (WWW ’04). ACM, New York, NY, USA, 13-19. Available online:
http://www.ambuehler.ethz.ch/CDstore/www2004/docs/1p13.pdf
Sutcliffe, A.G. and Ennis, M. (1998). Towards a cognitive theory of information retrieval. Interacting with
Computers, 10: 321–351.
Toms, E.G. (2011). Task-based information searching and retrieval. In I. Ruthven & D. Kelly (eds.). Interactive
Information Seeking, Behaviour and Retrieval, Facet Publishing, 43-59.
Wilson, M. (2011). Search User Interface Design. Synthesis Lectures on Information Concepts, Retrieval, and
Services (Ed. Marchionini, G.), Morgan & Claypool.




