And finally….!

As I undertake my daily scanning of the research papers on information retrieval, I continue to be impressed with the level of innovation that I see. Clearly significant advances are being made into many areas. My concern is how these advances are going to feed into search products that are going to make a commensurate advance in the findability experienced by employees in their daily work. To use the metaphor of building a house, there seem to be endless suggestions for the design of kitchens, bedrooms, patios and bathrooms but no one is considering whether there is any combination that makes a house a home and takes into account that people have different visions for what they regard as an acceptable compromise taking into account their long terms plans and the current (and continuing) high levels of interest.

There are many models that have been developed for the process of search. My favourite model is the Complex Searcher Model developed by Leif Azzopardi and David Maxwell because it is very close to my own experience in developing enterprise search applications. In particular I like the way that the concept of a stopping point is introduced into the model, a factor almost completely overlooked by the IR research community.

Given the plethora of models it always concerns me that there is never an attempt to ‘place’ a particular development within an overall search process so that the precursors and impacts can be assessed. I’m thinking in particular about the extent to which a collection has to be re-indexed to take advantage of an innovation, as that is not going to happen without there being a very clear benefit to offset the inherent risks of the re-index process. There is of course a substantial amount of research into the optimisation of interactive information retrieval but because of a lack of both real enterprise search test collections and an in-depth understanding of how search is used (and mis-used) inside organisations there is a huge challenge in the development and delivery of end-to-end solutions that meet user requirements. The scale of the challenge is well illustrated by a recent case study of the enterprise search application at Microsoft. Am I alone in wondering why it seems to have taken Microsoft so long to optimise their internal application?

Why is it that there is only one detailed study that takes an in-depth look of discovery processes inside an enterprise? Just one! Arguably the first enterprise search system was developed by G. Douglas Tallbot in 1966. So that is just one detailed study in the last 56 years. Mmm!

I was delighted to learn of the establishment of the DoSSIER project in 2019 because for probably the first time the entire process of discovery was being examined. It was very unfortunate that the launch of the project coincided with the Covid pandemic but given the commitment and experience of the leadership team good progress is now being made.

From a practitioner perspective it seems to me that the research community fails to understand that that the achievement of high levels of search satisfaction is a wicked problem, and that point improvements will have little or no impact. In 1973 Horst Rittel and Melvin Webber authored a paper entitled ‘Dilemmas in a General Theory of Planning’ (Policy Sciences 4 (1973), 155-169). In this paper they set out the basis for what they regarded as ‘wicked problems’, which were beyond the capacity of traditional methods to resolve. There is a good introduction to the concept of wicked problems in a 2008 Harvard Business Review article by John Camillus. Wittel and Webber set out ten characteristics. In his book ‘Dialogue Mapping: Building Shared Understanding of Wicked Problems‘ (Wiley 2005) Jeffery Conklin set out a further six.  A couple of years ago I mapped enterprise search against all 16 criteria and ended up with a fairly depressing picture!

It is also worth reading two recent posts from Steve Arnold, the US search guru, on the difficulties of making search work and the challenges of supporting search with a taxonomy.

I am confident that the outcomes of DoSSIER will do much to fill some of the gaps in our understanding of the overall discovery process. I do hope that DoSSIER will be the first of many such projects but I have to say that I am not optimistic.

Martin White

About Martin White
Martin White

Martin is an information scientist and the author of Making Search Work and Enterprise Search. He has been involved with optimising search applications since the mid-1970s and has worked on search projects in both Europe and North America. Since 2002 he has been a Visiting Professor at the Information School, University of Sheffield and is currently working on developing new approaches to search evaluation.

Leave a Reply

You must be logged in to post a comment.