Search Solutions 2021 23/24 November

We had hoped to run the Search Solutions 2021 Conference and Tutorials on-site at the BCS London HQ but constraints on the number of delegates that could be accommodated because of Covid concerns meant that last month we made the decision to go virtual with the Conference (which worked well last year) and with one of the tutorials. As you will see the one-day tutorial will be held at the BCS London HQ.

Details of both events follow but this is the Eventbrite registration link.

The Tutorials 23 November

Tutorial 1 Overview of Natural Language Processing

Michael Oakes

BCS, Ground Floor, 25 Copthall Avenue, London, EC2R 7BP

This tutorial will give an overview of Natural Language Processing, which is the computer processing of human-produced speech and text). The textbook “Speech and Language Processing” by Daniel Jurafsky and James H Martin will be used as a basis for the tutorial. The levels we will cover are morphology (shapes of subword units),  phonology (pronunciation of subword units), spelling checkers, automatic assignment of grammatical classes to words, relations among words, parsing with context-free grammars, meaning representations, word sense disambiguation, pragmatics (language above the sentence level) and a brief introduction to machine translation.

What we expect the attendees to gain an overview of the field of Natural Language Processing. Lecture style presentations will be interspersed with practical exercise where we carry out the actions of the computer on pen and paper.


10:00am: Overview of Natural Language Processing

10:30am: Regular Expressions and Finite State Automata

11:00am:  Speech Processing

12:00noon: Dealing with Spelling Errors

12:30pm: Automatic Part-of-Speech Tagging

2:00pm: Syntax: A Context-Free Grammar for English

2:30pm: Semantic Representations

3:00pm: Discourse Analysis

4:00pm: Machine Translation

4:30pm: Questions and Answers

5:00pm: End

Tutorial 2 – Practitioners’ Evaluation Roundtable – A virtual tutorial

Ingo Frommholz and Jochen Leidner

Information systems that are deployed in production settings and used operationally by hundreds or thousands of users are typically more complex than systems developed in academic research, which makes them much harder to evaluate. However, not evaluating a system is not a viable option, as it corresponds to “flying blindly” – the positive or negative impact of any change would remain unknown. As a consequence, many practitioners come up with their own protocols for assessing system quality in terms of the relevance of rankings given a query. In the academic world, several initiatives such as TREC1, MediaEval2 or CLEF3 are striving to provide benchmarks and datasets to make different solutions and algorithms comparable to each other for some specified task. A further example is Kaggle4. While BCS Search Solutions in the past has been successful in transferring knowledge among practitioners on the one hand, and academics and practitioners, on the other hand, we think evaluation is a topic that would require more attention. While we think there is no “one size fits all” solution, we also believe that there should be an exchange of ideas, solutions and experiences when it comes to evaluation information and search systems in an enterprise environment.

Instead of a full tutorial, we think the topic of evaluation needs to be driven by the participants. Hence we will conduct a round-table discussion (in lieu of a tutorial) at the upcoming BCS Search Solutions. Our aim is to provide an open forum where practitioners can share methods, metrics, challenges, and tricks of the trade with their peers. After a short introductory presentation that emphasises the importance of IR evaluation and sketches its history to set the scene and align participants, the format is one of free discussion without moderation. A human recorder will take notes, which may be published in a suitable venue (e.g. SIGIR Forum or BCS Informer) if findings emerge that are worthy to be preserved.

3.00pm: A brief history and introduction of IR systems evaluation – Ingo Frommholz & Jochen Leidner

3.45pm: Discussion & Lightning talks: Methods, metrics, challenges — how do practitioners evaluate their systems so far? – All participants

4.45pm: Discussion/Breakout Groups: Evaluation in “real-world” environments – all participants

5.30pm: Discussion of results/wrap up – all participants

6.00pm: Closing

The Conference 24 November

This year the format of the conference is based around paired papers (with a couple of exceptions) on specific themes, so that attendees can get two different perspectives on the themes. There will then be a Q&A session for both the speakers.

Incorporated into the agenda will be the presentation of the  BCS Search Industry Awards (organized by Tony Russell-Rose), one of which will be the SS 2021 Best Paper award which (for obvious reasons!) comes right at the end of the conference.

There will be two panel sessions at the end of the conference. The first of these will be a panel of some speakers and session chairs reflecting on what they have heard and learned during the conference. The second will be some invited panelists who will be asked to success what the themes for the SS2022 conference should be.

Once the final session is completed the AGM will take place. This will be open to all attendees but voting is of course only open to members of the Information Retrieval Specialist Group.

Inevitably attendees will come in and out of the event during the day, which is why each session starts on the hour so there is no excuse for missing a session that is of particular interest. We hope to make recordings of the presentations available but that may not be the case for every presentation, so please do not assume that you can miss a presentation and catch up later!

09.00 Formulating and treating information needs at work

Professor Katriina Byström, Department of Archivistics, Library and Information Science, Oslo Metropolitan University

10.00 Training for IR and data science

Professor Paul Clough, Information School, University of Sheffield and Peak Indicators

Olivia Foulds, Department of Computer Science, University of Strathclyde

11.00 Identifying and addressing misinformation

Dr. Andy MacFarlane, City, University of London

Dr. David Corney, NLP Engineer, FullFact

12.00 Searching the enterprise

Steve Sale, Search and Taxonomy Architect, AstraZeneca

John Western, Regional VP, Yext

13.00 Break

14.00 Systematic searching

Drs. Ing Rene Spijker, Academic Medical Centre, University of Amsterdam

BCS Search Industry Awards

15.00 Digital asset management

Tim Gollins, Head of Preservation and Information Management, National Records of Scotland

Theresa Regli, Consultant

16.00 Panel sessions

What have we learned today?

What are the priorities for 2022?

Search Solutions 2021 Best Paper Award


About Martin White
Martin White

Martin is an information scientist and the author of Making Search Work and Enterprise Search. He has been involved with optimising search applications since the mid-1970s and has worked on search projects in both Europe and North America. Since 2002 he has been a Visiting Professor at the Information School, University of Sheffield and is currently working on developing new approaches to search evaluation.

Leave a Reply

You must be logged in to post a comment.