forked from hans/Nominatim
move index creation for word table to tokenizer
This introduces a finalization routing for the tokenizer where it can post-process the import if necessary.
This commit is contained in:
@@ -26,6 +26,10 @@ class DummyTokenizer:
|
||||
self.init_state = "loaded"
|
||||
|
||||
|
||||
def finalize_import(self, _):
|
||||
pass
|
||||
|
||||
|
||||
def name_analyzer(self):
|
||||
return DummyNameAnalyzer(self.analyser_cache)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user