move index creation for word table to tokenizer

This introduces a finalization routing for the tokenizer
where it can post-process the import if necessary.
This commit is contained in:
Sarah Hoffmann
2021-04-30 17:28:34 +02:00
parent 20891abe1c
commit 388ebcbae2
6 changed files with 31 additions and 11 deletions

View File

@@ -26,6 +26,10 @@ class DummyTokenizer:
self.init_state = "loaded"
def finalize_import(self, _):
pass
def name_analyzer(self):
return DummyNameAnalyzer(self.analyser_cache)