0 votes
in Solr – Enterprise Search Engine by

Which tokenizer splits the text field into tokens, treating whitespace and punctuation as delimeters?

a) Lower Case Tokenizer

b) Standard Tokenizer

c) Classic Tokenizer

d) ICU Tokenizer

1 Answer

0 votes
by

b) Standard Tokenizer

Related questions

0 votes
asked Oct 22, 2022 in Solr – Enterprise Search Engine by john ganales
0 votes
asked Oct 22, 2022 in Solr – Enterprise Search Engine by john ganales
...