0 votes
in Solr – Enterprise Search Engine by

Which tokenizer splits the text field into tokens, treating whitespace and punctuation as delimeters?

a) Lower Case Tokenizer

b) Standard Tokenizer

c) Classic Tokenizer

d) ICU Tokenizer

1 Answer

0 votes
by

b) Standard Tokenizer

...