Fixed a bug in FastWordpieceTokenizer where vocab sizes >= 7 would cause failures if the unknown token was not at the end of the vocabulary by ensuring the internal hash map is reserved upfront. #1470
+325
−314