tokenizer

Noun

 * 1)  A system that parses an input stream into its component tokens.