Natural Language Processing

Archived; click post to view. Excerpt: After 2 years of hard work I can finally proudly present the core of my PhD thesis. Starting from Till Speicher and Paul Georg Wagner implementing one of my ideas for next work prediction as an award winning project for the Young Scientists competition and several iterations over this [...]

Continue reading about How Generalized Language Models outperform Modified Kneser Ney Smoothing by a Perplexity drop of up to 25%

Close

Subscribe to my newsletter

You don't like mail?