From the 1950’s up to the 1980’s, Natural Language Processing (NLP) systems usually consisted of complex sets of prescriptive rules based on the General Linguistic Theories of that time. Since the late 1980’s, NLP systems are mainly focused on statistical and machine-learning principles applied on massive amounts of text data, while detailed grammatical knowledge of particular languages is neglected until now. But, after years of research, the mainly statistical NLP techniques are still offering disappointing quality.

It is sensible to strive for a new NLP approach that is based on integrated achievements of actual General Linguistics, highly developed Formal Linguistics and sophisticated Statistical Research.

Leading principle is: the meaning of a text is to be calculated according to the grammar rules.


Utrecht, December 2013

The Calculemus-team:

Drs. W.J.J. Jansen, Senior Linguist
R.J.C. van Haaften, Informaticus
A.M.C. de Koning, Project Manager


Creative Rebels - Strategy, design en technology for your business.