Natural language processing (nlp for short) is the processing mechanism by which we teach computers to understand sentences and make some type of coherent sense of them. Yesterday, I read a lab paper from Stanford on Probabilistic Context Free Grammars (PCFGs) and thought I’d share my revelations from that piece of work with you.
- PCFGs are made up from a set of terminal and non-terminal objects
- PCFGs can be decomposed by following a set of rules pre described to exist for the set described in step 1.
- PCFGs are only valid symbol sets if a sentence s can be composed or decomposed from the given set of rules, and some tree can be formed from the given PCFG.
- The main point of the lab article was to prove that unlexicalized PCFGs are still competitive ways of analyzing sentences and that simply increasing the resolution of the parse tree can give you deeper in depth knowledge on the sentence than previously thought. Lexicalized PCFGs are not always needed, and may take more computational complexity, and space.
Thats all for now. More to come on NLP articles in the future. Today’s research goal is to research the customer service and retail industries. I’ll post my findings here hopefully by later tonight.