I think Chomsky got it wrong with his decomposition of language structure. His idea is that, basically every sentence is like a graph with subgraphs that when arranged the right way (certain eliminations and simplifications are made), reveal a "deep" structure, which corresponds to some universal form of language. In fact language is algebraic decomposition, of structure. This is because, all nouns are instantive, in the sense nouns "contain" things or represent data in the language program. Verbs "act" on nouns, or in the sense of being data, are dereferenced nouns (gerundives, supines). Naming also qualifies a thing. We alter nouns (inflect them) to give them properties; adjectives are transient or temporary "noun values" in a sentence. In that sense the data structures are either "classical" data or class&object data. Either model implies that the data is instantiated at some point. A noun has 'capacity', the phonemes in its structure represent a constant volume of some thing ("ball", "cat"). Verbs are a "volume flow"; we can construct "deep" sentences by simply leaving out the pronouns, adjectives and other adjuncts to see the "data". e.g. "cat meow(s)", "ball bounce(s)", note the order is irrelevant. Then there is a straightforward "physical" representation of all the flow in a linguistic structure, as a volume form - nouns "contain" verbals, verbs reference nouns and dereference nominatives. That languages are mathematical should be obvious; that an obvious (though not simple, English is very overloaded for instance) algebraic representation exists should be too, where did Chomsky leave the track?