Formal Languages,Machine Understanding,Automata.

Discussion in 'Intelligence & Machines' started by Rick, Jul 26, 2002.

Thread Status:
Not open for further replies.
  1. Rick Valued Senior Member

    I am glad to be back again.
    Well,this is kind of a lecture,so anyone who is an engineering or computer science student might find the material useful.
    feel free to ask questions.

    <b><i>note: this is mere an introduction of sorts for some people who dont really have any kind of background about what i am actually going to talk about.i dont want them listening to persian</b></i>

    Justfication of posting the thread here:
    i found that this place will be accurate for the posting,probably because enhancements in formal languages could lead to a great and highly evolved interaction with machines.this would again converge us towards the True AI,as some people here would say.
    the history of computer theory is was formed by fortunate conincidences,involving seemingly unrelated branches of intellectual endeavor.

    As the set theory was invented there were many Paradoxes.Some of the Georg Cantor's findings could be tolerarted

    Please Register or Log in to view the hidden image!

    ,like for example the Infinty comes in different sizes,but some contradictions were not,like for example,
    <i>the notion that some set is bigger than the universal set</i>.
    This left a big cloud over mathematics. that needed to be resolved.

    In the year 1900,David Hilbert was invited to address an international congress to predict what problems would be important in the century to turned out as most critics would call a mere coincidence that he was almost correct everywhere..

    First of all he wanted the confusion in set theory resolved.Hilbert thought that Axioms as provided by Euclid,and set of rules of inference could be developed to avoid paradoxes.

    second Hilbert was not merely satisfied that every provable result should be true;he also presumed that every true result was provable and wanted the mathematicians to find the ways to do so.

    it wasnt easy for mathematicians to follow Hilbert's plan.Mathematicians are usually in business of creating proofs themselves,not the proof generating techniques.what had to be invented was a whole field of mathematics that dealt with Algorithms or programs.
    in other words (as per today)we might rephrase Hilbert 's request as a demand for a computer programs to solve mathematical problems.

    the road to studying algorithms wasnt a smooth one.the first Bump Occured in 1931 when Kurt Godel proved that there was no algorithm to provide proofs for all true statements in fact what he showed was even worse.he showed that either there were some true statements in maths that had no proofs,in which case there were certainly no algorithms that could provide such proofs,or else there were some false statements that did have proof of their correctness,in which case the algorithm would be disasterous.
    Mathematicians then had to retreat to the question of what statements do have proofs and how can we generate them...
    Alonzo Church(of Church thesis fame),Stephen Kleene.Emil Post(PCP fame),Andrei Markov,i dont know the middle name of his(difficult to pronounce

    Please Register or Log in to view the hidden image!

    ),John newmann and Alan Turing worked independently and came up with an extraordinary simple set of building blocks that seemed to be the atoms from which all the mathematical algorithms can be comprised.
    they each fashioned various(but similiar in various respects) versions of universal models for all algorithms,what we could call a universal Algorithm Machine.Turing went one step farther,he proved that there were mathematically definable fundamental questions about the machine itself that machine could not answer.

    Turings theoritical model for an algorithm machine employing simple set of maths structures held out a possibility that a physical model of Turings idea could be actually constructed.if some human could figure out an algorithm to solve a particular class of problems,then machine could be told to follow that sequence to solve the problems.

    the electronic discoveries coincidentally supported everything that was going on theoritically.thus,
    <b>what actually started out as a mathematical theorem about mathematical theorems became single most practically applied invention since wheel and Axle.not only was this an ironic twist of fate but the whole thing happened in a span of just <i>10 years</i>.
    it is natural to assume that:
    the two important branches of psychology and neurology play an important role in the whole scenario of Formal labguages acceptances and the machines designed for the various purposes.the languages in AI play an important role in a way that unless a formal language is designed with removal of ambiguities and perfected,the days of true AI will remain Asymptotic as our mathematical elite set would say.

    this raises an important question,since our whole branch of AIs development lies in this question itself.

    what should be the language of an AI machine,or for simplicity let us take a common machine,what should be the language.
    this makes us go back to our roots.

    (this is getting bigger,i"ll post another reply to the same thread.

    ....more to follow...

  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. Rick Valued Senior Member

    Basic Theory of Languages

    Before creating a highly evolved machine,that could think, our first consideration would be:Language.

    what should be the language of a machine?to answer the above question let us go back to basics.
    In English we have three different entities:
    now we begin with only one finite set of fundamental units out of which we build structure.let us call this <b>Alphabet</b>.
    a certain specified set of strings of charracters from the alphabet will be called <b>Language</b>.
    the strings that are permissible in the language are called <b>Words</b>..
    the set should also contain a LAMBDA.
    for every language there exists a grammer.conversly for every grammer there will be a language generated.
    what is this grammer?
    a grammer is a set of following things:
    1.)A starting symbol,essentially called 'S'.
    2.)A set of productions called P,which will generate the string that is part of a language.
    3.)Terminal Symbols,the symbols which end the productions in case of a derivation of a string which is part of a language.
    4.)Variable,which aids in the productions.

    every string is essentiall derived by the grammer which is a set of (V->VARIABLES SET,P->PRODUCTIONS SET,S->STARTING SYMBOL,T->TERMINAL SYMBOL SET).
    essentially the above concept is used by most of the machines to accept a language.conceptually therefore most of the machines are same.they employ various methods to understand the language.

    During the development phase of various machines which could understand various languages.there was confusion all around.this was primarily because of some machines were able to accept some language and some the other.To remove this Noam Chomsky gave a classification of languages:

    1.)Regular Languages:
    these were the languages which had type 3 productions meaning the productions set in the grammer was such that it was accepted by a finite state automata.we"ll discuss these in detail later on.

    2.)Context Free Languages:
    these were languages of productions such that they had no left context or right context.

    a left context in the following example is 'A' adjacent to S.
    AS ----->BB
    note that here B,A are the variable and not terminal symbol.the languages of such kinds were usually accepted by a Push Down Automata,that we will discuss later on in the thread(that is if i complete this post

    Please Register or Log in to view the hidden image!

    my fingers are already pissing me off...)

    3.)Context Sensitive Languages
    these languages usually accepted by a LBA(LINEAR BOUNDED AUTOMATA),consists of a grammer such that the production set is Context Sensitive meaning,the productions are having left context or right context,as i explained in previous example.

    4.)Phase structured Languages
    these languages contain grammers that consists of virtually any kind of a production.such languages are usually accepted by a TURING MACHINES.we will discuss some aspects of this later in one can say that a Turing machine is essentially the peak of evolution in languages acceptors.

    essentially the languages are derived by a technique called parsing or by making a parser tree.this forms an integral part in compiler design and also machine interaction and coding etc.
    the parsing technique is of two types essentially:
    1.)Top Down Parsing.
    2.)Bottom Up parsing.
    the top down parsing makes a tree,in which starting symbol 'S' is the root and rest of all productions are children.this essentially generates the string in the end.the string validity is then checked. or can be checked.
    if there are more than one derviation tree for same string than there is an ambiguity in our logic of grammer productions and hence our language has failed to qualify a potential test posed by the finite state machines.

    (more to follow...)

  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. Rick Valued Senior Member

    that was a trailer or a background,i hope that was digestable.if it wasnt i can explain the whole stuff again.first i'd love to get a feedback on above stuff,then i"ll post the detailed part later.


  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. marin139 Registered Member

    Very interesting indeed.
  8. allant Version 1.0 Registered Senior Member

    Re: Basic Theory of Languages

    Language as a starter sounds obvious, but any proof ? The current thinking is that this is actually wrong. However this may take us off topic so yell if you want more...
  9. Rick Valued Senior Member

    Proof for what?

    Please Register or Log in to view the hidden image!

    i dont get you,please clarify.

  10. c1earwater Registered Member

    What I think Allant means is, do you have any proof that the starting point for an AI should be language? Why not start with for example symbol manipulation and let the language evlove? (Ofcourse, then you'd have to learn the AI's language which might be tricky :bugeye: ).

    But as he said, this might be a bit off topic.
  11. Rick Valued Senior Member

    I wouldnt disagree with you that this is slightly off the topic.perhaps thats why i tried(for the first time ever i think here)my posting.But i couldnt find a better place for the posting.this is because Computer science Forums are not suitable enough for this post since it lies somewhere in between.

    i think Languages are a concern for AI thinktanks.partly because <b>level</b> of interaction has never ever occured between Computers and Humans.The day is not far when we have a Direct Brain to cpu communication,but untill then we have to improve upon this technology for enhancement in the communication skill of the intelligent box that has revolutionized our lives.isnt it?

    thanks for your time.
  12. c1earwater Registered Member

    I think your original post is in the right place, but the question about language being the starting point for AI is a bit off topic of your post.

    Please continue.
  13. allant Version 1.0 Registered Senior Member

    What you said above. I have started a separate thread on which came first - language or symbols.
Thread Status:
Not open for further replies.

Share This Page