Is it possibly to functionally transfer knowledge from a neural network to another?

Discussion in 'Intelligence & Machines' started by Buckaroo Banzai, Jan 3, 2018.

  1. Write4U Valued Senior Member

    Messages:
    7,610
    An AI may be able for "deep" thought processes, but until we can find an algorithm or a logarithm that makes an AI wanting to stay alive, the same fundamental urge that all biological organisms seem to possess which seems to drive the evolution of organisms.
    This sentient AI business presents a real human dilemma.
    Remember HAL from "2001: A Space Odyssee ?
    https://en.wikipedia.org/wiki/2001:_A_Space_Odyssey_(film)#Plot
    https://en.wikipedia.org/wiki/2001:_A_Space_Odyssey_(film)
     
    Last edited: Jan 19, 2018
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Michael 345 Looking for Bali in Nov Valued Senior Member

    Messages:
    6,198
    Agree best guess and not always right

    How about discussion between two AIs as to whether Giacomo Antonio Domenico Michele Secondo Maria Puccini is better than Giuseppe Fortunino Francesco Verdi

    How would that be best guessed?

    Please Register or Log in to view the hidden image!

     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Write4U Valued Senior Member

    Messages:
    7,610
    Differently from humans, no doubt, but fundamentally still the processing of available information and responding appropriately in a programmed benign fashion. And it might have to make value choices, especially in patterns of mathematically based arts or geometrics.

    Ever taken a deep zoom into a fractal until hydrogen atom is represented as a cloud.
    I had it but lost it in transfer...it had a running counter of iterations and it was just astounding to behold what a simple program can produce over long periods of time. T he trick is to start at the smallest possible particle and build the fractal from the ground (planck scale) up.

    Example: I Robot
    https://en.wikipedia.org/wiki/I,_Robot_(film)#Plot

    And this more serious inquiry.; https://itstillworks.com/morph-photos-8341726.html

    It is dangerous to introduce an AI, which would compete wth humans. Actuall this has already happened . How many jobs are performed by robots? The world is already completely integrated with specialist computers, which are creating a whole new tool which can replace human labor and with proper maintenance may work a hundrd years to perfect its, own experience of life, approximatioms of human behavior.
     
    Last edited: Jan 19, 2018
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. someguy1 Registered Senior Member

    Messages:
    686
    His premise is wrong.

    You clearly have no idea what an algorithm is. And if you got your ideas from this vid, either you misunderstood the vid or Anil Seth doesn't know what an algorithm is.


    Then whatever it is, it's not an algorithm as understood in computer science. Like I say, if you would just call it a "foozle" we'd have no disagreement. You can't call it an algorithm for the same reason you can't call it a banana or a brick. Those words already have well-understood meanings that conflict with how you are using the term algorithm.

    I certainly agree with that.

    So in the end you are agreeing with me that computer sexecuting algorithms are no sentient. And since the brain is sentient, the brain is not a computer.

    You are completely agreeing with me.
     
  8. someguy1 Registered Senior Member

    Messages:
    686
    Great example. Anyone who ever caught a ball knows that it's NOT a step-by-step computation of trajectories. That's how MACHINES are programmed to catch balls. Humans seem to have a real-time sense of where the ball is and where it's going. It's an analog system of continuous feedback that is NOT an algorithm. This is plain to anyone who has ever played catch or grabbed a falling plate in the air before it falls to the ground.

    Surely. We are in agreement. Our brains do not operate the way digital computers do.

    Our body and brain have an amazing system of maintaining a constant temperature. I assume you know that.
     
  9. someguy1 Registered Senior Member

    Messages:
    686
    Tegmark is already on record claiming not only that everything can be explained by math, but that the universe literally IS a mathematical structure. That's a provocative thesis that falls into the category of "Interesting even though almost certainly wrong."

    I did happen to read a Tegmark article the other day about his ideas on substrate independence. His argument was SOUND but not VALID. That is, if you grant his premises, you have a correct logical argument. But his main premise was wrong. And that premise was that intelligence is an algorithm. Once we accept that, all kinds of other stuff follows. But intelligence is NOT an algorithm. Tegmark's wrong. Tegmark is in fact wrong about a lot of things. He has ideas but no experimental verification of those ideas. When he speculates without evidence, he's not doing science.

    Yes but the speed of computation makes no difference to what an algorithm can compute. The Euclidean algorithm computes the GCD of two integers whether I run it on a supercomputer or do it by hand with pencil and paper. Any quality or attributed of a computation that depends on execution speed is NOT COMPUTATIONAL by definition. Because computations are substrate independent. They don't depend on the hardware. Of course the supercomputer can find the GCD of large numbers faster than I can with my pencil and paper. But in principle, given enough time, I'd find the same answer as the supercomputer.

    Many arguments for machine sentience make this error. Running a computation really really fast makes absolutely no difference to what the computation can do.
     
  10. iceaura Valued Senior Member

    Messages:
    26,897
    But so would an exact and perfect duplicate. The brain itself "fails" to be predictable over enough time in the real world of input and reaction.
    That can't be your criterion, or you have ruled out functional transfer by definition - which is empty.
    - - - -
    The first attempts failed - in some areas. The latest round is not failing in as many areas - and it is showing signs of being particularly valuable in exactly the areas least amenable to checklist yes/no stuff, the judgment call and grey area assessments.
    All the auxiliary stuff - communication processing, sensory input, error housekeeping, memory stashes of one kind an another, etc. We aren't talking "beyond", necessarily, depending on what that implies.
    Why must functional duplication depend on the brain running an algorithm?
    That would not be easy to do if you were trying to transfer the playing ability of the latest round of Go playing neural networks - the "algorithm" is built into the hardware status at any given moment, and changes with every game. It's not a separate set of instructions. You would want to take another approach.
    - - - -
    Same as always, for example with any CPU clone.
    There exist analog computers. Are you claiming they cannot be cloned?
    Pay closer attention to the actual mechanics of this - what you are transferring is charge distributions, and they vary in a range with certain probabilities. If you have skimped on your error correction, your expectation of errorfree transfer and reliable operation will be disappointed quickly. If you have been careful, it will take longer is all.
    Nobody is claiming complexity makes no qualitative differences. The opposite. The point was that a duplicate or functional transfer would include transfer of that ability. So the "chaos" factor has - apparently - been covered in advance.
    The status of the brain's cogitation machinery at some moment in time - including direction/temporal change/acceleration information.

    nb: I am usually on the "other side" of this discussion.
     
  11. iceaura Valued Senior Member

    Messages:
    26,897
    Neural nets can learn how to catch a ball - one result of training them in is a set of weighted nodes and connection strengths.
    The human ability to project a trajectory - to create a continuity from its perceptions - is not based on continuity in the perceptions themselves. And humans have to learn how to do that - they train in, much as a neural net trains in.
     
  12. someguy1 Registered Senior Member

    Messages:
    686
    "... much as ..."

    That covers a lot of vagueness, doesn't it? If you think the brain is run by algorithms I disagree with you. If tomorrow morning some brilliant neuroscientist publishes the brain's own ball-catching subroutine, I'll apologize for being wrong. Till then, you need to supply some evidence if you claim that the brain has a ball-catching algorithm. And formal neural nets ARE algorithms, there's no difference except for the organizational scheme.
     
  13. iceaura Valued Senior Member

    Messages:
    26,897
    It refers to the striking similarity between the process of training in a ball-catching net and the process of learning to catch a ball.
    I don't care whether the brain is "run by" algorithms. I think that's irrelevant.
    Which is the major visible aspect of a functioning brain - the extraordinarily complex organizational scheme.

    And one clue to how it runs - and so the basics of any cloning or transfer efforts - is to note that this claim misleads: "Humans seem to have a real-time sense of where the ball is and where it's going. It's an analog system of continuous feedback".
    That "seeming" is a kind of illusion created by the brain - and all the different well-functioning human brains do it - from "feedback" that is (apparently, when investigated) not continuous. or not continuously registered.
     
  14. someguy1 Registered Senior Member

    Messages:
    686
    I'm agnostic on whether functional transfer is possible. I truly have no idea.

    I am opposing the claim that the mind is an algorithm in the brain. It so happens that many of the functional transference arguments are based on the idea that the mind is an algorithm. That's Tegmark's argument for example. By opposing the idea that the mind is an algorithm, I'm opposing many of the arguments for functional transference.

    But you are right, I have not presented an argument against functional transference in general, nor do I have such an argument.

    Ok, if you say expert systems are making a comeback I believe you. That's not a key point. Some dictionary definition used medical diagnosis as an example of an algorithm, and it reminded me of the expert systems that were proposed in the 1980's to do medical diagnosis. That's as far as my remark went.

    Well if you claim those are algorithms, we can debate that. If you agree they're not, you've conceded my point. Memory stashes? You are making things up. They can't slice open your brain and find the memory of the time you went to the store last week. Memory doesn't work that way. COMPUTER memory works that way. It's NOT THE SAME THING.

    You're right. My arguments are against the idea that the mind is an algorithm. I take no position on functional duplication. But do please note that many of the arguments for functional duplication depend on the brain being an algorithm. So I'm arguing against those arguments. That's all.

    A neural net is an algorithm. It's a conventional program running on conventional hardware. A node is a memory location. Assigning a weight to a node is assigning a numeric value to a memory location. The algorithm is fixed and never changes. Of course algorithms can do very clever things like playing chess and Go and driving cars. Weak AI is impressive lately. Doesn't prove anything about brains.

    Well analog computers work by completely different principles than digital ones. I don't know enough about them to know if they can be cloned.

    All hardware is fallible. Hardware presents an abstraction layer to the software in which the software can assume there are such things as memory locations and the ability to flip bits. Not sure what that means in terms of proving brains are algorithms, if that's your goal.

    I might be a little lost in the chain of quoting. I brought up chaos to point out that algorithms can't even solve the question of the stability of the solar system under deterministic Newtonian gravity. In any event I'm not arguing against functional transfer, only against the proposition that the mind is an algorithm executing in the brain.

    If the brain is not executing an algorithm, how exactly do you transfer all its goo? Molecule by molecule? Atom by atom? Quark by quark? Duplicating analog data is incredibly difficult.

    I'm totally confused by now as to what's being argued. I claim the mind is not an algorithm executing in the hardware of the brain. That's all I'm arguing.
     
    Last edited: Jan 20, 2018
  15. someguy1 Registered Senior Member

    Messages:
    686
    Neural nets train, brains train, therefore brains are neural nets? Bad logic, I'm sure you can see that. Planes fly, birds fly, therefore birds fly by the same mechanism as planes. Bad logic.
     
  16. someguy1 Registered Senior Member

    Messages:
    686
    And planes and birds are strikingly similar in their ability to fly through the air. But the mechanism are very different. Just as the mechanisms of artificial intelligence are very different than the mechanisms of human intelligence.

    It's the only point I'm arguing here. And Write4U is claiming the mind's an algorithm, and keeps posting popularized and inaccurate definitions of algorithm as if they were evidence. You may be reading my replies to Write4U and thinking I'm making a more general argument than I am. I'm only saying the mind's not the result of an algorithm. It could not be because algorithms are only syntactic and humans understand semantics.

    Brains are complex, chess playing computers are complex, therefore the brain is a chess playing computer? Please stop that!! You must be trying to make a more subtle point, I may be missing your meaning.

    Well it's certainly not a digital algorithm. Are you claiming it is? What are you claiming?
     
  17. Write4U Valued Senior Member

    Messages:
    7,610
    With "syntactic", do you mean symbolic? We can and do observe values, patterns, and their potentials.

    My argument was based on the following:
    https://en.wikipedia.org/wiki/Algorithm

    Your assertion that the brain does not use algorithms because it is sentient is based on what? We know that computers (which are not sentient) must use algorithms, in order to execute the processing of information. Programmers write these algorithms, which at the least indicates an understanding of these mathematical functions.
    If they work in non-sentient computers (pseudo intelligence), what rule prevents the sentient (intelligent) human brain from using these effective mathematical functions also?

    After all is said and done and all semantics aside, the human brain is a biological computer, but it's function is at the same scale of a computer, i.e. at nano scale. And in the end it is just processing values just like a non- sentient computer.

    But, unlike computers, the brain also functions at the bio-chemical level, which I suspect is responsible for "emotions" such as pain, pleasure, desire. (i.e. opioidaddiction). Ever seen an addicted computer?
    That the bio-chemical difference. And perhaps this phenomenon is also produced by a form of chemical algorithm.

    I am not saying that there is no difference between a brain and a computer. I am saying that fundamentally the calculating brain processes (best guesses) can be compared to what we call "algorithms" in computers.

    I am familiar with the term "foozle" and it has nothing to do with a bungling brain function, but I do understand the concept of a variable (fuzzy) algorithm, depending on subtle differences of the input information..
     
    Last edited: Jan 20, 2018
  18. Write4U Valued Senior Member

    Messages:
    7,610
    Depends on your definition of flying. Regardless of mechanism, they all use the principle of "lift" in order to fly.
     
  19. someguy1 Registered Senior Member

    Messages:
    686
    Symbolic, yes. In the formal sense. There is an alphabet of symbols. The alphabet is taken to be at most countably infinite. When the machine encounters a particular symbol, it flips some bits and goes to the next symbol. Given a particular state of the computer and the particular symbol, it's entirely pre-determined what bits it will flip. It's defined by the computation's program.

    I'm being picky like that because I can refer to this if you start to get poetic and fanciful about what algorithms are.

    I wish to state that I am not opposed to poetry. I like imagery and intuition. I just want to make sure that when we're talking about algorithms, we're clear on whether we mean that in the technical sense or the poetic. Because once you use words like "patterns and their potential" I get that old woo-woo vibe and I feel that you are leaving the realm of the science of algorithms and drifting into the poetry. And that may be leading you astray in your reasoning.

    Because our minds understand meaning. They have intensionality, "aboutness." When computers flip bits, that activity is meaningless. Humans impart meaning to the bits. These bits are a discussion forum, those bits are a cat video. That's what the humans add. If you just looked at the circuitry of the computer you'd just see a long but finite string of bits 1010101010010001001001001... There is no meaning in that, nor in the manipulations of the bits by the program.

    In case it's not obvious, this is basically Searle's argument in the Chinese room. So if you and I disagree on that subject, we can agree to disagree. Many people think the room is conscious or self-aware or understands Chinese. I happen to think that's absurd. So now you know my philosophical orientation to all this.

    Yes.

    Yes. The humans impart meaning to the symbols. If I had to put my entire argument into a short sentence, that would be it.

    Oh, nothing prevents humans from using effective mathematical functions. For example a long time ago I took a class in number theory. Early in the class they required us to execute the Euclidean algorithm by hand. In that moment my brain caused my body to execute an algorithm!

    OF COURSE THAT HAPPENS. Of course PARTS of our brain may be algorithms. Reflexes, say. Doctor taps your knee in just the right post and your knee jumps. That's probably a little subroutine in the nerves that operate the muscles. I'm perfectly happy to stipulate that.

    But not ALL brain and mind function can be explained as the execution of algorithms in the brain.

    It's "some" versus "all." Are some brain functions algorithmic? Probably. Could we prove that it's at least possible? Sure, just execute the Euclidean algorithm in your head. 8 and 14 in, 2 out. Boom, I'm a digital computer that can execute a 2000 year old algorithm. Substrate independence. Euclid and I can execute the exact same algorithm in our respective brains.

    But not ALL brain function is an algorithm. And the fact that algorithms are substrate independent does NOT prove that mind is. I saw Tegmark make that argument in an article recently. He's totally and completely and utterly wrong.


    It's biological, agreed.

    When you say biological computer, what do you mean by that? If by computer you mean a digital computer according to the abstract theory of Turing and the contemporary practice of hardware and software engineering, then there is not a shred of evidence that the brain works this way. There is no cpu, there's no ram, there's no instruction set, there's no clock, there's no program. None of these things.

    So I must ask you if you would be willing to clarify EXACTLY what you mean by that remark? Do you mean to say that a biological computer is different than a digital computer? If so then how about using a different word. Call it the biological foozle and I will have not a single objection to write about. I'll have nothing to post. I'll be done here.

    But if you call it a computer and you mean to invoke the science of modern digital computing, I simply must push back as strenuously as I can, because you are making an assumption utterly without evidence.


    The brain? Processing values? If by processing you mean computing then you have no evidence. It's the same problem as in your previous paragraph. You are using the word "processing" ambiguously. You want it to mean both "whatever it is the brain does," and also "what a digital computer does." It's a subtle switch in language. If you get very clear about what you mean by processing, your argument fails.

    Again. The brain ultimately processes its gooey bits just like a non-sentient foozle. If you'll just stop calling things computers when you have provided no evidence that they are computers, I'll stop objecting. And honestly I think I'm just being tiresome now. I've pretty much made all the points I can make and if we agree to disagree so be it. But I think you are confusing yourself by overloading words like computer and process and algorithm so that they have both poetic and allegorical meanings as well as technical meanings, and by that ambiguity you are making an argument that fails once the ambiguity is identified.

    That's how I see it anyway!!

    So those parts are NOT algorithmic. Well then we're in agreement. Is that correct? If only SOME parts of our minds are algorithmic, I have no objection to that thesis at all. As long as you acknowledge that there's also something else going on, something that's not algorithmic.

    Haha I've seen a computer addict! In the mirror I think.

    As Ronald Reagan said to Jimmy Carter during their 1980 debate: There you go again.

    https://en.wikipedia.org/wiki/There_you_go_again

    Please, call it a chemical foozle and I'll say, Yes, I agree with you! Parts of the mind are algorithms and parts are foozle. I agree with that!

    But when you want to say that some kind of phenomenon that is clearly NOT computational, can still be called a computation or an algorithm, you are making an invalid argument by using the same word in two different senses.

    Compared to. So after all this time you are not actually stating a claim. You're only making a metaphor? If you'd said that up front I'd have never raised a peep. You serious? You were only making a metaphor after all? Shall I compare thee to a summer's day? Thou art more lovely and more temperate.

    You don't actually MEAN the brain operates via algorithm, but only that you are making a metaphor? Please clarify this point, you will save me a lot of typing going forward.

    You must know something I don't. I was using the word as a completely meaningless word that could stand for anything. If you know some specific meaning for it, that was not my intention. It's not bad, it is? Am I out of touch? Sorry about that.

    Call it "woo-stuff," the mysterious whatever that minds do that man-made machines may or may not be able to do or may someday do. So whenever you want to talk about "emotional processing" in the brain that's sort of like some biological woo-stuff, I'll reply only to express my enthusiastic agreement with your point!

    Word salad. Fuzzy algorithms? Look, I can write a program that goes:

    if today is tuesday take out the trash
    else lay in bed.

    That's a program that does something completely different depending on "subtle differences of the input information," like what day it is.
     
    Last edited: Jan 20, 2018
  20. someguy1 Registered Senior Member

    Messages:
    686
    ps --

    I found out what this logical fallacy is called. In Sophistical Refutations, Aristotle identifies the fallacy of equivocation, in which the same word is used in two different ways within the same argument.

    You use the word algorithm with its standard meaning as in computer science; and you also use it to refer to those mysterious mental processes such as emotions that we don't yet understand. This equivocation is causing you to reach an erroneous conclusion.

    https://howaristotleimpactedargument.weebly.com/fallacies.html
     
  21. Write4U Valued Senior Member

    Messages:
    7,610
    Yes, except the human brain does not function in a binary code of "on/off" state. Our brains function in an exponential fashion forming cognition or recognition (faster) in a 3D environment. The emerging mental hologram of bits of information inside the brain, does not happen differently from our cognition of the existence of logarithms as an "efficient" form of symbolizable processing of information.

    Well, I am not talking about computer languages, but the fundamentals which make such mathematically functional concepts equally useful for the human brain and and AI and throughout the universe.

    Lemurs already can recognize more from less, that is a fundamental mathematical calculation.

    IMO, my "reasoning" is sound.
    I suspect that's where bio-chemistry begins to play a large part?
    I have no quarrel with that.
    Yes, because they do.
    Yes, because it could.
    Precisely that's why I highlighted the word efficient . Mathematical efficiency and conservation of energy are demonstrably equally valuable assets throughout the universe.
    It depends on your perspective, which you have restricted to computers. I think it can be argued that all electro/chemical information our senses and mirror neural system receives should (by the evidence of some 3 billion neurons) employ a very similar networking function as a computer. We can demonstrate this internal distribution of electro-chemical information by scanning which parts of the brain show activity while the brain is "working" on cognition or understanding.
    Yes, thas was Seth's argument also. You have to be alive to feel emotion.
    Yes, from your perspective, but I am trying to address the orderly distribution of information
    Do you know the definition of "foozle"? Look it up and you'll see the logical error you madeby using this definition.
    No, my "assumptions" are based on evidence of the importance of decision making, which is an ancient survival mechanism, present in almost all living organisms with neural networks.. It seems a peculiarly well developed organ in hominids and especially in humans, especially in cognition of abstractions and abstract thought itself. This requires a certain logical processor.
    Fair enough, I can understand your reluctance to using the term "biological computer" for "biological calculating organ" (machine).
    https://en.wikipedia.org/wiki/Brain–computer_interface
    Well then we are in agreement. I am fairly certain that many parts of the brain have completely different functional abilities.
    But even in the subconscious part of the brain that only controls and regulates our own living system, has to deal with the processing of internal electro/chemical signals.
     
    Last edited: Jan 20, 2018
  22. Write4U Valued Senior Member

    Messages:
    7,610
    page 2
    No, I am not speaking in metaphors, we just address the question from different perspectives.
    I looked it up.
    So you do agree that the human brain is capable of processing algorithms.
    Thanks, you just proved my point. I could rewrite the algorithm in favor of taking out the trash then going back to bed....

    Please Register or Log in to view the hidden image!


    Substitute "numbers" with the concept of "inherent values" (potentials), perhaps that will make more sense
     
    Last edited: Jan 20, 2018
  23. iceaura Valued Senior Member

    Messages:
    26,897
    Brings up an issue: What is "failure" that only appears after years of time, in a transfer of brain functionality. How would one detect it?
    I have no objection to setting that aside - a lot of folks seem, to me, to underestimate the hardware complexity and (especially) flexibility of the brain.
    I am not arguing that it would be easy. Like I said, I'm normally on the other side of this discussion.
    Birds and planes are not that similar. They don't function the same.
    But it is possible to "transfer" the functionality of bird flight to machinery, in principle - it would not look or operate like an airplane, is all.
     
    river likes this.

Share This Page