Discussion in 'Intelligence & Machines' started by JoshHolloway, Apr 2, 2006.
Besides Linear Algebra, Calculus, and Diff E, what type of maths are used in the field of AI?
Log in or Sign up to hide all adverts.
I am guessing that no one in these forums actually works in the area of articifial intelligence. I guess that is the reason no one has responded to my post. Very interesting.
First order predicate calculus. It's not exactly a mathematics field, it does have a defined set of axioms upon which theorems can be proven and so forth.
Ahh... it's been a while since I visited this place...
As to your question, it also depends what kind of AI you're talking about. For example, swarm intelligence and emergence use statistics ... A LOT.
Calculus? Do you mean calculus as in integrals and derivatives? I may be completely wrong (feel free to point it out, I'm actually interested if there is an application), but I thought that continuous mathematics doesn't help in the discrete world much.
all the best
jUST A GUESS, WHAT ABOUT Matrix and probability?
Hey there josh. I'm a mechanical engineering student who took a control systems course which briefly touched the subject of things called "Neural Networks", and "Fuzzy Logic".
A neural network is basically a program with is designed to "learn" a pattern presented to it in the form of some input. Inside the program is a bunch of blank mathematical equations "neurons" which alter themselves slightly after being subject to an input.
My memory is too fuzzy to really describe the other one.
Hope that helps a bit. Maybe do an internet search for published and credible resources. Please Register or Log in to view the hidden image!
Please Register or Log in to view the hidden image!
AI math is a very special form of the math as 'we' know it
It is symbolic math
There is the thing and the image
such that an entry for "apple" contains all the data for "apple thing" (thing form) in one single unique mathematical entry (image form).
AI (IMO should) work using all the thing data to generate a corresponding unique image that can be retrieved in total from that numerical data (=thing) in that single entry.
That is how I have approached the problem.
The real problem with AI is a lack of a definition framework. It is impossible to read the ROM of the brain, so the criteria (elements) of the framework for defining input have to be construed from our observations
and how much of this unobserved framework is totally hidden from us?
Without a complete framework, you can't have true AI, but you can have a very 'intelligent' computer.
Production Inference, Nonmonotonicity and Abduction
Aritificial Intelligence is atually studied in different disciplines such as Computer Science, Computer Engineering, Electrical Engineering etc. I guess the type of math you´d be using would depend on the kind of exposure you´d be getting. For example, I´m studying electrical engineering, and we deal mainly with pattern recognition in AI. For example, image recognition, voice recognition etc. In such courses, you´d need higher level statistics and probability in addition to linear algebra. For example, speech recognition utilizes hidden markov modelling which is pretty high level probability and stochastic processes.
In genetic algorithms, it's nothing much more difficult than multiplication. GAs are cool. They're so easy to code and they work like magic.
I wrote one once. It was about two screens of code. I gave it the task of finding the highest point in a terrain (an equation) that consisted of a square one billion points per side, having three hills of almost the same height in one corner. There was only one highest point, so it had to find that one point out of 1,000,000,000,000,000,000 possibilities! On a 1 GHz machine, where one try = a guess that a point is the highest, the algorithm could find that point in less than 1500 tries 99% of the time, and in about 5 seconds. Of course it had to start from a WAG (a wild-ass guess) and knew nothing about the equation (it's treated as a black box).
Given how easy they are to code and how well they work, I'm surprised at the slow pace of their advancement into everyday firmware and software applications. It probably doesn't help that most of the books I've seen make it look harder than brain surgery.
Also check out Linear Programming and matrices.
Separate names with a comma.