How does programming work

Discussion in 'Computer Science & Culture' started by python, Jun 9, 2007.

Thread Status:
Not open for further replies.
  1. python wisdom comes quietly Registered Senior Member

    Can someone explain to me how programming actually works. For example I dont understand how anything that has been programmed is given a GUI.I understand algorithms are instructions but how do they tell what the computer to do.
  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. phonetic stroking my banjo Registered Senior Member

    Programming languages are like the middleman.

    The CPU - arithmetic/logic and other bits (jesus.. wasn't that long ago I actually remembered this stuff) deals with 1's and 0's. On and Off.

    Imagine your tv/dvd/vcr/cable box for example. If they're on and off in a different order, different things might happen. Depending on what you want to happen, you turn them on in a certain order.

    So, the 1's and 0's that the computer understands are Assembly Language and/or Machine Code.

    When we write a program in C++ or whatever language, all we're really doing is writing those commands in words. Something that makes sense to us. The compiler will then turn that into 1's and 0's or hexadecimal for the computer to understand.

    Apologies for my shitty answer. Hopefully somebody will give you a much better answer

    Please Register or Log in to view the hidden image!

    I'm amazed at how little I've remembered. It was only a couple of years ago I had to learn all of this stuff.
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. Zephyr Humans are ONE Registered Senior Member

    Do you mean: how can you learn to program? Or do you mean: how do computers run programs?

    For the second question check out the art of assembly books.
  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. leopold Valued Senior Member

    there are 2 basic types of programs, low level and high level.
    to explain how a program works you need to understand both types.
    low level is programming on a machine level with ones and zeros. the next step up would be assembly. assembly is low level programming.
    the computer (or technically the CPU) is constructed to understand that certain groups of ones and zeros mean a certain thing. for example we can tell the computer to recognize the bit pattern of 1101 to mean add the 'a' register to the accumulator. so when the CPU sees this bit pattern it automatically does the program task.

    in order to run programs you must be able to tell the computer what the instructions are.
    we do this with a program counter. a program counter is pointed to the first byte of the program and it adds one to itself after each instruction.
    this way each instruction is executed one after the other.

    this is a very simplified explanation.
    algorithms are a method to obtain a goal.

    see my above explanation to see how they tell a computer what to do.
    if you need clarification then ask.
  8. python wisdom comes quietly Registered Senior Member

    How did we ever program this in the first place using this example how did we ever program that 1101 would bring up "a". And could someone explain to me how a software like for example nero is programmed is it just more complicated binary commands. I'm still unclear after you have made a program how you give it a GUI.
  9. RubiksMaster Real eyes realize real lies Registered Senior Member

    You use the Windows API (application programming interface). The WinAPI allows your program to work with Windows, and display a GUI for it. You have to program it yourself though.

    It has to do with the hardware level of the computer (like the actual transistors that make up an integrated circuit). Look up some stuff about logic gates and the ALU. Learn how a simple 4-bit ALU is designed, and you will begin to see how the computer decodes and "understands" binary instructions.
  10. RubiksMaster Real eyes realize real lies Registered Senior Member

    Not really. No human is smart enough to program something that complex all in binary. They use a higher level language like C or C++ and then the compiler translates that to machine-readable binary. This machine code is specific to the processor, because different processors (e.g. the the Intel 8088 and the 80486) have different architectures and different circuits that control them.
  11. python wisdom comes quietly Registered Senior Member

    Thanks rubiks makes things more clearer now. Thanks for telling me about winAPI didnt know about it but now I'm starting to think even deeper how did we make things actually come up on displays I mean we couldn't have just designed a random architecture and hope it would show something on screen.
  12. leopold Valued Senior Member

    when a computer is designed all the various instructions are programmed into it.
    this is the reason intel chips do not understand amd chips.

    your question has to do with op codes and op code decoders.
    an op code is a sequence of ones and zeros.
    when a computrer is designed all the various instructions are programmed into it.
    for example the instruction to (a)add the a register to the accumulator, (b)the instruction to jump to a location, (c)the instruction to branch if the carry flag is set, (d)the instruction to branch if the carry flag is clear, (e)the instruction to load a value into memory, the list is quite long.
    in fact it is not unusual to have more than 256 different instructions.
    all of these instructions are programmed in at the time of manufacture.
    all of these instructions are called op codes.
    let's assign some values to the above instructions.
    (a) 00000001
    (b) 00000010
    (c) 00000011
    (d) 00000100
    (e) 00000101
    for ease of memorizing these op codes let's assign them the following:
    (a) add a
    (b) jmp xxxx
    (c) brcs xx
    (d) brcc xx
    (e) ld xxxx,yyyy
    the X's in the above stand for memory locations. the Y's for numerical values.

    when and instruction is loaded into the op code decoder it outputs a single high corresponding to the instruction. this line is hard wired to various components to carry out the instruction.

    in essence we are telling a computer what to do when we create it.

    it's a little like telling your child not to go into the road when he sees any cars coming. how does the child know not to go into the road? because he was told not to before hand. the same senario can be applied to computers.

    now yo will no doubt ask "well how can computers do so many different things" , the answer is in the list of instructions, the more instructions you have the more varied your programs will be.

    if not i'll try to expain further.
  13. python wisdom comes quietly Registered Senior Member

    Thanks leo you've made me understand further. Still wondering about how we made things appear in displays though.
  14. leopold Valued Senior Member

    getting computers to display alphanumeric sybols on a crt was one of the harest things that happened during computer development.
    it's very complex, involving the use of shift registers.
    the symbol isn't outputted all at once to the screen
    it's more like scanned onto the screen from memory

    here is a chart that contains the codes to display various characters.
    in each case the binary equivalent is stored in memory.
  15. RubiksMaster Real eyes realize real lies Registered Senior Member

    When something wants to be displayed, it writes to the video memory, or VRAM, in the video card. The display hardware constantly sweeps this for changes, and it ties into the display hardware (the monitor). The monitor knows that when certain bits are turned on, it should display a certain color pixel at a certain location on screen.

    So basically the display architecture "observes" the video portion of the RAM, and decodes it with its own processor (the GPU), and sends that data to the monitor.

    This is all done through the video driver, which are utilized in high-level libraries, such as the Windows API, OpenGL, or DirectX. These allow human programmers to write programs that can be compiled into the machine code that works with the video display hardware.

    So that's how displays work.
  16. python wisdom comes quietly Registered Senior Member

    Thanks rubiks you've made something else clear to me I was originally asking what leo picked on which is going to remain unclear at least for a while. Thank you for you help
  17. Stryder Keeper of "good" ideas. Valued Senior Member

    The best way to get even more lost when it comes to programming (Lost in the sense that you have to do a bit of research and it's seen as a bit of "Around the houses" to get what you require) is to actually look at the history of programming.

    From the basics of pure mathematics to the theoretical invention of Charles Babbage, the "Computation Engine". (A clockwork calculator)

    As for GUI, thats where the fun starts as it's easy enough to tell a machine to put points on a screen and draw a line between them, however when you start applying 3D engines to the mix. It gets a little more complex.

    Since you move from the X,Y axis screen position (0,0 happens to be the Top, Left Corner) to the Cartesian system, which then involves a process of converting that (Which most GPU's now a days have built into the hardware)

    You then have start adding extra dimension for instance the Z axis, which is drawn by Rotating, Scaling and otherwise Transforming it's location through the X,Y axis's.

    To make it easier to manipulate, rather than having complex lines of code or large binary throughput the usage of Matrix Mathematic is used to greater the ergonomics of the operations.

    In a nutshell, it's worth looking at from greater vantage point than just the programming.
  18. leopold Valued Senior Member

    what i find most confusing about explaining programming to a beginner is that there are two basic types, high level and low level.
    high level has to do with programming languages such as basic and C.
    low level has to do with machine code and assembly.
  19. Billy T Use Sugar Cane Alcohol car Fuel Valued Senior Member

    I do not know much about the inter workings of digital computers, but I thought there were three levels of language:
    (1) the high level most humans who either use commercial software or program code use
    (3) the lowest level that the CPUs "understand" (These are all unique to each different CPU. Usually called "machine language.")
    and one you did not mention:
    (2) the intermediate level which all software that most humans use is written to "speak" - I.e. the people constructing a piece of software write code that has as its end product the language input to this intermediate level. Then the intermediate "compiler" converts that input into the unique machine language of the particular CPU.

    The existence of a now commonly-used, universal, industry-standard, intermediate level language makes it possible for software producers to make a single version of their programs, not a different one for every cpu. For example, a Frenchman working with MS word and doing a cut and paste of his text produces the same commands in this universal "intermediate language" as a German or American does.

    Is this not correct?
    Last edited by a moderator: Jun 10, 2007
  20. leopold Valued Senior Member

    basically correct.

    it gets really confusing when you realize that all of it must eventually be converted into machine code.
  21. Billy T Use Sugar Cane Alcohol car Fuel Valued Senior Member

    If there are any errors, please tell what they are.

    Also please explain why high level software, which outputs the universal standard intermediate level language is not functional on all operating systems.

    I stated that I do not know much about the inter working of digital computers, but you have made no similar qualifiers, so I assume you know and can answer.
  22. leopold Valued Senior Member

    i am indeed familiar with the inner workings of computers, on the chip level.
    most of my knowledge is with the early 8 bit processors, 8080, 6800, and their support chips.

    as to your question, no, i cannot answer it.
  23. Oniw17 ascetic, sage, diogenes, bum? Valued Senior Member

    This thread is actually very informative. What's hapening to sciforums?
Thread Status:
Not open for further replies.

Share This Page