Faster than BigO(n) compacting algorithm exists?

Discussion in 'Computer Science & Culture' started by a_ht, Oct 5, 2004.

Thread Status:
Not open for further replies.
  1. a_ht Registered Senior Member

    Messages:
    158
    Hi all,

    For example, lets say you have an array B[6]={0,3,0,0,2,0}. After compacted, B would be {3,2,0,0,0,0}. They only way I can think of doing this is;

    int ptr;
    for(int i; i<n; i++){
    if (B==NULL && ptr==NULL){
    ptr=i;
    }else if (ptr!=NULL){
    B[ptr] = B;
    ptr++;}
    }

    The algorithm above is BigO(n), since its execution increases linearly with n. Is there an compacting algorithm that executes faster than BigO(n)?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. curioucity Unbelievable and odd Registered Senior Member

    Messages:
    2,429
    I can't say for sure... the only thing where faster than linear algo I've ever heard of is dividing technique when searching a sorted array...... How is it applicable here?
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. river-wind Valued Senior Member

    Messages:
    2,671
    could you explian your algorythm? I think, from looking at it, that it would loose data during execution, as you're just setting B[ptr]=B, without dealing with the previous value of B[ptr]. Plus, I don't see where you are comparing values of b to anything else (other than null), thus resulting in no way for the algo to know if it should move the current value to the left or not.

    unrolling the loop a little gives me this:
    Code:
    int ptr;                       //pnt is null
    for(int i; i                   //the syntax on this is odd, but it look like a for(int i;0;max(B))
    if (B[i]==NULL && ptr==NULL){   //this should only occur after i> than max(B), and the for loop is done.  this should never occur
    ptr=i;                        //setting ptr  
    }else if (ptr!=NULL){   //first time, ptr is null, so neither of the below statements will occur?
    B[ptr] = B[i];             //setting B[ptr] value to the value of B[i]
    ptr++;}                    //increment ptr
    }
    
    
    From the looks of things, you're hoping for a rough sorting algorythm.  Bubble sort, bucket sort, insertion sort, etc, all require about O(n), because they need to compare the values of the array.  Quick sort is one of the best, but even it requires O(n log n).  The nl log n algo's are what curioucity is thinking of with the "dividing technique", I believe.  sectioning the array and sorting it in groups.
    [url]http://linux.wku.edu/~lamonml/algor/sort/sort.html[/url]
    
    [url]http://www.cs.colorado.edu/~karl/2270.spring03/sorting.html[/url]
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Blindman Valued Senior Member

    Messages:
    1,425
    The quickest sort I know of is the tree sort. You create a backward and forward linked list. You insert the first record, then the next record and check whether it is greater or less. If less add it to the back link else to the forward. If there is an item on the link check the second value and so on.. Once you have done this for each record you then just have to step through the linked list to get the correct order.

    This is not effective for small record count as the over head will cancel out any speed gains. It does not work well on an already sorted data set and works best on a large, randomly distributed data set. But with small record sizes and large data sets it can use up a lot of memory.

    Was used in old render pipes before the days of Z buffers.
     
  8. river-wind Valued Senior Member

    Messages:
    2,671
  9. melodicbard Registered Senior Member

    Messages:
    208
    You must look at each element to see if it is NULL. So the compacting must be O(n) if there is no parallel processing unit capable of examining the elements more than one at a time.

    This reminds me of the problem: count the number of 1's (non-zero) in a N-bit machine word. Faster than O(n) algorithm exists for this problem. Here the accumulator actually performs some parallel operations on the bits.
     
  10. shoffsta Geek Registered Senior Member

    Messages:
    60
    quickest sort on average is quicksort
    ...google...
     
Thread Status:
Not open for further replies.

Share This Page