Automatically editing wiki pages (for private wiki) in command line?

Discussion in 'Computer Science & Culture' started by AlphaNumeric, May 8, 2012.

Thread Status:
Not open for further replies.
  1. AlphaNumeric Fully ionized Registered Senior Member

    Messages:
    6,702
    Here's what I want to do on a wiki (not Wikipedia, a private one). Each person has a page and each day they add something to their own page. I want to have a script which computes the difference between yesterday's page and today's page (like the history thing wiki's have) and then writes the new stuff to a single central page, basically collating what they've written.

    Of course I could do this manually but a script would be preferable. I have admin access to the server the wiki is on so there shouldn't be any permission issues. I'm familiar with copying bits from standard text files and concatinating them into a single file but I don't know how to work with the format wiki uses. Is this possible and if so can it be done with someone with just basic scripting knowledge?

    I suppose if the 'compute difference' bit is unpleasant I could just get everyone to write the text into a .txt and then have the script copy that into the personal pages and also the central page. That'd just need me to know how to edit wiki pages in the command line.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Stryder Keeper of "good" ideas. Valued Senior Member

    Messages:
    13,105
    I would guess that you could likely just write a parser in whatever CGI language you have available that takes the information recently stored directly from the database and format's it onto a separate page. You could either have it parse everytime the script runs (which wouldn't be a good scripting practice) or have it create a static page once a day with the information you want in it. (this lessens the database accesses).
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Chipz Banned Banned

    Messages:
    838
    The typical tools used for this task; patch, diff, diff3, merge, will not work (easily) due to your Wiki syntax. Whatever diff tools you use will have to be smart enough to properly reformat the diff to Wiki, if I understand what you're trying to do.

    However, I will mention GitHub hosts its own wiki's (Using markdown) and git seems to adequately merge branches.

    You would probably have a much easier time using Markdown, as patches and diffs don't need to be fully aware of Wiki-syntax closures. If this is something you're interested, I'd be happy to whip up some patching examples. I would need to know first... are you trying to concatenate all NEW data into a single page...or are you trying trying to concatenate all data into a single page?

    Code:
    
    #!/bin/bash 
    
    function cat_diff_of_page 
    {
        pushd $1 &> /dev/null 
    
        OLD="page.old"
        NEW="page"
    
        echo "Diff Page for $1"
        diff $NEW $OLD 
    
        cp $NEW $OLD 
        popd &> /dev/null 
    }
    
    # BASE WORKING DIRECTORY 
    WD="/tmp"
    
    # PAGE TO AMALGAMATE DIFFS TO 
    DIFFP="$WD/amalgam_dif"
    
    # PAGES TO PROCESS
    PTP=( "john" "james" "ralph" )
    
    
    #!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # This part is only here to set up the example # 
    #!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    for PAGE in "${PTP[@]}"
    do 
        echo "$PAGE" 
        if [ ! -e $PAGE ];
        then 
            mkdir $PAGE 
        fi 
    
        pushd $PAGE 
    
        echo "Some words" > page 
        echo "Some words" > page.old 
        echo "and more words." >> page.old 
        popd 
    done 
    #! END SETUP 
    
    
    # BEGIN EXECUTION 
    cd $WD 
    
    echo "#===================================================" >> $DIFFP 
    echo "# `date`" >> $DIFFP 
    
    for PAGE in "${PTP[@]}"
    do
        echo "--------------------------------------------------" >> $DIFFP
        echo "`cat_diff_of_page $PAGE`" >> $DIFFP 
        echo "" >> $DIFFP 
    done 
    # END EXECUTION 
    
    This assumes you have structure like...
    Code:
    .
    ├── amalgam_dif
    ├── james
    │** ├── page
    │** └── page.old
    ├── john
    │** ├── page
    │** └── page.old
    └── ralph
        ├── page
        └── page.old
    
    And in amalgam_dif It will output...

    Code:
    #===================================================
    # Mon May 14 21:47:46 PDT 2012
    --------------------------------------------------
    Diff Page for john
    1a2
    > and more words.
    
    --------------------------------------------------
    Diff Page for james
    1a2
    > and more words.
    
    --------------------------------------------------
    Diff Page for ralph
    1a2
    > and more words.
    
    
    (Because the diff for each one of them was the same)
     
    Last edited: May 15, 2012
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. rpenner Fully Wired Valued Senior Member

    Messages:
    4,833
    MediaWiki (what Wikipedia uses, roughly) and Twiki are just 2 data standards. Both allow one to mine history and perform diffs, but the details vary greatly. The perl library http://search.cpan.org/dist/Algorithm-Diff/lib/Algorithm/Diff.pm I have found helpful in the past. The problem is, ANY part of a wiki page can change -- unless you structure the content and make updates un-wiki-like -- So you need to decide a spec for what counts as "new" material.
    1) Any added/changed paragraph
    2) Any added/changed section
    3) Any added section
    4) Any added block of text at the end.

    A question of presentation arises -- do the copied sections link back to their source pages? How are the different contributions delimited?

    The last step is merging in or replacing the contents of the old page with the new material.
     
  8. Chipz Banned Banned

    Messages:
    838
    Well, rpenner, let's hope his situation isn't so dire as to require Perl as a solution

    Please Register or Log in to view the hidden image!

    .
     
  9. rvagood Registered Member

    Messages:
    2
    Have you ever wanted to start your own wiki website that anyone can edit, like wikiHow, Wikipedia or Wikitravel? Here's how to do it.
     
    Last edited: May 18, 2012
Thread Status:
Not open for further replies.

Share This Page