I know it can be done, but not how Depends on your source. I've resized pictures from my WordPress files before. It's one of those transfer protocols that ends with w and h assignments. However, I have never determined what it is exactly that makes that work for which files. Have I resized them for my blog? Is there something about the file properties that allows it? I cannot do it consistently, and haven't bothered trying for a long time, but it seems to me that I have, in fact, done it before. Perhaps it's that the resizing must take place on the source side, and not through Sciforums. But it's been a couple years, so I can't tell you exactly why it worked, or even which post.
If you go to "User CP" and click on Photos/Albums, one will find the instructions: "...Maximum File Size per Picture 97.7 KB Maximum Picture Dimensions 600 by 600 Pixels Pictures will be automatically resized to fit within these constraints if possible. However, you may receive better results by doing it manually..."
keith1: That's fine for pictures you upload, but it doesn't help at all with images linked using the [noparse]Please Register or Log in to view the hidden image!
The test begins ... now Please Register or Log in to view the hidden image! Please Register or Log in to view the hidden image! Please Register or Log in to view the hidden image! Okay, what you're looking at is a file from one of my blogs. The files, in order: • bdthisis.files.wordpress.com/2010/09/warrantrect.jpg — Untailored. • bdthisis.files.wordpress.com/2010/09/warrantrect.jpg?w=500&h=167 — The version used in the blog post. I don't know why, since it has the same dimensions as the untailored version. • bdthisis.files.wordpress.com/2010/09/warrantrect.jpg?w=335&h=112 — A version tailored for this post; the dimensions have been reduced by a third (.67:1). As of this posting, I am uncertain if the scalability at Sciforums is the result of the original picture already being ... er, well, I'm going to try again with another image and answer the question. But that's what we have so far.
Test failed I can't find any images that come up without the specifications in the url. Interesting. So .... I'm going to borrow one from somewhere. Test failed. I attempted to alter the dimensions of both the small and large versions of a Mr. Fish cartoon. Both failed. Here's the thing: I'm not a computer scientist. I'm barely competent with the things these days as it is. I can see, but cannot coherently express the issue. It has to do with the server and whether it's prepared to rescale images on demand. I do not know why this is, or how it works. To wit, is it a deliberate result chosen by a network administrator, or is it a symptom or limitation of a particular software package? Am I even within a mile of making sense?
Tiassa: In HTML it is very easy to scale pictures to whatever size you want. You just specify the desired size in an HTML tag and the browser automatically handles things. On sciforums, we have a reduced implementation of HTML tags accessible to users. It is deliberately that way in order to stop malicious users messing with the look of the forum by using certain tags, or including malicious executable code in their posts. For whatever reason, the writers of the forum software seem to have decided that posters do not require the ability to alter the size of the images they link. Perhaps they assumed that users would mostly upload images to the forum, and the forum software would handle the scaling at upload time. Instead, what we have right now is that images are mostly linked from here rather than uploaded.
It would be kinda cool to set up your own little squid server with your own server side script to resize the images for you. That way, you wouldn't have to dice roll on whether or not random web-image sources expose any image resize functions.
I was thinking about how to design the online self-image cache system. basically - you'd set up a server somewhere, and create server side scripting to interpret your URL. To post images, you could do something like this: [noparse]http://www.myproxyimageserver.com/myimage.jpg?url=http://www.originalimagesource.com/image.jpg&w=800&w=600[/noparse] Your server could be scripted to download the image once (so that it's not constantly leaching from the source). Do you think anything like this would work?
Forums and blogs nowadays attempt to move towards creating copies of the image to serve locally. This aids when a source website is either down or no longer has the image available. Obviously this can cause particular problems in regards to peoples intellectual property and copyright rights, so it would have to be an identifiable forum policy (with documented method) in regards to allowing people to have the right to ask for material to be removed. However it should be noted that the internet exists as a vast interconnecting network of computers that cache data and that peoples actual browser can also cache data too. (data meaning not just webpage's but the images) So replicating an image is just the same as having the image in a browser cache if it's publicly viewable. Sciforums use to allow the uploading of images in it's hayday, however as it become more popular, it was found that it taxed resources and consumed a great deal of bandwidth. It's possible that neither of those points are so much of a concern now, however if images were to be allowed to be uploaded, they should be contained within an anti-leeching folder. As for a simple image proxy resizer: It would be possible to code one, however it would be likely that such a tool would require people to "signup" for it (mainly in an attempt to lessen misuse) and might incur watermarking to identify the site used to resize it.