I suppose. But let's look a bit more closely at my list model: if you have a number like 0.999 and multiply it by 10[sup]-2[/sup], the result is 0.00999, the single zero on the left of the decimal point must really be like a string of zeros extending infinitely to the right (which is inductively true since we can multiply by 10[sup]-2[/sup] indefinitely). But this is true for any number, there is this infinite string of zeros on the right. We just write numbers like 000.9 as 0.9, because it's: "0.009 multplied by 100". You seem keen to pursue this idea that one view is fundamental, the other isn't. This is not the case. Are numbers more fundamental than operations on them? A well-defined operation on a set is not a vague convenience. You really want to cling to this "there's a problem here" thing, don't you?