I do not claim to be doing science Please Register or Log in to view the hidden image!, but thank you MD, not for the "grow some" suggestion, but for trying to move this forward. Everyone who has read Q is for Quantum, The Dance of the Photons, and the Quantum Challenge, is aware that there is still a basis for scientist and layman alike to have a Hidden Variable Interpretation of QM. I have read those three books, and have done so before CptBork started this thread, and that doesn't even mention the fact that it is clearly spelled out in Wiki, a common site used by me and almost everyone else, that there are grounds to say that a Hidden Variables Interpretation of QM is valid, without say what is missing because we don't know. He wants to make that false so that he can say he has falsified one of the claims that he has said he would falsify. Still we have people agreeing that experiments to close all of the loopholes at the same time have not yet been done. Until they are, no one can claim that QM is complete, and I am simply using the HVI vs. HVT explanation to convey that. CptBork doesn't claim that all the loopholes have been closed at the same time, and yet refuses to acknowledge my phrasing of HVI vs. HVT, the common position in the scientific community, that there are still grounds to say there is a basis for HVI. I just suggested we move forward based on his two civil posts, and then CptBork makes additional untrue accusations in reply. We will have to air out all of them, one by one, and get his proof that they are true before we even address the issue of why he refuses to acknowledge HVI vs. HVT. If you think I need to grow a set, them tell me which of the accusations he is making to avoid agreeing that HVI vs. HVT is valid, I should ignore. It is a fact that there are loopholes, and that all of the loopholes have not been closed at the same time, yet. Maybe they will be, and at that point, if that silences any professionals from saying that QM might not be complete, that will put the loopholes argument to rest. It has not yet been put to rest yet. That is what HVI vs. HVT is about. His accusations that are unproven just from his reply to my "let's move forward" post, and that he seems to be using to avoid agreeing with the concept of HVI vs. HVT are: 1) I never knew there were loopholes until he told me, 2) I am asking him to go easy on me, 3) and Quantum mechanics does not come into play in the experiments.
I'll say this much before I continue: science doesn't rule anything out beyond any possibility, it can only demonstrate that an attempted explanation is highly implausible/unreasonable with an extremely small likelihood of having any validity to it. We haven't even ruled out Noah's Ark yet, even though the whole story contradicts with vast mountains of readily-accessible evidence studied in biology, geology, physics and chemistry. My plan is to demonstrate that nature must engage in an extremely implausible, almost intelligent conspiracy in order to reproduce known experimental results only through local deterministic behaviour, by knowing the flaws in a given experimental setup and specifically exploiting those various flaws in such ways as to always spoof the result predicted by quantum mechanics. I intend to show that adhering to such a viewpoint is as reasonable and responsible as it would be to claim that the Big Bang never occurred and that cosmic redshift merely involves photons conspiring with our telescopes. Regarding quantum_wave's objections to the tone of our dialogue, I've never seen him make any references to possible loopholes in Bell test experiments when stating his counterclaims, until the subject was mentioned here in this thread by someone else. quantum_wave, the only objections I've seen you make are the false claims that 1) Only certain classes of local hidden variable theories containing specific details have been tested, and 2) Bell test experiments require some knowledge and assumptions about quantum mechanics in order to be performed, beyond what's directly measurable in the lab. If I'm mistaken about your past claims, then please provide me with a quote and reference showing where you stated Bell test loopholes as one of your objections. My plan is to show that the Bell test loopholes themselves were never really plausible explanations for the results of Bell test experiments in the first place, and that any attempt to combine loopholes to account for the fact that each has been individually closed is even more implausible still.
I wish you could explain some actual experiment in the simplest terms possible, if you could replace questions with statements in this table of events: Initial setup: angle choice constants: -30, 0, +30 diagram of the whole setup? number of measurements to perform? expected result and margin of error? properties of polarizers P1 and P2? properties of of RNG1 and RNG2? Event T0: photons L1 and L2 emitted properties unknown, uncertain, certain? Event T1: P1= RNG1(-30,0,+30) P2= RNG2(-30,0,+30) Event T2: L1 <interaction> P1 L2 <interaction> P2 interactions unknown, uncertain, certain? Event T3: measurement: what where, how, why?
The angles in degrees are -120, 0, 120. Photons L1 and L2 must always yield correlated polarizations when measured on the same axis, and measurements on one photon cannot affect the other. The measuring apparatii can alter their respective particles at the last minute before measurement, provided that the alteration depends exclusively on the particle's initial setting and the settings on its associated detector (not what happens at the other end), and in such a way that polarization still correlates 100% of the time for same-axis measurements. You can choose any scheme to select the polarities as long as it conforms to these requirements. Random generators RNG1 and RNG2 must be statistically independent and pick each of the three values with equal probability. A typical setup involves two photons being generated from a source such as a radioactive decay, in such a way that their polarizations are related by some scheme such as the simple one I give (i.e. polarized the same way on the same axes, or polarized in the same way but on axes rotated by 90 degrees, etc.). The photons are then sent to opposite ends of the lab separated by large distances and each passes through a two-channel polarization filter whose alignment is chosen at the last minute by an independent randomized selector. Photons passing through the separate polarization channels enter photodetectors typically containing arrays of photomultiplier tubes which convert and amplify their information into measurable electrical signals. The signals from each photodetector then enter a coincidence counter which identifies the time of arrival for each photon and determines whether they were most likely produced from the same event, and if the photon arrival times and other characteristics are consistent with a legitimate event, their correlation or anti-correlation is added to a net tally. Background noise and other sources of error must be carefully accounted for in the final statistical analysis, but I imagine for your sake you'll want to run a noiseless simulation to simplify the problem.
Does it matter if instead they are: -30, 0, +30? What is expected result and margin of error? That is, what is "good" and what is "wrong" result, and how far away they are? Event T0: L1= 1 L2= 1 Event T1: P1= RNG1(A, B, C) P2= RNG2(A, B, C) Event T2: L1= L1 x P1 L2= L2 x P2 Is that what you said? L1 and L2 stay "correlated" every time RNG1 and RNG2 pick the same angle for polarizers P1 and P2, ok? What about those photons that do not pass through? Even if the initial polarization of both photons is the same relative to their polarizer it still doesn't mean they will both actually pass through. Right? Event T3: if L2 == L1 then CORRELATED++ N_MEASURE++ RESULT= CORRELATED/(N_MEASURE/100) Ok? That would mean this: L1 ALWAYS correlated with L2 if P1 == P2 L1 NEVER correlated with L2 if P1 != P2 If "always" and "never" are 100% guaranteed, then the result depends on nothing more but RNG1 and RNG2.
Translation to human language... Event T0: L1= 1 L2= 1 This reads, two photons are emitted with correlated properties. Since we haven't defined anything else, it also means we assume it will be true for every single photon pair and 100% guaranteed. If there are any uncertainties or if the description is not complete, I'd like you to tell me about it. Event T1: P1= RNG1(A, B, C) P2= RNG2(A, B, C) This reads, polarizers P1 and P2 are rotated to either A, B, or C angle depending on RNG choice between the three. Event T2: L1= L1 x P1 L2= L2 x P2 This reads, photon L1 interacts with polarizer P1, and photon L2 interacts with polarizer P2. I used multiplication as 'interaction function', but can be expressed in many other ways including uncertainty percentages. Event T3: if L2 == L1 then CORRELATED++ N_MEASURE++ RESULT= CORRELATED/(N_MEASURE/100) if N_MEASURE < N_REPEAT goto Event T0 This reads, sensors compare photons L1 and L2, and if they are "correlated" increase the counter by one. Then, increase the counter that keeps the number of measurements so we can calculate the percentage of correlations relative to the number of measurements. Then, if not all the measurements are performed yet, go to Event T0 and repeat the same thing for the next two photons.
To keep the interesting part of this discussion going, I remind everyone that it isn't necessary to post the details of the experiments used to prove Bell's Theorem. The experiments utilize entanglement, superposition, and decoherence, notwithstanding statements to the contrary. Why is it important to acknowledge that the theories of superposition and decoherence play a role in the conclusions? Because both superposition and decoherence are predictions of quantum physics, and they are the components of faster than light communication between particles that must occur if there is to be any conclusion that local reality exists. There is another reason I have been insisting that entanglement and the resulting theories of superposition and decoherence are crucial to the experiments, beside the fact that they are a requirement of the FTL outcome. The quantum physics community is not yet settled on the validity of the theories when used to predict FTL communication. There is a debate about if the measurement problem or loophole can be resolved to the extent that FTL communication can be proven. First we have to understand the difference between the simple entanglement that creates a pair of particles where one has spin up for example, and the other has spin down. Simple entanglement without superposition and decoherence does not result in any communication between the pair of particles. When we measure the spin of one of the pair, we automatically learn the spin of the other particle, since we know the spin states have been opposite since the moment of entanglement. This knowledge comes from learning which spin was imparted back at the time of entanglement. Learning the spin of one reveals what the spin is of the other, and does not involve superposition or decoherence; just opposite spins imparted at the time of entanglement. Each particle of the pair retains it original spin during the period between entanglement and measurement. That is what I refer to as simple entanglement. There can be no FTL communication between entangled particles unless there is also superposition and decoherence, and so, since FTL is predicted to be possible in the experiments testing Bell's theorem, the entanglement is not the simple form, and must include the theories of superposition and decoherence. Otherwise, there is no basis to claim FTL communication between the entangled pair of particles when the spin of one is measured. To understand why they must invoke the theories of superposition and decoherence of the wave function in order to claim FTL communication we examine what each theory involves. Superposition involves a third state call the "mixed state". When superposition is predicted, entangled particles that have opposite spins imparted to them when they are entangled, do not have either spin up or spin down; they have the third alternative, the mixed spin state with the two spin states in superposition within a wave packet. Given superposition, examining the theory of decoherence of the wave function, we learn it theoretically occurs when either of the particles with superimposed spins has its spin measured. Until the measurement, both particles have the mixed state, but measurement of a particle in the mixed state collapses the wave function (decoherence). Decoherence of the mixed state results in one of the two possible spins becoming a "reality" at the instant of the collapse of the wave function, and FTL communication occurred at the instant of the measurement. Thus it is theorized that when decoherence occurs, it occurs for both entangled particles simultaneously over any distance via faster than light communication between the two entangled particles. That theoretical event is called spooky action at a distance. We now know that the experiments used to test Bell's Theorem must include entanglement of the superposition/decoherence type, and not the simple type of entanglement that allows the hidden variable theory of local reality. Given that acknowledgement, the experiments need not be described in detail, because the theories of entanglement, superposition, and decoherence are the basis of the experiments used to prove Bell's Theorem. They must be invoked if we are to be limited to the options of either accepting that there is no local reality, or there is FTL communication. It is said that experiments are improving and are getting closer and closer to being able to close all of the loopholes at the same time, but close and being there, in this instance, is still worlds apart. Some layman science enthusiasts and some professional quantum physicists seem quite certain that, given enough time, there will be an experiment that closes all of the loopholes at the same time. On the other hand, the belief that there is a fundamental problem with quantum physics as it now stands persists. The experiment that closes all of the loopholes must be preformed, not just imagined or predicted, before Bell's theorem can be said to be proven, and such an experiment has not yet been carried out. In the way of thinking of those who side with the Hidden Variables Interpretation (HVI) of QM, such an experiment amounts to putting a square peg in a round hole, i.e. not possible. The interpretations of QM called the Hidden Variables Interpretations (HVI) remain possible, based on the position that the quantum physics used to continue attempts to close all of the loopholes may still not be complete. One outstanding issue with closing the loopholes is called the "measurement problem", and it is technical. It means to me that you are in no position to say the HVI is not a valid interpretation of QM. The measurement problem has to do with whether theoretical superposition and decoherence can actually be measured. The problem relates to what is referred to as infinite regress in measurement efforts. A simple classical measurement would make things clear, but in quantum mechanics, the act of measurement involves the creation of additional entanglements. Measuring the energy of the photon amounts to describing the result in terms of the effect of the energy of a single photon on the entire system. Each adjacent atom must be measured, and the entire measurement system and apparatus is considered to become entangled in the process. Those favoring HVI will be able to maintain that the possibility exists that there is an as yet undetected level of order featuring more complex particle mechanics. Maybe one is based on continuous wave action in an aether medium below the current limit of our ability to observe. Thus, an aether cosmology is one type of theory within the hidden variables interpretation that is not yet being tested by the existing experiments. Continuous wave action in an aether medium could be the cause of the infinite regress encountered when trying to solve the measurement problem. I just offer these views to reopen the meaningful part of the discussion about the prospects of eliminating any possible hidden variables interpretation of quantum mechanics. Take it for what its worth, and add to the constructive side of the discussion if you want.
Firstly, your interpretations of the experiments are incorrect. That the spins anti-correlate or polarizations correlate in these experiments when measured on the same axis, is an experimental fact and not an assumption. The photon and electron pairs are generated in such a way that they can always be confirmed to have this property (aside from sources of background noise which must be carefully accounted for). It doesn't matter what mechanism is generating the correlations, be it superposition decorherence or what have you; there's no hidden variable scheme which can account for the results experiments involving Bell's Theorem except by exploiting a combination of detection and locality loopholes as have been briefly discussed earlier. I'll get back to why it's highly implausible for nature to exploit the two potential loopholes in such a way as to always obtain the known experimental results, and one can therefore say that existing experiments strongly disagree with the local hidden variables interpretation, but I had wanted to take a short break.
Welcome back. Ok, I'll try to be patient and see what the experiments entail. This is the position of professionals and layman alike, who believe that there is a level of complexity to the universe below the level of mechanics that we can observe. It means that instead of particles that are described as "fundamental" in the Standard Model, which have no internal composition, particles are much more complex, and are made of "packets" or "complex standing wave patterns" in the medium of space. Such particles are often treated as points mathematically in equations quantifying scientific experiments, but to get the math right, the particles would need to be quantified as complex standing wave patterns. They would have to be considered to have internal composition made up of multiple points of high energy density wave intersections within their individual standing wave patterns. Such particles are hypothesized that way because they are intended to have and display wave-particle duality at all times. They are continuous entities composed of complex wave energy patterns, and their internal high density spots make them appear as particles and not waves when they are measured in certain ways, and as waves and not particles when they are measured in other ways, according to my hobby-model. You're going to get different results unless you base the probabilities on how these wave-particles move and interact as multiple points of high energy in the aether medium of space. This reference to the mathematics behind Bell's Theorem is correct, and I have not taken exception to the math, or the fact that tests using the postulates of quantum mechanics bear out the theorem. I have been patiently waiting to present my case until you finish your presentation, but now that you are back on that track, I'll start explaining my position, as stated in my hobby-model, since the hidden variables "theory" is not testable using current QM postulates. Reference to the model as a hobby is explained often, and needs to be mentioned again here as a disclaimer because you have misrepresented my claims. By "hobby", I am saying that I am a layman science enthusiast, and have been evolving my hobby-model over the many years that I have been interested in the cosmology of the universe. It is a hobby, I'm not a professional, I am not "doing" science, and I'm not trying to say that any of it is supposed to be "my theory"; it is speculation and hypotheses to answer the as yet unanswered questions in cosmology, to my personal satisfaction, and not yours. I know I get under your skin for saying the following, but it is my intention to try to motivate members who are interested in cosmology to discuss the problems and questions that are part of the current circumstances in the cosmological community, so I claim: "My hobby-model is internally consistent to the best of my ability, and it is not inconsistent with known scientific observations and data, stipulating that those observations are understood and explained with the mechanics that they operate by." Generally everyone agrees that faster than Light (FTL) is accommodated by quantum physics; it is part of the "spooky action" that characterizes QM in its incompleteness as it currently stands. The theories that support the prediction that FTL is possible are those of superposition and decoherence that accompany quantum entanglement, and they do not explain the mechanics of how it works, just like GR does not explain the mechanics of gravity. The Hidden Variables Interpretations (HVI) of Quantum Mechanics are held by people who think that there is a local explanation that will resolve the paradox between the predicted FTL and "local reality". That local explanation in my hobby-model centers around predicted but unobservable details. "Particles" are described as complex and ever changing wave energy patterns that cause the impression of randomness because of the almost infinite possible orientations, shapes, and imprecise locations of such wave-particles, and of the ever changing pattern and relative density of the high energy density spots within them. Each particle is a complex wave energy pattern, and the extreme and varying wave densities within those patterns affect the speed of the waves entering, moving through them, and exiting them. The particles are composed of two components, inflowing gravitational wave energy arriving through the medium of space from the surrounding particles and objects, and out flowing spherically from particles and objects into and traversing the surrounding medium of space. The medium of space is filled with the gravitational wave energy transiting the space between particles and objects. Particles move in the direction of the net highest surrounding inflow of wave energy density. The inflowing directional waves are converted by the internal mechanics of the wave-particles, into spherically out flowing wave energy which becomes the directional inflowing wave energy that determines the motion of surrounding wave-particles. It is as simple as that in my hobby-model.
The response to my last post ... oh wait ... there was no response, meaning no one has seen fit yet to challenge a single thing in it. It could be argued against on many levels, by anyone who has a position on any of the topics it addressed; but I get a pass? And it's not that I get a pass here, I am getting a pass on most of my posts about cosmology, about my hobby, and about my alternative ideas. It is possible that everyone is now of the opinion that I evolve my hobby-model with the intention that nothing in it can be falsified. That is not me being antagonistic, because everything that is "alternative" about my ideas stem from the well acknowledged facts that there are things that we (meaning the scientific community) do not yet have answers for, and don't have the ability to observe or test. Are people agreeing that every mainstream theory that I imply is inconsistent, incompatible, or incomplete is in fact inconsistent, incompatible, and/or incomplete. Maybe so. Maybe it is now a fact that the sanctity of the consensus in science is not the same as the idea that a consensus theory is fact. I focus where there are significant unknowns. I am actively familiarizing myself with the problems that the scientific community faces in advancing the body of knowledge, and have kept abreast of our ability to observe, but new things go on all the time, and one person cannot be aware of even a fraction of it, and so we are all in a state of continual learning if we try to keep up. In my view, observation is the greatest limiting factor in the advancement in science. Progress at observing "reality" in both the micro and the macro realms is slow and expensive, and the payback is losing some of its glitz as our economies falter, and as our accomplishments out pace the average person's ability or inclination to understand.
Let me struggle on with this, and maybe someone will help. I have been maintaining that the description of the experiments, not yet described, weren't necessary if we acknowledge that they include quantum entanglement of particle states, stipulating that quantum entanglement involves superposition of states and decoherence or collapse of the wave function when one particle of an entangled pair is observed to determine its state. My reason for bypassing the need for the details of the experiments was based on my acceptance of the math being conveyed. I agree that the math represents real outcome probabilities that differ from predictions that would be observed from the experiments if the nature of reality defined by QM was correct, but I also maintain that what are being called hidden variables in the presentation are part of a hidden variables theory derived from an incomplete understanding of reality. I'm saying that quantum mechanics as we know it might be incomplete, and that is a position shared by many. Why not say what you think about FTL communication? QM, as it is interpreted by the generally accepted Copenhagen Interpretation, allows for the possibility of faster than light communication between particles as an explanation for the experimental outcomes. That FTL communication takes place instantly when the wave function collapses, according to the theories associated with quantum entanglement. Those who hold the Hidden Variables Interpretation (HVI) of QM maintain that the conclusion that there could be FTL communication involved under any circumstances, is not understood or supported by mechanics of how it could occur, and so it is theory at best. I interpret their position to say that the Copenhagen Interpretation of QM is incomplete, and that the theories of superposition and decoherence do not represent the reality of what is taking place in the wave-particle "packet" prior to or between observations. The reality in question is the nature of the particles and their states between entanglement and measurement, and the HVI position is that particles, their location, and momentum, aren't properly understood; reality must be different than is described by QM as it stands. One HVI conclusion is that the determinism associated with the location and momentum of particles and particle states involves and is taking place in a deeper level of reality that we don't yet understand, and can't observe. That would imply a level of quantum action that takes place on a scale too small for us to observe, and entails mechanics that go on all the times, continuously, in that realm that we cannot detect. When we make a measurement, we force a state to be observed without understanding the mechanics that produce that state from the underlying continuous action taking place prior to the measurement. Those who hold the HVI view are saying that some such scenario, not yet discovered, is more likely than FTL communication between particles. But take your pick Please Register or Log in to view the hidden image!.
Particles seem to be best described as wave-particles because they display wave characteristics under certain measurement conditions, and particle characteristics under other measurement conditions. The theory of wave-particle duality is a paradox, not yet understood. I think it might be the consensus that the wave-particle duality paradox is closely associated with the paradox presented by superposition of states of a particle, i.e. how a particle can have its states in superposition when not being observed, and yet display a specific state when measured. My understanding is that when no measurement is being taken, the nature of the particle in QM is theorized to be a physical wave packet that obeys the laws of quantum mechanics. Randomness and probability are elements of those laws, and every measurement that establishes location of the particle to any great degree, loses some degree of certainty as to the momentum of the particle that is being located. It is under the conditions of superposition that FTL communication is allowed under the Copenhagen Interpretation. QM, in the strictest Copenhagen interpretation even brings into question whether the particle can be said to have a specific combination of quantifiable momentum and quantifiable location at any time, and allows even the possible non-existence of a particle's presence while it is not being observed. (That would be the case where not only was the cat not dead or alive, but it wasn't even in the box until you looked.) From that, until I hear otherwise, my understanding is that the FTL communication is predicted to occur only under the Copenhagen Interpretation, and only at the instant that the particle is observed, but further, maybe it is possible only if the particles do not exist locally as a physical wave packet, i.e. only under the strictest Copenhagen interpretation. That would say the energy comprising the particle that will be subsequently observed is disbursed into the wider physical space, some sort of energy continuum maybe, and only takes on a local presence (location and momentum) of its own when a measurement is made. That would be a spooky explanation that is considered possible under the strictest Copenhagen Interpretation, but not possible under the Hidden Variables Interpretations that are based on a local determinism. Either way, we are far from having the final answer.
Having a free hand to post without opposing arguement is usually rare on a science forum, and yet, here we are. Many find it seemingly impossible for FTL communications to occur, but FTL is considered possible by those who hold to the strict Copenhagen interpretation of QM. So let's discuss a few things that quantum spookiness has going for it under the Copenhagen interpretation. I would love it if someone would bother to correct my misconceptions about the following: 1) Wave-particle duality seems to best describe the nature of particles based on observations, but we can't explain it with any degree of confidence or consensus. It seems that at the instant of measurement, the wave-particle can appear as either a wave or a particle, instantaneously. 2) When a particle interaction occurs, like a photon striking a light sensitive atom, the photon ceases to be a wave-particle and becomes a black spot on the film, and it is described as an instantaneous absorption or transfer of energy. 3) When an electron is energized and jumps to a higher level, it can be described as an instantaneous jump in energy of the electron. 4) There is experimental evidence of the possibility of FTL communication between particles, and it seemingly must be accepted if one wants to hold on to any sort of local reality or determinism. 5) A particle seems to be able to go through two slits at the same time. 6) There are some delayed choice experiments that imply that in QM, it is possible to alter the past, without explanation. 7) According to the uncertainty principle, all of our measurements have a degree of uncertainty or fuzziness. 8) The very idea that individual events have causes is missing and QM does not provide explanations of events. In QM it is said we can only know half of everything, alluding to the location/momentum uncertainty. 9) It is said that if there is an objective reality then QM is incomplete. 10) It is said that QM might, in fact, be complete and there is no evidence of an objective reality. I know that some of these items are saying the same thing using different circumstances, but this is a partial list of spooking things, unexplained, that are observed or are predicted to be possible in QM. Please add to the "spooky" list if you have some things that qualify. The question is, is the transfer of energy in particle interactions instantaneous, and if so, is that a local occurrence of the same phenomenon as faster than light communication that is considered possible at a distance in QM? I refer back to the last post where I mused: "That would say the energy comprising the particle that will be subsequently observed, is disbursed into the wider physical space, ...". It could be added that the "wider physical space" would seem to be connected in a way that all parts share the same moment in time, and any point can become instantly "aware" of all events anywhere in the connected space. That could then allow the distant entangled particle to know when to collapse its wave function :shrug:. That would be spooky, but it is just a speculative scenario of what a "reality" that complies with the strict Copenhagen interpretation might be like. Some people might prefer something like that to the alternative "that the apparent randomness of nature in experiments is due to certain variables being hidden from measurement, in such a way that they're either impossible for a human observer to measure or beyond our present technological capabilities". I consider the later more probable, but it is a subjective opinion. Edit: I'll add this link. It is from 2005, but does a good job of covering the topic up to that point. http://www.sciencemag.org/content/309/5731/98.full
Let's go back and look at the setup. We have a steady stream of entangled photons at random angles, and by entangled we mean that each photon in each pair is polarized along the same axis. 1) Then you say that they are polarized at the same angle or at the opposite angle. Is this an intentional randomness using a source that can either entangle them at the same angle or at an opposite angle and the decision as to if they are at the same angle or at the opposite angle is changed randomly? 2) Otherwise, it seems like you are saying that you have a source that produces entangled photons along the same axis, but there is no way to tell if they are entangled at the same angle or at opposite angles at the point that entanglement is established. Do you understand the question, and if so, which is it, 1 or 2, and does being entangled at the same angle or at the opposite angle make any difference as the experiment progresses?
I noticed CptBork come and go elsewhere in the forum, and ignore his own thread :shurg:. I guess he still thinks he has a case that falsifies my claim that my hobby-model is not inconsistent with proven scientific observations. All I can do is keep trying to make my case, but we all know that he hasn't made his, and now is taking time off from talking about it. Come back, let's see what the scientific community really knows, and what is still unproven and unexplained theory. What would reality have to be like if QM is complete as it stands?
Like I said, I'm not in a rush, my pace is proportional to the level of interest I see in you or others actually handling criticisms of your own ideas. We can start with the detection loophole if you want. Does it seem plausible to you that various different detectors with different designs and materials should all either detect virtually every photon that passes through them with no biases against correlated pairs, or else if they're not nearly 100% efficient, they must then all skew the number of measured correlations by the exact same significantly biased amount relative to the other particles they detect?
I notice you seem to be asking additional questions, give me a little time to catch up and I'll see if there's some questions on your part that I can answer.
Yes, it seems possible that every correlation could theoretically be detected. Will you stipulate that the nature of the photon might not be fully understood?
The photon is extremely well understood, probably better than any other fundamental particle. However, it's possible that there's some previously undiscovered property which allows it to be set to bypass detectors under certain circumstances. That's not the end of the issue, however. We know there are photon detectors with near 100% efficiency, which means one can't explain the Bell test results in these cases by arguing that correlated photon pairs get detected with less efficiency than anticorrelated photons. However, all experiments conducted with high efficiency detectors thus far have neglected to close the locality loophole at the same time, which should hopefully be accomplished within a year or less. On the other hand, other experiments have been performed which do close the locality loophole, but such long-range experiments haven't yet reached high detection efficiencies, leaving the detection loophole open instead. So if you want to cling to the detection loophole as an explanation for experimental Bell inequality violations, you need to stipulate that there are basically two kinds of detectors: There are detectors with near 100% efficiency that don't bias against the detection of correlated photon pairs, and you must then also stipulate that any inefficient detector of any sort with any design and material structure of the types used in the nonlocality tests, must all somehow bias against correlated pairs by the exact same amount relative to their various overall efficiencies. Agreed?
I have forwarded the time in the video to the pertinent position (1:12:00 if that does not work). Watch the whole video for more info, or better yet the whole series of videos. [video=youtube;awpnsGl08bc]https://www.youtube.com/watch?feature=player_detailpage&v=awpnsGl08bc#t=4408[/video]