Entanglement and Reality (Interpretations of Quantum Physics)

Posted: March 12, 2019 at 7:14 pm

The following is direct copy-paste from my notes. I have had some time to reflect on this, but I’m strongly leaning towards Karen Barad’s Agential Realist Interpretation, which I will post about after doing some more reading.

Bell’s Theorem and the EPR Paradox

Quantum entanglement means that one of these things is very different than we tend to accept:

  • Locality: distance has no meaning in some cases
  • Realism: reality does not exist without observation “counterfactual definiteness” coexistence of everything possible.
  • Freedom of choice: the universe and all of our actions are deterministic.

“namely (i) reality (that microscopic objects have real properties determining the outcomes of quantum mechanical measurements), and (ii) locality (that reality in one location is not influenced by measurements performed simultaneously at a distant location). ” (wikipedia) (more…)


Bell’s Theorem and Reality.

Posted: March 12, 2019 at 6:48 pm

In one of my earlier posts I mentioned Bell’s Theorem and I’ve been spending some time reading thinking about this in relation to the EPR Paradox. That splintered off into many interesting and different directions for further reading and consideration, listed below. My interest in this area connects with my epistemological inquiry, and the idea of objects and subjects as being mutually constructive.

  • EPR Paradox
  • Counterfactual Definiteness
  • Observation
  • Quantum Discoherence
  • Free Will
  • The Copenhagen Interpretation
  • De Broglie-Bohm Theory
  • The Transactional Interpretation
  • Quantum Bayesianism

Update: I wrote this months ago and have not had a chance to return to it until now. After a week of seminars with Karen Barad, I’m inspired to return to this tough work but it’ll take a few posts to get from where I was at the time or writing the above to where I am now.


Face Detection & Non-defective Display!

Posted: March 12, 2019 at 10:29 am

Last week I managed to get facial tracking code working on the Jetson. It’s using the old CUDA-based Haar-feature method, but seems to be working more or less fast and well enough. Though I did notice that the (a) it’s a little noisy (i.e. the detection of a face sometimes oscillates over time) and (b) the plant behind me (as seen above the mouse in the image above) was occasionally recognized as a face. This is good enough for this stage and hopefully I won’t need to train my own classifier to improve things. Later this week I’ll integrate the ‘painting’ rendering code and I’ll see how the experience feels in terms of change only when no one is looking.

The third EIZO display arrived last week and it does not have any noticeable “pressure marks”. I suppose the first two were just bad luck and I hope I have better luck when I order the second unit.