Chaos Manor View, Wednesday, August 26, 2015
Very hot today in Los Angeles. Pounding away on fiction, but it’s not easy. Typing continues difficult. I have dozens of suggestions regarding Dragon, and one day I’ll implement it and try, but the weather and time pressure both argue against trying a whole new way to “write”; the last time I tried dictation it was a flop; I got so concerned with the way I was composing sentences, and waiting for them to appear on the screen, that after a while I was more worrying about the writing details than about what I dictated. Of course that’s something of what is happening now.
As it happens, yesterday over in another conference (SFWA) where I spend too much time, a new member asked for advice on career management. I answered:
I don’t think there has ever been better advice given than that of Mr. Heinlein:
To be a writer you must write.
I will add, until you are established as a writer, you would do well not to spend a lot of time talking about writing or listening to others talk about writing in the hopes that you will learn some secret formulae. You won’t. Randall Garrett was fond of saying he knew no professional writers who got there through workshops or discussing writing with other beginners. I do, but not many.
To be a writer, you must finish what you write.
I will add that there is something sadly amusing about the “writer” who always has an unfinished manuscript to inflict on his friends.
Do not rewrite unless instructed to do so by someone who is going to buy it.
This was probably the most controversial, and most badly misunderstood, of Heinlein;s dicta. He did not mean write first draft and never rewrite; he meant that the rewrite is part of finishing and it should be done and over. Don’t rewrite finished work. You will do much better to work on something new.
Send your work to someone who can buy it, and start on something else. Keep that up. Keep writing, finishing, and sending to editors.
Basically that’s it.
The magic is in doing the writing. For story tellers it takes a while to make writing automatic so you can concentrate on the story, not on how you tell it.
And there are nine and sixty ways of constructing tribal lays…
As to career management, it used to be that you sold to the magazines, got the cover, graduated to a novel, etc. Now there are alternatives, many discussed here. But before you manage a career in writing you have to write, and the best way to learn that is to write, finish what you write, send it to someone who can buy it, and don’t rewrite unless someone who will buy it tells you to. Obviously there are stories that if rewritten can be made better, but a better investment is to do a new story. Then another. Then one more. Finishing each.
After a while the writing comes easier and you can concentrate on what you want to say, not on how to say it.
Of course you may be well past needing that advice.
Jerry Pournelle
The point being that if you have to think about what you are doing, rather than on what you are trying to say, you have a severe handicap; and that’s what I am trying to overcome. I’m getting there but it’s slower than I like. But then it took longer than I like just to feed myself…
For some reason I cannot fathom, the Word grammar program does not like the first sentence in that paragraph. I give up on why it thinks it is bad grammar. If it be, then so be it. Oh. I see. It wants a proper verb. Ah well, it’s clear enough.
Anyway we will continue the discussion of philosophy of science. To summarize my views, which are derived entirely from Sir Karl Popper and St. Thomas Aquinas:
Science has become a very useful way of discovering truth about the world. To most of the world, “reason” and “science” are essentially synonymous.
Science has strict rules. The most fundamental rule is that no theorem or hypothesis is scientific if it cannot be falsified. It does not mean that “I saw a man who wasn’t there” cannot be true, but it is not a scientific truth because there is no conceivable way to falsify it.
We may act as if scientific theories (those which can be falsified) were true, but always with the understanding that they may someday be falsified.
This can lead to conflicts of theories, and sometimes does. An example is the late Petr Beckmann’s theory of entailed aether, as opposed to Einstein’s Theory of Relativity; they both, as I understand it, “explain” all the relevant data; where they make different predictions, falsification of either requires experiments we cannot perform. That leads to wildly different possibilities, but we cannot choose among them given the present state of observations. There is an overwhelming consensus in favor of Einstein, but there is no crucial experiment to choose between them at this time.
When conflicting theories lead reasonably to disparate courses of action the situation becomes critical, in particular if the different actions have high cost; this is the situation in which we find ourselves regarding global warming, with the added problem that there are mutual assertions of falsifications of the different theories, as well as conflicting claims of the validity of certain evidence.
Some statements may be true, but are not scientific because there is no way to falsify them. My prediction that unrestricted capitalism will lead to the sale of human flesh in the market place is “scientific” in that it could be falsified, but it also rests on the non-scientific assumption that the sale of human flesh – or baby parts – is not morally acceptable. “Ethicists” and religious leaders may or may not agree on that assumption, but their disagreements cannot be settled by any scientific process I am aware of. At some point you are faced with “good” and “evil”, and it is meaningless to say that good is better than evil because good’s gooder. There are those (I am among them) that say that certain morality systems lead to a “better” way of life than others, and there are many examples, but this not science; one reason why education needs to include the liberal arts, but this goes far afield of this discussion.
Regarding philosophy of science
Jerry,
Just now catching up on the latest blog post. Last couple days were busy writing/recording/editing the weekly Osborn Cosmic Weather Report. So I want to respond to some talking points.
1) Astronomers certainly did NOT pounce upon Doppler shift uncritically, after Hubble’s discovery — more like throwing a firecracker into an ant’s nest. I didn’t go into the details of the history because I could have written a book about it. Many books HAVE been written about it. And like it or not, the bulk of the demonstrable evidence that we have today lands on the side of large-scale expansion. Note I said LARGE-SCALE. It’s long been known that localized inhomogeneities were required even to develop the galaxies we see, let alone clusters, superclusters, and the other structures we’re still discovering, like “walls” and “bubbles.” So this is no new thing. I will say that we’re still working on how it all came about, but we know it did, because we see the results.
Hubble’s discovery and subsequent others produced an uproar in the community, with huge infighting about the validity of the results between the “Steady-State-ers” and the “Expansionist Universe-ers.” This is in fact a close parallel between a similar and more or less concurrent, long-running controversy in geology between the concept of uniformitarianism and catastrophism, where uniformitarianism can be likened to steady state and catastrophism could be likened to big bang/expansion. Geologists now think that the reality seems to be a blending of the two, a kind of uniformitarianism punctuated by episodes of catastrophism; is it then so surprising that cosmology is proving to be the same?
Moreover in no wise are astronomers/cosmologists/astrophysicists favoring a particular model over another, as evidenced by the large number of theories/models that are put forward. (My friend, physicist Dr. W, and I have discussed the whole “dark matter/dark energy” concepts several times; neither of us is disposed to care for either one, and are inclined to think that it will eventually be disproven. But right now it does seem to explain observations.) My entire point was that these things are indeed being considered, but just because we seem to find a data point that is in conflict with current theory does not mean we automatically throw out the baby with the bath water and start over from scratch.
Also note that I am not saying that any theories would be “knocked out if new theories were accepted.” Obviously Newtonian physics was not “knocked out” by relativity theories, nor quantum mechanics, nor any of the rest. In fact what we find is that Newtonian physics is what the others reduce to in the everyday world. Quantum mechanics devolves to Newtonian physics as the scale increases from subatomic to macro world. Relativity devolves into Newtonian physics at increasingly lower sublight speeds. Et cetera. This is what a proper “new theory” SHOULD do — reduce to the established, observable ways/models when “ordinary world” initial conditions are plugged in. What is happening, however, is that this thrust experiment is contradicting the “ordinary world” model, which has been demonstrably proven correct over centuries (and arguably millennia) of observation. And THAT is what experienced scientists take issue with.
2) I think some may be confusing the difference between the universe and the models we have of the universe. When new, unexplained data is discovered, obviously this is coming FROM the universe, and it is the MODELS that must be adjusted to try to see if the new data can be explained. It isn’t that we’re trying to shoehorn the universe to fit our theories. We are looking to see if this new evidence has uncovered something that needs to be added, something we didn’t know about before. It is a MODIFICATION of our theories/models, not changing the universe, that is occurring. This usually requires several iterations, and not infrequently does in fact require the model to be reduced to its basic components and rebuilt, or occasionally thrown out altogether and replaced.
Think about it like this: You want to race cars, and you want to win. You’re on a budget constrained by other factors — house payment, credit card payments, food bill, kid in college, etc. So which is easier and more economical, which fits into your budget better: Take the stock car already in your garage and modify it to juice it up, or throw out the stock car and start building an Indy race car from the pavement up? You’re going to start with your stock car and modify it, then you’re going to race it and see if you win. If you don’t win, you keep modifying the stock car until you’ve reached the limits of what the frame will handle. If you’re still not winning, you scrap the stock car and start work on an Indy car design.
In this analogy, your budget constraints are the body of existing observations. Your stock car is existing science and its models. Winning in this case means your model correctly predicts the observations; juicing up the stock car represents the modifications to existing theory you have to make to try to predict the data. The Indy car design is when you can’t get existing theory to match observation, so you scrap the theory and construct another. But you still have those budget constraints! The new model has to accurately predict, not just the new observations, but all the old ones too. It has to be “drivable on the road,” as it were. Sort of like a Transformer that goes from Indy car to your mom’s sedan and back.
3) String theories: there are in fact five basic string theories. (And while I’m about it, let me point out that there is a difference between a cosmic string and a superstring. Here I refer to superstrings.) Each theory was developed by a different researcher or group of researchers, and each one accurately predicts some of the observable data — but no one superstring theory predicts ALL of the observable data. Nor, so far, can they be made to do so.
This is a case where the scientists dropped back and punted. It wasn’t exactly that they scrapped the stock car, but they definitely were pulling Indy car concepts into the modifications! (To continue my racecar analogy, I’d say they kept the frame but put in a new engine and more aerodynamic body.)
Unable to get their superstring models to wrap around the whole problem, they made a fundamental realization that relates back to that “new theories should reduce to the older forms” comment I made earlier: They realized it was very likely that the five different superstring theories were actually special cases of an overarching theory. So they instead created a new theory/model, called M Theory. And this, so far, DOES accurately predict all of the observable data, though again it may possibly not be the simplest way to do so; Occam’s Razor and all. But it’s the best we’ve come up with so far.
(This is a case where Dr. W might be more up on the latest developments than I am, since it falls more into the realm of particle/quantum physics in which he specializes than the astronomy/astrophysics in which I specialized. I did study M theory and the related stuff in order to write both Extraction Point with Travis S. Taylor, and my Displaced Detective series. And I’ve tried to stay up on what’s going on with the theory — I get asked about it a lot at SF cons. I don’t claim to be an expert in M theory by any means.)
4) Quasars: given that, in recent years, we’ve been able to image the distant galaxies in which quasars are embedded, and we have been able to generate models of the mechanism that predict observational data, it’s going to be rather hard to argue away the notion that they are indeed embedded in galaxies.
As for proper motion, that is still in debate. Proper motion is not, contrary to what you might think, immediately obvious to the observer, especially when we are looking at extragalactic objects. Why? It’s complicated — because the Earth is making a truly spectacular gyration through the universe: it is spinning on its axis, revolving around the Sun, following the Sun in its orbit about the galactic center, and moving with the galaxy as it orbits the center of mass of the local cluster, which is in turn orbiting the center of mass of the local supercluster, which is experiencing linear motion through the universe…and then there is precessional motion of all of that, and more. All those motions have to be determined as accurately as possible, and then SUBTRACTED FROM THE MEASUREMENTS of the apparent proper motion of any given object. Only then can we say that the object MAY be experiencing true proper motion.
Current studies of quasar proper motion seem to be indicating that there is an inadvertent systemic error in the reduced measurements (as well as a couple of other things occurring within individual quasars) that, if corrected properly, will remove most if not all of the purported proper motion. Or to put it more simply, we may have an error in our estimate of the motions we ourselves are making, which is causing an apparent motion of the studied objects, when there really is little or none. The jury is still out on that, but legitimate research is ongoing.
Also consider that we currently have a nice spectrum of galaxy “types” or morphologies, ranging from “ordinary,” to interacting, to Seyfert/BL Lacertae/radio galaxies, to quasars. These in general range nicely from nearby, to a little farther out, to pretty far out, to way the hell over there. There’s a whole lot of observational evidence that quasars are embedded in galaxies, and that they have a lifetime that takes them through several morphologies, it’s going to be hard to disprove. Note I didn’t say impossible. There are arguments for other kinds of Doppler shifting, such as relativistic gravitational. But, “With great power comes great resp–” no, sorry, wrong quote. “Extraordinary claims require extraordinary evidence.”
Now, somebody reading all this is bound to be thinking that I’m just one of these “accepted science” conspirators who are trying to stifle anything new. Not so. I have a brain, I use it most days, and I am trained to be a skeptic. (Blonde hair notwithstanding.) If I were into “accepted science” then I would not be posting guest blogs like this:
No, I sit down and look at the data in the light of what I know. I look at the models and decide if they make sense, or if they are off in the weeds someplace. If it all lines up, and if I can take the data, feed it into the model, and predict more data, and that prediction is demonstrably correct by collecting the additional data, then I conclude that the model is correct insofar as we understand the science to this point in time. If it does not, I conclude that the model is wrong, and possibly the theory behind it as well, depending on whether I can determine if it was just a poorly-constructed model or if the problem with it is more fundamental.
This is not simply going along for the ride because someone else says so. And this is the way science is supposed to work. Does it always work like this? No, it doesn’t. Because scientists are human too, and we can get hidebound and attached to our pet theories. (Go read up on William Thomson, Lord Kelvin’s successes, as well as his failed predictions, if you don’t believe me. And he was as “established” as they come.) But it does so more often than not, and especially in my chosen fields, I’m pleased to say.
Stephanie Osborn
“The Interstellar Woman of Mystery”
http://www.Stephanie-Osborn.com
It is clear that we are at the edge of observational accuracy, and possibly many statements which appear to be falsifiable are in fact not so with present equipment. It would not be the first time.
And I will repeat my own view: the extraordinary claim of reactionless drive needs considerable evidence that it exists, since it falsifies a fundamental principle of Newtonian physics, as well as being incompatible with Relativity.
More on Beckmann and Einstein
<<Jerry P I commend to you Petr Beckmann and his Einstein Plus Two…>>
And I commend to you Tom Bethell’s book, Questioning Einstein: Is Relativity Necessary? (2009), explicating Beckmann’s theory, and putting it into the whole historical context of the development and testing of relativity theory, and the wider and continuing question of the nature and existence of the “ether”.
Bethell has been a contributing columnist and/or editor of National Review, The American Spectator, Harpers, and other intellectual periodicals, and he is a Hoover fellow. He specializes in whistle blowing on politically correct orthodoxies, so of course he is personal non-grata with the elite establishment, which in my book is one of his strongest credential.
Among the many thoughtful and trenchant pieces of his I’ve clipped and saved, was a two part swipe (in the June and July/August American Spectator at the cancer research mafia that has deflected so many tens of billions of dollars of taxpayer money into unproductive reinforcement of the established paradigms that retroviruses (and now faulty genes) cause cancer, while shunting aside the fact that virtually all solid tumors consist of cells that contain more than the two chromosomes of normal cells: this phenomenon is called aneuploidy and it has been known since the 1960s, yet practically no research has been done on the replication errors that must lie at the heart of it. How many lives have been cut short and/or blighted because of this waste of funds and scientific talent?
Bethell worked closely with Beckmann and with his colleague and collaborator, physicist Howard Hayden (who wrote the introduction to Bethell’s book), and Bethell did extensive research of his own, drawing on papers of Einstein that have only recently become available, and also on the papers of Nobel Laureate Albert Michelson, who designed the interferometer that was used in the classic Michelson-Morley experiment, and who went on to design and conduct the Michelson-Gale experiment in 1924 that conclusively established that there was indeed an ether – a gravitational ether detectable against the earth’s rotation – a finding that has been partially replicated in passing by the Brilet-Hall experiments of 1979, which ironically were focused on finding the same kind of ether (detectable against the frame of the earth’s orbital motion) that the Michelson-Morley experiment failed to find in the first place (Brillet-Hall predictably repeated that original failure).
Einstein himself was one of the chief encouragers to Michelson, then at the University of Chicago, to conduct the Michelson-Gale experiment, which involved constructing an apparatus that spanned an area of some 50 acres, and he traveled to Chicago and met with Michelson for that purpose. Einstein had also begun to recognize as early as 1911 that his General Theory of Relativity REQUIRED a gravitational ether, regardless of the fact that his Special Theory of 1905 had dispensed with it. Bethell quotes Einstein thus {p182}:
“In a article published in 1911, ‘On the Influence of Gravitation on the Propagation of Light,’ Einstein acknowledged that the constancy of the velocity of light is ‘not valid in the formulation which is usually taken as the basis for the ordinary [special] theory of relativity.’ The velocity of light in the gravitational field ‘is a function of the place,’ Einstein said. Light rays ‘propagated across a gravitational field undergo a deflexion.'”
Einstein may thus be said to have backtracked on his premature discarding in the Special Theory of Relativity of the ether principle that presumes that some medium is necessary for the propagation of waves, whether they are light quanta or gravitational quanta, and to have anticipated, not only Beckmann, but the Michelson-Gale experiment.
None of this casts any shadow of doubt on Einstein’s theory of General Relativity, except that it suggests that it ought to have been called Einstein’s Theory of Gravitation, dropping the relativity moniker altogether. However, it is clear from the body of evidence reviewed by Bethell that Einstein’s Special Theory is both irrelevant to practical modern physics and pernicious in its paradoxical implications. The Special Theory is irrelevant because it applies only to inertial (constant velocity) frames of reference, yet we live in a universe of accelerations. But for that, the Michelson-Gale experiment of 1924 would have falsified special relativity since light was found to travel at different speeds depending on the beam’s orientation with respect to the rotation of the earth.
The 1971 Hafele-Keating experiments transporting atomic clocks around the world in opposite directions also appear to contradict Special Relativity, which asserts that time slows down for an object moving with respect to the observer, which would mean that the airplane clock would appear to be faster than the Naval Observatory clock on the ground, but that the reverse would be true too if the airplane clock were taken to be the fixed observer, but the interpretation of the results (which were consistent both with General Relativity and with Beckmann’s theory) required the postulation of an inertial clock at the center of the earth with which the times of the other clocks could be compared.
Because we live in a universe of accelerative forces such as gravity, the Special Theory may well be unfalsifiable, which would make it a metaphysical, not a scientific hypothesis, in Popperian terms. Certainly no one has ever observed the predicted dilations of space, or the mutual speeding up of clocks from the points of view of two observers moving relative to each other, or of the corresponding relative buildup of masses in both of the relative frames of reference as their relative velocity approached the speed of light. Science fiction has had fun with many of these paradoxes, but the Special Theory of Relativity, properly understood, gives us no reason to suspect that any of these phenomena are features of our universe.
The Special Theory of Relativity is also pernicious, not only because it gives rise to incomprehensible paradoxes that suggest that our whole conception of physics is wrong, but also because the second postulate of the Special Theory, that the speed of light is a constant independent not only of the source but of the observer permeates the thinking of modern physicists as a dogma, even though it is ignored in practice, and for good reason. For example, if the speed of light were always constant in our universe, the concept of simultaneity would dissolve into meaninglessness and there would be no way to synchronize clocks, nor could the GPS satellite system be made to work: it does work, of course, but only because a fixed temporal frame of reference is presumed.
The main problem with the Special Theory is that in order to preserve Einstein’s dogmatic postulate of the constancy of the speed of light independent of both source and observer, and its relativity implications, the mathematics of the General Theory of Relativity had to be unduly complicated. And, as you note, Beckmann’s work has demonstrated that the gravitational phenomena with which the General Theory is concerned can be accounted for in classical Newtonian terms, without all the mystification and paradoxes. However, Hayden notes in his introduction that the actual mathematics of overlapping gravitational fields (e.g. taking into consideration where the balance points lie between the gravitational fields of the earth, the moon, the sun, etc.) can still be quite complex, and Beckmann himself never got around to working those out.
John B. Robb
Thank you for the summary. I have many times recommended Tom Bethell’s book http://www.amazon.com/Questioning-Einstein-Is-Relativity-Necessary/dp/0971484597 and possibly should have done so again; there are other works on modern aether theory as well. Google “Is Einstein necessary”… Relativity was “confirmed” by the bending of light rays in a gravitational field; there are other explanations which do not require tensor calculus. Whether understanding the universe requires mathematics of that order of difficulty I cannot say; I confess that I hope not. Tom leaves out the math, which is perhaps wise.
Subject: Epistemology and the Münchhausen trilemma
I had an email about this in my drafts. Since you’ll discuss epistemology, where do you stand on the Münchhausen trilemma? How do we overcome the balkanization of epistemology? Why don’t we teach epistemology in high school? I think that and general semantics (per Alfred Korzybski) would solve many problems.
◊ ◊ ◊ ◊
Most Respectfully,
Joshua Jordan, KSC
Percussa Resurgo
Probably but I have only so much time. What is the Munchhausen trilemma?
Jerry Pournelle
Chaos Manor
Epistemology and the Münchhausen trilemma
I’ll keep this very short, relatively speaking; this is an outlined response:
The Münchhausen trilemma is the crux of epistemology. Anyone who studies epistemology soon becomes aware that every tendency in the epistemology has weaknesses — including science — making it fallible. No tendency e.g. authority, faith, science, empiricism, logic, rationalism, idealism, constructivism reveals truth because you come to a point of infinite regress. John Pollock describes it best:
“… to justify a belief one must appeal to a further justified belief. This means that one of two things can be the case. Either there are some beliefs that we can be justified for holding, without being able to justify them on the basis of any other belief, or else for each justified belief there is an infinite regress of (potential) justification [the nebula theory]. On this theory there is no rock bottom of justification. Justification just meanders in and out through our network of beliefs, stopping nowhere.” You find no solid truth that everything is built upon and it becomes more like a ball of ants crossing a river. Now you have to make a choice; you have three options — all of these undesirable. You must face the Münchhausen trilemma.
The story is named after the story Baron Münchhausen, who pulled himself out of quicksand by his own hair. The Münchhausen tribesman is the how we answer the question “How do I know this is true?” When we ask ourselves this, we provide proof but then we need proof that our proof is true and so on with subsequent proof. We can deal with this problem in three ways:
1. We create a circular viz. coherent argument where theory and proof support each other. X is true because of Y; Y is true because of X.
This is dangerous because the arguments can be logically valid i.e.
their conclusions follow from their premises. It is an informal fallacy. The main problem is one already believes the conclusion and, anyway, the premises do not prove the conclusion in this way.
Therefore, the argument will not persuade — well, it might persuade non-critical thinkers. This is where guys like Hitler do really well.
Coherentism is the approach.
2. We agree on axioms. Consider the conch shell in Lord of the Flies; the group agreed on the axiom that whomever held the conch had the right to speak. In epistemology, we might agree that a series of statements is true even though we cannot verify this. Most social organizations seem to run on this approach. Some more esoteric organizations even mention a “substitute” for something that was “lost” and this is a reference to the axiomatic tendency of the organization. Foundationalism is the approach.
3. We accept infinite regress. We realize that each proof requires further proof, ad infinitum. This view rejects the fallacies and weaknesses of the previous two choices, while accepting reality. Infinitism is the approach.
While most scientists I speak with relegate this as an “exercise in abstract philosophy”, I think they’re wrong. I think they’re not comfortable with the weaknesses of their paradigm and I confirmed this when discussing the weaknesses of empiricism and rationalism — the constituents of science. I normally get the “science is settled” or “if it isn’t science it’s not worth knowing” arguments and we reach an impasse (their axioms). Despite their inability to prove their claims, that is the only criticism I’ve entertained involving the trilemma.
◊ ◊ ◊ ◊ ◊
Most Respectfully,
Joshua Jordan, KSC
Percussa Resurgo
Having had only an undergraduate training in philosophy with some help from the Jesuits, I may have missed something, but I could have sworn that “The Munchhausen trilemma” was not commonly, or indeed at all, referred to in the late 40’s and early 50’s; it may be well known today by that name, but this has not always been so. As to the identity of the Baron, thank you, but I have had some previous exposure to the nobleman’s exploits. His name is also used in psychological diagnoses, although that is rather modern; before the need for appellations to use in insurance claims, we were content to use less colorful diagnostic terms than Munchhausen By Proxy. But that’s a different story
I was not aware that a common philosophical problem/criticism had been given the Baron’s name; I suspect this is due to the popularity of certain movies.
I am not ready to undertake the rather long task of providing instruction in philosophical principles; I simply have to make do with Sir Karl Popper, and the rather mundane notion that if two psychiatrists, a nurse, and the wardboys tell you that you are not covered with bees, you might as well stop brushing them off your coat. We can insist that statements about the world are not science if they cannot be falsified. As to what is truth; we can agree that statements that can be falsified but have not been may be acted on as if verified, even though full verification is not possible. Of course this can lead to having to treat two different views of reality as true if they do not generate falsifiable statements that conflict with each other. Rather like Beckmann and Einstein.
Excuse my brevity. It is painful to type while staring at the keyboard.
Jerry Pournelle
Chaos Manor
I appreciate you taking the time to respond at some length; especially considering that it’s not easy to type right now.
I did some research on it and the term was coined in 1968 in reference to Karl Popper’s trilemma of dogmatism vs. infinite regress vs.
psychologism. Popper, in his 1935 publication, attributed the concept to Jakob Fries. However, the trilemma of Fries is slightly different from the Münchhausen trilemma. You can read more here if you’re
interested: http://wiki.ironchariots.org/index.php?title=M%C3%BCnchhausen_trilemma
I understand that your time is limited; perhaps you could publish our exchange on the Münchhausen trilemma? My understanding of the trilemma is limited but it is important to me. For me, this is the bottom of the pile and it’s something I spent a good part of my life searching for and I want to popularize it as much as possible along with General Semantics, arete, Bloom’s Taxonomy, cheaper energy, and the Classical Trivium. =)
As an aside, under the Jesuits, you may know Agrippa’s trilemma — presented by Sextus Empiricus and attributed to Agrippa the Skeptic by Diogenes Laertius (not the Cynic). However, Agrippa’s trilemma has five — not three — choices.
◊ ◊ ◊ ◊ ◊
Most Respectfully,
Joshua Jordan, KSC
Percussa Resurgo
The map is not the territory. Science can provide us with better maps – if we follow the rules – but they are only maps.
Hello Jerry,
Reader James had a nice comment on your post for 24 August re Settled Science about the uncanny precision of planetary temperature measurements, over century time frames, when an instrumentation calibration lab, under controlled conditions, would be hard put to duplicate it over 24 continuous hours.
Without critiquing every ‘a’,’and’, and ’the’ of his post it sounded spot on to me.
Coincidentally, the pooh-bah’s of climate science from around the world made the following announcement this month: July 2015 was the hottest month ever, since records began in 1880.
Here is a quote from NOAA’s official announcement ( http://www.ncdc.noaa.gov/sotc/global/201507 ):
“The combined average temperature over global land and ocean surfaces for July 2015 was the highest for July in the 136-year period of record, at 0.81°C (1.46°F) above the 20th century average of 15.8°C (60.4°F), surpassing the previous record set in 1998 by 0.08°C (0.14°F). As July is climatologically the warmest month of the year globally, this monthly global temperature of 16.61°C (61.86°F) was also the highest among all 1627 months in the record that began in January 1880. The July temperature is currently increasing at an average rate of 0.65°C (1.17°F) per century.”
In the spirit of James’ comment, are the people producing such drivel stupid enough to believe it themselves? Do they REALLY believe that we have had a planet wide instrumentation system in place since 1880 and a 135 year data base of its output that would allow us to list the 1627 months since 1880 in rank order of the temperature of the entire planet for each month? In spite of the fact that there is AFAIK no universally agreed upon method of even CALCULATING the temperature of the planet for a given month? And if there IS a cookbook procedure, do they really believe that the planetary instrumentation system provided sufficient coverage and precision over the entire 1627 months to justify their proclamation of an anomaly of 0.08 C for a specific month to be a ‘record’ ?
Back when I was a Navy tech and we were faced with some incredible feat of technological wizardry our typical response was “Modern science knows no limitations!’. That would appear to be especially applicable to ‘Modern Climate Science’.
Bob Ludwick
I am at a loss to explain why they cannot tell us the formula for “the temperature of the Earth”. I suspect they are afraid they would be laughed at.
if you want more of something, subsidize it
Dr. Pournelle,
A case demonstrating your point: How Carbon Credit Program Resulted In Even More Greenhouse Gas Emissions
http://news.google.com/news/url?sr=1&ct2=us%2F1_0_s_0_1_a&sa=t&usg=AFQjCNF8uoAGTACLRE6JnuGarMAtOl-nkg&cid=52778934561097&url=http%3A%2F%2Fwww.csmonitor.com%2FEnvironment%2F2015%2F0825%2FHow-carbon-credit-program-resulted-in-even-more-greenhouse-gas-emissions&ei=5KLcVeD9H4HRhAHjq7CICw&rt=SECTION&vm=STANDARD&bvm=section&did=1096999885361212996&sid=en_us%3Asnc&ssid=snc&st=1&at=dt
-d
Surprise!
Turning Atmospheric CO2 into Carbon Nanofibers
Dr. Pournelle,
Regardless of what one believes about Climate Change, an economic process for manufacturing carbon nanofibers from atmospheric CO2 is pretty cool stuff. Projected cost is $1000/ton of nanofibers.
http://www.nanodaily.com/reports/Diamonds_from_the_sky_approach_turns_CO2_into_valuable_products_999.htmlJeffrey
If the system produces a product worth more than the cost of making it, I would assume it will be capitalized soon enough. I’m too lazy to do the numbers. But I suspect the CO2 entering the atmosphere each year far exceeds the amount of carbon fiber you can sell, so if it be actually needful it may have to be subsidized, but that’s better than bankrupting ourselves.
Security Theater, er Theatre
“Toddler’s Minions ‘fart blaster’ not allowed on flight as it has a trigger”
Well don’t we all feel SO much safer now?
“Will there ever again be an England?” -Anon
Cordially,
John
I dare not answer that…
: Celebrating George Orwell’s birthday
A group of Dutch artists celebrated George Orwell’s birthday on June 25th by putting party hats on surveillance cameras around the city of Utrecht.
http://front404.com/george-orwells-birthday-party/
“If you want any discipline to shape up, first get it laughed at.”
– Paul Harvey
Cordially,
John
|
|
The article is well beyond my expertise, but is very interesting. The climate modelers tend to secrecy about such matters.
Freedom is not free. Free men are not equal. Equal men are not free.