Evolutionary Philosophy
  • Home
  • Worldview
    • Epistemology
    • Metaphysics
    • Logic
    • Ethics
    • Politics
    • Aesthetics
  • Applied
    • Know Thyself
    • 10 Tenets
    • Survival of the Fittest Philosophers >
      • Ancient Philosophy (Pre 450 CE)
      • Medieval Philosophy (450-1600 CE)
      • Modern Philosophy (1600-1920 CE)
      • Contemporary Philosophy (Post 1920 CE)
    • 100 Thought Experiments
    • Elsewhere
  • Fiction
    • Draining the Swamp >
      • Further Q&A
    • Short Stories
    • The Vitanauts
  • Blog
  • Store
  • About
    • Purpose
    • My Evolution
    • Evolution 101
    • Philosophy 101

Overview of How to Talk to a Science Denier by Lee McIntyre

12/31/2021

0 Comments

 
Picture
Happy New Year! And good riddance to 2021. Between the January 6th attack on the Capitol and the spread of the anti-Covid-vaccine movement, it’s been another bad year for epistemology and truth-seeking. My last post looked at an epistemology book from a famous philosopher that didn’t offer much help about this situation, but now I’ve got two great books that come to the rescue. I’ll save Mental Immunity by Andy Norman for last, but before I get to that, let me go over a book called How to Talk to a Science Denier, which was written by Andy's friend Lee McIntyre. You can hear Lee talk about HTTTASD on Michael Shermer’s podcast, which I highly recommend if you don’t have time to read the book, but I was lucky enough to receive a pre-print copy from Lee’s publicist and I found it very enjoyable.
 
As before in this mini-series on epistemology, I'm not going to provide a formal review of this book. I'll just share some selected excerpts that I jotted down and insert a few of my own thoughts and interpretations where necessary. All page numbers are from the 2021 proof edition from MIT Press.
 
How to Talk to a Science Denier by Lee McIntyre
  • (TOC) Introduction; What I Learned at the Flat Earth Convention; What Is Science Denial?; How Do You Change Someone's Mind?; Close Encounters with Climate Change; Canary in the Coal Mine; GMOs: Is There Such a Thing as Liberal Science Denial?; Talking with Trust; Coronavirus and the Road Ahead; Epilogue

This Table of Contents shows you what topics are covered in this book. I don’t know about you, but I got very excited reading this.

  • (p.xii) In June 2019, a landmark study was published in the journal Nature Human Behaviour that provided the first empirical evidence that you can fight back against science deniers. … two German researchers—Philipp Schmid and Cornelia Betsch—show that the worst thing you can do is not fight back, because then misinformation festers. The study considered two possible strategies. First, there is content rebuttal, which is when an expert presents deniers with the facts of science. Offered the right way, this can be very effective. But there is a lesser-known second strategy called technique rebuttal, which relies on the idea that there are five common reasoning errors made by all science deniers. And here is the shocking thing: both strategies are equally effective, and there is no additive effect, which means that anyone can fight back against science deniers! You don’t have to be a scientist to do it. Once you have studied the mistakes that are common to their arguments--reliance on conspiracy theories, cherry-picking evidence, reliance on fake experts, setting impossible expectations for science, and using illogical reasoning—you have the secret decoder ring that will provide a universal strategy for fighting back against all forms of science denial.
 
This is the core idea of the book. If you pay any attention at all to claims from science deniers, you'll see these five mistakes pop up over and over. And it seems possible to make progress against poor arguments by simply pointing these issues out to people. You don’t need to be an expert in epidemiology or voting booth technology or earth sciences. But no matter what, you should continue to talk to people.


  • (p.xiv) In his important essay “How to Convince Someone When Facts Fail,” professional skeptic and historian of science Michael Shermer recommends the following strategy: From my experience, (1) keep emotions out of the exchange, (2) discuss, don’t attack (no ad hominem or ad Hitlerum), (3) listen carefully and try to articulate the other position accurately, (4) show respect, (5) acknowledge that you understand why someone might hold that opinion, and (6) try to show how changing facts does not necessarily mean changing worldviews.

And when you do engage with people, these top tips can help keep it civil.

  • (p.xv) In my most recent book, The Scientific Attitude: Defending Science from Denial, Fraud, and Pseudoscience (MIT Press, 2019), I developed a theory of what is most special about science, and outlined a strategy for using this to defend science from its critics. In my view, the most special thing about science is not its logic or method but its values and practices—which are most relevant to its social context. In short, scientists keep one another honest by constantly checking their colleagues’ work against the evidence and changing their minds as new evidence comes to light.
  • (p.9) In my earlier book, The Scientific Attitude, I had argued that the primary thing that separates science from nonscience is that scientists embrace an attitude of willingness to change their hypothesis if it does not fit with the evidence.
 
These are great points that fit right in with my review of Why Trust Science? by Naomi Oreskes. It’s important to remember that epistemology is a normative discipline, meaning it is concerned with the norms of behaviours that we find acceptable and useful for producing knowledge. As McIntyre notes, the values and practices of truth-seeking and fallibilism are core aspects of the scientific method, and I would extend those as necessary for all epistemological efforts.


  • (p.13) Conspiracy-based reasoning is—or should be—anathema to scientific practice. Why? Because it allows you to accept both confirmation and failure as warrant for your theory. If your theory is borne out by the evidence, then fine. But if it is not, then it must be due to some malicious person who is hiding the truth. And the fact that there is no evidence that this is happening is simply testament to how good the conspirators are, which also confirms your hypothesis.
 
Bingo. McIntyre does a great job of pinpointing why conspiracy thinking leads to a bad place where beliefs get stuck and become immune to change.


  • (p.17) “What evidence, if it existed, would it take to convince you that you were wrong?” I liked this question because it was both philosophically respectable and also personal. It was not just about their beliefs but about them. … Instead of challenging them on the basis of their evidence, I would instead talk about the way that they were forming their beliefs on the basis of this evidence.

This is another key point of HTTTASD. This question is an excellent way in to the mind of science deniers. It's also the kind of question that can slowly eat away at others long after your personal interaction with them.

  • (p.28) We used to laugh at anti-evolutionists too. How many years before Flat Earthers are running for a seat on your local school board, with an agenda to “teach the controversy” in the physics classroom? If you think that can’t happen—that it couldn’t possibly get that bad—consider this: eleven million people in Brazil believe in Flat Earth; that is 7 percent of their population.

Gah! Watch out for bad thinkers subverting democratic institutions.

  • (p.39) Why do some people (like science deniers) engage in conspiracy theory thinking while others do not? Various psychological theories have been offered, involving factors such as inflated self-confidence, narcissism, or low self-esteem. A more popular consensus seems to be that conspiracy theories are a coping mechanism that some people use to deal with feelings of anxiety and loss of control in the face of large, upsetting events. The human brain does not like random events, because we cannot learn from and therefore cannot plan for them. When we feel helpless (due to lack of understanding, the scale of an event, its personal impact on us, or our social position), we may feel drawn to explanations that identify an enemy we can confront. This is not a rational process, and researchers who have studied conspiracy theories note that those who tend to “go with their gut” are the most likely to indulge in conspiracy-based thinking. This is why ignorance is highly correlated with belief in conspiracy theories. When we are less able to understand something on the basis of our analytical faculties, we may feel more threatened by it. There is also the fact that many are attracted to the idea of “hidden knowledge,” because it serves their ego to think that they are one of the few people to understand something that others don’t know.

This is an aside from the points about epistemology that I am focused on at the moment, but understanding the psychology behind the bad beliefs does help me sympathise a bit more with the people who hold them. And that can give me more patience too.

  • (p.42) There are myriad ways to be illogical. The main foibles and fallacies identified by the Hoofnagle brothers and others as most basic to science denial reasoning include the following: straw man, red herring, false analogy, false dichotomy, and jumping to a conclusion.

That's another good checklist for noting the errors that people make.

  • (p.48) When I was at FEIC 2018, I noted a disproportionate number of people who had had some sort of trauma in their lives. Sometimes this was health-related, other times it was interpersonal. Often it was unspecified. But in every instance the Flat Earther referred to it as in some way related to how they “woke up” and realized that they were being lied to. Many of them embraced a sense of victimization, even before they became Flat Earthers. I have found very little in the psychological literature about this, but I remain convinced that there is something to learn from this hypothesis. I came away from the convention with the feeling that many of the Flat Earthers were broken people. Could that be true for other science deniers as well?

Maybe so! One of the big takeaways from Why Trust Science? was that scientific communities are aiming for broad consensus — broad across all kinds of diversity and all manner of investigations — and this requires good faith efforts and trust in one another. It makes a lot of sense, therefore, that once someone loses faith and trust in others as a result of a personal trauma, then they could easily lose their ability to join in with consensus beliefs too. If so, that is doubly damaging.

  • (p.49) We now stand on the doorstep of a key insight into the question of why science deniers believe what they believe, even in the face of contravening evidence. The answer is found in realizing that the central issue at play in belief formation—even about empirical topics—may not be evidence but identity.

This is another key takeaway from HTTTASD. And it makes complete sense in light of the discussion above about knowledge building towards consensus rather than truth. We only recognise the good faith efforts of people who we trust to be in our in-groups. That identity can be quite flexible and broad enough to include “anyone trying to tell the truth,” or it can be so rigid and narrow as to only include “those who see the world as I do.” Obviously, the former leads to better outcomes, so be careful who you identify with.

  • (p.54) Once you decide who to believe, perhaps you know what to believe. But this makes us ripe for manipulation and exploitation by others. Perhaps this provides the long-awaited link between those who create the disinformation of science denial and those who merely believe it.

Yes! And if you remember from my overview of Kindly Inquisitors, two foundation stones for the liberal intellectual system are “no one gets final say” and “no one has personal authority.” Once you commit to these, you join a team that is far more protected from disinformation. Fake news fizzles out here very quickly after a few checks and balances by your other teammates. If, however, you join a tribe that forms around revealed truths from authority figures, then you become much more susceptible to disinformation. This has got to be a major reason why conservatives retweeted Russian trolls about 31 times more often than liberals in the 2016 election. (Other possible reasons do exist for this too.)

  • (p.56) Science denial is an attack not just on the content of certain scientific theories but on the values and methods that scientists use to come up with those theories in the first place. In some sense, science deniers are challenging the scientist’s identity! Science deniers are not just ignorant of the facts but also of the scientific way of thinking. To remedy this, we must do more than present deniers with the evidence; we must get them to rethink how they are reasoning about the evidence. We must invite them to try out a new identity, based on a different set of values.
 
This is a brilliant point from McIntyre. We need to be much more explicit about the epistemological values and methods we are using. We have to be clear that anyone can join in with them, and this is precisely why they work. Just shouting “trust the science” isn’t going to work when “science” is such an underdefined term for the general public. (And that includes too many scientists doing the loud shouting too.)


  • (p.68) Schmid and Betsch tested four possible ways of responding to subjects who had been exposed to scientific misinformation: no response, topic rebuttal, technique rebuttal, and both kinds of rebuttal. … The clear result of this study was that providing no response to misinformation was the worst thing you could do; with no rebuttal message, subjects were more likely to be swayed toward false beliefs. In a more encouraging result, researchers found that it was possible to mitigate the effects of scientific misinformation by using either content rebuttal or technique rebuttal, and that both were equally effective. There was, moreover, no additive advantage; when both content and technique rebuttal were used together, the result was the same.
 
What a fascinating study. Good to know.
 
  • (p.119) According to one recent study in the Journal of Experimental Social Psychology, entitled “Red, White, and Blue Enough to be Green,” the persuasive strategy of “moral framing” can make a big difference in making the issue of climate change more palatable to conservatives. By emphasizing the idea that protecting the natural environment was a matter of (1) obeying authority, (2) defending the purity of nature, and (3) demonstrating one’s patriotism, there was a statistically significant shift in conservatives’ willingness to accept a pro-environmental message.

And that is a good data point about this strategy in action.

  • (p.175) My message in this book is simple: we need to start talking to one another again, especially to those with whom we disagree. But we have to be smart about how we do it.
  • (p.176) Those who are cognizant of the way science works understand that there is always some uncertainty behind any scientific pronouncement, and in fact the hallmark of science is that it cares about evidence and learns over time, which can lead to radical overthrow of one theory for another. But does the public understand this? Not necessarily. And lying to someone—for instance, by saying that masks are 100 percent effective, or that any vaccine is guaranteed to be safe—is exactly the wrong tactic. When scientists do that, any chink in the armor is ripe for later exploitation, and deniers will use it as an excuse not to believe anything further.
  • (p.177) I have long held that one of the greatest weapons we have to fight back against science denial is to embrace uncertainty as a strength rather than a weakness of science.

Yes! And this is exactly why I explicitly want to remove the claims for Truth from the JTB theory of knowledge. Embrace our uncertainty. That’s how we remain flexible in our thinking and begin to pay attention to what it really takes to builds up confidence.

  • (p.180) What if we taught people not just what scientists had found, but the process of conjecture, failure, uncertainty, and testing by which they had found it? Of course scientists make mistakes, but what is special about them is that they embrace an ethos that champions turning to the evidence as a way to learn from them. What if we educated people about the values of science by demonstrating the importance of the scientist’s creed: openness, humility, respect for uncertainty, honesty, transparency, and the courage to expose one’s work to rigorous testing? I believe this kind of science education would do more to defeat science denial than anything else we could do.
 
Agreed. I really enjoyed the main points that McIntyre drives home in HTTTASD. And there are numerous examples in the book (which I’ve left out of this short blog post) that are absolutely worth the price of admission. I especially enjoyed his stories about attending a Flat Earth convention. Amazing. If you’ve got any science deniers in your life, I highly recommend picking up a copy of McIntyre’s book to help you deal with them. Maybe we’ll all have a better 2022 because of it.
0 Comments

Overview of Knowledge and Its Limits by Timothy Williamson

12/24/2021

0 Comments

 
Picture
Picture
Picture
Consider this your lump of coal for Christmas for anything naughty you've done this year.

In my last post, I kicked off a short series on epistemology books with an overview of Kindly Inquisitors by Jonathan Rauch. I found that one really useful, and I have two other excellent books that I'm itching to explore in this series, but first I feel the need to cover one that I really disliked since I think it's still illustrative of the problems that exist with this topic of knowledge. That book is Knowledge and Its Limits by Timothy Williamson. Williamson is the Wykeham Professor of Logic at Oxford University and he cracked the top 10 in two lists of the most cited philosophers in history, as discussed in a recent blog post by Eric Schwitzgebel. Keen followers might remember that I got to meet Tim in 2018 at a local event when I was asked by The Philosopher to write and present a short review of Doing Philosophy by Williamson. Tim was a most impressive thinker and a very gracious man to interact with four amateur philosophers and our impressions of his work. He obviously has done an enormous amount of good in the field of philosophy so I was excited to dive into his 2002 book on epistemology which sounded by the title like it might agree with my position that we cannot claim to have justified, true, beliefs (the traditional definition of knowledge). As it turned out, however, I think there's a reason Tim is a professor of logic and not one of epistemology.

​As before, I'm not going to provide a formal review of this book. I'll just share some selected excerpts that I jotted down and insert a few of my own thoughts and interpretations where necessary. All page numbers are from the 2009 paperback edition from Oxford University Press.

Knowledge and its Limits by Timothy Williamson
  • (p.v) If I had to summarize this book in two words, they would be: knowledge first. It takes the simple distinction between knowledge and ignorance as a starting point from which to explain other things, not as something itself to be explained. In that sense the book reverses the direction of explanation predominant in the history of epistemology.

These are the very first sentences in the book, coming in its new Preface, and they explain why one of the blurbs on the back cover said "Williamson is to be commended for turning the theory of knowledge upside down." I didn't immediately grasp what Williamson meant by this, so I kept going through all 300+ pages of the book, but I might just as well have stopped right here. What Williamson is saying is that anything you know...you know! That's it. There's no need to fight over what counts or doesn't count as knowledge. This doesn't actually turn epistemology upside down; it throws it out the window! Williamson takes what is traditionally a normative study of the how's and why's of what we accept as knowledge and he settles for mere bald descriptions and assertions. By the end of the book, it became apparent to me that this is exactly what one might expect from an analytic professor of logic who wants crisp neat lines and unassailable starting points which he can use to build crystal palaces of thought by applying his rigorous formulas. Spoiler alert, that's not the way a gradually transitioning evolutionary world works.

  • (p.4) It will be assumed, not quite uncontroversially, that the upshot of that debate [among epistemologists] is that no currently available analysis of knowledge in terms of belief is adequate.
  • (p.6) Sceptics and their fellow-travellers characteristically suppose that the truth-values of one’s beliefs can vary independently of those beliefs and of all one’s other mental states: one’s total mental state is exactly the same in a sufficiently radical sceptical scenario as it is in a common-sense scenario, yet most of one’s beliefs about the external world are true in the common-sense scenario and false in the sceptical scenario.
  • (p.19) The point about the conjunctive proposition that p is true and unknown is that, in virtue of its structure, it is not available to be known in any case whatsoever. The argument for this conclusion was first published by Fitch in 1963. Contrapositively, he showed that all truths are knowable only if all truths are known. This is sometimes known as the Paradox of Knowability.

Williamson starts by acknowledging the skeptical problem of knowledge. (If you can see it through the jargon.) Philosophers haven't been able to prove that any beliefs rise to the level of true knowledge. Skeptical scenarios can always be imagined which would make our current beliefs false. Therefore, the only way to know if anything is true is to know everything that is possible to know. And it sure seems like that is impossible in a growing and changing universe where we are limited to our subjective viewpoints of the here and now with no way of ever knowing what we don't know.

Sounds pretty irrefutable, right? So what does Williamson offer to combat this?


  • (p.21) Knowing is a state of mind. That claim is central to the account of knowledge developed in this book. … A state of a mind is a mental state of a subject. Paradigmatic mental states include love, hate, pleasure, and pain. Moreover, they include attitudes to propositions: believing that something is so, conceiving that it is so, hoping or fearing that it is so, wondering whether it is so, intending or desiring it to be so. One can also know that something is so. This book concerns such propositional knowledge.
  • (p.27) Nothing said here should convince someone who has given up ordinary beliefs that they did in fact constitute knowledge, for nothing said here should convince her that they are true. The trick is never to give them up. This is the usual case with philosophical treatments of scepticism: they are better at prevention than at cure. If a refutation of scepticism is supposed to reason one out of the hole, then scepticism is irrefutable. The most to be hoped for is something which will prevent the sceptic (who may be oneself) from reasoning one into the hole in the first place.

The trick?!? So we're just supposed to ignore the centuries of arguments about philosophical doubt? Williamson wants us to confine ourselves to "propositional knowledge." But this is the kind of knowledge that simply takes for granted the propositions that are used to construct a logical argument. For example, take the two propositions A) "Socrates is a man" and B) "all men are mortal." Accept these, and you know for certain that C) "Socrates is mortal." Sure, that's one way to arrive at certainty. But only in "logic space" as opposed to reality. Logic space tells us nothing about how to evaluate the truth of the propositions. And without that, then any old proposition will do. If we were to accept the norm of taking propositions for granted, then we would slide immediately into a vicious relativism where anything can be claimed as true. I'm sure Williamson doesn't want that, but as soon as he gets off his perch and gets into debates about which propositions are to be disallowed, then he's going to need traditional epistemology. Only that can tell you why a proposition such as "all men are not mortal" should be treated as false.

  • (p.34) The main idea is simple. A propositional attitude is factive if and only if, necessarily, one has it only to truths. Examples include the attitudes of seeing, knowing, and remembering. Not all factive attitudes constitute states; forgetting is a process. Call those attitudes which do constitute states stative. The proposal is that knowing is the most general factive stative attitude, that which one has to a proposition if one has any factive stative attitude to it at all.

This proposal from Williamson isn't simple at all! Cutting through the dense obfuscation, he has simply smuggled in the claim to know "truths" while ignoring the entire debate about how we know which truths are true. (Spoiler alert, we can't say for certain.)

  • (p.101) Since it is logically possible for the deer to be behind the rock at one moment and not another, their present-tensed belief may be true at one moment and false at another. By standard criteria of individuation, a proposition cannot change its truth-value; the sentence ‘The deer is behind the rock’ expresses different propositions at different times.

I just want to flag up this point that "a proposition cannot change its truth-value." That's an important part of the definition for truth that must be considered, and it's also a point that I may raise in an article about evolutionary logic some day. For now, just notice the problem of this "standard criteria" in philosophy.

  • (p.138) Thus, the reasoning by which they rule out a last-day examination is unsound, for it assumes that knowledge will be retained in trying to refute a supposition on which it would not be retained.

​This is a diversion from the epistemological problem of knowledge that I'm concerned with, but it is an example of the narrowness of Williamson's logic-driven approach so I wanted to mention it. I've cut this passage short, but essentially Williamson tries to solve the surprise test paradox by saying, "A ha! Your argument rests on knowing that a test is coming, but since you might possibly forget that knowledge, then your argument isn't fully airtight. Thus, (*pushes up glasses*), I can ignore the paradox." This is utterly pedantic and misses the entire point of the argument. But when logic is your only hammer tool, every problem gets nailed with it. For a more direct treatment, see my own response to this thought experiment.

  • (p.180) Uncertainty about evidence does not generate an infinite regress of evidence about evidence about . . .. In order to reflect adequately on one’s evidence, one might need evidence about one’s evidence, and in order to reflect adequately about the latter evidence, one might need evidence about it, and so on. But this regress is merely and harmlessly potential. We cannot in fact realize infinitely many levels of adequate reflection; at best, further reflection enables us to realize finitely many further stages. At some stage, one must rely on unreflective causal sensitivity to evidence.

This is the heart of Williamson's long argument — that one must rely on unreflective causal sensitivity to evidence. No matter how much logical notation he hides behind (and there is a lot of it), this is a stunningly weak point to rest one's epistemology upon. I thought an unexamined life was not worth living. So how is an unreflective philosophy worth listening to?

  • (p.184) In recent decades, questions of knowledge seem to have been marginalized by questions of justification. According to Crispin Wright, “knowledge is not really the proper central concern of epistemologico-sceptical enquiry… We can live with the concession that we do not, strictly, know some of the things we believed ourselves to know, provided we can retain the thought that we are fully justified in accepting them.” Similarly, John Earman argues that accounts of knowledge are irrelevant to the philosophy of science, because in it ‘the main concern is rarely whether or not a scientist ‘knows’ that some theory is true but rather whether or not she is justified in believing it’.

That's right. Building cases for justification is good enough for scientists, but that's all our knowledge can ever be as well. It is time for a turn to humble pie in epistemology.

  • (p.189) Why does it matter what counts as evidence? Consider the idea that one should proportion one’s belief in proportion to one’s evidence for it. How much evidence one has for the proposition depends on what one’s evidence is. More precisely, a theory of evidence is needed to give bite to what Carnap calls the requirement of total evidence: “[I]n the application of inductive logic to a given knowledge situation, the total evidence available must be taken as a basis for determining the degree of confirmation (1950: 211).”

This is another excellent point to consider. We need a theory of evidence for determining degrees of confirmation. Sounds like a job for another evolutionary hierarchy! Not one of needs or of consciousness or of free will,  but one of knowledge. I'll be working on that for sure for my paper to come out of all this research.

And with that, I've reached my limit on Knowledge and Its Limits. Let me know if you have any other questions or thoughts about it in the comment section below. Until next time, merry Christmas! Hope you liked pressing on this lump of coal as we try to make diamonds with clarity.
0 Comments

    Subscribe to Help Shape This Evolution

    SUBSCRIBE

    RSS Feed


    Blog Philosophy

    This is where ideas mate to form new and better ones. Please share yours respectfully...or they will suffer the fate of extinction!


    Archives

    January 2023
    August 2022
    July 2022
    June 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    August 2021
    June 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    May 2019
    March 2019
    December 2018
    July 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    April 2012


    Click to set custom HTML
Powered by Create your own unique website with customizable templates.