Travels

This is a quick post to reassure regular readers that I have not given up the blog. The hiatus between posts has been due to travel. Seasons in Australia are not as distinct as they are in Europe and when I lived in England (many years ago) I always found the arrival of Spring the most striking illustration of this difference. True to my recollection, the European Spring put on an excellent show for this trip to England and Italy.

Wandering the streets of Rome, I came across a street called Via De’ Burro. While a burro is not exactly a mule*, it is a close enough relation to remind me that I was neglecting the blog and to inspire a quick photograph.

My family and I were not the only ones enjoying Spring in Rome. It had been many years since my last visit to Italy’s capital and I am convinced that there are more tourists there than ever. Perhaps good weather, a week of free museum admission and the lead-up to Easter were conspiring to swell visitor numbers, but I can only imagine that the crowds would be larger still by summer time. The photograph below shows the square in front of the Trevi fountain and I cannot help thinking that, while tourism is a big industry for Australia, we don’t really know what it means to be a serious tourist destination!

Of course, no trip to a non-English speaking country is complete without coming across bizarre English translations. The sign in the photo below was in the window of a restaurant near our hotel. What it means remains a mystery as it certainly did not tempt me to try their cuisine.

As well as Rome, the Italian leg of the trip took in Venice. Spring is an excellent time to visit Venice: excellent weather and not at all stinky!

Venice’s famous “Bridge of Sighs” was under renovation and partly shrouded in scaffolding. But I was able to photograph another Bridge of Sighs, named with a nod to the Venetian original. Before Italy, we spent some time in England and spent a beautiful Sunday morning punting down the river Cam in Cambridge. The bridge in Venice connects the court rooms in the Doge’s Palace to the prison cells and the name supposedly derives from the sighs of prisoners getting their last glimpse of freedom through the windows of the bridge. The connection of the bridge in Cambridge to the original is simply that both are covered stone bridges, regardless of how dim a view students of St John’s college may have taken of their studies at times.

Travelogues are not the usual fare here at the Stubborn Mule, so stay tuned for a return to more typical content soon!

* Thanks to Magpie for reminding me that while “burro” is donkey in Spanish, in Italian it means butter!

Language is a virus

Language is a virus and we are its host. Some strains of language are virulent and spread rapidly. Others are weaker, struggling to infect their hosts and easily supplanted by stronger challengers.

The natural habitat of the language virus is the social group. Some of the more obvious forms are schoolyard slang (what was unreal in my day was sick in later years, but could now be random) or the jargon of specialists. Sometimes the ponds the virus infects can be large ones. By 2008, everyone in Australia knew that “GFC” stood for “Global Financial Crisis”, but I repeatedly saw visitors from the US or UK mystified by this initialisation.

The corporate world is a rich source of (often meaningless) jargon, as decried by Paul Keating’s speech writer Don Watson in Death Sentence: The Decay of Public Language. But what has fascinated me of late in the corporate world is not the language of mission statements, paradigms, closure or value-add, but simpler more innocuous words or phrases that flourish within organisations. After a number of years away, I have been back less than two months at a firm I worked for before and I was immediately struck by the near universal use of a few expressions that I am sure were not being used there four years earlier, and were certainly not used at the company where I worked during the intervening years.

I have now realised that it is impossible to attend an internal meeting without someone suggesting an alternative lens with which to view a problem rather than, say, an alternative perspective. Even more prevalent is “calling out”, as in “I’ll just call out one or two points on this slide” or “Last time we met I called that out as the primary challenge”.

The point is not to criticise these terms themselves, which are quite reasonable means of expression, unlike so much of the corporate-speak that Don Watson ridicules. You could even make the case that “lens” is a better term as it suggests a point of view which can be quickly and simply changed, whereas “perspective” often has connotations of being more permanent. What fascinates me is the way these words have established such a firm hold on the organisation. It makes the social dimension of shared language very clear: if I start using the same terms as you, it makes me seem more a part of the group, which in turn reinforces your use of the terms. All of this can happen subconsciously, so that the hosts can be quite unaware of the infection. Some may notice, but to a newcomer like myself, the infestation is startlingly clear.

It probably will not be long until I find myself calling out the merits of putting on a different lens, but for now I am trying to be strong.

Irony

Last year I wrote about one of the more amusingly ridiculous attempted spam comments intercepted by my blog’s spam filter. It may be genius, stupidity or just an excellent coincidence, but a comment spammer has now attempted to add the following comment to that post:

There are actually loads of details like that to take into consideration. That may be a nice level to deliver up. I provide the ideas above as normal inspiration however clearly there are questions like the one you carry up where a very powerful factor can be working in trustworthy good faith. I don?t know if greatest practices have emerged around things like that, however I’m positive that your job is clearly identified as a good game. Each boys and girls feel the affect of just a moment’s pleasure, for the rest of their lives.

The internet is a funny place.

Carbon tax

Our regular guest writer James Glover (@zebra) returns to the Stubborn Mule today to look at the real cost of carbon tax…and who pays the cost.

It is no surprise that the latest Newspoll shows the Labor Government sinking under a concerted attack by the Opposition, and its supporters in the media, over the Carbon Tax. The incessant cry of “a great big new tax” was bound to have an effect on the marginal voters who derive their political views in atavistic ways. In fact most of the political arguments lately recently seem to revolve around the distinction between levies and taxes. The trick seems to be if your opponents propose it then it is a tax and if you propose it is a levy—the latter being used by both sides to describe variously the flood levy (Labor) and the parental leave levy (Coalition). Taxes, as opposed to levies, apparently lead to profligate spending and are downright un-Australian. It makes you wonder what they use to fund hospitals, schools and roads.

So how does the Carbon Tax work? And what does it mean to say it is “revenue neutral”? Is it really a tax or “not really a tax” as the Treasurer, Wayne Swan, claims? Suppose the government wants to set up a Carbon Tax for the purposes of reducing carbon emmissions. It does this by imposing a tax (or levy or fee) on the price of goods and services that are deemed to ultimately cause high but avoidable (hence no agriculture) emissions of carbon. This of course raises the price of these goods e.g. electricity. If we impose a Carbon Tax on coal-generated electricity (the sine qua non of carbon emitters) then expect the power companies to pass on all or most of the increase to consumers. Now here’s the thing, the money the tax raises will have gone to subsidise the increased power bills of these very same power consumers. By exactly the same amount as the price should rise. So in effect nothing happens. In other words, at a base level the Carbon Tax does nothing. It has no benefits and no costs. Isn’t it really “a great big snooze tax” and not “a great big new tax”?

The Carbon Tax has one (fully intended) important consequence. If power emitters want to increase their profits they can do so by switching to lower carbon emitting alternatives. These might already be available or they can pay to research and develop them. And because of the tax what was previously uneconomic will now be made viable. Since these alternatives are really more expensive than coal-based power, without the tax, you might ask what is really happening at the cost end. It seems like a tax which costs nobody nothing, magically makes alternatives to carbon emitting industries economic. Voila!

Well that’s what the government would have you believe. On closer examination though it is precisely when the Carbon Tax has its intended effect that the cost gets passed onto consumers. But not when the Carbon tax is first introduced. To see why let’s have a look at an example.

Suppose the cost per unit of producing electricity from coal is $100. The power company charges $110 to consumers and so makes a $10 profit. The Govt introduces a 20% Carbon Tax on the cost of producing electricity using coal. This raises the price to $130 in order for the company to maintain its $10 profit margin. That’s $100 for the coal, $20 for the tax and a profit of $10. The extra $20 gets passed onto the consumer whose bill is now $130 per unit. However after the $20 subsidy (paid for by the $20 proceeds of the tax) they still only pay $110.

In other words: the producers, the consumers, and the government are no better or worse off immediately after a Carbon Tax is introduced. But what happens if the Carbon Tax is successful in reducing emissions? That is when consumers end up paying more. The cost to the company, including the tax, of producing one unit of electricity is $120. Suppose an alternative non carbon-emitting energy source is found which costs $115 per unit. This is more than the coal-based cost before the tax, but less than the cost with the Carbon Tax as this carbon-free energy source, let’s call it “sunshine”, attracts no Carbon Tax. So the company, in order to maintain their profit of $10, charges $125 per unit, less than coal based power with a Carbon Tax. But now the consumer receives no subsidy either so even though their total bill has dropped from $130 (with carbon tax and a subsidy) to $125 without a subsidy. It now actually costs them $125, an increase of $15 over the cost before the carbon tax was introduced and even immediately afterwards. This of course is the extra $15 per unit that it costs to produced electricity from sunshine rather than coal.

That is how the Carbon Tax really works and ends up costing the consumer. You start out with a Carbon Tax which costs nobody anything and end up without a Carbon Tax that everybody ends up paying more for. When it has its intended effect, and there is no coal based power, but also no more money for subsidies. And, in principle, no more carbon pollution.That of course though is really the point. There is a (currently) hidden cost of producing carbon as carbon dioxide and methane in global warming and that is, if the system works, the $15 extra you pay to solve the problem by removing carbon from the economy.

Virtual currency

Thanks to my new job, the rate of Stubborn Mule posts has declined somewhat over the last few weeks (to say nothing of Mule Bites podcasts!). Still, my commute has allowed me to catch up on my podcast listening and a particularly interesting one was the recent Security Now episode about the “virtual currency” Bitcoin. Here is how Bitcoin is described on their website:

Bitcoin is a peer-to-peer digital currency. Peer-to-peer (P2P) means that there is no central authority to issue new money or keep track of transactions. Instead, these tasks are managed collectively by the nodes of the network.

Given that e-commerce is already widespread on the internet, what exactly is new about this idea of a virtual currency? The key to this question is understanding the difference between money in the form of “currency” (notes and coins) and money in the form of balances in your bank account. Currency is essentially anonymous. If I hand you a $10 note, we don’t need anyone to facilitate the transaction and you can take that $10 and spend it with no further reference to me or anyone other else. To move $10 from my bank account to yours is quite different. Before we could even start, we both had to provide extensive identification to our respective banks to open bank accounts. Then, you would have to provide me with enough account information for me to instruct my bank to transfer money from my account to yours. Both banks would retain records of the transfer for a long period of time and, if the transaction was rather bigger than $10, the chances are that there may even be requirements for our banks to notify a government agency in case we were engaged in money laundering. Even if I paid you using a credit card, the information exchange would be much the same.

The Bitcoin virtual currency aims to mimic some of the essential characteristics of currency while allowing transactions to be conducted online. To do so, it makes very creative use of a powerful encryption technology known as “public key cryptography”.

Public key encryption involves encrypting data in a rather unusual way: one key is used to encode the data and a different key is used to decode the data. This is in contrast to “symmetric key encryption” in which the same key is used for both encoding and decoding data. To appreciate the difference, consider a less electronic scenario. I want to exchange messages with you using a locked box and ensure no-one else can open it. If we already have identical keys to the one padlock there is no problem. I simply pop my message in the box, pop on the padlock and post it to you. When you receive the box, you can use your key to open the box, read the message, reply and pop the same padlock on the box before sending it back. But what do we do if we don’t both have keys to the one padlock? There is a tricky solution. I put the message in the box, secure it with my padlock and send it to you. Once you get it, although you cannot open my lock, you add your own padlock to the box and return it to me. Once I get it back, I unlock my own lock and send the box back. You can then open your lock and read my message. While in transit, no-one can open the box. It’s certainly an elaborate protocol and, of course, I’m ignoring crowbars and the like, but it gives a rough analogy* for how public key encryption works.

When it comes to data encryption, both users will create a “key pair”. One key they keep to themselves (this is known as the “private key”) and one key they can share with the world (the “public key”). I can then let you (and indeed the whole world) know what my public key is. When I want to send you a message, I encrypt it using your public key and send it to you. The only way to decode it is using your private key, which only you have. Even though everyone can find out what your public key is, only you can decode the message. When you want to send a message back to me, you encode it using my public key. So, anyone who knows my public key can send me a message for my eyes only. As a side benefit, public key encryption can also provide authentication. If you send me a message encrypted using my public key, I would ideally like to confirm that it really came from you not someone else (after all, everyone knows my public key). To deal with this, you can also send a copy of the same message encoded using your private key. Once I have decoded your message with my private key, I can also decode the second message using your public key. If the two messages are the same, I know that whoever sent me the encoded message also had access to your private key, so I can be reasonably sure it was you. In practice, authentication works a little bit differently to this, using a “hash” of the original message (otherwise anyone could decode the secret message using your public key). This authentication process is known as “digital signing”.

All of that may seem like a bit of a diversion, but public key cryptography is at the heart of the Bitcoin idea. Essentially, a Bitcoin is a blob of data and if I want to give you one of my Bitcoins, I add your public key to the blob and then sign it using my private key. This means that anyone who has access to my public key (i.e. the whole world) can confirm that I intended to pass the coin onto you. As a result, Bitcoins have their entire transaction history embedded in them! To decide who “owns” a Bitcoin, we just need to look at the last public key in the transaction chain. Whoever owns that key, owns the Bitcoin.

“How is that anonymous?” I hear you ask. Since “keys” are just strings of data themselves, there is no reason you have to advertise the fact that, say “6ab54765f65” is your public key. While the whole world can see that the owner of “6ab54765f65” owns a number of Bitcoins, that does not mean that anyone has to know your secret identity.

The other important feature of Bitcoins is that there is no centralised coordinator of the Bitcoin records. There is no bank keeping the records. The Bitcoin algorithm is public and information about Bitcoin transaction histories is shared across a peer-to-peer network which allows anyone to independently verify Bitcoin transactions.

It’s a fascinating idea and I don’t know if it will take off. It is only in beta, but there are a number of websites that have begun accepting Bitcoins for payment, as well as sites which will trade Bitcoins for “real” money. I will be watching with interest.

* It really is quite rough, only showing that a secure exchange without key exchanges is possible. Other features, such as authentication and the key asymmetry (either key can lock and then the other key unlocks) are not captured.

Mobile coverage

A friend and regular Stubborn Mule reader drew my attention to an article in the Sydney Morning Herald this week about the Australian telco Telstra. Much of the recent commentary has focused on the implications of the national broadband network (NBN) for Telstra. While the NBN certainly gets a mention here too, for me the most striking paragraph deals with the extraordinary success Telstra has been having of late in the mobile phone market:

In the December half, the group added 420,000 bundled customers — customers on bundled deals tend to be “stickier” and stay with a telco longer than those who sign on for only one service — and it added 139,000 retail fixed broadband customers. Most importantly, it added 919,000 mobile phone customers: that’s the biggest mobile phone customer growth Telstra has produced for more [sic] a decade.

Gaining almost 1 million new customers in six months is quite an achievement in a country with a population of around 22 million. My own experience may shed some anecdotal light on Telstra’s success. I switched from Virgin mobile to Telstra late last year. The main reason was network quality. Virgin use the Optus network which I found extremely unreliable, even in central parts of Sydney. Sitting in a café in Glebe with no signal and seeing the person next to me with four bars on a Telstra phone had become too much. Customer service did not come into the decision: as far as I can tell, all the providers are equally atrocious on that score. So that just left price. When I first signed up with Virgin a couple of years earlier, Telstra may have had the superior network, but charged a hefty premium for it. But since then their prices have become far more competitive, which made the decision to switch very easy. I know a number of other people who have switched for exactly the same reason.

Even so, 1 million new customers is an impressive result for such a short period of time. This prompted my source to do some further research. According to a Wikipedia article about mobile phone penetration, in 2006 Australia’s population of 20.8 million owned 19 million mobile phones*. By 2007, that figure had grown to 21.3 million while the population was up to 21.2 million and so there was more than one phone for every man woman and child in the country. I have no doubt that the number of mobile phones has continued to grow faster than the population since then.

But despite over 100% mobile phone penetration, Australia is far from being the country most in love with mobile phones. The chart below uses the Wikipedia statistics to show the top 20 countries and the statistics are intriguing and not a little mysterious.

Top 20 Countries by Mobile Phone Penetration

Montenegro is clearly in the lead with almost two phones per capita. There is a bit of a drop down to Saudi Arabia with a penetration rate of 170%. On 151%, Hong Kong comes in third and leads a closely packed group all close to the 150% mark. Continuing down the list, penetration rates fall gradually down to Chile at 113% which means that Australia does not even make it into the top 20. In fact, even New Zealand ranks higher in 23rd place, while Australia is only in 31st place.

Of course, differences in timing of both the phone and population figures mean that the Wikipedia article will not be very accurate, but the overall picture remains impressive for someone like myself who is old enough to remember a time before mobile phones. And if anyone has any theories why Montenegro has so many mobile phones, please share your theory in the comments below!

* Unfortunately the Wikipedia article cites no source for the 19 million figure. Population statistics are sourced from the Australian Bureau of Statistics.

A gentle introduction to R

Whenever a post on this blog requires some data analysis and perhaps a chart or two, my tool of choice is the versatile statistical programming package R. Developed as an open-source implementation of an engine for the S programming language, R is therefore free. Since commercial mathematical packages can costs thousands of dollars, this alone makes R worth investigating. But what makes R particularly powerful is the large and growing array of specialised packages. For any statistical problem you come across, the chances are that someone has written a package that will make the problem much easier to get to grips with.

If it was not already clear, I am something of an R evangelist and I am not the only one. The growing membership of the Sydney Users of R Forum (SURF) suggests that we are getting some traction and there are a lot of people interested in learning more about R.

Sooner or later, every R beginner will come across An Introduction to R, which appears as the first link under Manuals on the R website. If you work your way through this introduction, you will get a good grounding in the essentials for using R. Unfortunately, it is very dry and it can be a challenge to get through. I certainly never managed to read it from start to finish in one sitting, but having used R for more than 10 years, I regularly return to read bits and pieces, so by now I have read and re-read it all many times. So, useful though this introduction is, it is not always a great place to start for R beginners.

There are many books available about R, including books focusing on the language itself, books on graphics in R, books on implementing particular statistical techniques in R and more than one introduction to R. A few weeks ago I was offered an electronic review copy of Statistical Analysis With R, a new beginner’s introduction to R by John M. Quick. Curious to see whether it could offer a good springboard into R, I decided to take up the offer.

At around 300 pages and covering a little less ground, it certainly takes a more leisurely pace than An Introduction to R. It also attempts a more engaging style by building a narrative around the premise that you have become a strategist for the Shu army in 3rd century China. The worked examples are all built around the challenge of looking at past battle statistics to determine the best strategy for a campaign against the rival Wei kingdom. Given how hard it can be to make an introduction to a statistical programming language exciting, it is certainly worth trying a novel approach. Still, some readers may find the Shu theme a little corny.

The book begins with instructions for downloading and installing R and goes on to explore the basics of importing and manipulating data, statistical exploration of the data (means, standard deviations and correlations), linear regression and finishes with a couple of chapters on producing and customising charts. This is a good selection of topics: mastery of these will provide beginners good grounding in the core capabilities of R. Readers with limited experience with statistics may be reassured that no assumptions are made about mathematical knowledge. The exploration of the battle data is used to provide a simple explanation of what linear regression is as well as the techniques available in R to perform the computations. While this approach certainly makes the book accessible to a broader audience, it is not without risks. Statistical tools are notorious for being abused by people who do not understand them properly. As a friend of mine likes to say, “drive-by regressions” can do a lot of damage!

Each chapter adopts the same structure: a brief introduction advancing the Shu story; a list of the topics covered in the chapter; a series of worked examples with sample commands to be entered into the R console followed by an explanatory “What just happened?” section and a “Pop quiz”; suggestions for further tasks for the readers to try; and finally a chapter summary. At times this approach feels a little repetitive (and the recurring heading “Have a go hero” for the suggested further tasks section may sound a little sarcastic to Australian readers at least), but it is thorough.

If I were to write my own introduction to R (one day perhaps?), I would do some things a little differently. I would try to explain a bit more about the semantics of the language, particularly the difference amongst the various data types (vectors, lists, data frames and so on). But perhaps that would just end up being as dry as An Introduction to R. Also, though I certainly agree with Quick that commenting your code is a very important discipline (even if no-one else ever reads it, you might have to read it again yourself!), I do think that he takes this principle too far in expecting readers to type all of the comments in the worked examples into the console!

Statistical Analysis With R is a very gentle introduction to R. If you have no prior experience of R, reading this book will certainly get you started. On the other hand, if you have already started experimenting with R, the pace may just be a little too slow.

Holiday reading

My now traditional annual pilgrimage to the South coast of New South Wales saw the rainiest weather I can remember. While it was nothing on the scale seen in Queensland and Victoria over recent weeks, it did take its toll on some of the wildlife: we saw dozens of dead porcupine puffers washed up on the beach, apparently the victims of an algal bloom triggered by the rains. On the plus side, the lack of sunshine did help me to catch up on a bit of overdue reading, including a review copy of a Beginner’s Guide to R which you can expect to hear more about when I manage to finish writing the review.

I also read two books about climate change, which were very different in style and content.

Merchants of Doubt

The first was Erik Conway and Naomi Oreskes’ Merchants of Doubt (How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming). The book is not really about climate change per se, but rather the modus operandi of a number of key climate skeptics. In the process it sheds some interesting light on a question I considered here on the blog about a year ago: why does belief or disbelief in the reality of climate change tend to be polarised along political lines? Most of the protagonists in the Merchants of Doubt are scientists, many of whom were physicists involved in the original US nuclear weapons program. The thesis that Conway and Oreskes build is that these scientists were committed anti-Communists and as the Cold War began to thaw, they saw threats to freedom and capitalism in other places, particularly in the environmental movement. That, at least, is the explanation given as to why the same names appear in defence of Ronald Reagan’s “Star Wars” missile defence scheme, in defence of the tobacco industry (first arguing against claims about the health risks of smoking, later about the health risks of second-hand smoke), dismissing the idea of acid rain and finally casting doubt on claims of human-induced climate change.

While I would not expect the book to sway any climate change skeptic, it should at least encourage people to think a bit harder about messengers as well as the message. It certainly prompted me to do just that. When reading the chapter on the second-hand smoke controversy, I immediately thought of an episode of the Penn and Teller’s very entertaining pseudo-science debunking TV series Bullshit*. The episode in question, as I remembered it, did a convincing job of portraying the risks of second-hand smoke (SHS) as dubious at best. Watching it again was eye-opening. Looking past the scathing treatment of the anti-SHS activist, I focused instead on the credentials of the talking heads who were arguing that the science was not settled. The two main experts were Bob Levy from the Cato Institute, a libertarian think-tank, and Dr Elizabeth Whelan, the president of the American Council on Health and Science.

Levy’s voice immediately suggests he is a smoker, which does not, of course, disqualify him from questioning the science of SHS. More intriguing is the fact that the Cato Institute regularly appears as a company of interest in the Merchants of Doubt. Conway and Oreskes draw a number of links between the Cato Institute and both the defence of the tobacco industry and skepticism of global warming, particularly in the person of Steven Milloy who, before joining Cato, worked for a firm whose main claim to fame was to provide lobbying and public-relations support for tobacco giant Phillip Morris.

As for the American Council on Health and Science, it sounds at first like some kind of association of health professionals (which is presumably why Warren chose the name). It is in fact an industry-funded lobby group…sorry, I mean an independent, nonprofit, tax-exempt organisation. Exactly how much of their funding comes from where is now shrouded in mystery, but here are the details as of 1991.

Of course, scrutinising the backgrounds Levy and Whelan does not prove that their claims are wrong. It does, however, raise the question of why Penn and Teller did not interview anyone more independent, perhaps even a scientist, who expressed the same doubts.

What’s the Worst That Could Happen?

The second book on climate change that the rain helped me to read was Greg Craven’s book What’s the Worst That Could Happen?. I bought this after watching Craven’s amusing, if flawed, video “The Most Terrifying Video You Will Ever See”. Craven, a high-school science teacher in Oregon, has clearly workshopped the issue of climate change extensively with his students and the insight he wants to share in his videos and his book is essentially that the whole problem can be viewed from a game-theoretic perspective. Rather than trying to decide what is true or not (are the skeptics right or are the warmers right?), the important question is should we be acting or not.

Craven decision gridCraven’s Global Warming Decision Grid

In his video, Craven uses an action versus outcome “decision grid” to argue that the consequences of not acting in the event that global warming turns out to be true are worse than the consequences of acting (i.e. economic costs) if it turns out to be false. The argument is entertaining, but unfortunately flawed. The problem is that it can be applied to any risk, however remote. As he writes in the book:

Simply insert any wildly speculative and really dangerous-sounding threat into the grid in place of global warming, and you’ll see the grid comes to the same conclusion–that we should do everything possible to stop the threat. Even if it’s something like giant mutant space hamsters (GMSHs).

The book is an attempt to rescue his idea by developing a series of tools to help sift through the arguments for and against climate change without having to actually understand the science. Along the way, he includes an extensive discussion of confirmation bias which I enjoyed as I am fascinated by cognitive biases. Ultimately though, his conclusions rest on an argument from authority. While he makes an excellent case for the important role that authority plays in science, this approach will not win over the skeptics I know: I can already hear their riposte in the form of the establishment’s rejection of Albert Wegener’s theory of continental drift.

Skeptics aside, What’s the Worst That Could Happen? is an extremely accessible book (perhaps even too folksy in its style for some) and is probably best read by those who are not already entrenched in one camp or another and are just sick of the whole shouting match.

* Long-time readers may remember that Bullshit has been mentioned on the blog before in this post about bottled water.

Hans Rosling: data visualisation guru

It is no secret that I am very interested in data visualisation, and yet I have never mentioned the work of Hans Rosling here on the blog. It is an omission I should finally correct, not least to acknowledge those readers who regularly email me links to Rosling’s videos.

Rosling is a doctor with a particular interest in global health and welfare trends. In an effort to broaden awareness of these trends, he founded the non-profit organisation Gapminder, which is described as:

a modern “museum” on the Internet – promoting sustainable global development and achievement of the United Nations Millennium Development Goals

Gapminder provides a rich repository of statistics from a wide range of sources and it was at Gapminder that Rosling’s famous animated bubble charting tool Trendalyzer was developed. I first saw Trendalyzer in action a number of years ago in a presentation Rosling gave at a TED conference. Rosling continued to update his presentation and there are now seven TED videos available. But, the video that Mule readers most often send me is the one below, taken from the BBC documentary  “The Joy of Stats”.

If the four minutes of video here have whetted your appetite, the entire hour-long documentary is available on the Gapminder website. You can also take a closer look at Trendalyzer in action at Gapminder World.

A way with words

Sometimes the things that are unsaid are far more telling than the things said.

I had cause to reflect on this when I stumbled across a book on my shelves that I have not opened for many years. The book, entitled “Deutsche Bank: Dates, facts and figures 1870-1993”, is an English translation of the year-by-year history of the bank compiled by Manfred Pohl and Angelike Raab-Rebentisch. In keeping with the title, the style is more bullet points than narrative. Nevertheless, I continue to find the pages spanning World War II strangely fascinating.

In 1938, with the connivance of the French and British, Germany annexed Sudetenland in Western Czechoslovakia. For Deutsche Bank, this meant more branches.

Deutsche Bank 1938

The following year, Deutsche Bank was fortunate enough to be able to continue its branch expansion, this time into Poland. At least this time, there is a mention of the events outside the bank that may have been relevant.

Deutsche Bank 1939

Another year, and some more expansion for the bank including a few branches in France. No need to mention the invasion of France here, of course.

Deutsche Bank 1940

From 1942, outside events start to interfere with the bank: the “impact of war” forces rather inconvenient branch closures.

DB War End

To see these extracts in the full context, here are the pages spanning 1934 to 1940 and 1940 to 1946.