Econstudentlog

Information complexity and applications

I have previously here on the blog posted multiple lectures in my ‘lecture-posts’, or I have combined a lecture with other stuff (e.g. links such as those in the previous ‘random stuff’ post). I think such approaches have made me less likely to post lectures on the blog (if I don’t post a lecture soon after I’ve watched it, my experience tells me that I not infrequently simply never get around to posting it), and combined with this issue is also the issue that I don’t really watch a lot of lectures these days. For these reasons I have decided to start posting single lecture posts here on the blog; when I start thinking about the time expenditure of people reading along here in a way this approach actually also seems justified – although it might take me as much time/work to watch and cover, say, 4 lectures as it would take me to read and cover 100 pages of a textbook, the time expenditure required by a reader of the blog would be very different in those two cases (you’ll usually be able to read a post that took me multiple hours to write in a short amount of time, whereas ‘the time advantage’ of the reader is close to negligible (maybe not completely; search costs are not completely irrelevant) in the case of lectures). By posting multiple lectures in the same post I probably decrease the expected value of the time readers spend watching the content I upload, which seems suboptimal.

Here’s the youtube description of the lecture, which was posted a few days ago on the IAS youtube account:

“Over the past two decades, information theory has reemerged within computational complexity theory as a mathematical tool for obtaining unconditional lower bounds in a number of models, including streaming algorithms, data structures, and communication complexity. Many of these applications can be systematized and extended via the study of information complexity – which treats information revealed or transmitted as the resource to be conserved. In this overview talk we will discuss the two-party information complexity and its properties – and the interactive analogues of classical source coding theorems. We will then discuss applications to exact communication complexity bounds, hardness amplification, and quantum communication complexity.”

He actually decided to skip the quantum communication complexity stuff because of the time constraint. I should note that the lecture was ‘easy enough’ for me to follow most of it, so it is not really that difficult, at least not if you know some basic information theory.

A few links to related stuff (you can take these links as indications of what sort of stuff the lecture is about/discusses, if you’re on the fence about whether or not to watch it):
Computational complexity theory.
Shannon entropy.
Shannon’s source coding theorem.
Communication complexity.
Communications protocol.
Information-based complexity.
Hash function.
From Information to Exact Communication (in the lecture he discusses some aspects covered in this paper).
Unique games conjecture (Two-prover proof systems).
A Counterexample to Strong Parallel Repetition (another paper mentioned/briefly discussed during the lecture).
Pinsker’s inequality.

An interesting aspect I once again noted during this lecture is the sort of loose linkage you sometimes observe between the topics of game theory/microeconomics and computer science. Of course the link is made explicit a few minutes later in the talk when he discusses the unique games conjecture to which I link above, but it’s perhaps worth noting that the link is on display even before that point is reached. Around 38 minutes into the lecture he mentions that one of the relevant proofs ‘involves such things as Lagrange multipliers and optimization’. I was far from surprised, as from a certain point of view the problem he discusses at that point is conceptually very similar to some problems encountered in auction theory, where Lagrange multipliers and optimization problems are frequently encountered… If you are too unfamiliar with that field to realize how the similar problem might appear in an auction theory context, what you have there are instead auction partipants who prefer not to reveal their true willingness to pay; and some auction designs actually work in a very similar manner as does the (pseudo-)protocol described in the lecture, and are thus used to reveal it (for some subset of participants at least)).

March 12, 2017 Posted by | Computer science, Game theory, Lectures, papers | Leave a comment

Random stuff

i. A very long but entertaining chess stream by Peter Svidler was recently uploaded on the Chess24 youtube account – go watch it here, if you like that kind of stuff. The fact that it’s five hours long is a reason to rejoice, not a reason to think that it’s ‘too long to be watchable’ – watch it in segments…

People interested in chess might also be interested to know that Magnus Carlsen has made an account on the ICC on which he has played, which was a result of his recent participation in the ICC Open 2016 (link). A requirement for participation in the tournament was that people had to know whom they were playing against (so there would be no ultra-strong GMs playing using anonymous accounts in the finals – they could use accounts with strange names, but people had to know whom they were playing), so now we know that Magnus Carlsen has played under the nick ‘stoptryharding’ on the ICC. Carlsen did not win the tournament as he lost to Grischuk in the semi-finals. Some very strong players were incidentally kicked out in the qualifiers, including Nepomniachtchi, the current #5 in the world on the FIDE live blitz ratings.

ii. A lecture:

iii. Below I have added some new words I’ve encountered, most of them in books I’ve read (I have not spent much time on vocabulary.com recently). I’m sure if I were to look all of them up on vocabulary.com some (many?) of them would not be ‘new’ to me, but that’s not going to stop me from including them here (I included the word ‘inculcate’ below for a reason…). Do take note of the spelling of some of these words – some of them are tricky ones included in Bryson’s Dictionary of Troublesome Words: A Writer’s Guide to Getting It Right, which people often get wrong for one reason or another:

Conurbation, epizootic, equable, circumvallation, contravallation, exiguous, forbear, louche, vituperative, thitherto, congeries, inculcate, obtrude, palter, idiolect, hortatory, enthalpy (see also wiki, or Khan Academy), trove, composograph, indite, mugginess, apodosis, protasis, invidious, inveigle, inflorescence, kith, anatopism, laudation, luxuriant, maleficence, misogamy (I did not know this was a word, and I’ll definitely try to remember it/that it is…), obsolescent, delible, overweening, parlay (this word probably does not mean what you think it means…), perspicacity, perspicuity, temblor, precipitous, quinquennial, razzmatazz, turpitude, vicissitude, vitriform.

iv. Some quotes from this excellent book review, by Razib Khan:

“relatively old-fashioned anti-religious sentiments […] are socially acceptable among American Left-liberals so long as their targets are white Christians (“punching up”) but more “problematic” and perhaps even “Islamophobic” when the invective is hurled at Muslim “people of color” (all Muslims here being tacitly racialized as nonwhite). […] Muslims, as marginalized people, are now considered part of a broader coalition on the progressive Left. […] most Left-liberals who might fall back on the term Islamophobia, don’t actually take Islam, or religion generally, seriously. This explains the rapid and strident recourse toward a racial analogy for Islamic identity, as that is a framework that modern Left-liberals and progressives have internalized and mastered. The problem with this is that Islam is not a racial or ethnic identity, it is a set of beliefs and practices. Being a Muslim is not about being who you are in a passive sense, but it is a proactive expression of a set of ideas about the world and your behavior within the world. This category error renders much of Left-liberal and progressive analysis of Islam superficial, and likely wrong.”

“To get a genuine understanding of a topic as broad and boundless as Islam one needs to both set aside emotional considerations, as Ben Affleck can not, and dig deeply into the richer and more complex empirical texture, which Sam Harris has not.”

“One of the most obnoxious memes in my opinion during the Obama era has been the popularization of the maxim that “The arc of the moral universe is long, but it bends towards justice.” It is smug and self-assured in its presentation. […] too often it becomes an excuse for lazy thinking and shallow prognostication. […] Modern Western liberals have a particular idea of what a religion is, and so naturally know that Islam is in many ways just like United Methodism, except with a hijab and iconoclasm. But a Western liberalism that does not take cultural and religious difference seriously is not serious, and yet all too often it is what we have on offer. […] On both the American Left and Right there is a tendency to not even attempt to understand Islam. Rather, stylized models are preferred which lead to conclusions which are already arrived at.”

“It’s fine to be embarrassed by reality. But you still need to face up to reality. Where Hamid, Harris, and I all start is the fact that the vast majority of the world’s Muslims do not hold views on social issues that are aligned with the Muslim friends of Hollywood actors. […] Before the Green Revolution I told people to expect there to be a Islamic revival, as 86 percent of Egyptians polled agree with the killing of apostates. This is not a comfortable fact for me, as I am technically an apostate.* But it is a fact. Progressives who exhibit a hopefulness about human nature, and confuse majoritarian democracy with liberalism and individual rights, often don’t want to confront these facts. […] Their polar opposites are convinced anti-Muslims who don’t need any survey data, because they know that Muslims have particular views a priori by virtue of them being Muslims. […] There is a glass half-full/half-empty aspect to the Turkish data. 95 percent of Turks do not believe apostates should be killed. This is not surprising, I know many Turkish atheists personally. But, 5 percent is not a reassuring fraction as someone who is personally an apostate. The ideal, and frankly only acceptable, proportion is basically 0 percent.”

“Harris would give a simple explanation for why Islam sanctions the death penalty for apostates. To be reductive and hyperbolic, his perspective seems to be that Islam is a totalitarian cult, and its views are quite explicit in the Quran and the Hadith. Harris is correct here, and the views of the majority of Muslims in Egypt (and many other Muslim nations) has support in Islamic law. The consensus historical tradition is that apostates are subject to the death penalty. […] the very idea of accepting atheists is taboo in most Arab countries”.

“Christianity which Christians hold to be fundamental and constitutive of their religion would have seemed exotic and alien even to St. Paul. Similarly, there is a much smaller body of work which makes the same case for Islam.

A précis of this line of thinking is that non-Muslim sources do not make it clear that there was in fact a coherent new religion which burst forth out of south-central Arabia in the 7th century. Rather, many aspects of Islam’s 7th century were myths which developed over time, initially during the Umayyad period, but which eventually crystallized and matured into orthodoxy under the Abbasids, over a century after the death of Muhammad. This model holds that the Arab conquests were actually Arab conquests, not Muslim ones, and that a predominantly nominally Syrian Christian group of Arab tribes eventually developed a new religion to justify their status within the empire which they built, and to maintain their roles within it. The mawali (convert) revolution under the Abbasids in the latter half of the 8th century transformed a fundamentally Arab ethnic sect, into a universal religion. […] The debate about the historical Jesus only emerged when the public space was secularized enough so that such discussions would not elicit violent hostility from the populace or sanction form the authorities. [T]he fact is that the debate about the historical Muhammad is positively dangerous and thankless. That is not necessarily because there is that much more known about Muhammad than Jesus, it is because post-Christian society allows for an interrogation of Christian beliefs which Islamic society does not allow for in relation to Islam’s founding narratives.”

“When it comes to understanding religion you need to start with psychology. In particular, cognitive psychology. This feeds into the field of evolutionary anthropology in relation to the study of religion. Probably the best introduction to this field is Scott Atran’s dense In Gods We Trust: The Evolutionary Landscape of Religion. Another representative work is Theological Incorrectness: Why Religious People Believe What They Shouldn’t. This area of scholarship purports to explain why religion is ubiquitous, and, why as a phenomenon it tends to exhibit a particular distribution of characteristics.

What cognitive psychology suggests is that there is a strong disjunction between the verbal scripts that people give in terms of what they say they believe, and the internal Gestalt mental models which seem to actually be operative in terms of informing how they truly conceptualize the world. […] Muslims may aver that their god is omniscient and omnipresent, but their narrative stories in response to life circumstances seem to imply that their believe god may not see or know all things at all moments.

The deep problem here is understood [by] religious professionals: they’ve made their religion too complex for common people to understand without their intermediation. In fact, I would argue that theologians themselves don’t really understand what they’re talking about. To some extent this is a feature, not a bug. If the God of Abraham is transformed into an almost incomprehensible being, then religious professionals will have perpetual work as interpreters. […] even today most Muslims can not read the Quran. Most Muslims do not speak Arabic. […] The point isn’t to understand, the point is that they are the Word of God, in the abstract. […] The power of the Quran is that the Word of God is presumably potent. Comprehension is secondary to the command.”

“the majority of the book […] is focused on political and social facts in the Islamic world today. […] That is the best thing about Islamic Exceptionalism, it will put more facts in front of people who are fact-starved, and theory rich. That’s good.”

“the term ‘fundamentalist’ in the context of islam isn’t very informative.” (from the comments).

Below I have added some (very) superficially related links of my own, most of them ‘data-related’ (in general I’d say that I usually find ‘raw data’ more interesting than ‘big ideas’):

*My short review of Theological Correctness, one of the books Razib mentions.

*Of almost 163,000 people who applied for asylum in Sweden last year, less than 500 landed a job (news article).

*An analysis of Danish data conducted by the Rockwool Foundation found that for family-reunificated spouses/relatives etc. to fugitives, 22 % were employed after having lived in Denmark for five years (the family-reunificated individuals, that is, not the fugitives themselves). Only one in three of the family-reunificated individuals had managed to find a job after having stayed here for fifteen years. The employment rate of family-reunificated to immigrants is 49 % for people who have been in the country for 5 years, and the number is below 60 % after 15 years. In Denmark, the employment rate of immigrants from non-Western countries was 47,7 % in November 2013, compared to 73,8 % for people of (…’supposedly’, see also my comments and observations here) Danish origin, according to numbers from Statistics Denmark (link). When you look at the economic performance of the people with fugitive status themselves, 34 % are employed after 5 years, but that number is almost unchanged a decade later – only 37 % are employed after they’ve stayed in Denmark for 15 years.
Things of course sometimes look even worse at the local level than these numbers reflect, because those averages are, well, averages; for example of the 244 fugitives and family-reunificated who had arrived in the Danish Elsinore Municipality within the last three years, exactly 5 of them were in full-time employment.

*Rotherham child sexual exploitation scandal (“The report estimated that 1,400 children had been sexually abused in the town between 1997 and 2013, predominantly by gangs of British-Pakistani Muslim men […] Because most of the perpetrators were of Pakistani heritage, several council staff described themselves as being nervous about identifying the ethnic origins of perpetrators for fear of being thought racist […] It was reported in June 2015 that about 300 suspects had been identified.”)

*A memorial service for the terrorist and murderer Omar El-Hussein who went on a shooting rampage in Copenhagen last year (link) gathered 1500 people, and 600-700 people also participated at the funeral (Danish link).

*Pew asked muslims in various large countries whether they thought ‘Suicide Bombing of Civilian Targets to Defend Islam [can] be Justified?’ More than a third of French muslims think that it can, either ‘often/sometimes’ (16 %) or ‘rarely’ (19 %). Roughly a fourth of British muslims think so as well (15 % often/sometimes, 9 % rarely). Of course in countries like Jordan, Nigeria, and Egypt the proportion of people who do not reply ‘never’ is above 50 %. In such contexts people often like to focus on what the majorities think, but I found it interesting to note that in only 2 of 11 countries (Germany – 7 %, & the US – 8 %) queried was it less than 10 % of muslims who thought suicide bombings were not either ‘often’ or ‘sometimes’ justified. Those numbers are some years old. Newer numbers (from non-Western countries only, unfortunately) tell us that e.g. fewer than two out of five Egyptians (38%) and fewer than three out of five (58%) Turks would answer ‘never’ when asked this question just a couple of years ago, in 2014.

*A few non-data related observations here towards the end. I do think Razib is right that cognitive psychology is a good starting point if you want to ‘understand religion’, but a more general point I would make is that there are many different analytical approaches to these sorts of topics which one might employ, and I think it’s important that one does not privilege any single analytical framework over the others (just to be clear, I’m not saying that Razib’s doing this); different approaches may yield different insights, perhaps at different analytical levels, and combining different approaches is likely to be very useful in order to get ‘the bigger picture’, or at least to not overlook important details. ‘History’, broadly defined, may provide one part of the explanatory model, cognitive psychology another part, mathematical anthropology (e.g. stuff like this) probably also has a role to play, etc., etc.. Survey data, economic figures, scientific literatures on a wide variety of topics like trust, norms, migration analysis, and conflict studies, e.g. those dealing with civil wars, may all help elucidate important questions of interest, if not by adding relevant data then by providing additional methodological approaches/scaffoldings which might be fruitfully employed to make sense of the data that is available.

v. Statistical Portrait of Hispanics in the United States.

vi. The Level and Nature of Autistic Intelligence. Autistics may be smarter than people have been led to believe:

“Autistics are presumed to be characterized by cognitive impairment, and their cognitive strengths (e.g., in Block Design performance) are frequently interpreted as low-level by-products of high-level deficits, not as direct manifestations of intelligence. Recent attempts to identify the neuroanatomical and neurofunctional signature of autism have been positioned on this universal, but untested, assumption. We therefore assessed a broad sample of 38 autistic children on the preeminent test of fluid intelligence, Raven’s Progressive Matrices. Their scores were, on average, 30 percentile points, and in some cases more than 70 percentile points, higher than their scores on the Wechsler scales of intelligence. Typically developing control children showed no such discrepancy, and a similar contrast was observed when a sample of autistic adults was compared with a sample of nonautistic adults. We conclude that intelligence has been underestimated in autistics.”

I recall that back when I was diagnosed I was subjected to a battery of different cognitive tests of various kinds, and a few of those tests I recall thinking were very difficult, compared to how difficult they somehow ‘ought to be’ – it was like ‘this should be an easy task for someone who has the mental hardware to solve this type of problem, but I don’t seem to have that piece of hardware; I have no idea how to manipulate these objects in my head so that I might answer that question’. This was an at least somewhat unfamiliar feeling to me in a testing context, and I definitely did not have this experience when doing the Mensa admissions test later on, which was based on Raven’s matrices. Despite the fact that all IQ tests are supposed to measure pretty much the same thing I do not find it hard to believe that there are some details here which may complicate matters a bit in specific contexts, e.g. for people whose brains may not be structured quite the same way ‘ordinary brains’ are (to put it very bluntly). But of course this is just one study and a few personal impressions – more research is needed, etc. (Even though the effect size is huge.)

Slightly related to the above is also this link – I must admit that I find the title question quite interesting. I find it very difficult to picture characters featuring in books I’m reading in my mind, and so usually when I read books I don’t form any sort of coherent mental image of what the character looks like. It doesn’t matter to me, I don’t care. I have no idea if this is how other people read (fiction) books, or if they actually imagine what the characters look like more or less continuously while those characters are described doing the things they might be doing; to me it would be just incredibly taxing to keep even a simplified mental model of the physical attributes of a character in my mind for even a minute. I can recall specific traits like left-handedness and similar without much difficulty if I think the trait might have relevance to the plot, which has helped me while reading e.g. Agatha Christie novels before, but actively imagining what people look like in my mind I just find very difficult. I find it weird to think that some people might do something like that almost automatically, without thinking about it.

vii. Computer Science Resources. I recently shared the link with a friend, but of course she was already aware of the existence of this resource. Some people reading along here may not be, so I’ll include the link here. It has a lot of stuff.

June 8, 2016 Posted by | books, Chess, Computer science, data, demographics, Psychology, random stuff, religion | Leave a comment

Random stuff

I find it difficult to find the motivation to finish the half-finished drafts I have lying around, so this will have to do. Some random stuff below.

i.

(15.000 views… In some sense that seems really ‘unfair’ to me, but on the other hand I doubt neither Beethoven nor Gilels care; they’re both long dead, after all…)

ii. New/newish words I’ve encountered in books, on vocabulary.com or elsewhere:

Agleyperipeteia, disseverhalidom, replevinsocage, organdie, pouffe, dyarchy, tauricide, temerarious, acharnement, cadger, gravamen, aspersion, marronage, adumbrate, succotash, deuteragonist, declivity, marquetry, machicolation, recusal.

iii. A lecture:

It’s been a long time since I watched it so I don’t have anything intelligent to say about it now, but I figured it might be of interest to one or two of the people who still subscribe to the blog despite the infrequent updates.

iv. A few wikipedia articles (I won’t comment much on the contents or quote extensively from the articles the way I’ve done in previous wikipedia posts – the links shall have to suffice for now):

Duverger’s law.

Far side of the moon.

Preference falsification.

Russian political jokes. Some of those made me laugh (e.g. this one: “A judge walks out of his chambers laughing his head off. A colleague approaches him and asks why he is laughing. “I just heard the funniest joke in the world!” “Well, go ahead, tell me!” says the other judge. “I can’t – I just gave someone ten years for it!”).

Political mutilation in Byzantine culture.

v. World War 2, if you think of it as a movie, has a highly unrealistic and implausible plot, according to this amusing post by Scott Alexander. Having recently read a rather long book about these topics, one aspect I’d have added had I written the piece myself would be that an additional factor making the setting seem even more implausible is how so many presumably quite smart people were so – what at least in retrospect seems – unbelievably stupid when it came to Hitler’s ideas and intentions before the war. Going back to Churchill’s own life I’d also add that if you were to make a movie about Churchill’s life during the war, which you could probably relatively easily do if you were to just base it upon his own copious and widely shared notes, then it could probably be made into a quite decent movie. His own comments, remarks, and observations certainly made for a great book.

May 15, 2016 Posted by | astronomy, Computer science, history, language, Lectures, mathematics, music, random stuff, Russia, wikipedia | Leave a comment

A few lectures

The sound quality of this lecture is not completely optimal – there’s a recurring echo popping up now and then which I found slightly annoying – but this should not keep you from watching the lecture. It’s a quite good lecture, and very accessible – I don’t really think you even need to know anything about genetics to follow most of what he’s talking about here; as far as I can tell it’s a lecture intended for people who don’t really know much about population genetics. He introduces key concepts as they are needed and he does not go much into the technical details which might cause people trouble (this of course also makes the lecture somewhat superficial, but you can’t get everything). If you’re the sort of person who wants details not included in the lecture you’re probably already reading e.g. Razib Khan (who incidentally recently blogged/criticized a not too dissimilar paper from the one discussed in the lecture, dealing with South Asia)…

I must admit that I actually didn’t like this lecture very much, but I figured I might as well include it in this post anyway.

I found some questions included and some aspects of the coverage a bit ‘too basic’ for my taste, but other people interested in chess reading along here may like Anna’s approach better; like Krause’s lecture I think it’s an accessible lecture, despite the fact that it actually covers many lines in quite a bit of detail. It’s a long lecture but I don’t think you necessarily need to watch all of it in one go (…or at all?) – the analysis of the second game, the Kortschnoj-Gheorghiu game, starts around 45 minutes in so that might for example be a good place to include a break, if a break is required.

February 1, 2016 Posted by | anthropology, archaeology, Chess, Computer science, genetics, history, Lectures | Leave a comment

A few lectures

Below are three new lectures from the Institute of Advanced Study. As far as I’ve gathered they’re all from an IAS symposium called ‘Lens of Computation on the Sciences’ – all three lecturers are computer scientists, but you don’t have to be a computer scientist to watch these lectures.

Should computer scientists and economists band together more and try to use the insights from one field to help solve problems in the other field? Roughgarden thinks so, and provides examples of how this might be done/has been done. Applications discussed in the lecture include traffic management and auction design. I’m not sure how much of this lecture is easy to follow for people who don’t know anything about either topic (i.e., computer science and economics), but I found it not too difficult to follow – it probably helped that I’ve actually done work on a few of the things he touches upon in the lecture, such as basic auction theory, the fixed point theorems and related proofs, basic queueing theory and basic discrete maths/graph theory. Either way there are certainly much more technical lectures than this one available at the IAS channel.

I don’t have Facebook and I’m not planning on ever getting a FB account, so I’m not really sure I care about the things this guy is trying to do, but the lecturer does touch upon some interesting topics in network theory. Not a great lecture in my opinion and occasionally I think the lecturer ‘drifts’ a bit, talking without saying very much, but it’s also not a terrible lecture. A few times I was really annoyed that you can’t see where he’s pointing that damn laser pointer, but this issue should not stop you from watching the video, especially not if you have an interest in analytical aspects of how to approach and make sense of ‘Big Data’.

I’ve noticed that Scott Alexander has said some nice things about Scott Aaronson a few times, but until now I’ve never actually read any of the latter guy’s stuff or watched any lectures by him. I agree with Scott (Alexander) that Scott (Aaronson) is definitely a smart guy. This is an interesting lecture; I won’t pretend I understood all of it, but it has some thought-provoking ideas and important points in the context of quantum computing and it’s actually a quite entertaining lecture; I was close to laughing a couple of times.

January 8, 2016 Posted by | Computer science, economics, Game theory, Lectures, mathematics, Physics | Leave a comment

Random stuff/Open Thread

i. A lecture on mathematical proofs:

ii. “In the fall of 1944, only seven percent of all bombs dropped by the Eighth Air Force hit within 1,000 feet of their aim point.”

From wikipedia’s article on Strategic bombing during WW2. The article has a lot of stuff. The ‘RAF estimates of destruction of “built up areas” of major German cities’ numbers in the article made my head spin – they didn’t bomb the Germans back to the stone age, but they sure tried. Here’s another observation from the article:

“After the war, the U.S. Strategic Bombing Survey reviewed the available casualty records in Germany, and concluded that official German statistics of casualties from air attack had been too low. The survey estimated that at a minimum 305,000 were killed in German cities due to bombing and estimated a minimum of 780,000 wounded. Roughly 7,500,000 German civilians were also rendered homeless.” (The German population at the time was roughly 70 million).

iii. Also war-related: Eddie Slovik:

Edward Donald “Eddie” Slovik (February 18, 1920 – January 31, 1945) was a United States Army soldier during World War II and the only American soldier to be court-martialled and executed for desertion since the American Civil War.[1][2]

Although over 21,000 American soldiers were given varying sentences for desertion during World War II, including 49 death sentences, Slovik’s was the only death sentence that was actually carried out.[1][3][4]

During World War II, 1.7 million courts-martial were held, representing one third of all criminal cases tried in the United States during the same period. Most of the cases were minor, as were the sentences.[2] Nevertheless, a clemency board, appointed by the Secretary of War in the summer of 1945, reviewed all general courts-martial where the accused was still in confinement.[2][5] That Board remitted or reduced the sentence in 85 percent of the 27,000 serious cases reviewed.[2] The death penalty was rarely imposed, and those cases typically were for rapes or murders. […] In France during World War I from 1917 to 1918, the United States Army executed 35 of its own soldiers, but all were convicted of rape and/or unprovoked murder of civilians and not for military offenses.[13] During World War II in all theaters of the war, the United States military executed 102 of its own soldiers for rape and/or unprovoked murder of civilians, but only Slovik was executed for the military offense of desertion.[2][14] […] of the 2,864 army personnel tried for desertion for the period January 1942 through June 1948, 49 were convicted and sentenced to death, and 48 of those sentences were voided by higher authority.”

What motivated me to read the article was mostly curiosity about how many people were actually executed for deserting during the war, a question I’d never encountered any answers to previously. The US number turned out to be, well, let’s just say it’s lower than I’d expected it would be. American soldiers who chose to desert during the war seem to have had much, much better chances of surviving the war than had soldiers who did not. Slovik was not a lucky man. On a related note, given numbers like these I’m really surprised desertion rates were not much higher than they were; presumably community norms (”desertion = disgrace’, which would probably rub off on other family members…’) played a key role here.

iv. Chess and infinity. I haven’t posted this link before even though the thread is a few months old, and I figured that given that I just had a conversation on related matters in the comment section of SCC (here’s a link) I might as well repost some of this stuff here. Some key points from the thread (I had to make slight formatting changes to the quotes because wordpress had trouble displaying some of the numbers, but the content is unchanged):

u/TheBB:
“Shannon has estimated the number of possible legal positions to be about 1043. The number of legal games is quite a bit higher, estimated by Littlewood and Hardy to be around 1010^5 (commonly cited as 1010^50 perhaps due to a misprint). This number is so large that it can’t really be compared with anything that is not combinatorial in nature. It is far larger than the number of subatomic particles in the observable universe, let alone stars in the Milky Way galaxy.

As for your bonus question, a typical chess game today lasts about 40­ to 60 moves (let’s say 50). Let us say that there are 4 reasonable candidate moves in any given position. I suspect this is probably an underestimate if anything, but let’s roll with it. That gives us about 42×50 ≈ 1060 games that might reasonably be played by good human players. If there are 6 candidate moves, we get around 1077, which is in the neighbourhood of the number of particles in the observable universe.”

u/Wondersnite:
“To put 1010^5 into perspective:

There are 1080 protons in the Universe. Now imagine inside each proton, we had a whole entire Universe. Now imagine again that inside each proton inside each Universe inside each proton, you had another Universe. If you count up all the protons, you get (1080 )3 = 10240, which is nowhere near the number we’re looking for.

You have to have Universes inside protons all the way down to 1250 steps to get the number of legal chess games that are estimated to exist. […]

Imagine that every single subatomic particle in the entire observable universe was a supercomputer that analysed a possible game in a single Planck unit of time (10-43 seconds, the time it takes light in a vacuum to travel 10-20 times the width of a proton), and that every single subatomic particle computer was running from the beginning of time up until the heat death of the Universe, 101000 years ≈ 1011 × 101000 seconds from now.

Even in these ridiculously favorable conditions, we’d only be able to calculate

1080 × 1043 × 1011 × 101000 = 101134

possible games. Again, this doesn’t even come close to 1010^5 = 10100000 .

Basically, if we ever solve the game of chess, it definitely won’t be through brute force.”

v. An interesting resource which a friend of mine recently shared with me and which I thought I should share here as well: Nature Reviews – Disease Primers.

vi. Here are some words I’ve recently encountered on vocabulary.com: augury, spangle, imprimatur, apperception, contrition, ensconce, impuissance, acquisitive, emendation, tintinnabulation, abalone, dissemble, pellucid, traduce, objurgation, lummox, exegesis, probity, recondite, impugn, viscid, truculence, appurtenance, declivity, adumbrate, euphony, educe, titivate, cerulean, ardour, vulpine.

May 16, 2015 Posted by | Chess, Computer science, history, Lectures, mathematics | Leave a comment

Belief-Based Stability in Coalition Formation with Uncertainty…

“In this book we present several novel concepts in cooperative game theory, but from a computer scientist’s point of view. Especially, we will look at a type of games called non-transferable utility games. […] In this book, we extend the classic stability concept of the non-transferable utility core by proposing new belief-based stability criteria under uncertainty, and illustrate how the new concept can be used to analyse the stability of a new type of belief-based coalition formation game. Mechanisms for reaching solutions of the new stable criteria are proposed and some real life application examples are studied. […] In Chapter 1, we first provide an introduction of topics in game theory that are relevant to the concepts discussed in this book. In Chapter 2, we review some relevant works from the literature, especially in cooperative game theory and multi-agent coalition formation problems. In Chapter 3, we discuss the effect of uncertainty in the agent’s beliefs on the stability of the games. A rule-based approach is adopted and the concepts of strong core and weak core are introduced. We also discuss the effect of precision of the beliefs on the stability of the coalitions. In Chapter 4, we introduce private beliefs in non-transferable utility (NTU) games, so that the preferences of the agents are no longer common knowledge. The impact of belief accuracy on stability is also examined. In Chapter 5, we study an application of the proposed belief-based stability concept, namely the buyer coalition problem, and we see how the proposed concept can be used in the evaluation of this multi-agent coalition formation problem. In Chapter 6, we combine the works of earlier chapters and produce a complete picture of the introduced concepts: non-transferable utility games with private beliefs and uncertainty. We conclude this book in Chapter 7.”

The above quote is from the preface of the book, which I finished yesterday. It deals with some issues I was slightly annoyed about not being covered in a previous micro course; my main problem being that it seemed to me back then that the question of belief accuracy and the role of this variable was not properly addressed in the models we looked at (‘people can have mistaken beliefs, and it seems obvious that the ways in which they’re wrong can affect which solutions are eventually reached’). The book makes the point that if you look at coalition formation in a context where it is not reasonable to assume that information is shared among coalition partners (because it is in the interest of the participants to keep their information/preferences/willingness to pay private), then the beliefs of the potential coalition partners may play a major role in determining which coalitions are feasible and which are ruled out. A key point is that in the model context explored by the authors, inaccurate beliefs of agents will expand the number of potential coalitions which are available, although coalition options ruled out by accurate beliefs are less stable than ones which are not. They do not discuss the fact that this feature is unquestionably a result of implicit assumptions made along the way which may not be true, and that inaccurate beliefs may also in some contexts conceivably lead to lower solution support in general (e.g. through variables such as disagreement, or, to think more in terms of concepts specifically included in their model framework, higher general instability of solutions which can feasibly be reached, making agents less likely to explore the option of participating in coalitions in the first place due to the lower payoffs associated with the available coalitions likely to be reached – dynamics such as these are not included in the coverage). I decided early on to not blog the stuff in this book in major detail because it’s not the kind of book where this makes sense to do (in my opinion), but if you’re curious about how they proceed, they talk quite a bit about the (classical) Core and discuss why this is not an appropriate solution concept to apply in the contexts they explore, and they then proceed to come up with new and better solution criteria, developed with the aid of some new variables and definitions along the way, in order to end up with some better solution concepts, their so-called ‘belief-based cores’, which are perhaps best thought of as extensions of the classical core concept. I should perhaps point out, as this may not be completely clear, that the beliefs they talk about deal both with the ‘state of nature’ (which in part of the coverage is assumed to be basically unobservable) and the preferences of agents involved.

If you want a sort of bigger picture idea of what this book is about, I should point out that in general you have two major sub-fields of game theory, dealing with cooperative and non-cooperative games respectively. Within the sub-field of cooperative games, a distinction is made between games and settings where utilities are transferable, and games/settings where they are not. This book belongs in the latter category; it deals with cooperative games in which utilities are non-transferable. The authors in the beginning make a big deal out of the distinction between whether or not utilities are transferable, and claim that the assumption that they’re not is the more plausible one; whereas they do have a point, I however also actually think the non-transferability assumption might in some of the specific examples included in the book be a borderline questionable assumption. To give an example, the non-transferability assumption seems in one context to imply that all potential coalition partners have the same amount of bargaining power. This assumption is plausible in some contexts, but wildly implausible in others (and I’m not sure the authors would agree with me about which contexts would belong to which category).

The professor teaching the most recent course in micro I took had a background in computer science, rather than economics – he was also Asian, but this perhaps goes without saying. This book is supposedly a computer science book, and they argue in the introduction that: “instead of looking at human beings, we study the problem from an intelligent software agent’s perspective.” However I don’t think a single one of the examples included in the book would be an example you could not also have found in a classic micro text, and it’s really hard to tell in many parts of the coverage that the authors aren’t economists with a background in micro – there seems to be quite a bit of field overlap here (this field overlap incidentally extends to areas of economics besides micro, is my impression; one econometrics TA I had, teaching the programming part of the course, was also a CS major). In the book they talk a bit about coalition formation mechanisms and approaches, such as propose-and-evaluate mechanisms and auction approaches, and they also touch briefly upon stuff like mechanism design. They state in the description that: “The book is intended for graduate students, engineers, and researchers in the field of artificial intelligence and computer science.” I think it’s really weird that they don’t include (micro-)economists as well, because this stuff is obviously quite close to/potentially relevant to the kind of work some of these people are working on.

There are a lot of definitions, theorems, and proofs in this book, and as usual when doing work on game theory you need to think very carefully about the stuff they cover to be able to follow it, but I actually found it reasonably accessible – the book is not terribly difficult to read. Though I would probably advise you against reading the book if you have not at least read an intro text on game theory. Although as already mentioned the book deals with an analytical context in which utilities are non-transferable, it should be pointed out that this assumption is sort of implicit in the coverage, in the sense that the authors don’t really deal with utility functions at all; the book only deals with preference relations, not utility functions, so it probably helps to be familiar with this type of analysis (e.g. by having studied (solved some problems) dealing with the kind of stuff included in the coverage in chapter 1 of Mas-Colell).

Part of the reason why I gave the book only two stars is that the authors are Chinese and their English is terrible. Another reason is that as is usually the case in game theory, these guys spend a lot of time and effort being very careful to define their terms and make correct inferences from the assumptions they make – but they don’t really end up saying very much.

February 28, 2015 Posted by | books, Computer science, economics | Leave a comment

Stuff

i. Econometric methods for causal evaluation of education policies and practices: a non-technical guide. This one is ‘work-related’; in one of my courses I’m writing a paper and this working paper is one (of many) of the sources I’m planning on using. Most of the papers I work with are unfortunately not freely available online, which is part of why I haven’t linked to them here on the blog.

I should note that there are no equations in this paper, so you should focus on the words ‘a non-technical guide’ rather than the words ‘econometric methods’ in the title – I think this is a very readable paper for the non-expert as well. I should of course also note that I have worked with most of these methods in a lot more detail, and that without the math it’s very hard to understand the details and really know what’s going on e.g. when applying such methods – or related methods such as IV methods on panel data, a topic which was covered in another class just a few weeks ago but which is not covered in this paper.

This is a place to start if you want to know something about applied econometric methods, particularly if you want to know how they’re used in the field of educational economics, and especially if you don’t have a strong background in stats or math. It should be noted that some of the methods covered see wide-spread use in other areas of economics as well; IV is widely used, and the difference-in-differences estimator have seen a lot of applications in health economics.

ii. Regulating the Way to Obesity: Unintended Consequences of Limiting Sugary Drink Sizes. The law of unintended consequences strikes again.

You could argue with some of the assumptions made here (e.g. that prices (/oz) remain constant) but I’m not sure the findings are that sensitive to that assumption, and without an explicit model of the pricing mechanism at work it’s mostly guesswork anyway.

iii. A discussion about the neurobiology of memory. Razib Khan posted a short part of the video recently, so I decided to watch it today. A few relevant wikipedia links: Memory, Dead reckoning, Hebbian theory, Caenorhabditis elegans. I’m skeptical, but I agree with one commenter who put it this way: “I know darn well I’m too ignorant to decide whether Randy is possibly right, or almost certainly wrong — yet I found this interesting all the way through.” I also agree with another commenter who mentioned that it’d have been useful for Gallistel to go into details about the differences between short term and long term memory and how these differences relate to the problem at hand.

iv. Plos-One: Low Levels of Empathic Concern Predict Utilitarian Moral Judgment.

“An extensive body of prior research indicates an association between emotion and moral judgment. In the present study, we characterized the predictive power of specific aspects of emotional processing (e.g., empathic concern versus personal distress) for different kinds of moral responders (e.g., utilitarian versus non-utilitarian). Across three large independent participant samples, using three distinct pairs of moral scenarios, we observed a highly specific and consistent pattern of effects. First, moral judgment was uniquely associated with a measure of empathy but unrelated to any of the demographic or cultural variables tested, including age, gender, education, as well as differences in “moral knowledge” and religiosity. Second, within the complex domain of empathy, utilitarian judgment was consistently predicted only by empathic concern, an emotional component of empathic responding. In particular, participants who consistently delivered utilitarian responses for both personal and impersonal dilemmas showed significantly reduced empathic concern, relative to participants who delivered non-utilitarian responses for one or both dilemmas. By contrast, participants who consistently delivered non-utilitarian responses on both dilemmas did not score especially high on empathic concern or any other aspect of empathic responding.”

In case you were wondering, the difference hasn’t got anything to do with a difference in the ability to ‘see things from the other guy’s point of view’: “the current study demonstrates that utilitarian responders may be as capable at perspective taking as non-utilitarian responders. As such, utilitarian moral judgment appears to be specifically associated with a diminished affective reactivity to the emotions of others (empathic concern) that is independent of one’s ability for perspective taking”.

On a small sidenote, I’m not really sure I get the authors at all – one of the questions they ask in the paper’s last part is whether ‘utilitarians are simply antisocial?’ This is such a stupid way to frame this I don’t even know how to begin to respond; I mean, utilitarians make better decisions that save more lives, and that’s consistent with them being antisocial? I should think the ‘social’ thing to do would be to save as many lives as possible. Dead people aren’t very social, and when your actions cause more people to die they also decrease the scope for future social interaction.

v. Lastly, some Khan Academy videos:

(Relevant links: Compliance, Preload).

(This one may be very hard to understand if you haven’t covered this stuff before, but I figured I might as well post it here. If you don’t know e.g. what myosin and actin is you probably won’t get much out of this video. If you don’t watch it, this part of what’s covered is probably the most important part to take away from it.)

It’s been a long time since I checked out the Brit Cruise information theory playlist, and I was happy to learn that he’s updated it and added some more stuff. I like the way he combines historical stuff with a ‘how does it actually work, and how did people realize that’s how it works’ approach – learning how people figured out stuff is to me sometimes just as fascinating as learning what they figured out:

(Relevant wikipedia links: Leyden jar, Electrostatic generator, Semaphore line. Cruise’ play with the cat and the amber may look funny, but there’s a point to it: “The Greek word for amber is ηλεκτρον (“elektron”) and is the origin of the word “electricity”.” – from the first link).

(Relevant wikipedia links: Galvanometer, Morse code)

April 14, 2013 Posted by | Computer science, Cryptography, econometrics, Khan Academy, medicine, papers, random stuff, statistics | Leave a comment

Khan Academy videos of interest

Took me a minute to solve without hints. I had to scribble a few numbers down (like Khan does in the video), but you should be able to handle it without hints. (Actually I think some of the earlier brainteasers on the playlist are harder than this one and that some of the later ones are easier, but it’s a while since I saw the first ones.)


Much more here.

Naturally this is from the computer science section.

It’s been a while since I’ve last been to Khan Academy – it seems that these days they have an entire section about influenza.

February 10, 2013 Posted by | Computer science, Khan Academy, Lectures, mathematics, medicine | Leave a comment

A few notes on Singh’s The Code Book

It seems that nine out of ten readers don’t read/like my book posts, so I probably will try to hold back on those in the future or at least put a bit less effort into them. But I thought I’d just post a quick note here anyway:

I spent part of yesterday and a big chunk of today reading Simon Singh’s The Code Book. I generally liked the book – if you liked Fermat’s last Theorem, you’ll probably like this book too. I didn’t think much of the last two chapters, but the rest of it was quite entertaining and instructive. You know you have your hands on a book that covers quite a bit of stuff when you find yourself looking up something in an archaeology textbook to check some details in a book about cryptography (the book has a brief chapter which covers the decipherment of the linear B script, among other things). Having read the book, I can’t not mention here that I blogged this some time ago – needless to say, back then I had no idea how big of a name Hellman is ‘in the cryptography business’ (this was a very big deal – in Singh’s words: “The Diffie-Hellman-Merkle key exchange scheme […] is one of the most counterintuitive discoveries in the history of science, and it forced the cryptographic establishment to rewrite the rules of encryption. […] Hellman had shattered one of the tenets of cryptography and proved that Bob and Alice did not need to meet to agree a secret key.” (p.267))

August 22, 2012 Posted by | books, Computer science, Cryptography | Leave a comment

Wikipedia articles of interest

i. Shannon–Hartley theorem. Muller talked a little bit about this one in one of the lectures, I don’t remember which, but it’s probably one of the wave-lectures. His coverage is less technical than wikipedia’s. I was considering not including this link because I previously linked to wikipedia’s closely-related article about the Noisy-channel coding theorem, but I decided to do it anyway. From the article:

“In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon’s channel capacity for such a communication link, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density. […]

Considering all possible multi-level and multi-phase encoding techniques, the Shannon–Hartley theorem states the channel capacity C, meaning the theoretical tightest upper bound on the information rate (excluding error correcting codes) of clean (or arbitrarily low bit error rate) data that can be sent with a given average signal power S through an analog communication channel subject to additive white Gaussian noise of power N, is:

 C =  B \log_2 \left( 1+\frac{S}{N} \right)

where

C is the channel capacity in bits per second;
B is the bandwidth of the channel in hertz (passband bandwidth in case of a modulated signal);
S is the average received signal power over the bandwidth (in case of a modulated signal, often denoted C, i.e. modulated carrier), measured in watts (or volts squared);
N is the average noise or interference power over the bandwidth, measured in watts (or volts squared); and
S/N is the signal-to-noise ratio (SNR) or the carrier-to-noise ratio (CNR) of the communication signal to the Gaussian noise interference expressed as a linear power ratio (not as logarithmic decibels).”

ii. Expansion joint. Also covered by Muller, this is important stuff that people don’t think about:

“An expansion joint or movement joint is an assembly designed to safely absorb the heat-induced expansion and contraction of various construction materials, to absorb vibration, to hold certain parts together, or to allow movement due to ground settlement or earthquakes. They are commonly found between sections of sidewalks, bridges, railway tracks, piping systems, ships, and other structures.

Throughout the year, building faces, concrete slabs, and pipelines will expand and contract due to the warming and cooling through seasonal variation, or due to other heat sources. Before expansion joint gaps were built into these structures, they would crack under the stress induced.”

If you have any kind of construction of a significant size/length, thermal expansion will cause problems unless you try to deal with it somehow. To use expansion joints to deal with this problem is another one of those hidden ‘good ideas’ people don’t think about, because they probably weren’t even aware there was a problem to be solved.

iii. Beaufort scale.

iv. Belle Gunness. Not all serial killers are/were male:

“Personal – comely widow who owns a large farm in one of the finest districts in La Porte County, Indiana, desires to make the acquaintance of a gentleman equally well provided, with view of joining fortunes. No replies by letter considered unless sender is willing to follow answer with personal visit. Triflers need not apply.[2]” […]

“The suitors kept coming, but none, except for Anderson, ever left the Gunness farm. By this time, she had begun ordering huge trunks to be delivered to her home. Hack driver Clyde Sturgis delivered many such trunks to her from La Porte and later remarked how the heavyset woman would lift these enormous trunks “like boxes of marshmallows”, tossing them onto her wide shoulders and carrying them into the house. She kept the shutters of her house closed day and night; farmers traveling past the dwelling at night saw her digging in the hog pen.” Guess what they found buried in the hog pen later?

v. English garden.

“The English garden, also called English landscape park (French: Jardin anglais, Italian: Giardino all’inglese, German: Englischer Landschaftsgarten, Portuguese: Jardim inglês), is a style of Landscape garden which emerged in England in the early 18th century, and spread across Europe, replacing the more formal, symmetrical Garden à la française of the 17th century as the principal gardening style of Europe.[1] The English garden presented an idealized view of nature. They were often inspired by paintings of landscapes by Claude Lorraine and Nicolas Poussin, and some were Influenced by the classic Chinese gardens of the East,[2] which had recently been described by European travelers.[2] The English garden usually included a lake, sweeps of gently rolling lawns set against groves of trees, and recreations of classical temples, Gothic ruins, bridges, and other picturesque architecture, designed to recreate an idyllic pastoral landscape. By the end of the 18th century the English garden was being imitated by the French landscape garden, and as far away as St. Petersburg, Russia, in Pavlovsk, the gardens of the future Emperor Paul. It also had a major influence on the form of the public parks and gardens which appeared around the world in the 19th century.[3]

A few images from the article (click to view full size):

vi. Aquifer.

“An aquifer is an underground layer of water-bearing permeable rock or unconsolidated materials (gravel, sand, or silt) from which groundwater can be usefully extracted using a water well. The study of water flow in aquifers and the characterization of aquifers is called hydrogeology. Related terms include aquitard, which is a bed of low permeability along an aquifer,[1] and aquiclude (or aquifuge), which is a solid, impermeable area underlying or overlying an aquifer. If the impermeable area overlies the aquifer pressure could cause it to become a confined aquifer.” The article has much more.

vii. Great Tit.

“The Great Tit (Parus major) is a passerine bird in the tit family Paridae. It is a widespread and common species throughout Europe, the Middle East, Central and Northern Asia, and parts of North Africa in any sort of woodland. It is generally resident, and most Great Tits do not migrate except in extremely harsh winters. Until 2005 this species was lumped with numerous other subspecies. DNA studies have shown these other subspecies to be distinctive from the Great Tit and these have now been separated as two separate species, the Cinereous Tit of southern Asia, and the Japanese Tit of East Asia. The Great Tit remains the most widespread species in the genus Parus.

The Great Tit is a distinctive bird, with a black head and neck, prominent white cheeks, olive upperparts and yellow underparts, with some variation amongst the numerous subspecies. It is predominantly insectivorous in the summer, but will consume a wider range of food items in the winter months, including small hibernating bats.[2] Like all tits it is a cavity nester, usually nesting in a hole in a tree. The female lays around 12 eggs and incubates them alone, although both parents raise the chicks. In most years the pair will raise two broods. The nests may be raided by woodpeckers, squirrels and weasels and infested with fleas, and adults may be hunted by Sparrowhawks. The Great Tit has adapted well to human changes in the environment and is a common and familiar bird in urban parks and gardens. The Great Tit is also an important study species in ornithology. […]

Great Tits combine dietary versatility with a considerable amount of intelligence and the ability to solve problems with insight learning, that is to solve a problem through insight rather than trial and error.[9] In England, Great Tits learned to break the foil caps of milk bottles delivered at the doorstep of homes to obtain the cream at the top.[24] This behaviour, first noted in 1921, spread rapidly in the next two decades.[25] In 2009, Great Tits were reported killing and eating pipistrelle bats. This is the first time a songbird has been seen to hunt bats. The tits only do this during winter when the bats are hibernating and other food is scarce.[26] They have also been recorded using tools, using a conifer needle in the bill to extract larvae from a hole in a tree.[9] […]

The Great Tit has generally adjusted to human modifications of the environment. It is more common and has better breeding success in areas with undisturbed forest cover, but it has adapted to human modified habitats. It can be very common in urban areas.[9] For example, the breeding population in the city of Sheffield (a city of half a million people) has been estimated at 17,164 individuals.[45] In adapting to human environments its song has been observed to change in noise-polluted urban environments. In areas with low frequency background noise pollution, the song has a higher frequency than in quieter areas.[46]

July 10, 2012 Posted by | biology, Computer science, Geology, history, wikipedia | Leave a comment

Random wikipedia links of interest

1. Orogeny.

‘Before the development of geologic concepts during the 19th century, the presence of mountains was explained in Christian contexts as a result of the Biblical Deluge. This was an extension of Neoplatonic thought, which influenced early Christian writers and assumed that a perfect Creation would have to have taken the form of a perfect sphere. Such thinking persisted into the 18th century.’

Of course this could just be the confirmation bias talking, but I think the ‘religion makes you more stupid and less knowledgeable’-hypothesis gets yet another point here.

2) Coalworker’s pneumoconiosis – ‘a common affliction of coal miners and others who work with coal, similar to both silicosis from inhaling silica dust, and to the long-term effects of tobacco smoking. Inhaled coal dust progressively builds up in the lungs and is unable to be removed by the body; that leads to inflammation, fibrosis, and in the worst case, necrosis.’

3) Hand grenade. Did you know that a gunpowder version of this weapon (Zhen Tian Lei – that article is only a stub, unfortunately) was developed more than 1000 years ago – I most certainly did not.

4) Gene expression. This is a dangerous article, it has a lot of good links and can cost you many hours of your life if you’re not careful. As I’m sure regular readers would know, the name of the article is of course also the name of one of my favourite blogs.

5) Simpson’s paradox.

6) Noisy-channel coding theorem.

‘the noisy-channel coding theorem establishes that however contaminated with noise interference a communication channel may be, it is possible to communicate digital data (information) nearly error-free up to a given maximum rate through the channel.’

[…]

‘Stated by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. The theory doesn’t describe how to construct the error-correcting method, it only tells us how good the best possible method can be. Shannon’s theorem has wide-ranging applications in both communications and data storage. This theorem is of foundational importance to the modern field of information theory. Shannon only gave an outline of the proof. The first rigorous proof is due to Amiel Feinstein in 1954.

The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if R < C there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, C.’

I’d file this one under ‘stuff I didn’t know I didn’t know’. There’s a lot of that stuff around.

July 3, 2010 Posted by | Computer science, genetics, Geology, medicine, statistics, wikipedia | Leave a comment