Simulating Language 2015, pre-reading 9 questions, part 2 Question Title * 1. Now imagine a weird dice, that comes up with a 6 on average half the time you roll it, and the rest of the time produces 1, 2, 3, 4 or 5 with equal probability. We can specify the probability distribution of rolls of that dice - which if the following is the correct probability distribution? Note: following the notation in the reading, I am going to write p( 6 | weird-dice) to signify the probability that I roll a 6 given that I am rolling the weird dice. The probability distribution then just specifies a probability for each of the 6 possible outcomes. p(1 | weird-dice)=p(2 | weird-dice)=p(3 | weird-dice)=p(4 | weird-dice)=p(5 | weird-dice)=p(6 | weird-dice)=1/6 p(1 | weird-dice)=p(2 | weird-dice)=p(3 | weird-dice)=p(4 | weird-dice)=p(5 | weird-dice)=1/10, p(6 | weird-dice) = 1/2 p(1 | weird-dice)=p(2 | weird-dice)=p(3 | weird-dice)=p(4 | weird-dice)=p(5 | weird-dice)=1/5, p(6 | weird-dice) = 1/2 Question Title * 2. What is the probability that this weird dice will roll a 6 then a 2? 1/100 1/20 1/2 2/10 Question Title * 3. What is the probability that, on two rolls, it will generate at least one 6? 0 1/4 1/2 3/4 1 Question Title * 4. Imagine that tomorrow you will meet someone new. Before you meet them, you have some prior knowledge of what they will be like. In Bayesian models this is captured by the prior - in the case of people, the prior for meeting a new person is presumably based on your experiences in the past (but it needn’t be - see the question about innate priors below), but it is still a prior because it captures your knowledge about this person prior to meeting them. What is the prior probability that this new person will be female? Approximately 0 Approximately 1/2 Approximately 1 Question Title * 5. What is the probability that they will have two arms? Approximately 0 Approximately 1/2 Approximately 1 Question Title * 6. What is the probability that they will have red hair? Approximately 0 Approximately 1/10 Approximately 3/10 Question Title * 7. We will also want to consider cases where prior probability is innate, i.e. not derived from experience, but built-in to an individual. Depending on your theoretical persuasion, you will find it easy to imagine or hard to imagine these priors! Imagine for a moment that you are a linguist who believes that the Extended Projection Principle (roughly: sentences must have subjects) is innate: children know, prior to encountering any linguistic data, that the language they are learning will obey the EPP. What is the prior probability of languages which follow the EPP for children? 0 Just above 0 Just below 1 1 Question Title * 8. What is the prior probability for languages that don't follow the EPP? 0 Just above 0 Just below 1 1 Question Title * 9. Now imagine you are casting Christiansen & Devlin’s connectionist model of sequence learning in Bayesian terms: languages which exhibit recursive inconsistency are harder to learn than languages that are recursively consistent. We can capture this in Bayesian terms in the prior: languages with low prior probability are ‘hard’ to learn, in the sense that they require more data to outweigh their low prior probability. How would you capture the Christiansen & Devlin model? The prior for recursively inconsistent languages is 0 The prior for recursively inconsistent languages is 1 The prior for recursively inconsistent languages is higher than the prior for recursively consistent languages The prior for recursively consistent languages is higher than the prior for recursively inconsistent languages Done