Wikipedia:Reference desk/Archives/Mathematics/2010 April 18

From Wikipedia, the free encyclopedia
Mathematics desk
< April 17 << Mar | April | May >> April 19 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


April 18[edit]

long-term probability of heads on a biased coin[edit]

After doing an experiment as part of my university coursework and getting some funny results, one of the points I thought I would include in the write-up was the possibility of inadequate entropy added to one of the variables. I was hoping for a pointer with the math, however. Suppose you have a number of coins (call it x) that you are repeatedly throwing. The first toss is fair, but for each subsequent toss, there is a fixed probability of any coin retaining information about how it landed on the previous toss (for example, there is a p% chance for each coin on each toss that, rather than landing randomly, it will land on the reverse of the side it landed on in the previous toss). In the long run, what would be the expected proportions of tosses with i heads, for i = 0 to x? Cheers. --130.216.172.247 (talk) 06:52, 18 April 2010 (UTC)[reply]

It all depends on how you toss the coin. If your process approximates to turning over, then there will be a very high probability. I suggest that you repeat the experiment with numbered coins and the hypothesis that each coin just gets turned over. You can thus test for that particular bias in your experimental process. Dbfirs 07:30, 18 April 2010 (UTC)[reply]
It's not just "turning over" (once). Depending on the height and rotational speed, the coin may tend to flip half over, full over, one and a half flips, etc. Only by having a very large height and rotational speed can you ensure that it works out to be truly random. StuRat (talk) 17:40, 18 April 2010 (UTC)[reply]
Your concern is that the outcomes of the tosses are not independent. You can analyze that through the correlation coefficient between the outcomes of adjacent pairs of tosses. As an aside, I've heard that that a better way to get random bits from coins is to put a bunch of coins in a box and shake the box for a while, rather than flipping a coin repeatedly. 66.127.54.238 (talk) 19:45, 18 April 2010 (UTC)[reply]

The answers above are largely irrelevant to the question. If you've specified that you've got probability p of landing on the side opposite the side you got last time, you've defined the problem. The question is not how big p is, given the physical conditions; the question is what is the probability distribution of the outcome give the value of p. It's a function of p. And obviously in this simple case knowing p is the same as knowing the correlation, but saying that doesn't answer the question.

You may want to think about equilibrium distributions of Markov chains. This is a fairly simple Markov chain. Michael Hardy (talk) 00:57, 19 April 2010 (UTC)[reply]

Agreed, we were talking about establishing whether or not p is exactly 50%. I would be interested in seeing the the result of the Markov Chain analysis. I can't see how it could lead to anything other than 50% heads on average, but will it influence the normal distribution about that average? I trust that no-one is suggesting that the coin can "remember" the way it landed the time before last! Dbfirs 09:05, 19 April 2010 (UTC)[reply]
I'm very confused why you think the distribution is going to be normal - it's surely binomial. Anyway... the question, as I understand it, is best dealt with by considering the case whereupon each coin is a bit more likely to be the same as the one before it. This should be identical to the stated case, aside from a deterministic parity based flip.
Now, the way I see it, if you haven't flipped any coins yet, then the information that each coin has some memory is meaningless, because you have no idea what the previous memory of the coin is. So the question is a bit ill specified - you can assume a flat prior, say, or you can assume that the first throw is completely random. In that case, you can freely swap over each coin being heads or tails down the entire sequence of throws, so it's naturally binomial. The more interesting question is if you condition on knowing what the first flip was. In which case, as you do sufficient numbers of flips, the influence of the original flip will soon be diluted, and so by a sufficiently large number of flips, the full coin set would just be a bunch of independent bernoulli (0.5) rvs, leading to convergence to a binomial distribution.--192.76.7.208 (talk) 00:50, 20 April 2010 (UTC)[reply]
Sorry, careless use of language on my part. I meant the usual binomial distribution (approximating to a normal distribution for many flips). Dbfirs 07:01, 20 April 2010 (UTC)[reply]
No, it needn't be binomial. It's easy to see this by noting the the first two tosses, for example, are correlated, so the variance of their sum will be different than predicted by binomial. -- Meni Rosenfeld (talk) 09:03, 20 April 2010 (UTC)[reply]
I think you are misunderstanding the question: the point is that we are tossing x coins simultaneously, with no dependence between different coins. We aren't looking at the distribution of multiple tosses together, but of the vector of x independent coin facings resulting from a single toss. Or *I*'m misunderstanding the question.--192.76.7.208 (talk) 19:53, 20 April 2010 (UTC)[reply]
Right, I get it now. -- Meni Rosenfeld (talk) 08:19, 21 April 2010 (UTC)[reply]
If then yes, the stationary distribution will be 50% each. Otherwise it will be heads. And the distribution for a large number of tosses (edit: for a single coin) will be approximately normal with the same mean, and I think asymptotically the same variance, as if each toss was independent (with probability given by the stationary distribution). You'll need more tosses for the approximation to be good, though. -- Meni Rosenfeld (talk) 09:03, 20 April 2010 (UTC)[reply]

N factors[edit]

How can I find the first number with more than N factors? 12.105.164.147 (talk) 07:31, 18 April 2010 (UTC)[reply]

I think it's messy for large numbers. See Divisor function. It's even hard to decide whether a given large number is prime--it was only a few years ago that an efficient deterministic way was found to do that. 66.127.54.238 (talk) 09:00, 18 April 2010 (UTC)[reply]
How about multiplying the first N primes together? -- SGBailey (talk) 10:59, 18 April 2010 (UTC)[reply]
Or of course 2^N if you allow duplicate factors -- SGBailey (talk) 11:00, 18 April 2010 (UTC)[reply]
Or are you counting non-prime factors a well? Is 12=2,2,3 or 2,3,4,6? -- SGBailey (talk) 11:01, 18 April 2010 (UTC)[reply]
I assume that when the OP says "factors" they actually mean "divisors" (otherwise, as SGBailey says, the problem is trivial). A number that has more divisors than any smaller number is called a highly composite number. Although highly composite numbers have certain characteristic features, I don't think there is a simple and efficient method of finding them. There is a table of the first 1200 highly composite numbers here - the largest has over 4x1015 divisors. Gandalf61 (talk) 11:07, 18 April 2010 (UTC)[reply]

Soduku proof[edit]

I've been trying to prove or disprove the following, but I'm not getting anywhere.

Suppose that for each cell in an unsolved soduku puzzle, you write out a list of possible values by looking at what other numbers are already used in that cell's row, column, and 3x3 block. This list is kept updated as the puzzle is being solved. Can the following statements be simultaneously true?

1. The soduku puzzle has one and only one solution. 2. All cells with only 1 possible value have already been assigned that value, and the list of possible values is updated afterwards. (In other words, every unfilled cell has 2 or more possible values.) 3. In each row, column, and 3x3 block, every number from 1 to 9 not already in that row, column, or block exists in the "possible" lists of 2 or more cells.

I think that #3 can't be true if #1 and #2 are because otherwise, there would be multiple solutions to the puzzle. I can't prove this, however. --99.237.234.104 (talk) 17:31, 18 April 2010 (UTC)[reply]

I'm pretty sure they can be simultaneously true, but you haven't defined exactly what you mean by possible value. If the puzzle has exactly one solution then really there is only possible value for a cell eventually, but I'm assuming you mean a value for which there are no other cells in the same row, column or block with the same value at the current stage of solving the puzzle. You are essentially describing a basic strategy for solving a given puzzle, but the harder puzzles require more sophisticated strategies to solve. Get some puzzles that are rated very difficult and try solving them by only filling in a cell that results from rule 2 or 3 being violated, I predict you will get stuck eventually and you will then have a counterexample.--RDBury (talk) 21:28, 18 April 2010 (UTC)[reply]

Demographic thought experiment[edit]

Suppose we have an earthlike planet, called Urf, which needs to be populated. Every year, 500,000 settlers arrive (250,000 female and 250,000 male). These settlers are evenly distributed between the ages of 20 and 40. They're very healthy, so for simplicity's sake, let's assume that they (and their descendants) all live to be 100 years old. We've recruited people who want large families, so the fertility rate is 4.0 births per woman. Again for simplicity's sake, let's assume that our female settlers exhibit a constant rate of fertility between the ages of 20 and 40, and no fertility before and after. Given these values, what will be Urf's population in 10, 50 and 100 years? --Lazar Taxon (talk) 21:05, 18 April 2010 (UTC)[reply]

I've plugged it all into a spreadsheet (with a few crude approximations to make it a discrete problem, so take these as rough orders of magnitude rather than exact numbers) and get the populations every 10 years as:
Year Population
0 500000
10 3750000
20 9500000
30 17500000
40 26000000
50 37500000
60 53750000
70 72500000
80 94500000
90 122000000
100 157000000
110 197750000
120 247250000
130 309750000
140 385500000
150 475750000
--Tango (talk) 21:35, 18 April 2010 (UTC)[reply]

"Constant rate of fertility" between the ages of 20 and 40, and having exactly four babies, sounds as if maybe it could mean each woman first gives birth when she's 1/5 of the way from 20 to 40, i.e. she's 24, then again at 2/5, then 3/5, then 4/5, so the ages are 24, 28, 32, 36. Or maybe it could mean the first baby is born when she's 20, then 26 + 2/3, then 33 + 1/3, then 40. Michael Hardy (talk) 00:51, 19 April 2010 (UTC)[reply]

Or it could mean that the births are stochastic, with a 20% chance of a birth each year, so that over the 20 year span each woman will, on average, have 4 children, but a few may have many more, and ~1% will have none. (Bonus points for someone who factors in lactational amenorrhea, and specifically reduces birth rate in the years following a successful birth.) As an aside, Fibonacci numbers were originally derived from a similar thought experiment. -- 174.24.208.192 (talk) 02:14, 19 April 2010 (UTC)[reply]

Except that all these simplifying assumptions seem out of place if you're bringing in stochasticity. Michael Hardy (talk) 03:07, 21 April 2010 (UTC)[reply]