Jump to content

User talk:Holcombea

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Hello, Holcombea, and welcome to Wikipedia! Thank you for your contributions; I hope you like the place and decide to stay. We're glad to have you in our community! Here are a few good links for newcomers:

I hope you enjoy editing here and being a Wikipedian! Though we all make goofy mistakes, here is what Wikipedia is not. If you have any questions or concerns, don't hesitate to see the help pages or add a question to the village pump. The Community Portal can also be very useful.

Happy Wiki-ing!

-- Sango123 15:30, August 6, 2005 (UTC)

P.S. Feel free to leave a message on my talk page if you need help with anything or simply wish to say hello. :)

I think I'm interested in a different kind of answer. As I understand it, the expected value minimizes the *average* difference between the estimate and the actual unknown parameter value. But in other contexts, Bayesians may minimize the average sum of squared difference, and in maximum likelihood the mode of the posterior distribution is used. Did Laplace have any particular reason for choosing the mean? Is this somehow embedded in the law of total probability? Sorry if this is a naive question. Alex Holcombe 10:23, 15 October 2007 (UTC)[reply]

No, it does not. The expected value does not minimize that average distance. The median does. The expeected values minimizes the average square of the distance. Besides, it's not clear that estimation is what's being done here. Michael Hardy (talk) 19:07, 7 January 2008 (UTC)[reply]

Greetings

[edit]

Hi Alex. So there you are, hiding under a ?Latinate variant. Psychology articles are in a frightful mess around this neck of the woods. Tony (talk) 06:16, 29 November 2008 (UTC) PS You may recognise the white staircase and my daughter on my talk page. Tony (talk) 06:17, 29 November 2008 (UTC)[reply]