×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

arXiv:2310.10530v4 Announce Type: replace
Abstract: In Bayesian theory, the role of information is central. The influence exerted by prior information on posterior outcomes often jeopardizes Bayesian studies, due to the potentially subjective nature of the prior choice. In modeling where a priori knowledge is lacking, the reference prior theory emerges as a proficient tool. Based on the criterion of mutual information, this theory makes it possible to construct a non-informative prior whose choice can be qualified as objective. In this paper, we contribute to the enrichment of reference prior theory. Indeed, we unveil an original analogy between reference prior theory and Global Sensitivity Analysis, from which we propose a natural generalization of the mutual information definition. Leveraging dissimilarity measures between probability distributions, such as f-divergences, we provide a formalized framework for what we term generalized reference priors. Our main result offers a limit of mutual information, simplifying the definition of reference priors as its maximal arguments. This approach opens a new way that facilitates the theoretical derivation of reference priors under constraints or within specific classes. In the absence of constraints, we further prove that the Jeffreys prior maximizes the generalized mutual information considered.

Click here to read this post out
ID: 823214; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: April 26, 2024, 7:32 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 11
CC:
No creative common's license
Comments: