photog.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
🌈 An inclusive place for your photos, silliness, and convos! 🌈

Administered by:

Server stats:

249
active users

#bayesian

0 posts0 participants0 posts today

Version 0.3.1 of *inferno*, the R package for Bayesian nonparametric inference, is out!

<pglpm.github.io/inferno/>

This version brings the following improvement and new functions:

- Possibility of calculating the posterior probability of value ranges, such as Pr(Y ≤ y), besides of point values such as Pr(Y = y). Also for subpopulations.
- New function to generate posterior samples for any set of variates. Also for subpopulations
- Improved calculation of mutual information between variates.

I'd like to remind that this package is especially suited to researchers with a frequentist background who'd like to try out Bayesian nonparametrics. The introductory vignette <pglpm.github.io/inferno/articl> provides a simple and intuitive guide to the ideas, functions, and calculations, with a concrete example. The package provides many useful tools and functions for subgroup/subpopulation studies.

The package is also suited to Bayesian researchers who'd like to do nonparametric analysis without worrying to much about the Monte Carlo coding and calculations that it often involves.

Feedback and questions much appreciated!

pglpm.github.ioInference in R with Bayesian nonparametricsFunctions for Bayesian nonparametric population inference (also called exchangeable inference, or density inference). From a machine-learning perspective, they offer a model-free, uncertainty-quantified prediction algorithm.

Dear R community, I'd like to poll your opinions and ideas about the arguments of a possible R function:

Suppose you're working with the variates of some population; for instance the variates `species`, `island`, `bill_len`, `bill_dep`, `body_mass`, etc. of the `penguins` dataset <cran.r-project.org/package=bas>.

Suppose there's a package that allows you to calculate conditional probabilities of single or joint variates; for example

Pr( bill_len > 40, species = 'Adelie'  |  bill_dep < 16, body_mass = 4200)

and note in particular that this probability refers to intervals/tails ("bill_len > 40") as well as to point-values ("body_mass = 4200").

In fact the crucial point here is that with this function you can inquiry about the probability of a point value, "=", or about a cumulative probability, ">" or "<", or mixtures thereof, as you please.

Now what would be the "best" way to input this kind of choice as an argument to the function? Let's say you have the following two input ways:

**A: indicate the request of a cumulative probability in the variate name:**

```
Pr(
Y = list('bill_len>' = 40, species = 'Adelie'),
X = list('bill_dep<' = 16, body_mass = 4200)
)
```

**B: indicate the request of a cumulative probability in a separate function argument:**

```
Pr(
Y = list(bill_len = 40, species = 'Adelie'),
X = list(bill_dep = 16, body_mass = 4200),
tails = list(bill_len = '>', bill_dep = '<') # or +1, -1 instead of '>', '<'?
)
```

Any other ideas? Feel free to comment :) See <pglpm.github.io/inferno/refere> for a clearer idea about such a function.

Thank you so much for your help!

cran.r-project.orgbasepenguins: Convert Files that Use 'palmerpenguins' to Work with 'datasets'From 'R' 4.5.0, the 'datasets' package includes the penguins and penguins_raw data sets popularised in the 'palmerpenguins' package. 'basepenguins' takes files that use the 'palmerpenguins' package and converts them to work with the versions from 'datasets' ('R' >= 4.5.0). It does this by removing calls to library(palmerpenguins) and making the necessary changes to column names. Additionally, it provides helper functions to define new files paths for saving the output and a directory of example files to experiment with.

Sunken British superyacht Bayesian is raised from the seabed.

A superyacht that sank off the coast of the Italian island of Sicily last year has been raised from the seabed by a specialist salvage team.

Seven of the 22 people on board died in the sinking, including the vessel's owner, British tech tycoon Mike Lynch and his 18-year-old daughter.

The cause of the sinking is still under investigation.

mediafaro.org/article/20250620

BBC · Sunken British superyacht Bayesian is raised from the seabed.By BBC
#Italy#UK#Bayesian

aeon.co/essays/no-schrodingers

This is a pretty good article for showing how confused the interpretation of QM is. And its a good article to understand why i personally side with Bohm and Bell in thinking the pilot wave theory is the one most reasonable to believe. Because the pilot wave theory has the following quality. The theory is a mapping from initial position at time t=0 to final position at time t=1...Its deterministic, but our knowledge of the initial condition is not
#quantum #bohm #bayesian

AeonNo, Schrödinger’s cat is not alive and dead at the same time | Aeon EssaysThe weird paradox of Schrödinger’s cat has found a lasting popularity. What does it mean for the future of quantum physics?

Happy Birthday, Laplace! 🎂 🪐 🎓 One of the first to use Bayesian probability theory in the modern way!

"One sees in this essay that the theory of probabilities is basically only common sense reduced to a calculus. It makes one estimate accurately what right-minded people feel by a sort of instinct, often without being able to give a reason for it. It leaves nothing arbitrary in the choice of opinions and of making up one's mind, every time one is able, by this means, to determine the most advantageous choice. Thereby, it becomes the most happy supplement to ignorance and to the weakness of the human mind. If one considers the analytical methods to which this theory has given rise, the truth of the principles that serve as the groundwork, the subtle and delicate logic needed to use them in the solution of the problems, the public-benefit businesses that depend on it, and the extension that it has received and may still receive from its application to the most important questions of natural philosophy and the moral sciences; if one observes also that even in matters which cannot be handled by the calculus, it gives the best rough estimates to guide us in our judgements, and that it teaches us to guard ourselves from the illusions which often mislead us, one will see that there is no science at all more worthy of our consideration, and that it would be a most useful part of the system of public education."

*Philosophical Essay on Probabilities*, 1814 <doi.org/10.1007/978-1-4612-418>

Everyone thinks big data should be better. If you have a good model that makes good predictions then often more data is an enormous nuisance. The posterior distributions are so narrow that you can never sample properly. It's like finding a hydrogen atom in your bedroom. The thing is just too damn small. So anyway we are trying tempering for our migration model just so we can get convergence. I don't care about tiny uncertainty intervals. just be near the right answer. #Bayesian #statistics

@AeonCypher @paninid

"A p-value is an #estimate of p(Data | Null Hypothesis). " – not correct. A p-value is an estimate of

p(Data or other imagined data | Null Hypothesis)

so not even just of the actual data you have. Which is why p-values depend on your stopping rule (and do not satisfy the "likelihood principle"). In this regard, see Jeffreys's quote below.

Imagine you design an experiment this way: "I'll test 10 subjects, and in the meantime I apply for a grant. At the time the 10th subject is tested, I'll know my application's outcome. If the outcome is positive, I'll test 10 more subjects; if it isn't, I'll stop". Not an unrealistic situation.

With this stopping rule, your p-value will depend on the probability that you get the grant. This is not a joke.

"*What the use of P implies, therefore, is that a hypothesis that may be true may be rejected because it has not predicted observable results that have not occurred.* This seems a remarkable procedure. On the face of it the fact that such results have not occurred might more reasonably be taken as evidence for the law, not against it." – H. Jeffreys, "Theory of Probability" § VII.7.2 (emphasis in the original) <doi.org/10.1093/oso/9780198503>.

OUP AcademicTheory of ProbabilityAbstract. Jeffreys' Theory of Probability, first published in 1939, was the first attempt to develop a fundamental theory of scientific inference based on
Replied in thread

@paninid p-values, to a large extent, exist because calculating the posterior is computationally expensive. Not all fields use the .05 cutoff.

A p-value is an #estimate of p(Data | Null Hypothesis). If the two #hypotheses are equally likely and they are mutually exclusive and they are closed over the #hypothesis space, then this is the same as p(Hypothesis | Data).

Meaning, under certain assumption, the p-value does represent the actually probability of being wrong.

However, given modern computers, there is no reason that #Bayesian odds-ratios can't completely replace their usage and avoid the many many problems with p-values.

Theoretical Conspiracists will love this:
“The concerns focus on hard drives that Lynch reportedly kept with him. Survivors reportedly told Italian prosecutors that Lynch “did not trust [internet] cloud services” and kept his data with him.”

theguardian.com/world/2024/sep
#Bayesian #Lynch #TheGuardian

The Guardian · Sicily: fear of foreign actors prompts security request for wreck of luxury yachtBy Edward Helmore

Nach Yachtunglück: Hewlett Packard fordert Milliarden von Lynchs Witwe

Der britische Tech-Tycoon Mike Lynch starb beim Untergang der Luxusjacht "Bayesian". Hewlett Packard Enterprise fordert dennoch weiter Schadenersatz in Milliardenhöhe. Haftet nun Lynchs Witwe? Von Angela Göpfert.

➡️ tagesschau.de/wirtschaft/unter

tagesschau.de · Nach Yachtunglück: Hewlett Packard fordert Milliarden von Lynchs WitweBy Angela Göpfert