Skip to content

Conversation

bob-carpenter
Copy link
Collaborator

I added a minimum number of micro steps parameter. I'm not trying to auto-tune it yet.

There's some kind of bug in my evaluation code, I think. But I haven't been able to track it down. Clues:

  • WalnutsSampler doesn't work unless min_micro_steps = 1
  • AdaptiveWalnutsSampler works with just about any min_micro_steps.

If WalnutSampler doesn't work, then AdaptiveWalnutsSampler shouldn't work either, as it calls WalnutsSampler to do the sampling. I added prints at one point to make sure that AdaptiveWalnutsSampler passes the min number of micro steps and that it's being used. So I can only imagine that I'm calling WalnutsSampler correctly when created in AdaptiveWalnutsSampler but incorrectly when called directly. Or maybe it's just because it doesn't adapt the mass matrix, so it really struggles?

I would like to get the WalnutsSampler working on its own before merging, but I'm not clear on where to go next in debugging.

@WardBrian
Copy link
Collaborator

Can you elaborate on what you mean by doesn’t work? Does it error out or does it just not sample well?

@bob-carpenter
Copy link
Collaborator Author

It takes a couple draws then essentially hangs, leading to posterior standard deviations of 1e-14. So it feels like the last bug. It samples fine when called by AdaptiveWalnuts, so I don't think Walnuts itself is broken, just how it's getting called somewhere. It also works when the min number of micro steps is set to 1, as it is in the tests. Maybe it won't work without a better mass matrix?

@WardBrian
Copy link
Collaborator

Is this for any given posterior or are you trying that correlated normal example? Nothing sticks out to me in the diff in terms of things passed in the wrong order or similar.

@bob-carpenter
Copy link
Collaborator Author

Thanks for taking a look. Just the ill-conditioned (not correlated) normal. That one's easy to perfectly precondition, which is why adaptive walnuts is so fast there. Maybe it's just not a great sampler with that parameter greater than 1 for this distribution. I've stared at the chain of calls for over an hour and printed at the use destinations to make sure the right values were getting through. So I'm tempted just to merge this and use a different example config.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants