Open
Description
Looking closely at multinomial, I see some strange behaviors...
- when an error occurs, it seems to be modifying the tensor of probabilities passed by the user:
> p = torch.rand(10)
> torch.multinomial(p, 3) -- ok
> torch.multinomial(p, 11) -- oops, error
> print(p:size())
1
10
[torch.LongStorage of size 2]
something wrong happened!
- i am not sure it is completely mathematically correct. Or is there something wrong in the random number generator? I am not sure. But if you try the following at home, with C=10,..,60 it works quite fine. With C = 80 or above, you see some weird spiky artefacts which should not exist. Any idea?
require 'torch'
require 'gnuplot'
torch.manualSeed(5555)
local C = 80
local N = 100000
-- some unbalanced data
local p = torch.Tensor(C)
for i=1,C do
p[i] = i
end
local x = torch.multinomial(p, N, true)
gnuplot.axis{'','',0,''}
gnuplot.hist(x)