Skip to content

Commit 09dbdca

Browse files
Eduardo LeaoEduardo Leao
authored andcommitted
better browser plugins
1 parent 8faa8bc commit 09dbdca

File tree

5 files changed

+62
-65
lines changed

5 files changed

+62
-65
lines changed

dist/index.js

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1780,12 +1780,12 @@ class Embedding extends Module {
17801780
/**
17811781
* Embedding class, turns indexes into vectors.
17821782
*
1783-
* @param {number} in_size - number of different indexes (vocabulary size).
1784-
* @param {number} out_size - size of the embedding vector generated.
1783+
* @param {number} vocab_size - number of different indexes (vocabulary size).
1784+
* @param {number} embed_size - size of the embedding vector generated.
17851785
*/
1786-
constructor(in_size, embed_size) {
1786+
constructor(vocab_size, embed_size) {
17871787
super();
1788-
this.E = randn([in_size, embed_size], true, "cpu", false);
1788+
this.E = randn([vocab_size, embed_size], true, "cpu", false);
17891789
}
17901790
/**
17911791
* Extracts embedding from rows in "idx":
@@ -1802,14 +1802,14 @@ class Embedding extends Module {
18021802
class PositionalEmbedding extends Module {
18031803
E;
18041804
/**
1805-
* Embedding class, turns indexes into vectors.
1805+
* Embedding class, turns indexes into vectors based on it's position through an optimized lookup table.
18061806
*
1807-
* @param {number} n_timesteps - number of different embeddings (number of timesteps in each instance in batch).
1807+
* @param {number} input_size - number of different embeddings (size of the input).
18081808
* @param {number} embed_size - size of the embedding vector generated.
18091809
*/
1810-
constructor(n_timesteps, embed_size) {
1810+
constructor(input_size, embed_size) {
18111811
super();
1812-
this.E = randn([n_timesteps, embed_size], true, "cpu", false);
1812+
this.E = randn([input_size, embed_size], true, "cpu", false);
18131813
}
18141814
/**
18151815
* Gets embedding for timesteps in "idx" array.

dist/index.mjs

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1780,12 +1780,12 @@ class Embedding extends Module {
17801780
/**
17811781
* Embedding class, turns indexes into vectors.
17821782
*
1783-
* @param {number} in_size - number of different indexes (vocabulary size).
1784-
* @param {number} out_size - size of the embedding vector generated.
1783+
* @param {number} vocab_size - number of different indexes (vocabulary size).
1784+
* @param {number} embed_size - size of the embedding vector generated.
17851785
*/
1786-
constructor(in_size, embed_size) {
1786+
constructor(vocab_size, embed_size) {
17871787
super();
1788-
this.E = randn([in_size, embed_size], true, "cpu", false);
1788+
this.E = randn([vocab_size, embed_size], true, "cpu", false);
17891789
}
17901790
/**
17911791
* Extracts embedding from rows in "idx":
@@ -1802,14 +1802,14 @@ class Embedding extends Module {
18021802
class PositionalEmbedding extends Module {
18031803
E;
18041804
/**
1805-
* Embedding class, turns indexes into vectors.
1805+
* Embedding class, turns indexes into vectors based on it's position through an optimized lookup table.
18061806
*
1807-
* @param {number} n_timesteps - number of different embeddings (number of timesteps in each instance in batch).
1807+
* @param {number} input_size - number of different embeddings (size of the input).
18081808
* @param {number} embed_size - size of the embedding vector generated.
18091809
*/
1810-
constructor(n_timesteps, embed_size) {
1810+
constructor(input_size, embed_size) {
18111811
super();
1812-
this.E = randn([n_timesteps, embed_size], true, "cpu", false);
1812+
this.E = randn([input_size, embed_size], true, "cpu", false);
18131813
}
18141814
/**
18151815
* Gets embedding for timesteps in "idx" array.

dist/js-pytorch-browser.js

Lines changed: 0 additions & 48 deletions
Large diffs are not rendered by default.

dist/utils.js

Lines changed: 43 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

site/tensor/index.html

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -753,14 +753,16 @@ <h2 id="torchtril">torch.tril</h2>
753753
<h2 id="torchrandn">torch.randn</h2>
754754
<pre><code>torch.randn(*shape,
755755
requires_grad=false,
756-
device='cpu') → Tensor
756+
device='cpu',
757+
xavier=false) → Tensor
757758
</code></pre>
758759
<p>Returns a tensor filled with randomly sampled data with dimensions like <code>shape</code>. The sample is from a normal distribution.</p>
759760
<p>Parameters</p>
760761
<ul>
761762
<li><strong>shape (Array)</strong> - Javascript Array containing the shape of the Tensor.</li>
762763
<li><strong>requires_grad (boolean)</strong> - Whether to keep track of this tensor's gradients. Set this to true if you want to <strong>learn</strong> this parameter in your model. Default: <code>false</code>.</li>
763764
<li><strong>device (string)</strong> - Device to store Tensor. Either "gpu" or "cpu". If your device has a gpu, large models will train faster on it.</li>
765+
<li><strong>xavier (boolean)</strong> - Whether to use <a target="_blank" href="https://prateekvishnu.medium.com/xavier-and-he-normal-he-et-al-initialization-8e3d7a087528">Xavier Initialization</a> on this tensor. Default: <code>false</code>.</li>
764766
</ul>
765767
<p>Example</p>
766768
<pre><code class="language-javascript">&gt;&gt;&gt; let a = torch.randn([3,2], false, 'gpu');

0 commit comments

Comments
 (0)