You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<li> Long Short Term Memory Make the RNN out of little modules that are designed to remember values for a long time.</li>
5317
5317
<li> Hessian Free Optimization: Deal with the vanishing gradients problem by using a fancy optimizer that can detect directions with a tiny gradient but even smaller curvature.</li>
5318
-
<li> Echo State Networks: Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input.</li>
5319
-
<ul>
5320
-
<li> ESNs only need to learn the hidden-output connections.</li>
5321
-
</ul>
5318
+
<li> Echo State Networks (ESN): Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input. ESNs only need to learn the hidden-output connections.</li>
5322
5319
<li> Good initialization with momentum Initialize like in Echo State Networks, but then learn all of the connections using momentum</li>
5323
5320
</ol>
5324
5321
<!-- ------------------- end of main content --------------- -->
Copy file name to clipboardExpand all lines: doc/pub/week5/html/week5-reveal.html
+1-6Lines changed: 1 addition & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -5191,12 +5191,7 @@ <h2 id="four-effective-ways-to-learn-an-rnn-and-preparing-for-next-week">Four ef
5191
5191
<ol>
5192
5192
<p><li> Long Short Term Memory Make the RNN out of little modules that are designed to remember values for a long time.</li>
5193
5193
<p><li> Hessian Free Optimization: Deal with the vanishing gradients problem by using a fancy optimizer that can detect directions with a tiny gradient but even smaller curvature.</li>
5194
-
<p><li> Echo State Networks: Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input.</li>
5195
-
<ul>
5196
-
5197
-
<p><li> ESNs only need to learn the hidden-output connections.</li>
5198
-
</ul>
5199
-
<p>
5194
+
<p><li> Echo State Networks (ESN): Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input. ESNs only need to learn the hidden-output connections.</li>
5200
5195
<p><li> Good initialization with momentum Initialize like in Echo State Networks, but then learn all of the connections using momentum</li>
Copy file name to clipboardExpand all lines: doc/pub/week5/html/week5-solarized.html
+1-4Lines changed: 1 addition & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -5220,10 +5220,7 @@ <h2 id="four-effective-ways-to-learn-an-rnn-and-preparing-for-next-week">Four ef
5220
5220
<ol>
5221
5221
<li> Long Short Term Memory Make the RNN out of little modules that are designed to remember values for a long time.</li>
5222
5222
<li> Hessian Free Optimization: Deal with the vanishing gradients problem by using a fancy optimizer that can detect directions with a tiny gradient but even smaller curvature.</li>
5223
-
<li> Echo State Networks: Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input.</li>
5224
-
<ul>
5225
-
<li> ESNs only need to learn the hidden-output connections.</li>
5226
-
</ul>
5223
+
<li> Echo State Networks (ESN): Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input. ESNs only need to learn the hidden-output connections.</li>
5227
5224
<li> Good initialization with momentum Initialize like in Echo State Networks, but then learn all of the connections using momentum</li>
5228
5225
</ol>
5229
5226
<!-- ------------------- end of main content --------------- -->
Copy file name to clipboardExpand all lines: doc/pub/week5/html/week5.html
+1-4Lines changed: 1 addition & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -5297,10 +5297,7 @@ <h2 id="four-effective-ways-to-learn-an-rnn-and-preparing-for-next-week">Four ef
5297
5297
<ol>
5298
5298
<li> Long Short Term Memory Make the RNN out of little modules that are designed to remember values for a long time.</li>
5299
5299
<li> Hessian Free Optimization: Deal with the vanishing gradients problem by using a fancy optimizer that can detect directions with a tiny gradient but even smaller curvature.</li>
5300
-
<li> Echo State Networks: Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input.</li>
5301
-
<ul>
5302
-
<li> ESNs only need to learn the hidden-output connections.</li>
5303
-
</ul>
5300
+
<li> Echo State Networks (ESN): Initialize the input a hidden and hidden-hidden and output-hidden connections very carefully so that the hidden state has a huge reservoir of weakly coupled oscillators which can be selectively driven by the input. ESNs only need to learn the hidden-output connections.</li>
5304
5301
<li> Good initialization with momentum Initialize like in Echo State Networks, but then learn all of the connections using momentum</li>
5305
5302
</ol>
5306
5303
<!-- ------------------- end of main content --------------- -->
0 commit comments