You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<!-- navigation toc: --><li><ahref="#implementing-a-memory-cell-in-a-neural-network" style="font-size: 80%;">Implementing a memory cell in a neural network</a></li>
<h2id="gating-mechanism-long-short-term-memory-lstm" class="anchor">Gating mechanism: Long Short Term Memory (LSTM) </h2>
308
326
309
-
<p>Besides a simple recurrent neural network layer, as discussed above, there are two other
327
+
<p>Besides a simple recurrent neural network layer, as discussed during the last two weeks, there are two other
310
328
commonly used types of recurrent neural network layers: Long Short
311
329
Term Memory (LSTM) and Gated Recurrent Unit (GRU). For a short
312
330
introduction to these layers see <ahref="https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b" target="_self"><tt>https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b</tt></a>
<h2id="gating-mechanism-long-short-term-memory-lstm">Gating mechanism: Long Short Term Memory (LSTM) </h2>
246
246
247
-
<p>Besides a simple recurrent neural network layer, as discussed above, there are two other
247
+
<p>Besides a simple recurrent neural network layer, as discussed during the last two weeks, there are two other
248
248
commonly used types of recurrent neural network layers: Long Short
249
249
Term Memory (LSTM) and Gated Recurrent Unit (GRU). For a short
250
250
introduction to these layers see <ahref="https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b" target="_blank"><tt>https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b</tt></a>
@@ -266,6 +266,8 @@ <h2 id="gating-mechanism-long-short-term-memory-lstm">Gating mechanism: Long Sho
266
266
<p><li> The information stays in the cell so long as its <b>keep</b> gate is on.</li>
267
267
<p><li> Information can be read from the cell by turning on its <b>read</b> gate.</li>
268
268
</ol>
269
+
<p>
270
+
<p>The LSTM were first introduced to overcome the vanishing gradient problem.</p>
<h2id="gating-mechanism-long-short-term-memory-lstm">Gating mechanism: Long Short Term Memory (LSTM) </h2>
254
263
255
-
<p>Besides a simple recurrent neural network layer, as discussed above, there are two other
264
+
<p>Besides a simple recurrent neural network layer, as discussed during the last two weeks, there are two other
256
265
commonly used types of recurrent neural network layers: Long Short
257
266
Term Memory (LSTM) and Gated Recurrent Unit (GRU). For a short
258
267
introduction to these layers see <ahref="https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b" target="_blank"><tt>https://medium.com/mindboard/lstm-vs-gru-experimental-comparison-955820c21e8b</tt></a>
@@ -274,6 +283,8 @@ <h2 id="gating-mechanism-long-short-term-memory-lstm">Gating mechanism: Long Sho
274
283
<li> The information stays in the cell so long as its <b>keep</b> gate is on.</li>
275
284
<li> Information can be read from the cell by turning on its <b>read</b> gate.</li>
276
285
</ol>
286
+
<p>The LSTM were first introduced to overcome the vanishing gradient problem.</p>
0 commit comments