You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<p><li> Lectures Thursdays 1215pm-2pm, room FØ434, Department of Physics</li>
215
-
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, , room FØ434, Department of Physics</li>
215
+
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, room FØ434, Department of Physics</li>
216
216
<p><li> We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants</li>
217
-
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.</li>
217
+
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade</li>
218
218
<p><li> All info at the GitHub address <ahref="https://github.com/CompPhysics/AdvancedMachineLearning" target="_blank"><tt>https://github.com/CompPhysics/AdvancedMachineLearning</tt></a></li>
<p><li><ahref="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
271
-
<p><li><ahref="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
272
-
<p><li><ahref="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
273
-
</ul>
274
-
</div>
275
-
276
-
<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
277
-
from Goodfellow, Bengio and Courville's text <ahref="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
306
+
the lectures. Topics for data sets will be discussed.
307
+
</p>
308
+
</section>
309
+
310
+
<section>
311
+
<h2id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>
312
+
313
+
<p>The computational path: Here we propose a path where you develop your
314
+
own code for a convolutional or eventually recurrent neural network
315
+
and apply this to data selects of your own selection. The code should
316
+
be object oriented and flexible allowing for eventual extensions by
317
+
including different Loss/Cost functions and other
318
+
functionalities. Feel free to select data sets from those suggested
319
+
below here. This code can also be extended upon by adding for example
320
+
autoencoders. You can compare your own codes with implementations
321
+
using TensorFlow(Keras)/PyTorch or other libraries.
322
+
</p>
323
+
</section>
324
+
325
+
<section>
326
+
<h2id="the-application-path-own-data">The application path/own data </h2>
327
+
328
+
<p>The application path: Here you can use the most relevant method(s)
329
+
(say convolutional neural networks for images) and apply this(these)
330
+
to data sets relevant for your own research.
331
+
</p>
332
+
</section>
333
+
334
+
<section>
335
+
<h2id="gaussian-processes-and-bayesian-analysis">Gaussian processes and Bayesian analysis </h2>
<p><li><ahref="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
367
+
<p><li><ahref="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
368
+
<p><li><ahref="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
369
+
</ul>
370
+
</div>
371
+
372
+
<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
373
+
from Goodfellow, Bengio and Courville's text <ahref="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
374
+
</p>
375
+
</section>
376
+
299
377
<section>
300
378
<h2id="types-of-machine-learning">Types of machine learning </h2>
301
379
@@ -363,7 +441,7 @@ <h2 id="what-is-generative-modeling">What Is Generative Modeling? </h2>
363
441
</section>
364
442
365
443
<section>
366
-
<h2id="example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <ahref="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deeep Learning by David Foster</a></h2>
444
+
<h2id="example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <ahref="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deep Learning by David Foster</a></h2>
<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
468
-
the lectures. Topics for data sets will be discussed during the lab sessions.
469
-
</p>
470
-
</section>
471
-
472
-
<section>
473
-
<h2id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>
474
-
475
-
<p>The computational path: Here we propose a path where you develop your
476
-
own code for a convolutional or eventually recurrent neural network
477
-
and apply this to data selects of your own selection. The code should
478
-
be object oriented and flexible allowing for eventual extensions by
479
-
including different Loss/Cost functions and other
480
-
functionalities. Feel free to select data sets from those suggested
481
-
below here. This code can also be extended upon by adding for example
482
-
autoencoders. You can compare your own codes with implementations
483
-
using TensorFlow(Keras)/PyTorch or other libraries.
0 commit comments