Skip to content

Commit d0d1465

Browse files
committed
update first week
1 parent f4b21e3 commit d0d1465

File tree

8 files changed

+919
-899
lines changed

8 files changed

+919
-899
lines changed

doc/pub/week1/html/week1-bs.html

Lines changed: 130 additions & 124 deletions
Large diffs are not rendered by default.

doc/pub/week1/html/week1-reveal.html

Lines changed: 103 additions & 103 deletions
Original file line numberDiff line numberDiff line change
@@ -212,9 +212,9 @@ <h2 id="practicalities">Practicalities </h2>
212212

213213
<ol>
214214
<p><li> Lectures Thursdays 1215pm-2pm, room F&#216;434, Department of Physics</li>
215-
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, , room F&#216;434, Department of Physics</li>
215+
<p><li> Lab and exercise sessions Thursdays 215pm-4pm, room F&#216;434, Department of Physics</li>
216216
<p><li> We plan to work on two projects which will define the content of the course, the format can be agreed upon by the participants</li>
217-
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively one long project.</li>
217+
<p><li> No exam, only two projects. Each projects counts 1/2 of the final grade. Aleternatively, one long project which counts 100% of the final grade</li>
218218
<p><li> All info at the GitHub address <a href="https://github.com/CompPhysics/AdvancedMachineLearning" target="_blank"><tt>https://github.com/CompPhysics/AdvancedMachineLearning</tt></a></li>
219219
</ol>
220220
</section>
@@ -223,7 +223,7 @@ <h2 id="practicalities">Practicalities </h2>
223223
<h2 id="deep-learning-methods-covered-tentative">Deep learning methods covered, tentative </h2>
224224

225225
<ol>
226-
<p><li> <b>Deep learning, classics</b>
226+
<p><li> <b>Deep learning</b>
227227
<ol type="a"></li>
228228
<p><li> Feed forward neural networks and its mathematics (NNs)</li>
229229
<p><li> Convolutional neural networks (CNNs)</li>
@@ -244,7 +244,7 @@ <h2 id="deep-learning-methods-covered-tentative">Deep learning methods covered,
244244
<p><li> Autoregressive methods (tentative)</li>
245245
</ol>
246246
<p>
247-
<p><li> <b>Physical Sciences (often just called Physics informed) informed machine learning</b></li>
247+
<p><li> <b>Physical Sciences (often just called Physics informed neural networks, PINNs) informed machine learning</b></li>
248248
</ol>
249249
</section>
250250

@@ -258,28 +258,11 @@ <h2 id="additional-topics-kernel-regression-gaussian-processes-and-bayesian-stat
258258
variable).
259259
</p>
260260

261-
<p>We have not made plans for Reinforcement learning, but this can be another option.</p>
261+
<p>We have not made plans for Reinforcement learning.</p>
262262
</section>
263263

264264
<section>
265-
<h2 id="good-books-with-hands-on-material-and-codes">Good books with hands-on material and codes </h2>
266-
<div class="alert alert-block alert-block alert-text-normal">
267-
<b></b>
268-
<p>
269-
<ul>
270-
<p><li> <a href="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
271-
<p><li> <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
272-
<p><li> <a href="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
273-
</ul>
274-
</div>
275-
276-
<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
277-
from Goodfellow, Bengio and Courville's text <a href="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
278-
</p>
279-
</section>
280-
281-
<section>
282-
<h2 id="project-paths">Project paths </h2>
265+
<h2 id="project-paths-overarching-view">Project paths, overarching view </h2>
283266

284267
<p>The course can also be used as a self-study course and besides the
285268
lectures, many of you may wish to independently work on your own
@@ -296,6 +279,101 @@ <h2 id="project-paths">Project paths </h2>
296279
</ol>
297280
</section>
298281

282+
<section>
283+
<h2 id="possible-paths-for-the-projects">Possible paths for the projects </h2>
284+
285+
<p>The differential equation path: Here we propose a set of differential
286+
equations (ordinary and/or partial) to be solved first using neural
287+
networks (using either your own code or TensorFlow/Pytorch or similar
288+
libraries). Thereafter we can extend the set of methods for
289+
solving these equations to recurrent neural networks and autoencoders
290+
(AE) and/or Generalized Adversarial Networks (GANs). All these
291+
approaches can be expanded into one large project. This project can
292+
also be extended into including <a href="https://github.com/maziarraissi/PINNs" target="_blank">Physics informed machine
293+
learning</a>. Here we can discuss
294+
neural networks that are trained to solve supervised learning tasks
295+
while respecting any given law of physics described by general
296+
nonlinear partial differential equations.
297+
</p>
298+
299+
<p>For those interested in mathematical aspects of deep learning, this could also be included.</p>
300+
</section>
301+
302+
<section>
303+
<h2 id="the-generative-models">The generative models </h2>
304+
305+
<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
306+
the lectures. Topics for data sets will be discussed.
307+
</p>
308+
</section>
309+
310+
<section>
311+
<h2 id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>
312+
313+
<p>The computational path: Here we propose a path where you develop your
314+
own code for a convolutional or eventually recurrent neural network
315+
and apply this to data selects of your own selection. The code should
316+
be object oriented and flexible allowing for eventual extensions by
317+
including different Loss/Cost functions and other
318+
functionalities. Feel free to select data sets from those suggested
319+
below here. This code can also be extended upon by adding for example
320+
autoencoders. You can compare your own codes with implementations
321+
using TensorFlow(Keras)/PyTorch or other libraries.
322+
</p>
323+
</section>
324+
325+
<section>
326+
<h2 id="the-application-path-own-data">The application path/own data </h2>
327+
328+
<p>The application path: Here you can use the most relevant method(s)
329+
(say convolutional neural networks for images) and apply this(these)
330+
to data sets relevant for your own research.
331+
</p>
332+
</section>
333+
334+
<section>
335+
<h2 id="gaussian-processes-and-bayesian-analysis">Gaussian processes and Bayesian analysis </h2>
336+
337+
<p>The Gaussian processes/Bayesian statistics path: <a href="https://jenfb.github.io/bkmr/overview.html" target="_blank">Kernel regression
338+
(Gaussian processes) and Bayesian
339+
statistics</a> are popular
340+
tools in the machine learning literature. The main idea behind these
341+
approaches is to flexibly model the relationship between a large
342+
number of variables and a particular outcome (dependent
343+
variable). This can form a second part of a project where for example
344+
standard Kernel regression methods are used on a specific data
345+
set. Alternatively, participants can opt to work on a large project
346+
relevant for their own research using gaussian processes and/or
347+
Bayesian machine Learning.
348+
</p>
349+
</section>
350+
351+
<section>
352+
<h2 id="hpc-path">HPC path </h2>
353+
354+
<p>Another alternative is to study high-performance computing aspects in
355+
designing ML codes. This can also be linked with an exploration of
356+
mathematical aspects of deep learning methods.
357+
</p>
358+
</section>
359+
360+
<section>
361+
<h2 id="good-books-with-hands-on-material-and-codes">Good books with hands-on material and codes </h2>
362+
<div class="alert alert-block alert-block alert-text-normal">
363+
<b></b>
364+
<p>
365+
<ul>
366+
<p><li> <a href="https://sebastianraschka.com/blog/2022/ml-pytorch-book.html" target="_blank">Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch</a></li>
367+
<p><li> <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">David Foster, Generative Deep Learning with TensorFlow</a></li>
368+
<p><li> <a href="https://github.com/PacktPublishing/Hands-On-Generative-AI-with-Python-and-TensorFlow-2" target="_blank">Bali and Gavras, Generative AI with Python and TensorFlow 2</a></li>
369+
</ul>
370+
</div>
371+
372+
<p>All three books have GitHub addresses from where one can download all codes. We will borrow most of the material from these three texts as well as
373+
from Goodfellow, Bengio and Courville's text <a href="https://www.deeplearningbook.org/" target="_blank">Deep Learning</a>
374+
</p>
375+
</section>
376+
299377
<section>
300378
<h2 id="types-of-machine-learning">Types of machine learning </h2>
301379

@@ -363,7 +441,7 @@ <h2 id="what-is-generative-modeling">What Is Generative Modeling? </h2>
363441
</section>
364442

365443
<section>
366-
<h2 id="example-of-generative-modeling-taken-from-generative-deeep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deeep Learning by David Foster</a> </h2>
444+
<h2 id="example-of-generative-modeling-taken-from-generative-deep-learning-by-david-foster-https-www-oreilly-com-library-view-generative-deep-learning-9781098134174-ch01-html">Example of generative modeling, <a href="https://www.oreilly.com/library/view/generative-deep-learning/9781098134174/ch01.html" target="_blank">taken from Generative Deep Learning by David Foster</a> </h2>
367445

368446
<br/><br/>
369447
<center>
@@ -442,85 +520,7 @@ <h2 id="taxonomy-of-generative-deep-learning-taken-from-generative-deep-learning
442520
</section>
443521

444522
<section>
445-
<h2 id="possible-paths-for-the-projects">Possible paths for the projects </h2>
446-
447-
<p>The differential equation path: Here we propose a set of differential
448-
equations (ordinary and/or partial) to be solved first using neural
449-
networks (using either your own code or TensorFlow/Pytorch or similar
450-
libraries). Thereafter we can extend the set of methods for
451-
solving these equations to recurrent neural networks and autoencoders
452-
(AE) and/or Generalized Adversarial Networks (GANs). All these
453-
approaches can be expanded into one large project. This project can
454-
also be extended into including <a href="https://github.com/maziarraissi/PINNs" target="_blank">Physics informed machine
455-
learning</a>. Here we can discuss
456-
neural networks that are trained to solve supervised learning tasks
457-
while respecting any given law of physics described by general
458-
nonlinear partial differential equations.
459-
</p>
460-
461-
<p>For those interested in mathematical aspects of deep learning, this could also be included.</p>
462-
</section>
463-
464-
<section>
465-
<h2 id="the-generative-models">The generative models </h2>
466-
467-
<p>This path brings us from discriminative models (like the standard application of NNs, CNNs etc) to generative models. Two projects that follow to a large extent
468-
the lectures. Topics for data sets will be discussed during the lab sessions.
469-
</p>
470-
</section>
471-
472-
<section>
473-
<h2 id="paths-for-projects-writing-own-codes">Paths for projects, writing own codes </h2>
474-
475-
<p>The computational path: Here we propose a path where you develop your
476-
own code for a convolutional or eventually recurrent neural network
477-
and apply this to data selects of your own selection. The code should
478-
be object oriented and flexible allowing for eventual extensions by
479-
including different Loss/Cost functions and other
480-
functionalities. Feel free to select data sets from those suggested
481-
below here. This code can also be extended upon by adding for example
482-
autoencoders. You can compare your own codes with implementations
483-
using TensorFlow(Keras)/PyTorch or other libraries.
484-
</p>
485-
</section>
486-
487-
<section>
488-
<h2 id="the-application-path">The application path </h2>
489-
490-
<p>The application path: Here you can use the most relevant method(s)
491-
(say convolutional neural networks for images) and apply this(these)
492-
to data sets relevant for your own research.
493-
</p>
494-
</section>
495-
496-
<section>
497-
<h2 id="gaussian-processes-and-bayesian-analysis">Gaussian processes and Bayesian analysis </h2>
498-
499-
<p>The Gaussian processes/Bayesian statistics path: <a href="https://jenfb.github.io/bkmr/overview.html" target="_blank">Kernel regression
500-
(Gaussian processes) and Bayesian
501-
statistics</a> are popular
502-
tools in the machine learning literature. The main idea behind these
503-
approaches is to flexibly model the relationship between a large
504-
number of variables and a particular outcome (dependent
505-
variable). This can form a second part of a project where for example
506-
standard Kernel regression methods are used on a specific data
507-
set. Alternatively, participants can opt to work on a large project
508-
relevant for their own research using gaussian processes and/or
509-
Bayesian machine Learning.
510-
</p>
511-
</section>
512-
513-
<section>
514-
<h2 id="hpc-path">HPC path </h2>
515-
516-
<p>Another alternative is to study high-performance computing aspects in
517-
designing ML codes. This can also be linked with an exploration of
518-
mathematical aspects of deep learning methods.
519-
</p>
520-
</section>
521-
522-
<section>
523-
<h2 id="what-are-the-basic-machine-learning-ingredients">What are the basic Machine Learning ingredients? </h2>
523+
<h2 id="reminder-on-the-basic-machine-learning-ingredients">Reminder on the basic Machine Learning ingredients </h2>
524524
<div class="alert alert-block alert-block alert-text-normal">
525525
<b></b>
526526
<p>

0 commit comments

Comments
 (0)