Skip to content

Latest commit

 

History

History
133 lines (107 loc) · 6.57 KB

File metadata and controls

133 lines (107 loc) · 6.57 KB

Terminology

This document contains important information on many topics listed below, related to my PhD project on Application of Deep Learning for the Decomposition of 3D Shape for Hexahedral Mesh Generation.

Machine Learning

Context-Free Grammars

A context-free grammar (CFG) is a set of recursive writing rules (or productions) used to generate patterns of strings.

A CFG consists of the following components:

  • A set of terminal symbols, which are the characters of the alphabet that appear in the strings generated by the grammar.
  • A set of nonterminal symbols, which are placeholders for patterns of terminal symbols that can be generated by the nonterminal symbols.
  • A set of productions, which are rules for replacing (or rewriting) nonterminal symbols (on the left side of the production) in a string with other nonterminal or terminal symbols (on the right side of the production).
  • A start symbol, which is a special nonterminal symbol that appears in the intial string generated by the grammar.

To generate a string of terminal symbols from a CFG, we:

  • Begin with a string consisting of the start symbol.
  • Apply one of the productions with the start symbol on the left hand size, replacing the start symbol with the right hand sidde of the production.
  • Repeat the process of selecting nonterminal symbols in the string,and replacing them with the right hand side of some corresponding production, until all nonterminals have been replaced by terminal symbols.

Sources

Computer Aided Design and Computer Aided Engineering

Geometric Idealisation

Geometric Clean Up

CAD & CAE Integration

Programming

Python


Pytorch

Pytorch differs from Tensorflow and Caffe etc. due to the unique way it builds its neural networks. Other machine learning frameworks use a static method, where one has to build a neural network and reuse the same structure again and again. This is called Define and Run, which means that if you want to change how the network behaves you must start from scratch again.

Pytorch instead uses a technique called reverse-mode auto differentiation, which allows you to change the way your network behaves arbitrarily with zero lag or overhead. This means that it has a dynamic method of building its graph of the network, where the graph is built on the fly. This Define by Run" capability allows for more flexiblity and is why Pytorch is becoming more dominant in research applications.

Tensor Functions

torch.clamp(input, min, max, out=None) → Tensor
Clamps all elements in input into the range [min, max] and returns a resulting tensor.
  
  >>> a = torch.randn(4)
  >>> a
  tensor([-1.7120,  0.1734, -0.0478, -0.0922])
  >>> torch.clamp(a, min=-0.5, max=0.5)
  tensor([-0.5000,  0.1734, -0.0478, -0.0922])
  
  
view(*shape) → Tensor
Returns a tensor with the same data as the self tensor but of a different shape.
  
  >>> x = torch.randn(4, 4)
  >>> x.size()
  torch.Size([4, 4])
  >>> y = x.view(16)
  >>> y.size()
  torch.Size([16])
  >>> z = x.view(-1, 8)  # the size -1 is inferred from other dimensions
>>> z.size() torch.Size([2, 8])

NN Functions

loss.backward()
Computes dloss/dx for every parameter x which has requires_grad=True.

These are accumulated into x.grad for every parameter x.

x.grad += dloss/dx
optimizer.step updates the value of x using the gradient x.grad. For example, the SGD optimizer performs:

x += -lr * x.grad
optimizer.zero_grad() clears x.grad for every parameter x in the optimizer. It’s important to call this before loss.backward(), otherwise you’ll accumulate the gradients from multiple passes.

If you have multiple losses (loss1, loss2) you can sum them and then call backwards once:

    loss3 = loss1 + loss2
    loss3.backward()

Mathematics

Metrics

Manhatten distance

  • L1 distance
  • The distance between two points measured along axes at right angles.
  • In a plane with p1 at (x1, y1) and p2 at (x2, y2) it is |x1 - x2| + |y1 - y2|.

Euclidean distance

  • L2 distance
  • The straight line distance between two points.
  • In a plane with p1 at (x1, y1) and p2 at (x2, y2) it is sqrt((x1 - x2)2 + (y1 - y2)2).

Sources

Sets

Permutations

  • The re-ordering of elements in a set with a care for the order of the elements.
  • For example, your locker unlocks with a specific permutation of 2, 3, 4 and 5 (e.g. 2453).

Combinations

  • The re-ordering of elements in a set without a care for the order of the elements.
  • For example, your locker would unlock with any permutation of 2, 3, 4 and 5 to truly be by combination.

Sources