1. Afternoon
- worked with Jon to port his old tahirTrackViewer.cpp into new cvg2, and works.
- be aware of the origin convention when using Calibration related functions
2. Morning
- Trajectories with 3D coords done, sent to Luis
- Silly mistake of config file of calibration, waisted 2 hours in total…
3. Worth-noting points this week
1. SDS tracker
2. ported MTT stuff into cvg2
Basically it’s a point tracker
3. Hungarian Algorithm
- Also called Munkres’ Assignment Algorithm
- It appears to be two different solutions…
4. JPDAF
5. New ColorMap in Matplotlib
A Better Default Colormap for Matplotlib
How We Designed Matplotlib’s New Default Colormap (and You Can Too)
Perceptual Color Maps in matplotlib for Oceanography
6. Theono
Still don not quite get the idea of all algorithms are defined symbolically, therefore It’s more like writing out math than writing code
All the examples (entry level of course) I’ve come across so far,
- Theano at a Glance
- Theano Tutorial
- Baby Steps - Algebra
They are use the same letter/word for ‘symbol’ and variable, like
>>> import theano.tensor as T
>>> from theano import function
>>> x = T.dscalar('x')
>>> y = T.dscalar('y')
>>> z = x + y
>>> f = function([x, y], z)
or,
# The theano.tensor submodule has various primitive symbolic variable types.
# Here, we're defining a scalar (0-d) variable.
# The argument gives the variable its name.
foo = T.scalar('foo')
# Now, we can define another variable bar which is just foo squared.
bar = foo**2
# It will also be a theano variable.
print type(bar)
print bar.type
# Using theano's pp (pretty print) function, we see that
# bar is defined symbolically as the square of foo
print theano.pp(bar)
4. Highlights from last week
1. presentation on GMM
2. scikit-learn
Gaussian mixture models
Density Estimation for a mixture of Gaussians
Gaussian Mixture Model Ellipsoids
Gaussian Mixture Model Selection
GMM classification
1D Gaussian Mixture Example
How to draw PDF
x = np.linspace(-6, 6, 1000)
logprob, responsibilities = M_best.eval(x)
pdf = np.exp(logprob)
pdf_individual = responsibilities * pdf[:, np.newaxis]
GMM and score_samples(X) back to probabilities
The probability density can be greater than 1. The only normalization criterion is that it integrates to 1.
Take a simple 1D example of where the probability is zero except in the range (0, 0.1). Then the probability density must have an average value of 10 in that region for the normalization criterion to be met!
data = []
for _ in range(100) :
data.append( [np.random.rand(), np.random.rand(), np.random.rand()] )
model = GMM(n_components=1).fit(data)
logprob, _ = model.score_samples(data)
print (np.max(np.exp(logprob)))
# 2.57627579814
grid = np.linspace(-0.5, 1.5, 100)
x, y, z = np.meshgrid(grid, grid, grid)
X = np.vstack([x.ravel(), y.ravel(), z.ravel()]).T
logprob, _ = model.score_samples(X)
print (np.max(np.exp(logprob)))
# 2.65717824707
print(np.exp(logprob).sum() * (grid[1] - grid[0]) ** 3)
# 0.998503826652
3. matplotlib
Pyplot tutorial
- Working with multiple figures and axes
import matplotlib.pyplot as plt
plt.figure(1) # the first figure
plt.subplot(211) #the first subplot in the first figure
plt.plot([1,2,3])
plt.subplot(212) #the second subplot in the first figure
plt.plot([4,5,6])
plt.figure(2) # a second figure
plt.plot([4,5,6]) # creates a subplot(111) by default
plt.figure(1) #figure 1 current;
#subplot(212) still current
plt.subplot(211) # make subplot(211) in figure1 current
plt.title('Easy as 1,2,3') # subplot 211 title