Docs & Testing for SPAM

Rémi Cailletaud & Edward Andò

OSUG, Grenoble, France
Laboratoire 3SR, Grenoble, France

What is SPAM?

  1. First of all a “Canned pork meat product” (wp)
  2. Then a Monty Python sketch:
  3. Dailymotion
  4. ...which was used to describe junk emails...
  5. Now a python package:
    Software for Practical Analysis of Materials

“Software for the Practical Analysis of Materials”

Core developers: EA, RC, Emmanuel Roubin, Olga Stamati
Library of material science/mechanics tools:
  • We work with 2D (photos) and 3D (tomography volumes) field measurements
  • Python (~11000 sloc) that sometimes calls C++ (~5000 sloc)
  • Only simple types are used (numpy arrays, numbers, strings)
  • Only standard inputs and outputs (TIFF image container, TSV, VTK out)
  • Nice documentation
  • Code respects some internal standards
  • More complex command-line scripts are also provided (example coming up)

Quick example of SPAM

  1. I have a pair of 3D images (from x-ray tomography)
  2. There is a rigid-body displacement between these images, but also some strain
  3. We can use an image correlation script to first measure the ridig body motion, then the deviation from it
  4. → Terminal

How and why?

  • Embarked on this adventure after having produced TomoWarp2 with Rémi, Stephen Hall and Erika Tudisco
  • Wanted more of a library of tools, TW2 monolithic arch too hard to develop flexibly in
  • → Total rewrite of “spam” from scratch (building on more recent theoretical results): took opportunity to get good habits:
    • Serious and automatic documentation
    • Testing of code
    • All within git (svn before) and Continuous Integration

Documentation

  • Sphinx builds the online documentation from three sources:
    1. Inspection of python source code “header”
    2. Hand-written documentation (RST format)
    3. Examples gallery (python + RST)

Documentation - from source code

Every user-exposed function has this header between """

def applyPhi(im, Phi=None, PhiPoint=None, interpolationOrder=1):
    """
        Deform a 3D image using a deformation function "Phi", applied using spam's C++ interpolator.
        Only interpolation order = 1 is implemented.

        Parameters
        ----------
        im : 3D numpy array
            3D numpy array of grey levels to be deformed

        Phi : 4x4 array, optional
            "Phi" deformation function.
            Highly recommended additional argument (why are you calling this function otherwise?)

        PhiPoint : 3x1 array of floats, optional
            Centre of application of Phi.
            Default = (numpy.array(im1.shape)-1)/2.0
            i.e., the centre of the image

        interpolationOrder : int, optional
            Order of image interpolation to use. This value is passed directly to ``scipy.ndimage.map_coordinates`` as "order".
            Default = 1

        Returns
        -------
        imDef : 3D array
            Deformed greyscales by Phi
    """

Documentation - hand-written


****************************
Tutorial: Image correlation
****************************

Following the previous :ref:`refreshmentsTutorial`, to describe transformation in 3D we use a 4x4 matrix called deformation function :math:`\Phi`.
:math:`\Phi` is an extension of the transformation gradient tensor **F** (introduced in the previous tutorial) taking into account also the translation vector, which together with the rotations describe the rigid-body motion of the material:

.. math::
    \boldsymbol{\Phi} = \left[ \begin{array}{cccc}
                                F_{zz} & F_{zy} & F_{zx} & t_{z} \\
                                F_{zy} & F_{yy} & F_{yx} & t_{y} \\
                                F_{xz} & F_{xy} & F_{xx} & t_{x} \\
                                0 & 0 & 0 & 1
                            \end{array} \right ]

Coordinate system
==================

With 3D coordinates for a point:

$$p = (x,y,z)$$

we pad with a one and turn the coordinates into a column vector, to give:

.. math::
    p = \left[ \begin{array}{c}
                z \\
                y \\
                x \\
                1
        \end{array} \right ]

We can then transform the coordinate with the deformation function :math:`\Phi` as follows:

$$Φ.p = p'$$

Python-Sphinx Examples

A folder with examples that have graphical outputs (100% matplotlib for us) :

$ ls -R examples/
./DIC:
plot_imageCorrelationBasics.py  plot_imageDisplacements.py  plot_multiModalRegistration.py  README.txt

./filters:
plot_distancefield.py  plot_morphoop.py  README.txt

./ignore:
plot_orientations.py  tetLabelTest.py

./label:
plot_labelToolkit01.py  plot_labelToolkit02.py  README.txt

./mesh:
fields.vtk  plot_structuredMesh.py  README.txt

./randomfields:
plot_covariance.py  plot_elkc_3D.py     plot_randomfields.py  spamPaper_00000.vtk
plot_elkc_1D.py     plot_excursions.py  README.txt            spamPaper_00001.vtk

plot_imageCorrelationBasics.py


"""
Image correlation basics
=========================

Here we synthetically apply a rigid-body transformation to an image
and try to measure the transformation using the `lucasKanade` image correlation
function
"""

######################
# Import modules
######################
from __future__ import print_function

import matplotlib.pyplot as plt
import tifffile
import numpy
import spam.DIC.deformationFunction as transf
import spam.DIC.correlate as corr
import spam.datasets
import scipy.ndimage
#############################################
# Load snow data and create a deformed image
#############################################

################################################
# Here we will load the data,
# Define a transformation operator and apply this to the data
# in order to obtain a deformed data set.
#
# We will then visualise the difference between the images
# -- as explained in the :ref:`imageCorrelationTutorial`

snow = spam.datasets.loadSnow()

# Define transformation to apply
transformation = {  't': [  0.0, 3.0, 2.5 ],
                    'r': [  5.0, 0.0, 0.0 ]  }

#transformation = {'t': [3.0, 2.0, 1.5]}
#transformation = {'r': [5.0, 0.0, 0.0]}

# Convert this into an Phi
Phi = transf.computePhi( transformation )
# Apply this to snow data
snowDeformed = transf.applyPhi( snow, Phi=Phi )

# Scale by half to speed up calculations
#snow = scipy.ndimage.zoom( snow, 0.5 )
#snowDeformed = scipy.ndimage.zoom( snowDeformed, 0.5 )

# Here we used the blue-white-red colourmap "coolwarm" which makes 0 white
#   on the condition of the colourmap being symmetric around zero, so we
#   force the values with vmin and vmax.
plt.figure()
plt.imshow( (snow - snowDeformed)[50], cmap='coolwarm', vmin=-36000, vmax=36000)
################################################
# Perform correlation
################################################
#

# Now we will use the lucasKanade image correlation function to try
# to measure the **F** between `snow` and `snowDeformed`.
corr.lucasKanade( snowDeformed, snow,
                  margin=10,
                  maxIterations = 50,
                  deltaPhiMin = 0.001,
                  verbose=True,                 # Show updates on every iteration
                  imShowProgress="Z",           # Show horizontal slice
                  imShowProgressNewFig=True )   # New figure at every iteration

plt.show()

Gallery of examples

Testing

What does “testing” mean?
  • ≠ Does it compile/run? This is “building the code”
  • ≠ Does each function run as expected?
  • = Does every line of every function run correctly?

We use unittest

tests folder

There is a special folder with tests:

          $ ls -R tests/
          tests:
          __init__.py                test_label.py
          test_contacts.py           test_loaddataset.py
          test_correlateGM.py        test_mesh.py
          test_covariance.py         test_movingFilters.py
          test_DVC.py                test_scripts.py
          test_excursions.py         test_tsvio.py
          test_globalDescriptor.py   test_vtkio.py
          test_imageManipulation.py

          

What do tests look like?


          class TestFunctionDVC(unittest.TestCase):

              def tearDown(self):
                  try:
                      pass
                      os.remove("spamPhiFieldCF-corrected-N12.tsv")
                      os.remove("spamPhiFieldCF-corrected-N12-filteredRad3.tsv")
                      os.remove("spamPhiFieldCF-ignoreBadPoints.tsv")
                      os.remove("spamPhiFieldCFDel.tsv")
                  except OSError:
                      pass

              def test_computePhi(self):
                  trans1 = {'t': [0.0, 3.0, 3.0]}
                  trans2 = {'r': [-5.0, 0.0, 0.0]}
                  trans3 = {'z': [2, 2, 2]}
                  trans4 = {'s': [0.9, 0.8, 0.7]}
                  Phi1 = transf.computePhi(trans1)
                  self.assertEqual(numpy.sum([Phi1[0, -1], Phi1[1, -1], Phi1[2, -1]]), 6)
                  Phi2 = transf.computePhi(trans2)
                  self.assertEqual(numpy.sum([Phi2[0, -1], Phi2[1, -1], Phi2[2, -1]]), 0)
                  Phi3 = transf.computePhi(trans2, PhiCentre=[50.0, 50.0, 50.0], PhiPoint=[50.0, 16.0, 84.0])
                  self.assertAlmostEqual(numpy.sum([Phi3[0, -1], Phi3[1, -1], Phi3[2, -1]]), 5.926, places=2)
                  Phi4 = transf.computePhi(trans3)
                  self.assertEqual([Phi4[0, 0], Phi4[1, 1], Phi4[2, 2]], [2., 2., 2.])
                  Phi5 = transf.computePhi(trans4)
                  self.assertEqual(Phi5[0, 1], Phi5[1, 0], 0.9)
                  self.assertEqual(Phi5[0, 2], Phi5[2, 0], 0.8)
                  self.assertEqual(Phi5[1, 3], Phi5[3, 1], 0.7)
      

Testing - the concept of coverage

How many lines of the code are executed by the tests?
100% Coverage:
credits: @bloerwald

Testing - the concept of coverage

Packaging for pip

Gitlab CI of doc & testing


          image: remche/docker-ttk

          stages:
              - build
              - test
              - deploy
              - pages

          build:
            stage: build
            script:
              - pip install -r requirements.txt
              - python setup.py install

          test:
            stage: test
            script:
              - pip install -r requirements.txt
              - pip install -r requirements-dev.txt
              - python setup.py install
              - python setup.py test
              - coverage run -m unittest discover
              - coverage report

          deploy:
            stage: deploy
            image: remche/spam-manylinux
            only:
              - /^version-.*$/
            script:
              - ./build-wheels.sh
              - /opt/python/cp27-cp27mu/bin/twine upload /wheelhouse/spam-*-manylinux1_x86_64.whl

          pages:
            variables:
              GIT_SUBMODULE_STRATEGY: normal
            stage: pages
            script:
              - pip install -r requirements.txt
              - pip install -r requirements-dev.txt
              - python setup.py install
              - python setup.py build_sphinx
              - mkdir public
              - mv build/sphinx/html/* public
              - coverage run -m unittest discover
              - coverage html
              - mv coverage public
            artifacts:
              paths:
                - public
            only:
                - master
        

Gitlab CI of doc & testing

https://gricad-gitlab.univ-grenoble-alpes.fr/ttk/spam/pipelines

Conclusion

  1. First and foremost EA: Thank you RC!!
  2. Documentation is a great way to keep knowledge together
  3. A beautiful and automatic system keeps good practice alive
  4. Tests mean that code is professional
  5. Avoids a lot of conneries
  6. Thanks to gricad this is available easily!

Thanks!