Efficiency comparison of different thresholding algorithms & techniques for analysis of nanomaterials in TEM micrographs
Master thesis in Computer Science
Introduction
A summary of my mater thesis in computer science.
For a better understanding please read the 100+ pages thesis published at the university of Mons (Belgium), watch the prezi presentation at the end of this post or read the following summary.
Goal
- Test thresholding algorithm efficiency on TEM nanomaterial micrographs
- Test preprocessing effects on the efficiency of thresholding algorithm
- Test magnification effect on the efficiency of thresholding algorithm
How?
- By evaluating the precision and the introduced variation
- Repeatability uncertainty (within day uncertainty)
- Intermediate precision (between days uncertainty)
- Calibration uncertainty
Materials & Methodes
Which nanomaterials are used?
What does certified reference means ?
Certified Reference Materials (CRMs) are ‘controls’ or standards used to check the quality and metrological traceability of products, to validate analytical measurement methods, or for the calibration of instruments. A certified reference material is a particular form of measurement standard (Wikipedia link).
In our case, the refrence mesurement is the modal or median size value computed from 10 or more different laboratories under the same circumstance & using the same setting for a certain nanomaterial.
A video of the application in action :
Results
Magnification effect comparison between 18500x & 68000 on threshold efficiency
Preprocessing effect comparison by applying an 8 pixels radius smoothing filter at 18500x & 68000x
Conclusion
- Thresholding algorithm efficiency is affected by magnification
- Thresholding algorithm efficiency is affected by preprocessing
Discussion
Accuracy & Precision
By combining and analysing all the data & results we can conclud that some thresholding algorithms are efficient for the detection of certain materials but non of the evaluated algorithm is suitable for all the chosen nanomaterials.
What affects algorithm performance ?
- Magnification
- Background noise (sensitivity of the algorithm to background noise)
- Preprocessing (such as noise removal filters)
- Minimal total number of detected/analyzed particles
- Material density
- Nanoparticle size
- Statistical representation of the results
Possibilities & limitations
A complete automatic quantitative particle-size measurement tool that can be applied on :
- Top down approach
- Bottom-up approach (with some modifications)
- TEM (Transmission Electron Microscopy)
- AFM (Atomic Force Microscopy)
- PTA (Particle Tracking Analysis)
- NTA (Nanoparticle Tracking Analysis)
- SEM (Scanning Electron Microscopy)
- The chosen material needs to be a reference (certified) material
Future
Identification of the steps that are crucial and important in TEM image analysis remains to be explored
- Appropriate magnification
- Appropriate noise removal filter radius
- Minimal total number of detected/analyzed particles
- Evaluating on more complex materials
- Inclusion of separation filters
- Evaluating on lighter density materials
- Test different thresholding approches (local threshodling, dynamic thresholding, etc.)
Conclusion
- A complete automatic quantitative particle-size measurement tool
- Efficient tool to evaluate thresholding algorithm efficiency on nanoparticles TEM micrographs
- Evaluation of magnification an preprocessing effect
- Robust and stable tool with no added uncertainty
- Assured traceability
- Widely applicable approach
Prezi presentation
Automatic Virus Like Particles (VLP) detection using openCV
Introduction
Various approches have been developed using existing technical to address the problem of automatic Virus Like Particles (VLP) detection form digitized Electron Micrograph. The need to develop such methods was to reduce the work needed by microscopist in detecting and analysing VLP in digitized image.
Electron microscopy allows a direct demonstration of the nature of Viruses as well as the description of there size and morphology, but virus recognition by visual examination of digitized electron microscope images is time consuming and requires trained and experienced specialists.
The objective of this research was to develop a reliable, effective and unique technique to reduce the work load of microscopist and to detect VLPs in electron microscope digitized images, using the OpenCV library.

Original sample micrograph of a negative stained feline calici VLP.
This sample image has different brightness/contrast in the left button corner and in the upper right corner, which make it very difficult to analyse.

A modified version of the original image.
Some filters where applied to enhance the image and exhibit the particles before the detection process.

Canny edge, contour detection without any pre-image enhancement.
Almost 0% of false positive, but more than 50% of false negative.

This version of the algorithm proceeds as follow.
- Applies a Mean filter
- Applies a Gaussian blur filter.
- Apply an adaptive threshold ranged from 0 to 255 and detects on each run the contours.
Results are promising. About 60% of the particles where detected with very few false positive.
This technique works fine on several different image, but was not able to operate properly on all of the samples chosen for this study.

In this version of the algorithm, almost 0% of the detected particles are false positive, more than 50% are false negative and about 45% are positive hits.

This version of the algorithm is based on contour detections and gave the best results.
80% of the detected particles are positive hits, but again, did not produces such good results on all of the samples.
Conclusion
Some of techniques we developed produced great results on some micrographs whereas they have failed on different images due to the very textured and noisy background images.
The principal conclusion that have been drawn from the results so far is that in order to develop a unique functioning method, the multi variables that interacts and affects the digitized micrograph characteristics have to be reduced to a minimum.
- Samples have to be prepared following the exact same techniques and have to share the same characteristics
- The signal-to-noise ratio (SNR) has to be as high as possible
- Microscope settings must not be altered
- Microscographs have to be digitized at the same magnification
- Brightness should be homogeneously distributed over the image
- Room temperature, light intensity, artefacts etc, also have to be controled.
A unique, reliable and automated detection algorithm can be developed if all of the digitized micrograph shares the same characteristics like brightness, contrast, magnification, staining quality, presence/absence of artefacts, signal to noise ratio etc.
Though, the techniques learned & developed during this research can be easily applied to nano-particle detection.
TiaTag
Purpose
Informations such as magnification, defocus, intensity, spot size etc, which are important for image analysis and quality control are usually found in tiff images under the form of tags.
These informations are captured & saved by the “Tia ES-Vision” software (FEI, Eindhoven, The Netherlands) which controles the Tranmission Electron Microscope (TEM) used in the Nanotechnology service.
By the “Tia ES-Vision” software, this information is exported along with the image data in a “private” tag. Due to the tag privacy, only the “Tia ES-Vision” software is able to read the information.
The “TiaTag” module was developed for iTem and “Tia ES-Vision” using imagingC & C, in order to render this information accessible in the “iTem” software (Olympus, Münster, Germany) which is used for image analyses and data storage.
This module solves the incompatibility issues between “iTem” and “Tia ES-Vision” and allows the importation of the tiff image tags into “iTem” database while calibrating the image into the nm scale.
Automated microscopy for TEM imaging acquisition
Purpose
“The first rule of any technology in a business is that automation applied to an efficient operation will magnify the efficiency.
The second is that automation applied to an inefficient operation will magnify the inefficiency.”
― Bill Gates
In order to increase the effciciency of our work we have developed a methode using Visual Basic and javaScrit to automate the following procedures at the Tranmission Electron Microscope level.
- Stage movement control by automatically moving the stage to predefined coordinates read from a file
- Autotuning & autofocusing
- Image acquisition
- Image exporting to 16bit tif format
Automatic Analysis & Report generator using SigmaPlot
Flexible options
SigmaPlot histograms and reports automatically generated by “ReportGen” defines as a unique model to our final reports.
Reports in general
Report is a piece of information describing, or an account of certain events given or presented to someone.
Reports are often used to display the result of an experiment, investigation, or inquiry. The audience may be public or private, an individual or the public in general. Reports are used in government, business, education, science, and other fields.
Reports often use persuasive elements, such as graphics, images, voice, or specialized vocabulary in order to persuade that specific audience to undertake an action.
EM department reports
Since the results of our analysis have to be presented to different audience and written by different person, we have decided to give them the same look and feel.
For this purpose we have developed several templates and “application” which helps us in generating reports while reducing the work load and maintaining the common signature.
Domain of application
The purpose of this development is to reduce the work load of user by automatizing calculation, repetitive tasks and presentation.
“ReportGen”
ReportGen was developed to satisfy the following conditions :
-
Reports have to have the same presentation
-
The 20 generated histograms have to share the same order of appearance
-
Histograms are always distributed in the same manner on 4 different pages
-
Each page defines a group of histograms sharing common attributes (Shape Measurement, 2D measurement, Distance measurement, Mesh measurement)
-
The experiment title is printed on top of each page
-
Each histogram have a title
-
The report has to be automatically generated
Semi-Automatic nanoparticle detection to apply the EU definition using "Processing"
Background
“A natural, incidental or manufactured material containing particles, in an unbound state or as an aggregate or as an agglomerate and where, for 50 % or more of the particles in the number size distribution, one or more external dimensions is in the size range 1 nm – 100 nm.”
― EU definition for nanomaterials
Domain of application
The purpose of this development is to help users to easily and rapidly identify if a material is potentially a nanomaterial by finding the percentage of particles having one or more external dimensions smaller than 100nm.
Semi-automatic single particle analysis
The developed application using “processing” allows a user to easily identify nanoparticle in a TEM micrograph by following a simple procedure.
- Choose a representative or multiple representatives micrographs
- Indicates any of the following
- The pixel size in nm if known
- The micrograph width in nm if known
- Or use the semi-automatic image calibration offered by the application
- Use the mouse (or keyboard) to move the computed 100nm diametre circle over the particles for checking the size.
- Hit “+” for positive and “-” for negative hits
- Save the results
- Move to the next micrograph
Conclusion
Results are automatically computed which help for a rapid check of micrographs.
A very helpful method for classifying boundary nanomaterials where particles have one or more external dimensions around the 100nm boundary or for aggregated and agglomerated nanomaterials where it helps to check the size and percentage of primary particles.

ToolBox
Gradute work
Developped in Visual Basic™ and uses an SQL™ database.