Andreas Johansson, Ricardo Méndez-Fragoso, Jonas Enger
{"title":"Developing a self-calibrating system for volume measurement of spheroidal particles using two acoustically levitated droplets.","authors":"Andreas Johansson, Ricardo Méndez-Fragoso, Jonas Enger","doi":"10.1063/5.0211033","DOIUrl":null,"url":null,"abstract":"<p><p>Acoustically levitated droplets in the nanoliter to microliter range are studied in various fields. The volume measurements of these are conventionally done using image analysis. A precision-produced calibration sphere is often used to calibrate the recording equipment, which is time-consuming and expensive. This paper describes a self-calibrating method to measure the volumes of acoustically levitated droplets as a versatile and low-cost alternative. The distance between two levitated droplets in a horizontally oriented acoustic trap is processed via real-time or recorded frame data using image analysis. To assist in setting the cavity length for the acoustic trap, a simulation of the acoustic field is utilized based on the temperature in the trap, thereby also predicting the distance between the central nodes used to determine the scale factor. The volumes of the spheroidal-shaped levitated droplets can then be calculated from the pixel data. We use a modified version of the well-known TinyLev, and our method has been tested with two types of transducer packing. Its accuracy for volume measurements has been verified in comparison with the standard calibration sphere technique. Self-calibration of the system is demonstrated by changing the camera zoom during data collection, with negligible effects on measured volume. This is something that could not be achieved with conventional static methods.</p>","PeriodicalId":21111,"journal":{"name":"Review of Scientific Instruments","volume":"95 11","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Review of Scientific Instruments","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1063/5.0211033","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0
Abstract
Acoustically levitated droplets in the nanoliter to microliter range are studied in various fields. The volume measurements of these are conventionally done using image analysis. A precision-produced calibration sphere is often used to calibrate the recording equipment, which is time-consuming and expensive. This paper describes a self-calibrating method to measure the volumes of acoustically levitated droplets as a versatile and low-cost alternative. The distance between two levitated droplets in a horizontally oriented acoustic trap is processed via real-time or recorded frame data using image analysis. To assist in setting the cavity length for the acoustic trap, a simulation of the acoustic field is utilized based on the temperature in the trap, thereby also predicting the distance between the central nodes used to determine the scale factor. The volumes of the spheroidal-shaped levitated droplets can then be calculated from the pixel data. We use a modified version of the well-known TinyLev, and our method has been tested with two types of transducer packing. Its accuracy for volume measurements has been verified in comparison with the standard calibration sphere technique. Self-calibration of the system is demonstrated by changing the camera zoom during data collection, with negligible effects on measured volume. This is something that could not be achieved with conventional static methods.
期刊介绍:
Review of Scientific Instruments, is committed to the publication of advances in scientific instruments, apparatuses, and techniques. RSI seeks to meet the needs of engineers and scientists in physics, chemistry, and the life sciences.