The etiology of craniofacial defects is incompletely understood. The ability to obtain large amounts of gene sequence data from families affected by craniofacial defects is opening up new ways to understand molecular genetic etiological factors. One important link between gene sequence data and clinical relevance is biological research into candidate genes and molecular pathways. We present our recent research using OCT as a nondestructive phenotyping modality of craniofacial morphology in Xenopus embryos, an important animal model for biological research in gene and pathway discovery. We define 2D and 3D scanning protocols for a standardized approach to craniofacial imaging in Xenopus embryos. We define standard views and planar reconstructions for visualizing normal anatomy and landmarks. We compare these views and reconstructions to traditional histopathology using alcian blue staining. In addition to being 3D, nondestructive, and having much faster throughout, OCT can identify craniofacial features that are lost during traditional histopathological preparation. We also identify quantitative morphometric parameters to define normative craniofacial anatomy. We also note that craniofacial and cardiac defects are not infrequently present in the same patient (e.g velocardiofacial syndrome). Given that OCT excels at certain aspects of cardiac imaging in Xenopus embryos, our work highlights the potential of using OCT and Xenopus to study molecular genetic factors that impact both cardiac and craniofacial development.
Bone age assessment (BAA) is a method of determining the skeletal maturity and finding the growth disorder in the skeleton of a person. BAA is frequently used in pediatric medicine but also a time-consuming and cumbersome task for a radiologist. Conventionally, the Greulich and Pyle and the Tanner and Whitehouse methods are used for bone age assessment, which are based on visual comparison of left hand radiographs with a standard atlas. We present a novel approach for automated bone age assessment, combining scale invariant feature transform (SIFT) features and support vector machine (SVM) classification. In this approach, (i) data is grouped into 30 classes to represent the age range of 0- 18 years, (ii) 14 epiphyseal ROIs are extracted from left hand radiographs, (iii) multi-level image thresholding, using Otsu method, is applied to specify key points on bone and osseous tissues of eROIs, (iv) SIFT features are extracted for specified key points for each eROI of hand radiograph, and (v) classification is performed using a multi-class extension of SVM. A total of 1101 radiographs of University of Southern California are used in training and testing phases using 5- fold cross-validation. Evaluation is performed for two age ranges (0-18 years and 2-17 years) for comparison with previous work and the commercial product BoneXpert, respectively. Results were improved significantly, where the mean errors of 0.67 years and 0.68 years for the age ranges 0-18 years and 2-17 years, respectively, were obtained. Accuracy of 98.09 %, within the range of two years was achieved.
Photographic documentation and image-based wound assessment is frequently performed in medical diagnostics, patient care, and clinical research. To support quantitative assessment, photographic imaging is based on expensive and high-quality hardware and still needs appropriate registration and calibration. Using inexpensive consumer hardware such as smartphone-integrated cameras, calibration of geometry, color, and contrast is challenging. Some methods involve color calibration using a reference pattern such as a standard color card, which is located manually in the photographs. In this paper, we adopt the lattice detection algorithm by Park et al. from real world to medicine. At first, the algorithm extracts and clusters feature points according to their local intensity patterns. Groups of similar points are fed into a selection process, which tests for suitability as a lattice grid. The group which describes the largest probability of the meshes of a lattice is selected and from it a template for an initial lattice cell is extracted. Then, a Markov random field is modeled. Using the mean-shift belief propagation, the detection of the 2D lattice is solved iteratively as a spatial tracking problem. Least-squares geometric calibration of projective distortions and non-linear color calibration in RGB space is supported by 35 corner points of 24 color patches, respectively. The method is tested on 37 photographs taken from the German Calciphylaxis registry, where non-standardized photographic documentation is collected nationwide from all contributing trial sites. In all images, the reference card location is correctly identified. At least, 28 out of 35 lattice points were detected, outperforming the SIFT-based approach previously applied. Based on these coordinates, robust geometry and color registration is performed making the photographs comparable for quantitative analysis.
Cilia-driven fluid flow is a critical yet poorly understood aspect of pulmonary physiology. Here, we demonstrate that optical coherence tomography-based particle tracking velocimetry can be used to quantify subtle variability in cilia-driven flow performance in Xenopus, an important animal model of ciliary biology. Changes in flow performance were quantified in the setting of normal development, as well as in response to three types of perturbations: mechanical (increased fluid viscosity), pharmacological (disrupted serotonin signaling), and genetic (diminished ciliary motor protein expression). Of note, we demonstrate decreased flow secondary to gene knockdown of kif3a, a protein involved in ciliogenesis, as well as a dose-response decrease in flow secondary to knockdown of dnah9, an important ciliary motor protein.
Imaging and image-based measurements nowadays play an essential role in controlled clinical trials, but electronic data capture (EDC) systems insufficiently support integration of captured images by mobile devices (e.g. smartphones and tablets). The web application OpenClinica has established as one of the world’s leading EDC systems and is used to collect, manage and store data of clinical trials in electronic case report forms (eCRFs). In this paper, we present a mobile application for instantaneous integration of images into OpenClinica directly during examination on patient’s bed site. The communication between the Android application and OpenClinica is based on the simple object access protocol (SOAP) and representational state transfer (REST) web services for metadata, and secure file transfer protocol (SFTP) for image transfer, respectively. OpenClinica’s web services are used to query context information (e.g. existing studies, events and subjects) and to import data into the eCRF, as well as export of eCRF metadata and structural information. A stable image transfer is ensured and progress information (e.g. remaining time) visualized to the user. The workflow is demonstrated for a European multi-center registry, where patients with calciphylaxis disease are included. Our approach improves the EDC workflow, saves time, and reduces costs. Furthermore, data privacy is enhanced, since storage of private health data on the imaging devices becomes obsolete.
Wearable technology defines a new class of smart devices that are accessories or clothing equipped with computational power and sensors, like Google Glass. In this work, we propose a novel concept for supporting everyday clinical pathways with wearable technology. In contrast to most prior work, we are not focusing on the omnipresent screen to display patient information or images, but are trying to maintain existing workflows. To achieve this, our system supports clinical staff as a documenting observer, only intervening adequately if problems are detected. Using the example of medication preparation and administration, a task known to be prone to errors, we demonstrate the full potential of the new devices. Patient and medication identifier are captured with the built-in camera, and the information is send to a transaction server. The server communicates with the hospital information system to obtain patient records and medication information. The system then analyses the new medication for possible side-effects and interactions with already administered drugs. The result is sent to the device while encapsulating all sensitive information respecting data security and privacy. The user only sees a traffic light style encoded feedback to avoid distraction. The server can reduce documentation efforts and reports in real-time on possible problems during medication preparation or administration. In conclusion, we designed a secure system around three basic principles with many applications in everyday clinical work: (i) interaction and distraction is kept as low as possible; (ii) no patient data is displayed; and (iii) device is pure observer, not part of the workflow. By reducing errors and documentation burden, our approach has the capability to boost clinical care.
Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.
Microfluidic mixing or mixing at low Reynolds number is dominated by viscous forces that prevent turbulent flow. It
therefore differs from conventional mixing (e.g., stirring milk into coffee), as it is driven primarily by diffusion.
Diffusion is in turn dependent on (i) the concentration gradient along the interface between two fluids (dye front line)
and (ii) the extent of the interface itself. Previously, we proposed an in vivo method to microscopically monitor the
mixing interface using Shannon information entropy as mixing indicator and explored the use of length of dye front line as an indirect measure of mixing efficiency. In this work, we present a robust image processing chain supporting
quantitative measurements. Based on data from ciliated surfaces mixing dye and water, the dye-water interface front line is extracted automatically using the following processing steps: (i) noise reduction (average filtering) and down sampling in time to reduce compression artifacts; (ii) subtraction imaging with key reference frames in RGB color space to remove background; (iii) segmentation of dye based on color saturation in HSV color space; (iv) extraction of front line; (v) curve smoothing in curvature scale space (CSS) with an improved Gaussian filter adaptive to the local concentration gradient; and (vi) extraction of length. Evaluation is based on repeated measurements. Reproducibility in unaltered animals is shown using intra- and inter-animal comparison. Future work will include a more comprehensive evaluation and the application to datasets with multiple classes.
Motile cilia are cellular organelles that project from different epithelial surfaces including respiratory epithelium. They
generate directional fluid flow that removes harmful pathogens and particulate matter from the respiratory system. While
it has been known that primary ciliary dyskinesia increases the risk of recurrent pulmonary infections, there is now
heightened interest in understanding the role that cilia play in a wide-variety of respiratory diseases. Different optical
imaging technologies are being investigated to visualize cilia-driven fluid flow, and quantitative image analysis is used
to generate measures of ciliary performance. Here, we demonstrate the quantification of in vivo cilia-driven microfluidic
mixing using spatial and temporal measures of Shannon information entropy. Using videomicroscopy, we imaged in vivo
cilia-driven fluid flow generated by the epidermis of the Xenopus tropicalis embryo. Flow was seeded with either dyes
or microparticles. Both spatial and temporal measures of entropy show significant levels of mixing, with maximum
entropy measures of ~6.5 (out of a possible range of 0 to 8). Spatial entropy measures showed localization of mixing
"hot-spots" and "cold-spots" and temporal measures showed mixing throughout.In sum, entropy-based measures of
microfluidic mixing can characterize in vivo cilia-driven fluid flow and hold the potential for better characterization of
ciliary dysfunction.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.