KEYWORDS: Signal attenuation, Telecommunications, Visualization, Pattern recognition, Signal analysis, Detection and tracking algorithms, Image segmentation, Logic, Signal analyzers, Networks
The data-entropy quality-budget developed by the authors is used as an alternative to the conventional power budget. The traditional power budget approach is not capable of providing a full analysis of a system with different noise types and specifically providing a measure of signal quality. The quality-budget addressed this issue by applying its dimensionless 'bit measure' to integrate the analysis of all types of losses. A data-entropy visualisation is produced for
each set of points in a reference and test signal. This data-entropy signal is a measure of signal disorder and reflects the power loss and types of signal degradation experienced by the test signal. To analyse the differences between two signals an algorithm known as phase-coherent data-scatter (PCDS) is used to assess levels of attenuation, dispersion, jitter, etc. Practical analysis of telecommunications signals using the new multiple-centroid (MC) PCDS is presented here for the first time. MC-PCDS is then used to analyse differences between sets of data-entropy signals and digital signals. The theory behind MC data-scatter is discussed and its advantages for the quantification of signal degradations are assessed. Finally, a brief consideration is given to the use of pattern recognition algorithms to measure optical signal degrading factors.
A study was conducted to determine the alcohol concentration, refractive index and surface tension of binary solutions from multianalyser tensiotrace data. Characteristic vector analysis of multivariate response data has been successfully applied to a variety of optical tensiotraces to explore the quantitative capabilities of the multianalyser tensiograph. Singular value decomposition was used to determine the key vector i.e. the characteristics of the required signal as it affects the data. This vector is then optimised using the known established data to estimate the value of the unknown parameters. By the use of characteristic vector analysis the paper explores the relationship that exists between the tensiotrace features and the physical properties of a liquid. This paper shows the possibility of future work for identifying wines. A second study has been conducted where five wine samples were run on the multianalyser and their tensiotraces acquired. This preliminary study demonstrates that wine archiving and fingerprinting is possible.
The importance of sensitive monitoring of changes in Raman spectra in particular for microelectronic applications is discussed here. We explore the practicality of using a data-scattering method to analyse Raman spectra, and to establish the dependence of changes observed in all the spectral function characteristics on the parameters of data-scatter such as scatter closeness and scatter radii using "Trace Miner" software. In addition to the analysis performed on model data, analysis on experimental Raman data is also discussed. The sensitivity of the approach is fully appreciated.
For the first time the term data diffraction is introduced, with examples drawn from the algorithm known as phase coherent data-scatter (PCDS) that produces identifiable visual patterns for different types of signal degradation in optical telecommunications. The main signal degradation factors that affect the performance of optical fibers include attenuation, rise-times and dispersion. The theory behind data-scatter is introduced including comprehensive explanations of the theoretical conceptual components of this technique such as centroids, exchange operation, coherence, closeness and projection radius. The various issues of assessing the quality of digital signals are outlined using a simulation study. The authors for the study of optical telecommunications issues have extended the functionality of data-scatter. This approach shows considerable promise. The utility of the data-entropy based 'quality budget method' for optoelectronic system engineering is revisited using an information theory based approach for optical telecommunications. Proposals for the implementation of pattern recognition algorithms to analyse the repeatable patterns within data-scatter are discussed. The paper concludes with brief considerations into the advantages of linking the new data-scatter and data-entropy approaches in digital fiber systems for performance quantification and assessment.
KEYWORDS: Data mining, Data modeling, Visualization, Data acquisition, Data processing, Software development, Mining, Statistical analysis, Mathematical modeling, Raman spectroscopy
Phase Coherent Data-scatter (PCD-S) was originally developed for the area of tensiographic data mining and analysis. This development has been augmented with the engineering of a software toolkit called TraceMiner, which integrates this technique with additional data mining and statistical tools for general use. This paper presents, for the first time, a theoretical treatment of data-scatter as a generic data mining tool, cognisant of the data set descriptions, data transformations, measurands and data model visualisations possible with data-scatter. Data-diffraction resulting from data scatter is also presented here for the first time. The use of the two approaches in a Hough technique to analyse the resulting data-diffraction patterns is discussed briefly in the context of applications of this new data scatter approach.
The paper investigates from the perspective of computer science the phase coherence theory (PCT) and phase coherent data-scatter (PCD-S). These techniques were originally developed for the area of optical tensiographic data mining and analysis but have a more general appplication in data mining. These develoments have recently been augmented with the engineering of a software toolkit called TraceMiner. Although the toolkit was originally devised for tensiography it was developed to perform as a generic data mining and analysis application with PCT, PCD-S and a range of other data mining algorithms implemented. To date the toolkit has been utilised in its main application area, tensiography, but has also been applied to UV-visible spectroscopy. This work presents a critical investigation of the general utility of PCT, PCD-S and the toolkit for data mining and analysis. A new application of PCT and the TraceMiner software toolkit to Raman spectroscopy is presented with discussion of the relevant measures and the information provided by the toolkit. This provides more insight into the generic potential of the techniques for data mining. The analysis performed on theoretical Raman data is augmented with a study of experimental Raman data. Raman spectroscopy is used for composition and fault detecton analysis in semiconductor surfaces. Finally, the utility of the PCT technique in comparison with traditional Raman spectroscopy methods is considered together with some more general applications in the field of imaging and machine vision.
The paper critically assesses and illustrates the use of the data entropy budget method in both product and systems engineering based on the experience of developing an optoelectronic instrument known as the tensiograph. The design of such a system involving optoelectronic, electronic, thermal, mechanical, chemical and data processing noise components presents difficult engineering problem from the complex of noise spectrum contributions. This project provides perhaps an important case study for optical engineers because it was developed over a period of 15 years. The design history recorded in the data entropy-time graph, shows clearly the step-wise improvements achieved from the various engineering efforts. The present 11-bits information content of the instrument, with impressive signal-to-noise ratio exceeding 1000:1, was developed from prototype with less than 3-bit resolution. The paper concludes with an assessment of the relevance of this method to optical engineering in which a diverse number of technologies are frequently integrated in products and systems. Finally, the role of data entropy methods in third level education is then briefly considered with very clear lessons drawn from the foregoing concrete example offered by this case study.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.