Application of Image Processing and Data Science in Medical Science

Introduction

In today’s technological era. data science has touched all aspects of human life. The last few years have also seen a huge contribution of data science in the field of healthcare and medical science. Most doctors in this technological world are using a variety of advanced medical devices, which generate a large amount of health data. Health care data production includes hospital records arising from various sources such as doctor expertise, medical records of patients, results of medical tests, and so forth. Biomedical research plays a very important role in generating an important piece of data relevant to public health service. According to the World Health Organization, approximately 1.2 billion clinical documents are produced in the United States every year, about 80% of which is unstructured data, in which doctors’ drug prescriptions and images of various organs of the patient and so forth are considered. These types of data require proper management and analysis to obtain meaningful information. Otherwise, searching for a solution by analyzing the data in a way becomes comparable to finding a needle in the sea. Hence, data analysis plays an important role in handling these types of unstructured data (which are generated from image processing). This can only be achieved using a high-quality computing solution for data analysis.

From the data perspective, biomedical images are an original source in the medical science and healthcare industries. These biomedical images are characterized as human body images. These biomedical images play an important role in understanding the nature of the human biological system. The most widely used biomedical imaging techniques to assess the current state of any organ or tissue are X-rays, CT (computed tomography) scans, sound (ultrasound), magnetism (MRI), radioactive pharmaceuticals (nuclear medicine: SPECT and PET), or light (endoscopy and OCT). Image processing embedded with data analysis can process and examine human healthy using these biomedical images which can be of great use for the health industry. Computer vision software based on learning algorithms in the healthcare industry is already making things simpler and more comfortable. Such software is increasingly making an automated analysis possible with higher efficiency and more accurate results. Most hospitals have not yet started using such automated analysis techniques. When these techniques are used appropriately, these techniques based on medical imaging data science help us reduce our dependence on manual analysis. These can be used to obtain advanced and accurate diagnostic procedures, including molecular imaging, from macroscopic to microscopic. Image processing can play an important role in detecting various diseases such as glioma, AIDS, dementia. Alzheimer’s disease, cancer metastasis, tumor diagnosis, and so on.

In order to provide relevant solutions for improving public health, the appropriate infrastructure must be fully equipped to generate and systematically analyze the data of all healthcare providers in a completely systematic way. Efficient management, accurate analysis, and fine interpretation of all data can change the game of the biomedical industrial world by opening new avenues for modern healthcare. This is why various industries, including the healthcare industry, are taking vigorous steps to transform image-processing-based data science capability into better services and financial benefits. With a strong integration of biomedical and healthcare data, the modern healthcare organization is probably poised to bring a revolution in medical treatment and personalized medicine. This chapter covers the sources of medical data, the role of data analysis, and the handling tools and techniques of biomedical image processing with advanced social analytical tools.

Ideal Dataset of Medical Imaging for Data Analysis

An ideal medical image dataset must have sufficient data volume, annotation, truth, and reusability during a data analysis application in medical science. On this basis, each medical imaging dataset object contains data elements, metadata, and variables (identifiers). This type of combination represents the imaging examination. It is mandatory to have sufficient imaging examinations in a collection of data objects or datasets for hypothetical analysis. In order to make the algorithm fully efficient, both the dataset itself and each imaging exam must be correctly described and labeled. For ground reality, each imaging exam must be fully labeled and as accurate and reproducible as possible (Figure 8.1). In addition, an ideal dataset must be of Findable, Accessible, Interoperable, and Reusable nature (Wilkinson et al., 2016)

Medical Imaging

X-rays Medical images

Tomography images

Computer tomography

(CT) images

Radiography images

Ultrasound Images

Magnetic Resonance Imaging (MRI)

Thermo-graphic Images

Molecular Imaging /Nuclear medicine

Positron-emission tomography (PET)

Single Photon Emission Computed Tomography (SPECT)

Data Analysis

Improved Outcomes

Smarter and Cost effective Decision making Process

FIGURE 8.1 Workflow of data analytics to obtain a smarter healthcare option based on the medical imaging dataset.

Fundamentals of Medical Image Processing

Medical image processing produces visual images of the internal structures of the body for scientific and pharmacological study and treatment, as well as visualization of the function of internal tissues. This type of process pursues the identification and management of the disorder in the patient’s body. In this process, a data bank of regular structure is created which makes it easy to detect anomalies in the functioning of

Image Formation

Acquisition

Image Enhancement

Calibration Registration Optimization

B Transformation Filtering

Image Visualization

Image Analysis

Image Management

Surface reconstruction

Noise illumination

Colour Shading

Image Display

Feature

Extraction

Segmentation

Classification

Major steps of medical image processing

FIGURE 8.2 Major steps of medical image processing.

various organs of the human body. These procedures include both biological and radiological imaging images that use electromagnetic energy (X-ray and gamma), sonography, magnetic, scope, and thermal and isotope imaging. Medical image processing encompasses five major areas as presented in Figure 8.2.

Steps of Image Processing

"Biomedical image processing”, commonly used in the biomedical field, refers to the provision of digital image processing for biomedical science. Secondly, data science comes to light due to the increasing use of image processing in the field of medical science to make biomedical decisions. In general, biomedical image processing involves five major areas:

  • 1. Image Formation: This step of image processing covers all the steps from capturing the image to making the digital image a matrix form.
  • 2. Image Enhancement: Digital images are adjusted simultaneously at this step of image processing to make the results more suitable for display or further image analysis. For example, this step is capable of removing, sharpening, or brightening a layer from an image. This makes it easy to identify the key features of the image.
  • 3. Image Visualization: This step of image processing refers to all types of manipulation of the image matrix form, which results in a customized output of the image.
  • 4. Image Analysis: This step of image processing involves all stages of image processing, which are also used for abstract interpretations of biomedical images along with quantitative measurements. These steps require a piece of prior knowledge to understand the nature and content of the images, which are integrated into the algorithm at a high level of abstraction. Thus, the process of image analysis is very specific, and rarely this developed algorithm can be directly transferred to other application domains.
  • 5. Image management: All techniques are presented on this step of image processing which provides efficient storage, communication, transmission, collection, and access (retrieval) of image data. Thus, a part of image management can also be demonstrated through the method of telemedicine.

An image analysis is often also referred to as high-level image processing. On the other hand, low-level processing refers to manual or automated techniques that can be realized without a priori knowledge on the specific content of the images. This type of automatic algorithm has similar effects regardless of the position of the images. This can usually be seen on any holiday photograph. This is why low-level processing methods are usually available with programs to enhance the image.

Problems with Medical Images

With the definition and use of the field of biomedical science, a particular type of problem naturally becomes apparent in the high-level processing of medical images. As a result of its natural complexity, it is extremely difficult to make medicine a priori knowledge, so that it can be directly and easily integrated with the aid of image processing algorithms for the diagnosis of diseases in humans. In the literature, this is known as the semantic difference, which means there is discrepancy between the cognitive interpretation of the clinical image by the clinician (higher level) and the simple structure of discrete pixels, used in computer programs to represent an image. There are many factors that affect the accuracy of computer-based systems of analysis and interpretation in medical image processing (Figure 8.3).

Factors affecting the accuracy of computer-based medical image processing

FIGURE 8.3 Factors affecting the accuracy of computer-based medical image processing.

Heterogeneity of Images

Medical images are helpful in displaying living tissue, organs, or body parts of the human body. All these medical images have a variety of differences, whether these images are captured in the same manner or using the same standardized acquisition protocol. Apart from this, there are notable differences in the shape, size, and internal structure of these images among patients. Many variations have been found in the same patient even with the change in time. In other words, biological structures are subject to both interim and individual variability. Thus, creating a primordial knowledge universally is impossible in biomedical science.

Unknown Delineation of Objects

In most biomedical sciences, biological structures cannot be distinguished from their backgrounds. The entire image of this is to be represented as diagnostically or clinically relevant objects. Even if a certain object is seen in biomedical images, their segmentation is problematic. The reason for this is to represent their shape or borderline themselves or only partially. This is why medically related objects are often more and more abstract at the level of texture.

Robustness of Algorithms

Insufficient properties of medical images, which confound their critical level of processing, special prerequisites of reliability, and robustness of diagnostic algorithms. When the diagnostic algorithm is implemented in the framework, image processing algorithms run in parallel and also play a similar diagnostic role in the field as requested. Generally, automated program analysis of images in medicine should not give miscalculation. This means that drawings, w'hich cannot be prepared effectively, should consequently be delegated, rejected, and pulled back from further handling. Thus, all photographs that have not been rejected must be evaluated effectively. Furthermore, the volume of the dismissed image is not allowed to be too large, as most clinical imaging methods are unsupervised and cannot be redone as a result of image preparation blunders.

Noise Occurrence in Image

One of the most significant difficulties is the presence of noise in medical/clinical images. There are various procedures for clinical imaging, for example, CT scan, ultrasound, advanced digital radiography, magnetic resonance imaging (MRI). spectroscopy, and so forth. The imaging strategies have been utilized as progressive techniques in analytic radiology. The clinical images which are utilized by doctors in their examination are inclined to experience the side effects from noise. Under these conditions, the location precision may endure. The computer proposes utilizing suitable calculations for noise evacuation. Mostly, the brightness of an image is considered uniform except at what point it changes to form the image. The variety in brightness or contrast value is normally irregular and has no specific pattern. This can decrease the quality of the image and is particularly huge when the items beingimaged are little and have low differentiation. This irregular variety in image brightness or low differentiation is known as commotion.

Noise in digital images is an unwanted signal that corrupts original images. Different sources might contribute to the noise, for example, blunders in the image acquisition process that may bring about pixel esteems not mirroring the genuine idea of the scene. During securing, transmission, storage, and recovery forms, the commotion might be blended in with a unique image. A digital image that is transferred electronically contaminates noise sources. Noise can be caused in images by irregular vacillations in the image signal. The commotion signal present in the clinical image represents an extraordinary test in programmed clinical image examination. A portion of the significant focuses identified with noise in clinical images are:

  • • Some type of noise contained in all medical images.
  • • The noise might be different kinds such as grainy, snowy appearance, textured, and so forth.
  • • Various sources may contribute noise in an image.
  • • No imaging technique is liberated from noise.
  • • Image de-noising turns into a fundamental advance in all CAD frameworks.
  • • High noise found in nuclear images.
  • • Noise makes more trouble in MRI, CT, and ultrasound imaging when contrasted with other imaging modalities.
  • • Minimum noise is present in images produced by radiography.
  • • The process of removing noise is called filtering and denoising.
  • • Noise additionally diminishes the deceivability of specific highlights inside the picture.
  • • The loss of deceivability is particularly critical for low-differentiated pictures.
  • • The primary point of picture handling is to extricate clear data from the image debased by noise.

Huan et al. (2010) talked about three significant kinds of noise signals: (1) impulse noise, (2) multiplicative noise, and (3) additive noise. Noise found in digital images is seen as an added substance in nature with uniform power in the whole data transfer capacity with Gaussian likelihood conveyance. This kind of noise is called Additive White Gaussian Noise. This noise is a multiplicative commotion, and an undesirable irregular sign gets duplicated into some important sign during carrying out transmission or another image processing technique. A significant case of noise in clinical images is the spot commotion that could be ordinarily seen in radar imaging methodology. Scarcely any models are shadows because of undulations on the outside of the imaged objects, shadows cast by complex articles such as foliage, dull spots brought about by dust in the focal point or picture sensor, and varieties in the addition of individual components of the image sensor. Multiplicative commotion shows up in different image processing applications, for example, manufactured gap radar, ultrasound imaging, molecule outflow registered tomography, and positron emanation tomography. Hence, the expulsion of multiplicative commotion is extremely basic in imaging frameworks and image-preparing applications. Because the spot commotion is for the most part found in clinical images, this is quickly examined here.

Speckle Noise

The speckle naturally found in digital images is basically a grainy "noise”. This degrades the quality of noise-activated radar and Synthetic Aperture Radar (SAR) images. Some of the salient features of this macular noise are:

  • • Speckle noise arises from random fluctuations in the return signal of an object in conventional radars. It is no larger than a single image-processing element.
  • • Speckle noise increases the average gray level of a local area.
  • • This noise found in SAR images is very severe due to which there are many difficulties in image interpretation.
  • • Speckle noise arises due to the coherent processing of backscattered signals from multiple distributed targets.
  • • It is also caused by the signals of the primary skater and the waves containing cells of gravity.
  • • There are many different methods available in the literature to remove speckle noise from medical images.
  • • All these methods are based on different types of mathematical models.
  • • Non-adaptive filters are used to eliminate noise.
  • • Adaptive speckle filtering is one of the better techniques to preserve edges and expand into areas of high texture.
  • • A nonadaptive filtering process is a better option to meet the simplicity and less computational time requirement.
  • • Such adaptive speckle filtering is divided into two types.
  • • There are several forms of adaptive spec filtering involving frost filters.
  • • Speckle noise in SAR images is a qualitative noise as in where the noise is present in direct proportion to the local gray level.
  • • In some cases, speckle noise can also be used to represent useful information, for example, laser speckle where the change of the speckle pattern is only a measure of surface activity.

Figure 8.4 shows the speckle noise as particle noise in a digital image.

Speckle noise as granular noise reduction in ultrasound images

FIGURE 8.4 Speckle noise as granular noise reduction in ultrasound images.

 
Source
< Prev   CONTENTS   Source   Next >