Posts

Showing posts from February, 2024

Leveraging Deep Learning for Enhanced Image Processing

Image
In this blog, we'll delve into how deep learning assists in tackling ill-posed problems encountered in medical imaging across healthcare and industrial contexts. From the 1960s to 2000,  CT and MRI technologies experienced significant advancements in spatial resolution.  After 2000, the research emphasis transitioned to developing low-dose CT and fast MRI techniques, presenting challenges that are ill-posed for traditional mathematical methods. Understanding the Mathematical Shift in MRI and CT  Before 2000,  researchers mainly concentrated on solving well-posed problems that met basic rules, like the Nyquist criteria. The basic rule is that the number of measurements ( $b$ ) should roughly match the number of unknowns ( $x$ ),   ensuring that the forward model $A$ can be inverted. After 2000,  researchers have increasingly turned their attention to tackling very ill-posed problems by using methods like sparse sensing. In this case where the numb...

Streamlining Fetal Ultrasound Examination Workflows with Deep Learning Techniques

In obstetrics and gynecology, diagnostic ultrasound is indispensable for evaluating fetal development, health, and predicting perinatal outcomes. It enables measurements of critical fetal health indicators such as amniotic fluid volume, biparietal diameter, head circumference, and abdominal circumference. Despite its importance, the manual process of measuring these indicators is both time-consuming and prone to variability, depending on the skill level of the clinician. This has highlighted a need for a more streamlined and accurate method to extract and analyze biometric data from fetal ultrasound images, with the ultimate goal of enhancing clinical workflow efficiency and improving the consistency of fetal health evaluations. Prior to 2014, the task of automating biometric measurement extraction from ultrasound images faced significant hurdles due to common complications like signal interference, reverberation artifacts, blurred boundaries, signal attenuation, shadowing, and speckle...

Advancing Towards Clinical Application of Electrical Impedance Tomography

  In the 1960s and 1970s, the development of medical imaging technologies like CT scans, MRI, and ultrasound marked a significant transition from theoretical concepts to practical clinical tools. This period saw a growing need for advanced imaging techniques to support early detection, accurate diagnosis, and effective treatment planning. The advancement in these modalities was made possible through the collaborative efforts of mathematicians, physicists, engineers, and medical professionals. This interdisciplinary approach quietly reshaped healthcare innovation, demonstrating the value of collective expertise in enhancing patient care. During the 1970s, amidst the broader evolution of medical imaging technologies, electrical impedance tomography (EIT) was proposed as a method to offer insights into the body's internal structures through the mapping of electrical conductivity distributions. EIT employs an array of electrodes to explore the relationship between currents and voltages...

Understanding Real Analysis in Mathematics: The Lebesgue Integral and Measure Theory

Reasons for developing the Lebesgue integral and measure theory We dive into "Real Analysis," an essential math course, with the help of measure theory to show how important the Lebesgue integral is for solving old math problems from before the 20th century. Especially, the mathematical theories in the Dirichlet principle in PDE and the Fourier representation deeply rely on the measure theory. Our exploration begins by elucidating the advantages of the Lebesgue integral over its Riemann counterpart, using a simple example of Poisson's equation in three-dimensional space:  $$- \nabla^2 u(\mathbf{r}) = \rho(\mathbf{r}),\quad \mathbf{r}=(x,y,z)\in \mathbb{R}^3,$$ subject to the boundary condition that $ u(\mathbf{r}) $ vanishes as $ \mathbf{r} $ approaches infinity. To solve this problem intuitively, we utilize the Dirac delta function, symbolized by $\delta(\mathbf{r})$, to express the solution in a convolution format: $$u(\mathbf{r}) = \iiint \underbrace{\nabla^2 \Phi(\m...

University Education: Reflecting Life's Complexities and Challenges

Recently, a distinct trend has been observed: students receiving considerable academic and financial backing from well-educated parents frequently gain admission to elite universities and exhibit strong academic performance during their undergraduate years.  However, a worrisome pattern has also surfaced: these students frequently struggle to achieve financial independence, relying heavily on their parents for financial assistance.  This dependence can greatly obstruct their ability to develop the crucial survival skills needed to adeptly navigate the intricacies of life in society. In this blog post, I aim to explore the question of whether a university offering comprehensive support and an optimal research environment customized for students truly benefits them in the long run. The motivation behind this discussion stems from my experience visiting a university in UAE, where I witnessed the creation of an optimal educational environment supported by significant financia...

AI as an Aid, Not a Replacement: Enhancing Medical Practice without Supplanting Doctors

In this blog, I emphasize the crucial point that AI is not a replacement for doctors. Rather, its role is to unburden doctors from the tedious tasks of keyboard input and time-consuming procedures, thereby enhancing their capabilities and allowing them to focus on what truly matters: patient care. The foremost strength of deep learning resides in its remarkable computational capacity to swiftly integrate and analyze data derived from the diagnostic decisions of numerous doctors. Essentially, deep learning models serve as rapid calculators, excelling in processing and synthesizing vast amounts of information. However, they encounter a significant limitation: unlike humans, these models struggle to adapt to new and evolving situations with the same intuition and flexibility. While recent advancements in generative models have showcased their ability to produce novel outputs, these innovations often amount to recombination of existing data. Deep learning inherently lacks the human-like ab...

Paradym shift in mathematics through deep learning-based solution prior

 Most 3D medical imaging techniques, such as CT and MRI, employ voxel-based representations to provide spatial mapping of anatomical structures and tissue characteristics, offering clinical relevance and utility.  In the process of image reconstruction, it is necessary to establish a correspondence between the value assigned to each voxel and the measurable quantities. For instance, in CT, the measured data takes the form of X-ray projections, which can be modeled as the Radon transform of the voxel values within the CT image. In MRI, the measured data consists of k-space data, that is the Fourier transform of MR images.  In this context,  we use the inverse of the forward operator (e.g., the Radon transform or Fourier transform) for voxel-by-voxel image reconstruction to determine the value of each voxel based on measurement data. Consequently, to enhance image resolution, it is necessary to increase the number of measurements proportionally because, in simple term...

Optimizing Data Simplification: Principal Component Analysis for Linear Dimensionality Reduction

Image
 Data Matrix Imagine a dataset containing 138 grayscale images, each with a resolution of $93 \times 70=3810$ pixels and a grayscale range extending from 0 to 255.   By converting these images into vector form, we represent the dataset as $\{\mathbf{x}^{(k)}\}_{k=1}^{138}$, with each vector $\mathbf{x}^{(k)}$ residing in a portion of $3810$-dimensional discrete Euclidean space, specifically $\{0, 1, \ldots, 255\}^{3810}$.   In the subsequent figure, $130$ is the total count of images, while $3810$ is the pixel count per image.                            This transformation yields a data matrix, denoted by $X$, composed of 138 columns—each row uniquely mapping to an image—and  $3810$ rows, which correlate to the pixel values. This organized matrix becomes a critical tool for a variety of analytical processes, such as uncovering  common patterns within the images, the application ...

Vectorization in Medical Image Analysis

Image
Vector    In the context of $n$-dimensional Euclidean vector space, a vector $\mathbf{x} = (x_1, x_2, \ldots, x_n)$ is characterized by its magnitude $\|{\bf x}\|=\sqrt{{\bf x}\cdot{\bf x}}=\sqrt{\sum_{k=1}^n x_k^2}$  and its direction, determined by normalizing $\mathbf{x}$ to a unit vector, $\frac{{\bf x}}{\|{\bf x}\|}$. Additionally, the angle $\theta$ between two vectors ${\bf x}$ and ${\bf x}'$  can be determined using the cosine of the angle, expressed as:$$\cos \theta=\frac{{\bf x}\cdot {\bf x}'}{ \|{\bf x} \| \|{\bf x}'\|   }.$$  This expression, leveraging the dot product of $\mathbf{x}$ and $\mathbf{x}'$ normalized by their magnitudes, quantifies the geometric relationship or similarity between the two vectors. It is a fundamental concept that has widespread applications in various fields, including computational geometry and machine learning. Understanding Images as Vectors In the specialized domain of medical imaging, modalities such as Com...