Posts

AI-Supervised Home Palliative Care: A Comfort-First and Cost-Effective Alternative to Hospital-Based End-of-Life Care

This blog is based on my personal experience caring for my palliative mother, who at that time was expected to live less than two months. I am not a doctor. It reflects on how end-of-life care often brings unnecessary discomfort to patients, even when death is near. Palliative care should focus on comfort, dignity, and relief from symptoms—not on prolonging life through medical intervention. Yet, hospital routines designed for safety can easily go too far. Nurses and doctors are required to follow strict protocols that call for frequent vital checks, blood tests, and continuous monitoring. Much of this comes from fear of legal responsibility rather than medical need. As a result, even patients in their final hours are often subjected to repeated procedures that offer no benefit but cause distress. Many remain connected to machines until their last moments. Families watch their loved ones in pain, realizing that such interventions contradict the essence of palliative care. The system ne...

Beyond the Comfort Zone: Rethinking Higher Education in the Age of AI

This piece offers a personal reflection on the relevance of today’s university system—often characterized by high costs and structural inefficiencies—in the context of AI’s growing influence on how knowledge is delivered and how research is conducted. While many of these issues have already been widely discussed, the aim here is not to revisit familiar arguments. Instead, the focus is on concerns that are less frequently addressed, particularly the inefficiencies that built up in higher education between 2000 and 2020—developments that, from some perspectives, have made university education feel increasingly ineffective, or even unnecessary. To begin, it may be helpful to consider a parallel in the world of Go (baduk). Before AlphaGo, Go education followed a traditional model: aspiring players trained in academies under the close guidance of veteran instructors. These teachers shaped their students’ progress, corrected their form, and provided psychological support during losing strea...

Physics-Informed Neural Networks: Fundamental Limitations and Conditional Usefulness

!!!Needs revision!!! Physics-Informed Neural Networks (PINNs) aim to approximate the solution \( u \) of a differential equation defined over spatial coordinates \( x \in \mathbb{R}^d \) (e.g., \( x = (x_1, x_2, x_3) \)), and, when applicable,  time \( t \), by representing \( u \) with a neural network \( u_\theta \), where \( \theta \) denotes the trainable weights and biases. Training involves minimizing a composite loss function  $\mathcal{L}(\theta) = \mathcal{L}_{\text{PDE}} + \mathcal{L}_{\text{BC}} + \mathcal{L}_{\text{IC}} + \mathcal{L}_{\text{data}},$ which enforces the governing PDE, boundary conditions, initial conditions, and any available observational data.   However, PINNs minimize the PDE residual only indirectly—by adjusting the neural network parameters rather than manipulating solution components or their derivatives in a controlled, explicit manner. This leads to several fundamental inefficiencies. Since the solution is represented by a neural n...

Biased Warnings: Examining the Risks of Unverified AI Speculation

The motivation for this blog stems from a recent article about Geoffrey Hinton, a recipient of the Nobel Prize in Physics and a renowned figure in artificial intelligence, who once again issued an alarmist warning about AI. According to reports from foreign media outlets, including the British daily The Guardian on December 27, 2024, Hinton appeared on BBC Radio, stating, "There is a possibility that humanity will go extinct within 30 years." He estimated a 10–20% chance that AI could destroy humanity within the next three decades and predicted that powerful AI, surpassing human capabilities, could emerge within 20 years and potentially gain control over humanity. A similar pattern was observed with the late Stephen Hawking, a celebrated physicist known for his work on black holes and the Big Bang theory, who also issued extreme warnings about AI without providing sufficient evidence. While Hinton’s groundbreaking academic contributions to AI are undisputed, his consistently...

Mathematics as the Invisible Architect: Bridging Natural Phenomena and Practical Applications

Mathematics: The Invisible Driver of Civilization Mathematics, alongside philosophy, has systematically shaped human cognitive abilities, driving the progress of human civilization. Over history, it has evolved to address societal needs while simultaneously advancing as an "invisible culture" through individual and collective intellectual efforts. Fundamental concepts such as distance and space have been refined with mathematical tools, enabling simplified representations of complex phenomena and fostering systematic understanding. Mathematical tools, including Maxwell's equations for electromagnetism, the Navier-Stokes equations for fluid dynamics, elasticity equations for material properties, and the heat conduction equation, are grounded in conservation laws. They describe relationships between physical quantities over time and space and have broad applications, such as fingerprint recognition, voice analysis, data compression, medical imaging, cryptography, animation,...

Advantages and Limitations of Deep Networks as Local Interpolators, Not Global Approximators

This blog addresses a common misconception in the mathematics community: the belief that deep networks can serve as global approximators of a target function across the entire input domain. I write this post to emphasize the importance of understanding the limitations of deep networks' global approximation capabilities, rather than blindly accepting such claims, and to highlight how their strengths as local interpolators can be effectively leveraged. To clarify, deep networks are fundamentally limited in their ability to learn most globally defined mathematical transforms, such as the Fourier transform, Radon transform, and Laplace transform , particularly in high-dimensional settings. (I am aware of papers claiming that deep networks can learn the Fourier transform, but these are limited to low-dimensional cases with small pixel counts.) The misconception often stems from the influence of the Barron space framework, which provides a theoretical basis for function approximation. Wh...

The Impact of Data-Driven Deep Learning Methods on Solving Complex Problems Once Beyond the Reach of Traditional Approaches

This blog is intended for mathematicians with limited background in physics and computational biology. Recent advancements in data-driven deep learning have transformed mathematics by enhancing—and sometimes surpassing—traditional methods. By leveraging datasets, deep learning techniques are redefining problem-solving and providing powerful tools to tackle challenges once considered impossible. This marks a new paradigm, driven by data, advanced computation, and adaptive learning, pushing the boundaries of what can be achieved. The profound impact of data-driven deep learning was recognized by the 2024 Nobel Prizes in Physics and Chemistry.  The Nobel Prize in Physics honored John Hopfield and Geoffrey Hinton for their groundbreaking contributions to neural networks. Hopfield developed an early model of associative memory in neural networks, known as the Hopfield network , which is based on the concept of energy minimization. The energy function is represented by: \[E(\mathbf{...