dc.contributor.advisor |
Suresh, Vinod |
|
dc.contributor.advisor |
Morgan, Catherine |
|
dc.contributor.advisor |
Maso Talou, Gonzalo |
|
dc.contributor.author |
Zambo, Katze |
|
dc.date.accessioned |
2023-02-28T20:24:24Z |
|
dc.date.available |
2023-02-28T20:24:24Z |
|
dc.date.issued |
2023 |
en |
dc.identifier.uri |
https://hdl.handle.net/2292/63006 |
|
dc.description.abstract |
Neurodegenerative diseases like Alzheimer’s disease (AD) affect millions of people worldwide. Research to date on Alzheimer’s disease has focused on the molecular markers
of the disease: amyloid-beta and tau proteins. Though billions have been spent, little
progress has been made. An alternative direction to studying the causes and progression
of AD focuses on the cerebrovasculature. Recent work suggests changes in cerebral blood
vessel structure and haemodynamics in the Circle of Willis (CoW) in AD. One of the most
powerful tools for studying cerebral blood flow is time-resolved 3D phase-contrast Magnetic Resonance Imaging (commonly called 4D flow MRI), allowing for acquisition of
velocity fields in 3 spatial and 1 time dimensions. However, it suffers from low resolution
and high noise. In order to address this issue, we modify an existing super-resolution artificial neural network (4DFlowNet) to upsample and denoise 4D flow MRI data acquired
from the cerebral vasculature. We present a pipeline from in vivo MRI to network training data. The CoW is first segmented from acquired time-of-flight MRI data and then
converted into a mesh. Ground truth data for training the neural network is generated
by running steady-state and transient computational fluid dynamics (CFD) simulations
of blood flow on the mesh using realistic physiological boundary conditions. The CFD
velocity field is downsampled and noise is added to generate the input dataset for network training. Using only a velocity mean-squared-error (MSE) loss function, the network
achieved validation and benchmark accuracy of 95.15% and 95.39% respectively. We develop a physics-informed neural network (PINN) extension, enforcing mass conservation
in network output, achieving 97.06% and 97.14% validation and benchmark accuracy in
training. In addition, the original 4DFlowNet network with velocity MSE and a “gradient
loss’’ term gave 97.53% validation accuracy and 97.52% benchmark accuracy. All three
networks were used to denoise and upsample 4D flow MRI data from two human subjects
and comparisons between the raw data and network outputs were made at 3 locations: the
internal carotid artery, basilar artery and middle cerebral artery. In all cases the velocity
magnitudes output by the network were lower (0.00 - 0.12 m/s) compared to the raw data
(0.13 - 0.62 m/s) over 1 cardiac cycle. In conclusion, a pipeline was developed to train and
validate an artificial neural network with the aim of denoising and upsampling 4D flow
MRI data in the CoW. The workflow developed allows straightforward and transparent
customisation of CFD simulation inputs such as boundary conditions and network properties such as loss functions. While future work is required to understand the discrepancy
between raw in vivo data and the network outputs, once resolved, the workflow developed
is likely to improve the quality of cerebral haemodynamic measurements. |
|
dc.publisher |
ResearchSpace@Auckland |
en |
dc.relation.ispartof |
Masters Thesis - University of Auckland |
en |
dc.relation.isreferencedby |
UoA |
en |
dc.rights |
Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. |
|
dc.rights.uri |
https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm |
en |
dc.rights.uri |
http://creativecommons.org/licenses/by-nc-sa/3.0/nz/ |
|
dc.title |
Cerebrovascular Flow Analysis From 4D Flow MRI |
|
dc.type |
Thesis |
en |
thesis.degree.discipline |
Engineering |
|
thesis.degree.grantor |
The University of Auckland |
en |
thesis.degree.level |
Masters |
en |
dc.date.updated |
2023-01-06T00:39:28Z |
|
dc.rights.holder |
Copyright: the author |
en |
dc.rights.accessrights |
http://purl.org/eprint/accessRights/OpenAccess |
en |