Background Routine assessments of lower urinary tract symptoms (LUTS) include standard uroflowmetry (UF), which is labour and equipment intensive to perform, and stressful and unnatural for patients. An ideal test should be accurate, repeatable, affordable and portable.
Objective To evaluate the accuracy of a machine-learning (ML) augmented audio-uroflowmetry (AF) algorithm in predicting urinary flows.
Subjects and methods This pilot study enrolled 25 healthy men without LUTS, who were asked to void into a gravimetric uroflowmeter. A smartphone recorded the voiding sounds simultaneously. Paired uroflow and audio parameters were used to train an ensemble ML model to predict urinary flows from voiding sounds. Pearson’s correlation coefficient was used to compare UF with AF values. Statistical significance was defined as p<0.05.
Results A total of 52 voiding session were captured, of which n=35 were used for training and n=17 for testing the algorithm. Each voiding session was divided into 0.1 s frames, resulting in >300 analysable datapoints per session. Pearson’s coefficients showed strong correlations for flowtimes (r=0.96, p<0.0001), voided volumes (r=0.83, p<0.0001) and average flowrates (r=0.70, p=0.0019), and moderate correlation for maximal flowrate (r=0.69, p=0.0022). AF predicted flow patterns showed good agreement with UF tracings. The main limitations were the small participants sample size and use of a single smartphone type.
Conclusions ML augmented AF can predict uroflow parameters with a good accuracy, and can be a viable alternative to standard UF. Further work is needed to develop this platform for use in real-life conditions and across genders.
- machine learning
- artificial intelligence
- lower urinary tract symptoms
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.