By World Healthcare Journal-
As genomics, digital medicine, artificial intelligence, and robotics are changing the way healthcare is being delivered across the world, the need to train a digitally native workforce across our system becomes crucial. This change was recently highlighted in the 2018 Topol review, which explained how the NHS needs to reshape ‘its workforce to deliver a digital future’. It was further underlined at a student conference delivered at UCL by Dr Keith Grimes, clinical innovation director at Babylon Health, who stated that the future of healthcare requires the building of ‘digital doctors’, and the most efficient way to achieve this is by starting at the beginning and teaching medical students the basics of coding.
Achieving ‘digital nativeness’ however, poses a difficulty, as medical school curriculums are already dense, and choosing what to change is a difficult task. For most young medical students, building their digital skillset, therefore, requires finding time outside their curriculum to learn, and practice. This learning environment is usually unstructured, which in turn can lead students to lose motivation and abandon the subject. So why is it worth the effort?
To this day, over 56 per cent of Americans monitor their health with at least one digital health data collection tool (2019 ResMed survey), and this number is rising. This growth translates into an exponential rise in medical data, and therefore in companies and people who aim to use this data to develop a better system. In recent years, we have seen an increase in AI recommendation technologies (30 per cent of the EMJ’s top medical start-ups are focused on AI and diagnostics), which helps both the general public and doctors evaluate diseases states and diagnose them.
However, these types of technologies are often quite young and are yet to be fully validated. Modern doctors will need to be able to critically evaluate the technologies they are using and recommending. Moreover, patients drastically need to feel that their doctors fully comprehend the technologies they are manipulating. The ability to handle AI is only possible if medical professionals are able to develop a more robust understanding of these technologies. A simple way for students to gain a better knowledge of this field is by letting them experiment with the technologies by themselves in a structured and supervised way. This would, however, require students to learn how to code. Learning directly by doing is a less common method of teaching in medical education, but is widely used in computational fields.
In addition, it is essential that modern doctors are not only passive learners of digital health technologies but that they are also actors in their development. The growth and transformation of medical technology cannot exclusively be shaped by digital companies, and more specifically by computer scientists, or mathematicians, because they are not the ones who will be manipulating the products with patients. Medical professionals need to play a central role in healthcare AI, helping throughout the process, in designing, building, training and testing the products developed. The involvement of doctors alongside scientists is crucial in order to ensure that AI stays beneficial to patients and remains ethical.
As years unfold the role of the digital doctor slowly becomes more and more definite in the healthcare sector. What once seemed innovative becomes fundamental. Today, we cannot imagine medicine and medical education without learning to use a stethoscope or to read an x-ray and as digital health becomes more and more prevalent the chances that doctors will, in one way or another be confronted by coding languages increases. In the same way that all doctors have to learn the fundamentals of cardiology, neurology and orthopaedics and will not necessarily specialise in these fields, so should they learn the fundamentals of knowledge associated with digital medicine.
Erwann Le Lannou
#whjfeature #whjdigitalhealth #whjeducation #whjmedicaltechnology