Hostname: page-component-745bb68f8f-s22k5 Total loading time: 0 Render date: 2025-02-06T08:47:59.179Z Has data issue: false hasContentIssue false

An overview of augmented and virtual reality applications in radiotherapy and future developments enabled by modern tablet devices

Published online by Cambridge University Press:  15 August 2013

F. Cosentino*
Affiliation:
Medical Physics Department, North Wales Cancer Treatment Centre, Betsi Cadwaladr University Health Board, Bodelwyddan, UK
N. W. John
Affiliation:
School of Computer Science, Bangor University, Bangor, UK
J. Vaarkamp
Affiliation:
Medical Physics Department, North Wales Cancer Treatment Centre, Betsi Cadwaladr University Health Board, Bodelwyddan, UK
*
Correspondence to: F Cosentino, North Wales Cancer Treatment Centre, Medical Physics Department, Glan Clwyd Hospital, Bodelwyddan LL18 5UJ, UK. Tel: ++ 44 (0)1745 445113. E-mail: Francesco.Cosentino@Wales.NHS.uk
Rights & Permissions [Opens in a new window]

Abstract

Purpose

We review augmented (AR) and virtual reality (VR) applications in radiotherapy as found in the scientific literature and highlight future developments enabled by the use of small mass-produced devices and portability of techniques developed in other fields to radiotherapy.

Analysis

The application of AR and VR within radiotherapy is still in its infancy, with the notable exception of training and teaching applications. The relatively high cost of equipment needed to generate a realistic 3D effect seems one factor that has slowed down its use, but also the sheer amount of image data is relatively recent, were radiotherapy professionals are only beginning to explore how to use this to its full potential. This increased availability of 3D data in radiotherapy will drive the application of AR and VR in radiotherapy to efficiently recognise and extract key features in the data to act on in clinical decision making.

Conclusion

The development of small mass-produced tablet devices coming on the market will allow the user to interact with computer-generated information more easily, facilitating the application of AR and VR. The increased connectivity enabling virtual presence of remote multidisciplinary team meetings heralds significant changes to how radiotherapy professionals will work, to the benefit of our patients.

Type
Literature Reviews
Copyright
Copyright © Cambridge University Press 2013 

Introduction

Radiotherapy is the treatment of disease, usually cancer, with ionising radiation. It is a complex process with traditionally two distinct phases: treatment planning and treatment delivery. In modern radiotherapy large amounts of information are available from different sources such as: diagnostic CT, MR, PET or SPECT, radiotherapy planning CT and images acquired during and sometimes after treatment providing information on inter-fraction changes. Imaging modalities like 4D CT intrinsically come with temporal information, providing information on intra-treatment changes. The image data acquired over time is in itself a source of temporal information with the potential of quantifying and monitoring treatment progress and outcomes. The increased imaging capability and computational power in recent years have also led to a blurring of the two radiotherapy phases where treatments are modified or adapted as information acquired during treatment becomes available. This paradigm shift is often referred to as adaptive radiotherapy or image guided radiotherapy (IGRT).

The wealth of information made available by the development of imaging modalities needs to be understood and requires efficient processing by the treatment team to deliver high quality care. This is where computer graphics (CG) and advanced information visualisation techniques become useful.Reference Vidal, Bello and Brodlie1 Whereas CG is already an integral part of the planning and treatment process, in the clinical setting information is generally presented in 2D, from different aspects, in sequence, in different sub-windows, or using simple overlays. A notable exception is teaching and training applications where immersive virtual reality (VR) has taken off. To concentrate on clinical developments and applications we exclude training from further discussion here and for this refer to two recent overviews.Reference Beavis, Page, Phillips and Ward2, Reference Appleyard and Coleman3

Whereas a VR system is a device that creates an entirely computer-generated virtual scene, when the computer-generated graphics content is mixed with a direct view of the real scene we talk of augmented reality (AR). Such systems often generate a 3D perception for the user, but more generally, the perception of the scene can also include touch, for example using haptic force feedback and other senses. Two types of approach to generate 3D perception are possible: holographic techniques and stereoscopic methods. In the holographic approach the 3D element is implemented into the image display mechanism itself (e.g., the physical visualisation of image voxels is generated at different depths inside the device). With the stereoscopic approach the 3D perception is achieved by combining two different views of the scene from the two positions corresponding to the observer's left and right eye, so that each eye can see its own view only. The human brain will then generate a spatial perception of the scene from these two views. In recent years many groups have been working on 3D displays in various fields, including: military, industry, medicine, computer games and entertainment. Despite the relatively high cost of this type of equipment, work has been done to investigate the benefits of AR at different stages of the radiotherapy treatment process.

AR as a discipline, with many applications outside of medicine, aims to create systems in which the user perceives computer-generated content as physically present in the scene. In computer applications development, AR represents a new user interface paradigm where headsets, handheld or wearable devices enable the user to move in the real environment, receiving at the same time continuous information from computer systems. Particularly, a new generation of mass-produced, widely available and relatively low cost tablet devices show a potential to modify how we interact with clinical data in the radiotherapy setting.

In this paper, we present a review of AR and VR applications currently being investigated for use in radiotherapy. Also reviewed are AR and VR developments outside the radiotherapy domain where there appears to be an application in radiotherapy. Of particular interest is the use of small mass-produced, often handheld devices as interface tools, being increasingly incorporated into commercially available clinical equipment.

Treatment planning

The primary goal of radiotherapy treatment planning is to design a set of ionising radiation beams that deliver high doses to the tumour, while minimising dose to healthy tissue and vital organs. Large amounts of detailed patient anatomy and functional data are acquired during the planning phase from different imaging modalities such as: CT, MRI, PET and SPECT. With dedicated computer simulation software this data is used to iteratively determine the optimal radiation beam orientations and beam shapes with a resulting radiation dose distribution in the patient around a target volume. The efficient design and final choice of the optimal treatment plan remains a non-trivial task as the radiotherapy professional needs to understand and be able to visualise the dose coverage of anatomic structures in three dimensions, and increasingly four dimensions if reproducibility and time effects are included. Therefore accurate and informative 3D visualisation is required for intuitive and quick evaluation of competing plans. Although the 2D monitors used in conventional radiotherapy planning systems allow for displaying 3D images of the dose distribution together with patient anatomy data, typically only 2D flat surfaces are used. Therefore treatment planners usually view patient anatomy slice-by-slice, in three orthogonal planes (axial, sagittal and coronal), showing images from different modalities side-by-side or using relatively simple overlays.

In radiotherapy, the first use of a stereoscopic display was reported in 1997 by Hubbold et al.,Reference Hubbold, Hancock and Moore4 who coupled an autostereoscopic display to a direct volume rendering algorithm. Two sets of preliminary experiments investigated whether subjects could achieve better depth judgements with stereoscopic images than with monoscopic ones and to explore the discomfort caused by aliasing with low-resolution images. Aliasing is a mathematical effect of signal sampling and reconstruction, leading to the appearance of artefacts on digital images reconstructed from under-sampled images. With 2D images the only effect of aliasing is the presence of artefacts, but with stereoscopic images, aliasing artefacts can be present only in one of the two views that have to be combined to give the perception of depth and this can result in user discomfort.

The authors classify their results as preliminary, but observe that the results do demonstrate an overall advantage of stereoscopic over monoscopic viewing of transparent images generated by direct volume rendering. The investigation of the technique applied to radiotherapy data shows an observable improvement in the sense of depth to the image. Their results also showed stereo visualisation to have no benefit in a number of cases. The authors state that there is nothing in the way the visualisation was implemented that clearly explained this. They postulate that some subtle differences in shading on the surfaces may be more important than the stereoscopic disparities in the difficult cases. Judging from the comfort ratings, the results agree with evidence from other studies that effects of spatial aliasing may be to some extent ignored by users when interpreting stereo images.

Use of an autostereoscopic display for robotic radiosurgery planning was described by Schlaefer et al.Reference Schlaefer, Blanck and Schweikard5 An autostereoscopic display from SeeReal Technologies GmbH Dresden was used. The two different views of the scene required for stereoscopic viewing were vertically interlaced in the 2D display. In order to generate the user's 3D perception a mask of beam splitters are superimposed onto the display, allowing for two different views from two different positions each corresponding to the observers left and right eyes.

Treatment plans for robotic radiosurgery consist of a large number of beams directed towards the target volume. Software to visualise the resulting 3D dose distribution and the beam directions are implemented using the Visualization Toolkit (VTK).Reference Schroeder, Martin and Lorensen6 A hypsometric colour scheme was used to identify hot and cold spots in the target volume (i.e., regions of high and low dose, respectively). An existing treatment plan with 1,200 beams for an intracranial tumour was projected onto the autostereoscopic display to assess the spatial extent of hot and cold regions along with the orientation of the beams. Based on the visual information obtained from the 3D visualisation, 20 beams were manually added to the existing plan in such a way that a large number of cold voxels were hit, but hot voxels were avoided, helping to reduce dose to hot spots and increase dose to cold spots. An inverse planning algorithm was implemented to re-optimise the plan and the result was compared with the original plan. The original plan consisted of 119 weighted beams with a total of 21,763·3 MU, while the plan obtained after adding 20 beams and optimised to discard the less efficient beams, consisted of 123 beams requiring 21,610·7 MU. The manually added beams were all retained with maximum weight by the optimizer algorithm. It was concluded that the visualisation tool was useful to guide optimal beam placement.

An immersive VR simulation environment RTStar (University of Hull, UK), complemented with software enhancing the visualisation and simulation by using 3D stereoscopic data projection and geometric volume analysis, showed benefits to optimise beam orientations for axial 7-field prostate integrated prostate cancer center (IMRT) plans.Reference Shang, Williams, Beavis, Ward, Sims and Phillips7 For eight existing prostate IMRT plans the beam geometry was further improved. In the 3D environment most beam angles were modified achieving a better dose homogeneity in the target area (1·9% reduction in global maximal dose). Also rectal and bladder doses were reduced with a 2·3% and 12·9% reduction in maximum dose, respectively. The authors also emphasised that the 3D stereoscopic viewing eliminated the risk of designing a plan that could not be delivered because of a gantry collision with the patient.

The first system to integrate volumetric 3D visualisation with treatment planning in a true 3D planning system was described in three presentations.Reference Gong, Kirk and Zusag8Reference Chu, Zhang and Yurkewicz10 The system combined two commercially available components: the Perspecta Volumetric display System (Actuality Systems, Bedford, MA, USA) and the Philips Pinnacle3 Treatment Planning System (Philips Medical Systems, Madison, WI, USA). The Perspecta volumetric display (Figure 1) works by projecting a sequence of 2D images onto a swiftly rotating omnidirectional diffuser screen enclosed in a polycarbonate resin dome.

Figure 1 The Perspecta System volumetric display. Reproduced with kind permission of the American Institute of Physics from Gong et al.Reference Chu, Zhang and Yurkewicz10

The treatment plans could be easily transferred between Pinnacle and Perspecta, using Perspecta for display and modification while using Pinnacle for dose calculations. To assist the radiation oncologist during the review of treatment plans, the calculated dose distribution could be rendered in a volumetric 3D display (Figure 2) where anatomical information is visible in a more natural and efficient way than on 2D monitor screens. This enables treatment planners to create complex beam arrangements faster than with 2D monitor screens. In conventional planning the planner iteratively modifies and reviews the 3D beam geometry in relation to organs at risk (OAR) and planning treatment volumes on a 2D screen with 2D image views. This requires manipulation of the 3D image with a pointing device (mouse, trackpad, trackball, etc.), however, with the Perspecta display, the planner only needs to move around the display to change their point of view.

Figure 2 Perspecta autostereoscopic display of images transferred from Pinnacle planning system. (a) Region of interest can be tagged with different colours. In the original article right and left lungs are in green and purple respectively. Tumour is bright green in right lung. Nodal disease is in blue. (b) CT-based image. In the original article different colours are assigned to specific ranges of CT densities (e.g. purple is assigned to bone). Reproduced with kind permission of American Institute of Physics from Gong et al.Reference Gong, Kirk and Zusag11

Quality assurance of the system was considered by Gong et al.Reference Gong, Kirk and Zusag11 Dose at sampled points were checked and found consistent with Pinnacle within 1% or 1 mm. The 3D spatial display of images, contours, and dose distributions exported from Pinnacle to Perspecta were consistent with Pinnacle display. Distances measured by the 3D ruler in Perspecta agreed with Pinnacle. A clinical evaluation was reported in 2009Reference Chu, Gong and Cai12 with data from 46 patients: 12 brain, ten lung and 11 abdomen/pelvis cases, together with 13 patients from a pilot study. Perspecta plans were considered better in terms of reduced dose to OAR in 28 patients (61%). Lower doses were delivered to critical organs: 34% to the optical chiasm, 17% to the bladder, 10% to the liver, 30% to the kidney and 40% to the lungs. Surprisingly, in 14 patients (30%) Perspecta plans were worse than corresponding plans produced on a conventional planning system, and equivalent in four patients. This was attributed to volumetric 3D planning tools not yet being fully developed and the treatment planners not as familiar with the operation of the Perspecta 3D system as with the conventional planning system. The observation of unfamiliarity with the system does emphasise the need for intuitive user interfaces to effectively process and absorb large amounts of data. Despite this it was claimed that oncologist's evaluation of plans using 3D visualisation was more efficient than using 2D visualisation, because all plan information (target coverage, normal tissue sparing and the locations of hot or cold spots) from all CT slices were available simultaneously. Acceptance and quality assurance aspects and the accuracy and consistency of presenting dose information on Perspecta were also considered. It was suggested that the Perspecta display software (PerspectaRad) could be improved with the ability to commission the display to the user's specific treatment machine to include treatment machine limits.

A VR system for the evaluation of treatment plans was developed by Patel et al.Reference Patel, Muren, Mehus, Kvinnsland, Ulvang and Villanger13 This was installed and networked in a radiotherapy conference room at the Haukeland University Hospital. Data were exported from the planning system and fed into the VR application for visualisation. The VR environment consisted of a passive stereo setup made by a semi-rigid back projection screen (BARCO Pas-Cad) and two overlapping LCD projectors (BARCO SXGA 3000 ANSI). Selective views for the right and the left eye were implemented by using circular polarisation filters on the projectors and matched in the user's glasses. The software ran on an up-gradable, low cost and powerful PC graphics system. By using a 2D transfer function CT and dose data were combined, where the dose at the surface of outlined or segmented structures could be rendered with good quality graphical results (Figure 3). In their paper on the clinical evaluation, the authors concluded that the adopted hardware solution was well suited for collaborative multi-disciplinary team sessions. Users can both see each other and the data simultaneously and the degree of nausea that may appear when working in VR is tolerable. The system also provides the opportunity to inform patients about the procedure they are going through in a more informed way than traditional verbal explanations. Since the framework this system is built on already supports tracking, the authors point out it is only a question of availability of resources to be able to have it working in a completely immersed environment. The authors state that a user study quantifying the hypothesised advantage of VR (compared with the existing planning software) would be needed to further explore the potential of this software.

Figure 3 (a) Visualisation of the dose distribution on the surface of a selected CT data volume by making all but the lowest values of the transfer function opaque. (b) Visualisation of the dose distribution on the bony structures is achieved by making areas of high CT values opaque and areas of low CT values transparent. Reproduced with kind permission of Elsevier from Patel et al.Reference Patel, Muren, Mehus, Kvinnsland, Ulvang and Villanger13

Butler et al.Reference Butler, Teh and Bell14 investigated the impact on radiation oncologists’ decision making from presenting information in 2D, 3D and stereoscopic visualisation. Stereoscopic visualisation was implemented on an Apple workstation (8 core 2·8 GHZ processors, 16 GB RAM, NVIDIA Quadro FX 5600 1·5 GB stereo 3D dual link DVI graphics card) and a 24-inch stereoscopic monitor (Planar Systems Inc.) using Osiris software. Plans for ten patients with head and neck carcinoma generated on Pinnacle (Philips) and TomoTherapy (TomoTherapy Inc.) planning systems were evaluated in 2D, 3D and stereoscopic visualisation by three radiation oncologists. The clinicians were asked if the decision-making process was changed as the display progressed from 2D, to 3D, to stereoscopic visualisation. The information provided by stereoscopic visualisation of the relationship of the target to the normal structures, with visualisation of isodose curves with depth perception, was considered clinically significant by the radiation oncologists in all ten cases. Stereoscopic visualisation did not result in changing the dose constraints for any of the plans, although the 3D display provided added assurance that the plans were safe and clinically acceptable. The authors report that in their department head and neck cancer cases are now routinely reviewed with stereoscopic visualisation.

Treatment delivery

Radiotherapy is typically given in daily fractions delivered over a number of weeks. The main challenge is to ensure that, for each fraction of the treatment, the dose delivered to the patient is as close as possible to the planned dose, taking into account body and organ variations. These occur for a host of reasons, of which internal motion because of breathing and tumour shrinkage due to treatment are examples. Checking patient positioning is traditionally based on laser alignment with skin markers and treatment room imaging for verification. More recently, diagnostic quality images acquired in the treatment room immediately before treatment have become available for treatment machines equipped with kV imaging panels and cone beam CT facilities. In routine clinical practice typically 10–15 minutes time slots are scheduled daily for each patients’ treatment. To maintain this workflow, the increasing wealth of imaging information made available from new technologies immediately before and during treatment, needs to be quickly processed by the treatment team. AR techniques have much potential in this respect where they can help to make an optimal use of the setup verification images, improve accuracy of patient positioning or speed up the patient positioning decisions enabling fast but accurate treatment deliveries.

Deutschmann et al.Reference Deutschmann, Steininger, Nairz, Kopp, Merz and Wurstbauer15 developed a system that enables an overlay of inner structures delineated on CT data (target volumes and OAR) and field boundaries on the X-ray plane in real time (i.e., while fluoroscopy is performed, Figure 4).

Figure 4 Fluoroscopy images matched to projection of outlined structures. Reproduced with kind permission of Springer from Deutschmann et al.Reference Deutschmann, Steininger, Nairz, Kopp, Merz and Wurstbauer15

The simultaneous display of computer graphics imagery and real material is used to correct patient's positioning errors. More precisely, a projection onto the current X-ray image of 3-D structures not visible in the fluoroscopy because of missing soft-tissue contrast is implemented. Setup deviations between volumetric imaging and simulation were considered for 701 patients. The results of patient position adjustments based on the overlay of CT data and fluoroscopy images were superior to the results based on conventional registration of Digitally Reconstructed Radiographs and Electronic Portal Images. Applying the fast planar imaging technique and 2D-3D registration, translation errors could be corrected. A fast way to easily track rotations on planar images is still to be found.

A method for AR-facilitated patient set-up was proposed by Talbot et al.Reference Talbot, Meyer, Watts and Grasset16 in a pilot study using an anthropomorphic phantom. The 3D external body contour was obtained from planning CT data. With the phantom positioned on the treatment couch, the 3D body contour from planning CT was superimposed onto a real time video image of the phantom, using AR tracking software (Figure 5). An operator could view the monitor placed outside the treatment room and visually confirm correct positioning throughout set-up and treatment. The performance of the system was investigated by using it to position an anthropomorphic phantom without the aid of additional set-up methods. The translational set-up errors were <2·4 mm and the rotational errors <0·3°. These results demonstrated the feasibility of using AR for patient positioning. The authors state that the developed technique needs further investigation before clinical use

Figure 5 Body contour from CT scan (grey virtual image) and patient's true image (red anthropomorphous phantom). Reproduced with kind permission of Springer from Talbot et al.Reference Talbot, Meyer, Watts and Grasset16

True 3D display of delivered dose was investigated by Santhanam et al.Reference Santhanam, Willoughby and Kaya17 They presented a visualisation framework that combines a computer-based simulation of real-time lung tumour breathing motion and dose accumulation with an AR display system (Figure 6). The simulation framework provides visual insights on the variations in the quality of therapy for changes in the patient's breathing conditions from the pattern acquired with the 4D planning CT scan. The display system enhances the clinician's understanding by adding a 3D depth perception of the dose accumulation pattern. The framework is a tool for presenting both preoperative studies and intra-operative treatment efficacy analysis when coupled with a real-time respiration monitor. Evaluation was carried out using six clinical experts and results showed that, using AR compared with a 2D monitor, the experts were more able to efficiently perceive the radiation dose delivered to various aspects of the moving tumour and the surrounding normal tissues. Also, a quicker detection of radiation hot spots that are critical to minimising damage to healthy tissue was observed.

Figure 6 Real-time lung tumour motion (due to breathing) and dose accumulation, displayed on AR active glasses display system. Reproduced with kind permission of IEEE from Santhanam et al.Reference Santhanam, Willoughby and Kaya17

Abbrevation: Augmented Reality (AR)

Wang et al.Reference Wang, Lee and Fang18 developed a volume visualisation system with AR interaction, using the Insight Segmentation and Registration Toolkit19 and the VTK.Reference Schroeder, Martin and Lorensen6 Surface comparisons between clinically relevant isodose levels and planning volumes can give more information than conventional dose–volume histograms. A radiotherapy plan for a brain tumour was used to evaluate the software. The authors concluded that the volume visualisation with AR interaction helped the radiation oncologists to observe the under-dosing or over-dosing regions in 3D and to gain insight into the degree of dose inhomogeneity, such as hot or cold spots seen in radiotherapy plans.

A target visualisation system for real-time target verification was reported by Chen et al.Reference Chen, Chang, Liu and Chen20 Image data from ultrasound (US) and CT scans were captured and registered. US-CT image registration was integrated with a human-commanded 6-degree-of-freedom robotic manipulation of US probe and linear accelerator to form an innovative radiotherapy system. Using an automated algorithm, target organs were segmented in CT images, US images were transformed and reconstructed to match each orientation, and image registration was performed in real-time with acceptable accuracy. This image transformation allowed oncologists to visualise CT image-reconstructed targets outside BEV via an US probe positioned non-coplanar to the beams’ plane. Using robotic manipulation allowed oncologists to remotely control the US probe, dynamically track and real-time monitor the coverage of target volumes within a BEV during a simulated beam-on situation. The authors concluded that their target visualisation system might provide a remotely accessible and real-time way to visualise, verify and justify the use of more conformal radiotherapy treatment technologies.

Mass produced devices to interact with clinical data

Technology is changing at a rapid pace and some developments have the potential to profoundly change the way clinical professionals interact with computer-generated data. In particular there is a growing number of adaptations of consumer market technology (game controllers, handheld devices) as an alternative to highly specialised hardware.

Accuray PlanTouch21 is the first commercially available software application in radiotherapy that allows oncologists to remotely review and approve radiotherapy plans on the Apple iPad. The application's interface is fully integrated with the CyberKnife planning software. Oncologists can review dose volume histograms, isodose curves, contours and images and approve treatment plans directly from their tablet devices. Treatment planning displays are designed and formatted specifically for the iPad's screen and can be manipulated using the iPad's touch screen capabilities. There are a number of other companies supplying equipment to radiotherapy clinics releasing software for users to review or approve information on handheld devices, for example clinical information that is available during ward rounds.

An example applied to brachytherapy is given by Butler.Reference Butler22 When performing a needle implant for advanced gynaecological malignancies, it is often difficult to predetermine parameters like needle length to target, proximity to bowel and vascularity. To overcome these difficulties laparoscopic guidance is often required. In this example, 3D interactive volumetric display software, utilised by other subspecialties (e.g., cardiovascular interventions), is evaluated to see if it can replace laparoscopic guidance. For a patient with a clinical condition preventing the use of laparoscopic guidance, needle placement utilising the visualisation system as guidance was evaluated. A CT angiography study was fused with a PET imaging study and used to define and refine the target. Before going to the operating room, guidance data (ideal trajectory of needles and other relevant parameters) were predetermined and recorded on an iPad. The iPad was taken into the operating room and used to display the guidance data for additional insight during the intra-operative procedure to complement fluoroscopy, the only other diagnostic imaging available in the operating room. Postoperative CT imaging verified needle placement to be within 2 mm of ideal placement. There were no operating room complications. The author concludes that 3D volumetric reconstructive software can assist the radiation oncologist in preplanning brachytherapy needle placement but, in order to optimise the 3D volumetric reconstruction process, the radiation oncologist needs to understand the geometry of the CT datasets.

Nakata et al.Reference Nakata, Suzuki, Hattori, Hirai, Miyamoto and Fukuda23 designed a system for 3D and 4D image manipulation using optical tracking AR integrated with a smart-phone. The authors observed that the mouse, the most widely used pointing device on personal computers, was originally designed and best suited for control of 2D cursor movement rather than complex 3D image manipulation. In this work, 3D and 4D images obtained with CT and magnetic resonance imaging were displayed on a PC running Windows 7 (Microsoft). The AR software was based on ARToolKit,24 a video-based tracking technique without a precise video controller UI. In this novel system, the authors used the iPhone or iPod Touch as a remote control device. The functions of this remote control included zooming in or out on the AR object, capturing the PC screen, and playing or pausing the 4D object, and were achieved using a Wi-Fi connection. The system allowed radiologists to browse 3D or 4D images from CT and MR imaging by using an iPhone or iPod Touch to control the PC. AR images required surface rendering, which was achieved by using OsiriX25 imaging software. The surface image data were transferred to a Windows PC with a novel AR viewer developed by using the ARToolKit. The PC was equipped with a web camera allowing recognition of the AR fiducial marker. The software allowed radiologists to manage the AR images using either an iPhone or a conventional two-button mouse as a controller for comparative evaluation. The iPhone or iPod Touch was placed in a plastic jacket with an optical tracking marker printed on the back. The radiologist could move and twist the iPhone or iPod Touch with the optical marker facing the web camera of the PC and the software running on the PC was able to recognise the optical marker. The AR images were shown on the LCD display of the PC with real-time tracking as a superposed model of the optical marker on the background of the real-world view as seen on the LCD monitor. When the radiologist moves the iPhone or iPod Touch, the 3D object on the LCD monitor moves and scales itself at the same time in an intuitive manner. The authors concluded that, although strict comparisons of user interface performance between the AR techniques and a conventional mouse are difficult, AR had high interactivity and 3D image manipulation required no special training. Therefore, performance evaluation of the AR technique was performed without special warm-up trials. They compared the performance of the AR 3D image manipulation method with that of the conventional method. Three different 3D objects were evaluated by 12 different testers. The times for three horizontal predetermined rotations of each 3D object were measured. The average times to perform the rotations with the AR method were statistically shorter than those achieved with the conventional two-button mouse in all three cases.

In research carried out by Gallo et al.Reference Marra, Gallo and De Pietro26, Reference Gallo, Minutolo and De Pietro27 a novel user interface to provide a direct interaction with medical imaging data in 3D space by off-the-shelf input devices was proposed and evaluated. The interface was implemented as open-source software and integrated into the open-source medical image viewer Medical Imaging TOolkit MITO.28 Both a common mouse and a Wii remote controller were used as input devices (Figure 7).

Figure 7 Direct interaction with medical imaging data in 3D space integrated into the open-source medical image viewer MITO. Reproduced with kind permission of Elsevier from Marra et al.Reference Marra, Gallo and De Pietro26 Abbreviation: Medical Imaging Toolkit (MITO)

The interface featured a novel rotational technique using the geometry itself as the rotation handle. A user study showed that the proposed techniques were easy-to-learn and outperformed the virtual trackball technique in the task of rotating complex-shaped objects.

AR and VR techniques in other domains showing potential application to radiotherapy

In this section a number of AR/VR examples in medicine, or even outside of medicine, illustrating potential applications in radiotherapy, are considered. Registration of the real world, as seen for example through a device's camera, and computer-generated imagery being merged with the scene is a far from trivial task, especially in real time. The examples in this section, however, demonstrate that this can be accomplished with error margins of the order of 2 mm or less. This spatial error margin is accepted in many radiotherapy techniques. A considerable improvement of accuracy has been achieved between the first example considered below dating back to 2002 and the more recent studies.

In 2002, Mitchell et al.Reference Mitchell, Wilkinson, Griffiths, Linsley and Jakubowski29 described a method of image guidance for neurosurgery using the surgeon's binocular depth. For patients with brain tumours, stereoscopic pairs of images of the surface rendering of the head and the surface rendering of the tumour were produced using MRI data. The two pairs of images were colour-coded and combined into one pair of 35-mm slides viewable using an ad-hoc constructed stereoscopic viewer. Registration was achieved by moving the stereoscope in space until the virtual images of the rendered surface of the head coincided with the real head. The stereoscope was then locked in position and the virtual image of the tumour was projected inside the patient's head, allowing the surgeon to locate the tumour. Six clinical cases were considered. A lateral accuracy of 10–15 mm and a depth accuracy of 5–10 mm were achieved.

Another application of AR to conduct minimally invasive orthopaedic surgery was reported by Liao et al.Reference Liao, Ishihara, Tran, Masamune, Sakuma and Dohi30 This paper describes a precision-guided surgical navigation system, that consisted of a combination of laser guidance and 3D autostereoscopic image overlay. Using an integral videography imaging method, images of surgical anatomic structures were superimposed onto the patient without the need for special viewing or tracking devices (Figure 8).

Figure 8 Laser guidance with autostereoscopic image overlay: (a) IV image overlay device and patient/image overlay; (b) alignment of surgical instrument; (c) image-patient registration results and surgical path guidance of laser beams; (d) operational diagram. Reproduced with kind permission of Elsevier from Liao et al.Reference Liao, Ishihara, Tran, Masamune, Sakuma and Dohi30

The image overlay system was integrated with a laser guidance system to improve the placement accuracy of surgical instruments. Experimental evaluations showed that the error in guiding a linear surgical instrument towards a target was within 2·48 mm with a standard deviation of 1·76 mm, and the orientation error was 2·96° with 2·12° standard deviation. This is the same order of spatial accuracy required in modern external radiotherapy.31 The authors concluded that the system can support surgeons during their operations and enables them to intuitively identify the insertion path of the surgical instrument. It was also stated that accuracy could be improved by using a display device with a higher pixel density and a higher precision laser guidance device. This would make the system of practical use not only for orthopaedic applications, but also in other medical fields. An application in radiotherapy for the 3D autostereoscopic image overlay systems could be displaying patient's outlined anatomy, planning volumes and planned treatment beams, in the treatment room overlaid to the patient, before treatment as a verification aid.

Tomikawa et al.Reference Tomikawa, Hong and Shiotani32 developed a VR navigation system with open MRI for breast-conserving surgery. The authors report an estimate of the mismatch between VR content and real distance to be in the order of magnitude required by radiotherapy applications. Clear analogies between concepts considered in this study (image registration, surgical margins) and fundamental concepts in radiotherapy (image registration, treatment and OAR margins) suggest possible applications to radiotherapy. In this work dye marking of a breast tumour, serving as guidance for surgical resection, was performed using a real-time 3D VR navigation system. A pilot study using a 3D phantom was carried out for quantitative and qualitative evaluations and a mean mismatch between the navigation system and real distance of 2·01 ±0·32 mm was reported. A study based on two patients was also carried out. Histopathological examinations of the resected specimens of the two patients showed that the surgical margins were free of carcinoma cells.

Kim et al.Reference Kim, Hong and Joung33 developed a dual surgical navigation system for endoscopic surgery that used VR and AR techniques together to obtain additional depth and visual information for organs. The VR environment was developed to visualise the spatial relationships among the target organs, endoscope and surgical tools. The AR environment was used to display the raw endoscopic images with the nearby organ images overlaid, as obtained from CT and MRI scans, which would otherwise be invisible to the endoscopic probe. Surgeons were enabled to better understand the surgical environment around the target, increasing the safety and accuracy of surgical procedures. Image registration between endoscopic and CT/MRI data was realised using a surface-tracking technique. A virtual model of the endoscope and surgical instruments were displayed into the VR and AR environments based on tracking of the endoscope and instruments position; tracking was carried out using either an optical position sensor or an electromagnetic sensor. Raw endoscopic images are affected by distortion due to camera optics. In order to accurately overlay CT/MRI data on to endoscopic images, camera optics transformation was applied to CT/MRI images. This was realised through camera calibration procedures that allowed the relevant geometric parameters and the lens distortion coefficients to be obtained. Rendering was based on the parameters of the endoscopic camera, so the rendered results mimic the shape and size of the real object, just as it would appear from the endoscopic video camera. In phantom experiments, the translational overall registration error was <2 mm with CT images and an optical position sensor. Higher errors were observed using an electromagnetic tracking sensor and MR images. Correlation between errors and endoscopic camera angles were also observed. The dual navigation system was applied to a cochlear implant surgery for evaluation in a clinical setting. The system was applied to a surgical microscope instead of an endoscope and the clinical application analysis confirmed the feasibility of such a system in the operating theatre. The surgeons who have observed and used the system in the clinical study declared the usefulness of the dual navigation system, considering it to have significant advantages compared with conventional systems.

Gavaghan et al.Reference Gavaghan, Oliveira-Santos and Peterhans34 developed a portable image overlay projector for the visualisation of surgical navigation data and conducted some tests on phantoms to explore the capabilities of the device. Monitor-based visual feedback for image-guided surgery requires the surgeon to perform time consuming comparisons and diversion of sight and attention. Their system utilised a portable image overlay device comprising a navigation computer unit, an infrared-based optical passive tracking system (Vicra, NDI, CA, USA) and touch screens for user interaction and visual display (Figure 9). The optical cameras track known configurations of retro-reflective marker spheres. The system was tested on a range of anatomical models and for planning different surgical interventions (liver, cranio-maxillofacial, orthopaedic and biopsy). The visualisation approach was found to assist in spatial understanding and reduced the need for sight diversion throughout the simulated surgical procedures. The portability of the device and intuitiveness of use suggest an expansion of its application to other parts of medicine, including radiotherapy especially for the patient positioning phase, where monitor-based systems would pose problems of portability and ease of use inside the treatment bunker.

Figure 9 (a) Stereotactic instrument guidance system with integrated image overlay device. (b) (A) Image overlay AR for navigated liver surgery on a patient-specific rigid model and (B), pig liver tissue; (C, D) image overlay AR for navigated cranio-maxillo facial surgical planning; (E) and (F) image overlay AR for navigated orthopaedic tumour resection. Reproduced with kind permission of Springer from Gavaghan et al.Reference Gavaghan, Oliveira-Santos and Peterhans34

Another study reporting an estimate of the registration accuracy between the real scene and AR content is Low et al.,Reference Low, Lee, Dip, Ng, Ang and Ng35 where AR neurosurgical planning and navigation system (the DEX-ray) for surgical excision of meningiomas were implemented. The DEX-ray system is based on the Dextroscope (a stereoscopic 3D pre-operative planning system) and allows the transfer of the Dextroscope planning data into the operating theatre by displaying it on to real-time images, producing in this way a video-augmented presentation of the surgical scene, further enhancing the appreciation of the tumour's location in 3D space. The DEX-ray has an image distortion <0·4 mm in the AR mode and a registration accuracy of 1–3 mm. The AR feature allows for navigation with 3D graphics beyond the visible surface of the surgical site, but yet always in direct context to it, providing a see-through effect and resulting in a more direct understanding of the hidden anatomy relevant to the surgical procedure.

Several architecture-oriented applications of AR implementing visualisation of virtual buildings overlaid on the real scene are found on the web. CityViewAR36 is an AR application designed at Canterbury University, New Zealand, to give a visual reminder of how the city of Christchurch (New Zealand) used to look before the earthquake in 2010. Similar applications, but in the entertainment domain, are reported online, for example by String Labs Limited.37 These last two examples support the view that applications of AR to radiotherapy based on self-tracking capability of tablet devices are feasible. Moreover, for these consumer devices programming techniques are more reusable than for highly specialised devices requiring more low-level programming. This may make the required programming knowledge more readily available, although for some time yet multi-disciplinary collaboration involving specialist developers is likely to be needed to make best use of these tools. The accuracy and reliability achievable by these systems needs further investigation.

Summary and conclusions

The review of novel systems based on AR user interfaces considered in this paper suggests a future where radiotherapy professionals will be able to manipulate 3D and 4D images in a more intuitive and efficient way, possibly anywhere, anytime. This is likely to enable a better use of large amounts of information available with modern diagnostic tools, but more radically may change how collaborative tasks such as clinical case discussions or complex case planning can be performed by allowing experts to be off site.

The reviewed radiotherapy studies point to potential benefits from AR and VR at various parts of the treatment process. Most of the early studies suggest that research in this field will need to address current limitations around operator discomfort, ease of use and sensible selection and accuracy of information to be displayed.

Accuracy of the registration between virtual content and the real scene is reported to be in the order of millimetres or less in recent VR/AR applications to surgery.Reference Liao, Ishihara, Tran, Masamune, Sakuma and Dohi30, Reference Tomikawa, Hong and Shiotani32Reference Low, Lee, Dip, Ng, Ang and Ng35 The required accuracy for most advanced radiotherapy techniques is of the same order of magnitude.31 A reduction of the registration error from 5–10 mm to 1–2 mm has been achieved from 2002Reference Mitchell, Wilkinson, Griffiths, Linsley and Jakubowski29 to the present date. If this trend continues the registration error will be made significantly smaller than the spatial accuracy required in radiotherapy for patient positioning and treatment planning, possibly negligible. In that case the potential portability to radiotherapy as setup and verification tools of these and similar techniques found in the literature is feasible in the foreseeable future.

Despite promising results, AR has not taken off in clinical radiotherapy to date, with the exception of teaching and training applications. This may be partly because of the high cost of equipment, explaining the difficulties to develop this into commercial tools. However, the situation is rapidly changing and the cost of high specifications AR and VR capable hardware is considerably decreasing.

Although there was a considerable time lapse between the firstReference Hubbold, Hancock and Moore4 and the secondReference Schlaefer, Blanck and Schweikard5 study reporting 3D display applications in radiotherapy, since that time the number of publications in this field are steadily increasing, indicating a growing interest from the medical and scientific communities.

The majority of the reviewed studies used costly hardware not widely available commercially, especially holographic displays and state-of-the-art large flat-screen 3D displays and projectors. More recent studies have started to use readily available devices (Wii remote, iPad, iPhone, iPod Touch), where interestingly none of the systems found based on these devices have reported problems of user discomfort, requirements of special training or cost.

The use of tablet and handheld devices (e.g., iPad, iPhone, iPod Touch and Android equivalents) is growing fast and these devices are being rapidly adopted in the medical field, particularly for medical imaging applications. Most tablets also have a built in camera that can be utilised for AR applications. However, computing power on a tablet is limited and the real time registration of the camera image and computer-generated graphics remains a challenge.

In summary, the development of small mass-produced tablet devices coming on the market will allow the user to interact with computer-generated information more easily, facilitating the application of AR and VR to radiotherapy practice. The increased connectivity, making information available anywhere, anytime and enabling virtual presence at remote multidisciplinary team meetings, is likely to significantly change how radiotherapy professionals will work, to the benefit of patients.

Acknowledgement

The first author kindly acknowledges the Ron & Margaret Smith Cancer Appeal for providing the funding to conduct a PhD research project of which this forms a part.

References

1.Vidal, F P, Bello, F, Brodlie, K Wet al. Principles and applications of computer graphics in medicine. Computer Graphics Forum 2006; 25 (1): 113137.Google Scholar
2.Beavis, A W, Page, L, Phillips, R, Ward, J. VERT: Virtual Environment for Radiotherapy Training. Conference: World Congress on Medical Physics and Biomedical Engineering, 2009; 25 (12): 236–238.Google Scholar
3.Appleyard, R, Coleman, L. Early experiences of the Virtual Environment for Radiotherapy Training (VERT) initiative and the potential to extend its use to other professional groups. Clin Oncol (R Coll Radiol) 2009; 21 (3): 240241.Google Scholar
4.Hubbold, R J, Hancock, D J, Moore, C J. Autostereoscopic display for radiotherapy planning. Stereoscopic Displays and Virtual Reality Systems IV. Proceedings of the Society of Photo-Optical Instrumentation Engineers (SPIE), 3012, pp. 16–27. San Jose, CA, 1997.CrossRefGoogle Scholar
5.Schlaefer, A, Blanck, O, Schweikard, A. Autostereoscopic display of the 3D dose distribution to assess beam placement for robotic radiosurgery. Med Phys 2005; 32 (6): 2122.Google Scholar
6.Schroeder, W, Martin, K, Lorensen, B. Visualization Toolkit: An Object-Oriented Approach to 3D Graphics, 4th edition. Kitware, 2006.Google Scholar
7.Shang, C, Williams, T, Beavis, A, Ward, J, Sims, C, Phillips, R. Can current prostate IMRT be further improved with immersive virtual reality simulation? Med Phys 2006; 33 (6): 2075.Google Scholar
8.Gong, X, Kirk, M, Zusag, Tet al. Holographic image guided radiation therapy (HIGRT) treatment planning: a multi-institutional study [Abstract]. Int J Radiat Oncol Biol Phys 2006; 66 (3): 66646665.Google Scholar
9.Chu, J, Gong, X, Cai, Cet al. Multi-institutional randomized study to evaluate a holographic display device for treatment planning. Int J Radiat Oncol Biol Phys 2007; 69 (3, Suppl 1), S698S698.Google Scholar
10.Chu, J, Zhang, Y, Yurkewicz, K. 3D display of treatment planning and anatomy data: initial observation using a promising technical advance. World Congress on Medical Physics and Biomedical Engineering. IFMBE Proceedings, 14, 1844–1847, 2007.CrossRefGoogle Scholar
11.Gong, X, Kirk, M, Zusag, Tet al. Application of a 3D volumetric display for radiation therapy treatment planning I: quality assurance procedures. J Appl Clin Med Phys 2009; 10 (3): 96114.Google Scholar
12.Chu, J, Gong, X, Cai, Yet al. Application of holographic display in radiotherapy treatment planning II: a multi-institutional study. J Appl Clin Med Phys 2009; 10 (3): 115124.Google Scholar
13.Patel, D, Muren, L P, Mehus, A, Kvinnsland, Y, Ulvang, D M, Villanger, K P. A virtual reality solution for evaluation of radiotherapy plans. Radiother Oncol 2007; 82 (2): 218221.Google Scholar
14.Butler, E, Teh, B S, Bell, Bet al. Stereoscopic visualization of treatment plans. Int J Radiat Oncol Biol Phys 2008; 72 (1): S423.Google Scholar
15.Deutschmann, H, Steininger, P, Nairz, O, Kopp, P, Merz, F, Wurstbauer, K. ‘Augmented Reality’ in conventional simulation by projection of 3-D structures into 2-D images. Strahlentherapie und Onkologie 2008; 2: 9399.Google Scholar
16.Talbot, J, Meyer, J, Watts, R, Grasset, R. A method for patient set-up guidance in radiotherapy using augmented reality. Australasian Physical and Engineering Sciences in Medicine 2009; 32 (4): 201211.CrossRefGoogle ScholarPubMed
17.Santhanam, A P, Willoughby, T R, Kaya, Iet al. A display framework for visualizing real-time 3D lung tumor radiotherapy. J Display Tech 2008; 4 (4): 473482.Google Scholar
18.Wang, C-Y, Lee, T-F, Fang, C-H. A volume visualization system with augmented reality interaction for evaluation of radiotherapy plans. Proceedings of the 2009 Fourth International Conference on Innovative Computing, Information and Control, 433–436, 2009.Google Scholar
19. ITK – Segmentation & Registration Toolkit. http://www.itk.org/. Accessed 5th January 2013.Google Scholar
20.Chen, Y, Chang, W, Liu, C, Chen, C. Integration of multidisciplinary technologies for remote-controlled, dynamic tracking, and real-time target verification for conformal radiotherapy: a prototype of target visualization system. Int J Radiat Oncol Biol Phys 2011; 81 (1): S1, S771.Google Scholar
21.Accuray press release. Accuray rolls out PlanTouch for the CyberKnife System. 2012 http://www.accuray.com/media/press-releases/accuray-rolls-out-plantouch-cyberknife-system/. Accessed 5th January 2013.Google Scholar
22.Butler, E. The use of interactive, real-time, three-dimensional (3D) volumetric visualization for image guided assistance in the brachytherapy needle placement for advanced gynaecological malignancies. Int J Radiat Oncol Biol Phys 2011; 81 (2): S1, S482S483.Google Scholar
23.Nakata, N, Suzuki, N, Hattori, A, Hirai, N, Miyamoto, Y, Fukuda, K. Informatics in radiology: intuitive user interface for 3D image manipulation using augmented reality and a smartphone as a remote control. Radiographics 2012; 25 (1): 273283.CrossRefGoogle Scholar
24. ARToolKit. http://www.hitl.washington.edu/artoolkit/. Accessed 5th January 2013.Google Scholar
25. OsiriX DICOM Viewer. http://www.osirix-viewer.com/. Accessed 5th January 2013.Google Scholar
26.Marra, I, Gallo, L, De Pietro, G. 3D interaction with volumetric medical data: experiencing the Wiimote. Ambi-Sys’ 08: Proceedings of the 1st International Conference on Ambient Media and Systems (pp. 1–6). Brussels, Belgium: ICST, 2008.Google Scholar
27.Gallo, L, Minutolo, A, De Pietro, G. A user interface for VR-ready 3D medical imaging by off-the-shelf input devices. Comput Biol Med 2010; 40: 350358.CrossRefGoogle ScholarPubMed
28. MITO – DICOM Viewer. http://sourceforge.net/projects/mito/. Accessed 5th January 2013.Google Scholar
29.Mitchell, P, Wilkinson, I D, Griffiths, P D, Linsley, K, Jakubowski, J. A stereoscope for image-guided surgery. Brit J Neurosurg 2002; 16 (3): 261266.Google Scholar
30.Liao, H, Ishihara, H, Tran, H H, Masamune, K, Sakuma, I, Dohi, T. Precision-guided surgical navigation system using laser guidance and 3D autostereoscopic image overlay. Comput Med Imag Grap 2010; 34: 4654.Google Scholar
31.International Commission on Radiation Units and Measurements. (1999) Prescribing, Recording and Reporting Photon Beam Therapy (Supplement to ICRU Report 50). ICRU Report 62. (International Commission on Radiation Units and Measurements, Bethesda, MD).Google Scholar
32.Tomikawa, M, Hong, J, Shiotani, Set al. Real-time 3-dimensional virtual reality navigation system with open MRI for breast-conserving surgery. JACS 2010; 210 (6): 927933.Google Scholar
33.Kim, S, Hong, J, Joung, Set al. Dual surgical navigation using augmented and virtual environment techniques. Int J Optomechatronics 2011; 5 (2): 155169.Google Scholar
34.Gavaghan, K, Oliveira-Santos, T, Peterhans, Met al. Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: phantom studies. Int J CARS 2012; 7: 547556.Google Scholar
35.Low, D, Lee, C K, Dip, L L T, Ng, W H, Ang, B T, Ng, I. Augmented reality neurosurgical planning and navigation for surgical excision of parasagittal, falcine and convexity meningiomas. Brit J Neurosurg 2010; 24 (1): 6974.Google Scholar
36. HITLabNZ—CityViewAR. http://www.hitlabnz.org/index.php/products/cityviewar. Accessed 5th January 2013.Google Scholar
37. String Augmented Reality. http://www.poweredbystring.com/product. Accessed 5th January 2013.Google Scholar
Figure 0

Figure 1 The Perspecta System volumetric display. Reproduced with kind permission of the American Institute of Physics from Gong et al.10

Figure 1

Figure 2 Perspecta autostereoscopic display of images transferred from Pinnacle planning system. (a) Region of interest can be tagged with different colours. In the original article right and left lungs are in green and purple respectively. Tumour is bright green in right lung. Nodal disease is in blue. (b) CT-based image. In the original article different colours are assigned to specific ranges of CT densities (e.g. purple is assigned to bone). Reproduced with kind permission of American Institute of Physics from Gong et al.11

Figure 2

Figure 3 (a) Visualisation of the dose distribution on the surface of a selected CT data volume by making all but the lowest values of the transfer function opaque. (b) Visualisation of the dose distribution on the bony structures is achieved by making areas of high CT values opaque and areas of low CT values transparent. Reproduced with kind permission of Elsevier from Patel et al.13

Figure 3

Figure 4 Fluoroscopy images matched to projection of outlined structures. Reproduced with kind permission of Springer from Deutschmann et al.15

Figure 4

Figure 5 Body contour from CT scan (grey virtual image) and patient's true image (red anthropomorphous phantom). Reproduced with kind permission of Springer from Talbot et al.16

Figure 5

Figure 6 Real-time lung tumour motion (due to breathing) and dose accumulation, displayed on AR active glasses display system. Reproduced with kind permission of IEEE from Santhanam et al.17Abbrevation: Augmented Reality (AR)

Figure 6

Figure 7 Direct interaction with medical imaging data in 3D space integrated into the open-source medical image viewer MITO. Reproduced with kind permission of Elsevier from Marra et al.26 Abbreviation: Medical Imaging Toolkit (MITO)

Figure 7

Figure 8 Laser guidance with autostereoscopic image overlay: (a) IV image overlay device and patient/image overlay; (b) alignment of surgical instrument; (c) image-patient registration results and surgical path guidance of laser beams; (d) operational diagram. Reproduced with kind permission of Elsevier from Liao et al.30

Figure 8

Figure 9 (a) Stereotactic instrument guidance system with integrated image overlay device. (b) (A) Image overlay AR for navigated liver surgery on a patient-specific rigid model and (B), pig liver tissue; (C, D) image overlay AR for navigated cranio-maxillo facial surgical planning; (E) and (F) image overlay AR for navigated orthopaedic tumour resection. Reproduced with kind permission of Springer from Gavaghan et al.34