The Future of Dentistry with Augmented and Virtual Reality

How VR and AR can and will affect clinical dentistry and the patient experience.

Dentistry is one of the world’s oldest medical professions, dating back as far as 7000 B.C. – so it’s no surprise that we’ve come quite a ways since then.

What many may not have seen coming, however, is how great of an impact emerging technology such as augmented reality and virtual reality would have on the dental field. The effect is quite astounding.

Many dentists, like most people, find themselves a bit fuzzy when it comes to identifying the exact distinctions between augmented reality (AR) and virtual reality (VR). After all, they’re both interactive, visually-based technologies. However, they differ greatly when it comes to the user experience.

So, if you’re one of the many dental professionals out out there interested in learning more about the significance of augmented and virtual reality in practice, then this article is the perfect source.

In this post we will briefly discuss the key differences between augmented reality (AR) and virtual reality (VR) — two of the hottest trends in dentistry. We’re also going to discuss how these new technologies are currently being used in dentistry today, and their projected impact on the dental industry in the future.

Virtual Reality Current and Future Trends in Dentistry

Virtual reality we find, when contrasted with augmented reality, to provide much more of an immersive in terms of experience. This is due to the fact that the technology requires a large headpiece which covers the eyes and thus “inserts” the user into their own virtual world where they can interact and engage as they please with their new surroundings.

For example, with virtual reality, you can swim with dolphins or immerse yourself or your patients in a visually vivid tutorial, learning experience or just some plain old fashioned video game.

In fact, when it comes to the application of virtual reality in the dental industry, the benefits can be seen primarily in the enhancement of the patient experience. For example, many practices have already taken on the use of virtual reality for helping to ease patient anxieties.


The immersive nature of this technology has been scientifically vetted to help promote relaxation  with the use of calming, immersive visual sceneries.

The Journal of Cyberpsychology, Behavior and Social Networking completed a study which showed virtual reality technology helps minimize pain by reducing the  perception of pain. This is because when immersed in an interactive experience, patients are distracted from the procedure.  

Additionally, using headphones in conjunction with virtual reality technology can provide an extremely positive experience for dental patients who struggle with chronic anxiety.

As you can see, there are already many tactics are currently being used to help enhance the patient experience with this amazing technology. However, the future holds even more limitless opportunities for personalization.

Imagine your patient now having the opportunity to chat with their friends on their favorite social network, play cards with their siblings while sitting in a dental chair, or link their very own Netflix account?

Furthermore, there is a trending interest in using guided meditation with virtual reality to even further promote relaxing experiences within an environment where all too many patients often struggle with anxiety.  For instance, your patients could listen to their favorite meditation guru on their relaxation playlist on their very own Youtube account.

The ability to connect to personalized services would provide a huge benefit for the enhancement of the patient experience.  <br>

 


Augmented Reality Current and Future Trends in Dentistry

Augmented reality differs from virtual reality primarily in the sense that it offers virtual information in addition to the environment that you are surrounded in. With AR, the user has a greater degree of freedom and does not require any large bulky pieces of equipment.

For example, with virtual reality, you can swim with dolphins, but with augmented reality, you can watch a dolphin jump out of your business card. In virtual reality you immerse yourself in a scenario where a teacher explains complex procedures, but with augmented reality, you can practice procedures yourself in the comfort of your office chair with no pressure.


Augmented reality and its impact in the field of dentistry is primarily concentrated within the surgery realm. Dentists can practice carrying out complex procedures or check patient vital signs with a convenient pop-up screen at the push of a button.

It has also become a very beneficial tool for dentists, especially with regards to continuing education and further training. This is because it offers visual data which can be extremely useful in terms of processing and retaining information more efficiently.

AR also serves a useful purpose when it comes to enhancing the patient experience. Generally speaking, when it comes to consultations, a patient is given a treatment plan, and a cast is taken of the mouth which is then sent to a dental lab, at which point custom replications are made and shipped back to the office for review.

With augmented reality, you have the ability to give the patient an immediate visual representation of completed treatments that have been proposed during the consult (orthodontics, crown and bridgework, implants, etc. thus augmenting the patient’s expectations significantly.


The future applications of these technologies are practically limitless. Imagine a world in which practitioners are able to take virtual scans of patient’s mouths, or in which patients are able to take scans of their own mouths and have custom-built pieces made for them. They would only have to pop in for a fit for small procedures.

Helping to alleviate the length of appointments or repeat visits can only improve the patient experience. It would also promote lower fatigue rates among practitioners – unless they decide to use this downtime to increase patient volume, of course.

Other potential uses include the overlay of information during dental procedures. Imagine if you could monitor a wider array of patient information and metrics with the touch of a button on your dental instruments or absorb information more rapidly with an AR CE course that offers convenient animated visuals along with the lecture. This could benefit both practitioners and patients of any age. The possibilities are endless.


Unfortunately, the use of HMD devices may cause vertigo, nausea, blurred vision, eyestrain, and headaches. That is why a proper examination of the potential occurrence of these elements is important before the first use [26,34]. The reason behind those side effects may be because of the mismatch between the visual and vestibular systems. Conversely, avoiding these side effects may be possible by adjusting the headset, moving the eyes at an adequate speed, circumventing any abrupt bodily movements while using it, and having rest for a while after using the device [26]. These side effects might be the reason why most of the authors are indicating the usage of non-wearable display camera systems [39]. In this review, we also reached to the same conclusion. This was also supported by Zhu M. et al. in 2011 [7]. Yet, according to Espejo-Trung LC et al., using such systems may reduce the augmenting perception of the operator [39].

Moreover, Wang J. et al. proposed a video see-throw AR system to address the methodological and implementation issues that other systems resulted in [40]. The study suggested the usage of a simple video camera that can register and project the virtual objects on the camera itself, which resolves the issue concerning the space occupied by the external tracking and display system. Thus, reducing the need for extra space and the time needed to adjust them. Putting in mind that almost all modern operating rooms already utilize the use of an optical camera that allows for AR technology applications Figure 6 [28]. Another limitation is that AR cannot be used for emergency treatments, as it requires proper pre-operative investigations [33]. Certain studies had a prolonged time and delay due to the registration phase and other technical problems [25,32].

Conclusion

From dental training,  surgery and custom orthotics, augmented reality is reshaping the very industry that built itself on reshaping our mouths, while virtual reality is making waves with patients by helping to ease anxieties and increase comfort.

Together, these two technologies can work together to help increase efficiency, lower costs and enhance the patient experience in ways that we still cannot fully imagine. While we can’t say we’re certain on the prospect of what the future will hold, we can definitely say we’re excited.

References

1. http://www.adea.org/GoDental/Health_Professions_Advisors/History_of_Dentistry.aspx
2. http://theconversation.com/dental-implants-have-been-around-since-ancient-times-but-new-teeth-can-also-fall-out-41465
3. https://www.theverge.com/2017/6/14/15805332/virtual-reality-dentist-pain-vr-health
4. https://www.youtube.com/watch?v=YOuw6gMj1d0
5. https://techcrunch.com/2017/09/05/smile-youre-in-augmented-reality-dentistry/
6. https://www.ncbi.nlm.nih.gov/labs/journals/cyberpsychol-behav-soc-netw/

Augmented Reality in Dentistry: Uses and Applications in the Digital Era

Abstract

Introduction: With all the advancements that technology has reached, Dentistry can’t be left behind. In the past few years, researchers have focused on emerging technologies like Virtual and Augmented Reality with clinical practice. Objectives: This literature review aims to provide an update on the latest technological applications and development in augmented reality in the dental field. Methods: The PubMed database was reviewed, and the studies that fulfilled the inclusion criteria in the last 20 years, from 2000 to 5 May 2020, were included. Results: The search results revealed a total of 72 articles, 32 were excluded, while 40 articles were included. It’s been observed that augmented reality application is still under testing, as certain drawbacks still tie the spread of this technology in the dental field. Multiple studies have resulted in a system that is suitable for clinical use. Yet no routine clinical application has been reported. Conclusion: The research department has already covered more advanced technologies like mixed reality. Therefore, a question arises, whether augmented realty will continue to grow independently or will mixed reality dominate the field.

Keywords: Augmented reality, Dentistry, Dental technology, Clinical application, Technology, Dental practice.

Abbreviations: CAM-Computer Aided Manufacturing, CAD-Computer-Aided Design, RP-Rabid Prototyping, ML-Machine Learning, VR-Virtual Reality, AR-Augmented Reality.

Introduction

With each passing day, technology evolves, improving prospects in multiple fields in life. whether in the applied sciences fields, education, military, sports, entertainment industry, medical field, dental field or others [1,2]. Many digital production management workflows have already been implemented into treatment protocols, particularly in the fast-growing Computer-Aided Design\Computer Aided Manufacturing (CAD\CAM), Rabid Prototyping (RP), automated processing in radiological imagining by the usage of Artificial Intelligence (AI) and Machine Learning (ML), Virtual Reality (VR), and Augmented Reality (AR) [1]. Virtual Reality (VR) technology is a synthetic environment composed of computer-generated images, audio, and videos where the users are emerged inside the artificial environment and can’t see the real world [1]. Consequently, Augmented Reality (AR) is the technology that combines computer-generated images, audio, and videos on a screen with real-life scenes [1,3]. Therefore, for AR creation, computerized virtual components or elements are needed. The AR technology allows the users to superimpose virtual content in the real world; thus, it supplements reality with virtual content as a mix, rather than a complete replacement [4]. Due to this distinctive feature, AR is much easier to be realized and understood than VR [5].

For the creation of augmented reality systems, multiple components are required to be present. First and fore most is a camera, a sensor, or a scanning device; that will capture real-life scenes and objects. A second component is a computer unit; which can be described as the processing phase of the captured images and movements, analyzing the position, tilt, acceleration, and adding depth to the captured images; hence generating 3D images. Thirdly, a display system to display virtual and 3D objects in the real world. Lastly, a tracking device is needed to accomplish the registration phase, which is a phase that is needed to continuously track the user during the procedure to allow for real-time visualization [6]. Registration techniques can be categorized into two main groups: marker-free registration, such as laser skin surface scanning, and marker-based registration, such as anatomical landmarks, bone screws, and skin adhesive markers [7,8]. The virtual objects can be viewed from multiple angles and follow the patient and the operator’s movements by the usage of tracking systems [6-9].

There are two techniques used for tracking. The first technique uses the Fiducially Markers, which depend on the anatomical landmarks obtained from the X-rays. The second technique uses Surface Matching, which depends on position sensors placed on the instrument used and on the patient. Tracking systems are used to track the patient, the instruments, and the operator’s movement. Then, transferring the collected data to the processing unit; allows for almost real-time visualization. This process of registration and re-registration (in case any of the elements being tracked moves) takes time and depends upon the speed of the processing unit [9].

From a dental perspective, the pre-operative X-rays of the patient resemble the previously taken images that will later be used to obtain the 3D x-rays. Such x-rays can be obtained from 3D X-rays, like Computed Tomography (CT), or from multiple 2D images [10,11]. Four main types of 3D imaging systems have been used to capture dental and ore-facial structures; Cone-Beam Computed Tomography Systems (CBCT), Laser scanner, Structured light scanner, and Sterophotogrammetry [12].

After the images were captured and analyzed, they are displayed on the operating field (patient mouth or face) as superimposed objects; to allow navigational support intra-operatively from the previously obtained pre-operative X-rays directly on the patient. This can be based on video-based display, see-through display, and projection-based AR. The video-based display uses endoscopic cameras or Head-Mounted Displays (HMD) to superimpose virtual objects on a (stereo) video stream, thus increasing the viewer’s understanding of depth, motion, and stereo parallax. See-through display and projection-based AR uses translucent silver mirrors, see-through devices, and projectors. Those devices are placed between the operator and the patient; to allow the projection of the virtual objects [6,13-16]. Multiple researchers have proved the effectiveness of the AR simulators in assisting dentists by showing and displaying virtual models in the operating field. This directly contributed to the reduction in the difficulty of hand-eye coordination [17].

AR has already been introduced in the dental research, incorporating the dental implant, oral and maxillofacial surgery, orthodontic, endodontic, prosthodontics, paedodontics, operative dentistry, as well as dental education [1,4-6,17-26]. The reason why this study aims to acknowledge the latest technological development related to augmented reality uses and applications in the dental field, also its future, and how can it be improved.

Materials and Methods

Duration: 6 months.

Study Design: Literature Review.

Inclusion Criteria: 2000-5 May 2020, database that was searched: PubMed.

Exclusion Criteria: All the studies that were published in a language other than English were excluded, as well as editorial, letters to the editor, experimental studies on animals, short communications, articles related to Cranial-Maxillofacial Surgery, and articles that do not present an application or use of AR systems in dentistry. Studies focusing on other technological advancements that modify the normal visual environment like mixed reality, hybrid reality, and virtual reality were also excluded.

Data Collection Procedure: the search terms “Augmented reality” and “Dentistry” were used to search the PubMed database from the year 2000-2005 May 2020 for augmented reality uses in dentistry. n=72 articles were found in which n=40 were included and n=32 were excluded. Articles selection occurred in 2 stages, title and abstract evaluation, which resulted in the exclusion of n=19 articles leaving n=53 articles, followed by the full article evaluation, which resulted in the exclusion of n=13 articles leaving n=40 Figure 1. The list of included articles present in Table 1. The included articles presented in Table 1 are arranged in chronological order and start from 2005 as the previous articles did not meet the inclusion criteria.

Ethical Consideration: This literature review was approved by RAK Medical and Health Sciences University ethical committee and institutional review board.

It’s been observed that most articles were published in the last 5 years as n=48 articles from n=72 were published from 2016-5 May 2020 Figure 2. 55% of our included articles covered the AR applications, as a navigational system, in surgery, which exceeds the amount covered for other dental specialties Figure 3. The applications found were summarized in Table 2 and were divided according to the dental specialties. A detailed description of the AR systems was covered in Table 3.

Discussion

In this review, multiple systems and methods have been covered for the implication of AR in clinical practices. The reason behind this is that no standard method for AR technology application in clinical practice has been yet proposed [4,14]. This encouraged the researchers to modify the previously used systems to create new systems that would provide better outcomes [13,15,19,24,27]. The results revealed that the amount of literature covering the uses of AR as a navigational system in surgeries exceeds the amount covered for other dental specialties. This coincides with a review done by Ayoub A. et al. in 2019, which had described it as the primary area of use [12].

Many of the studies in this review have focused on improving certain aspects that could consequently enhance the AR systems. Such aspects would be accuracy, processing time, image registration, depth perception, and occlusion handling [13,15,19,24,28,29]. In comparison with manual procedures, Implant AR-supported navigation systems have shown more accurate results and less deviation. Also, it reduces iatrogenic complications such as sinus perforations, fenestrations, dehiscence’s, or mandibular nerve damage [19,20,30]. Although good results have been proven in multiple studies, Pellegrino G. et al. had a negative result in using the AR system for placement of two implants, as angular deviations for the first and second implants were respectively 3.05° and 2.19°. Thus further testing and researches are needed [31].

2D and 3D Computer-assisted navigational systems have been of great value to surgeons in the preceding years. In particular, the field of OMF surgery, where surgeons are faced with complex anatomy, the narrow spatial relationship of vital structures, and high esthetic demands. One of the major improvements was image-guided navigation that uses the pre-operatively acquired scans to enable intra-operative guidance.

Nevertheless, certain flows and challenges were accompanied by the use of these devices [20,32]. These include a lack of image depth in the virtually displayed images, the need for hand-eye Transformation, indirect recognition of the patient’s anatomy from the two-dimensional images, and inaccurate Registration in indirect visualization of three-dimensional images, as the small surface details may be smoothened out [6,13]. Accordingly, a constant comparison between the surgical field and the displayed image is required, conveying the need to look away from the operating field to see the displaying screen [6,13].

The usage of the augmented reality technology as a navigational tool could decrease the mean positional errors to 0.7 mm [6]. A study in 2018, done by Zhu M. et al., compared the usage of the AR system, Individualized Templates (IT), and free-hand technique in a Mandibular Angle Osteotomy (MAO) [33]. The study sample was divided into three groups; 31 patients were in the AR group, 28 patients in the IT group, and 34 patients in the free-hand group.

The study results showed that AR needed more time than the free-hand technique in the pre-operative phase, but in regards to the procedure time, the AR system proved to be less. The AR system showed an advantage over the IT system as the AR system pre-surgical option can be altered anytime. Furthermore, the surgeons performing the procedure favored the use of the AR system on the IT technique, as it provided more understanding of the operative field and provided better viewing [33]. A study was done in 2019 by Pietruski P. et al [34]. compared the usage of cutting guides created by CAD/CAM and two AR systems, based on simple (SAR) and a Navigated (NAR) Augmented Reality technology, for a mandibular osteotomy procedure, concerning the accuracy of the systems. After performing 21 osteotomies on the identically fabricated mandibles (Seven for each method). The result indicated a more accurate procedure when using surgical guides. CAD/CAM printed guides are gaining a lot of popularity nowadays. Although it has been proven more accurate than AR, this technique is limited by certain drawbacks that permit the widespread use of this technology. A time-consuming process, as the guide needs to be printed, which limits its use for trauma and cancer patients. It is also a costly technique. The main drawback is that the guide needs to be placed directly on a bony landmark, which means that greater irritation and extensive dissection of soft tissue are required. AR technologies have the potential to decrease these limitations in the future [34].

The relevant advances in the AR systems and techniques opened the door for the uses of AR in other dental specialties [33]. According to Dr. Charles J. Good acre, an educator at Loma Linda University School of Dentistry (Loma Linda, CA), there are four key factors to enhance dental education; spatial ability, interactivity, critical thinking, and clinical correlations with the integration of multiple dental disciplines. He described how 3D software (eHuman (https://ehuman.com/)) could help in enhancing dental education and the advantages it gives to the students [35]. Preclinical classes help dental students in improving fine motor abilities, mastery of new tools, as well as, provide an understanding of therapeutics, biomaterials, and techniques before patient treatment where the convergence of these disciplines takes place [36].

According to the literature, AR has proved to increase those skills needed for this convergence, as it is strongly related to spatial vision, since it increases both the surgeon’s visual awareness in high-risk surgeries and increases the surgeon’s intuitive grasp of the operating field [4,11,37]. Also, it decreases the iatrogenic complications of the treatment performed like the injuries of the surrounding anatomical structures Figure 4, proved effectiveness by assisting the oral surgeons to better visualize the operating fields that are not directly observed, aiding in the reduction of surgical time and morbidity, which may result in a reduced overall treatment coast, and help address the challenges that may confront the surgeon during a procedure [4,11,19,24,31,34]. Furthermore, it helps in decreasing the amount of X–rays the patient is required to have [8].

In contrast to the previously mentioned benefits, AR does have certain drawbacks. In OMF surgeries, the implementation of AR decreased because of the sophistication of such surgeries and the longer time required for the implementation of such devices. Additionally, the technical application and the limited accuracy have been proposing a difficulty [14]. Without forgetting to mention, that the system needs expensive necessary equipment, as the expenses of AR systems are still high [4,15,25]. This is why this technology demands both economic and methodological rationalization [4,33,38]. Those drawbacks not only apply to OMFS, but also the routine dental application of AR. A study by Won Y. in 2017 covered the usage of a simple AR system for assisting in the IAN block administration [14].

The study suggested a simple method of implementation of AR in dental practice without the need for a sophisticated system, as they attempted to create an AR on a screen monitor. It also suggested that the utilization of this technique could prove beneficial in orthodontics or prosthodontics, with certain Unfortunately, the use of HMD devices may cause vertigo, nausea, blurred vision, eyestrain, and headaches. That is why a proper examination of the potential occurrence of these elements is important before the first use [26,34]. The reason behind those side effects may be because of the mismatch between the visual and vestibular systems. Conversely, avoiding these side effects may be possible by adjusting the headset, moving the eyes at an adequate speed, circumventing any abrupt bodily movements while using it, and having rest for a while after using the device [26]. These side effects might be the reason why most of the authors are indicating the usage of non-wearable display camera systems [39]. In this review, we also reached to the same conclusion. This was also supported by Zhu M. et al. in 2011 [7]. Yet, according to Espejo-Trung LC et al., using such systems may reduce the augmenting perception of the operator [39].

Unfortunately, the use of HMD devices may cause vertigo, nausea, blurred vision, eyestrain, and headaches. That is why a proper examination of the potential occurrence of these elements is important before the first use [26,34]. The reason behind those side effects may be because of the mismatch between the visual and vestibular systems. Conversely, avoiding these side effects may be possible by adjusting the headset, moving the eyes at an adequate speed, circumventing any abrupt bodily movements while using it, and having rest for a while after using the device [26]. These side effects might be the reason why most of the authors are indicating the usage of non-wearable display camera systems [39]. In this review, we also reached to the same conclusion. This was also supported by Zhu M. et al. in 2011 [7]. Yet, according to Espejo-Trung LC et al., using such systems may reduce the augmenting perception of the operator [39].

Moreover, Wang J. et al. proposed a video see-throw AR system to address the methodological and implementation issues that other systems resulted in [40]. The study suggested the usage of a simple video camera that can register and project the virtual objects on the camera itself, which resolves the issue concerning the space occupied by the external tracking and display system. Thus, reducing the need for extra space and the time needed to adjust them. Putting in mind that almost all modern operating rooms already utilize the use of an optical camera that allows for AR technology applications Figure 6 [28]. Another limitation is that AR cannot be used for emergency treatments, as it requires proper pre-operative investigations [33]. Certain studies had a prolonged time and delay due to the registration phase and other technical problems [25,32].

Due to the continuous development in the field, new systems became available for use and eased the way to a solution for this obstacle. Suenaga H. et al. published a study in 2015 that introduced a new system. Instead of taking an hour for the registration phase, it takes less than 30 seconds for the completion of it [11]. Wang J. et al. also had similar results [24]. In like manner, Ma L. et al. in 2019 proposed a system that can register the Occlusal splint outside the patient mouth, thus reducing the intra-operative time [30].

For further enhancements of the previous points, Basnet B. R. et al. in 2018 shed the light on further issues in regards to the processing phase, including noise in real-time images, image registration, high processing time, and poor occlusion handling [29]. The study also proposed a solution to handle those limitations by introducing a new system aimed to increase the navigational accuracy by removal of occlusion and noise in real-time navigation. This was accomplished by the use of a weighting-based de-noising filter and depth mapping-based occlusion removal to exclude occluded objects (Blood, surgical tools, and the surgeon’s body) [10,29]. In this context, legal regulations must be clearly defined with a clear standard for the directive of patient data [3].

The fastest way for the brain to capture content is through images and visual experience. The concept of human understanding of reality is confined with the three dimensions of space, as the human brain functions on the principles of images and associations, which in return supports the AR concept, thus, yields the promise for further adaptations [41].

Conclusion

This review article covered the AR history, its systems, clinical applications, and the advancements in the AR field from 2000 till 5 May 2020. The publications indicate that AR application is still under testing, as certain drawbacks do tie the spread of this technology in the dental field. Multiple studies have resulted in a system that is suitable for clinical use, yet no routine clinical application has been reported. Improving the speed and accuracy of the processing unit should be the focus of future studies. The results revealed that AR was not used in all dental specialties as the applications found only covered oral surgery, orthodontics, endodontic, prosthodontics, operative, pedodontitics, and dental education As technological advancements resemble a continuous cycle, mixed reality is a new promising technology that combines both VR and AR. The research department has already covered this technology; therefore, a question arises, whether AR will continue to grow independently or will mixed reality dominate the field.

Acknowledgements

We would like to thank all the individuals who contributed to the success of this research. We would like to direct a special thanks to all the faculty, supervisors, and ethical committee in the RAKCODS and RAKMHSU.

References

  1. Huang TK, Yang CH, Hsieh YH, Wang JC and Hung CC. Augmented reality (AR) and virtual reality (VR) applied in dentistry (2018) Kaohsiung J of Med Scie 34: 243-248. https://doi.org/10.1016/j.kjms.2018.01.009
  2. Phuyal S, Bista D and Bista R. Challenges, opportunities and future directions of smart manufacturing: a state of art review (2020) Sustain Futur 2, 

https://doi.org/10.1016/j.sftr.2020.100023

  1. Joda T, Gallucci GO, Wismeijer D and Zitzmann NU. Augmented and virtual reality in dental medicine:

 A systematic review (2019) Comput Biol Med 108: 93-100, https://doi.org/10.1016/j.compbiomed.2019.03.012

  1. Badiali G, Ferrari V, Cutolo F, Freschi C and Caramella D, et al. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning (2014) J Cranio-Maxillofacial Surg 42: 1970-1976

http://dx.doi.org/10.1016/j.jcms.2014.09.001

  1. Mladenovic R, Dakovic D, Pereira L, Matvijenko V and Mladenovic K. Effect of augmented reality simulation on administration of local anesthesia in pediatric patients (2020) Eur J Dent Educ
  2. Suenaga H, Hoang Tran H, Liao H, Masamune K and Dohi T, et al. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study (2013) Int J Oral Sci 5: 98-102 . http://www.nature.com/articles/ijos201326
  3. Zhu M, Chai G, Zhang Y, Ma X and Gan J. Registration strategy using occlusal splint based on augmented reality for mandibular angle oblique split osteotomy (2011) J Craniofac Surg 22: 1806-1809

https://doi.org/10.1097/scs.0b013e31822e8064

  1. Zhu M, Liu F, Chai G, Pan JJ and Jiang T, et al. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery (2017) Sci Rep 7 ,

 https://doi.org/10.1038/srep42365

  1. Nijmeh AD, Goodger NM, Hawkes D, Edwards PJ and McGurk M. Image-guided navigation in oral and maxillofacial surgery (2005) Br J Oral Maxillofac Surg 43:294-302 https://doi.org/10.1016/j.bjoms.2004.11.018
  2. Farronato M, Maspero C, Lanteri V, Fama A and Ferrati F, et al. Current state of the art in the use of augmented reality in dentistry: A systematic review of the literature (2019) BMC Oral Health 19 https://doi.org/10.1186/s12903-019-0808-3
  3. Suenaga H, Tran HH, Liao H, Masamune K and Dohi T, et al. Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study (2015) BMC Med Imaging 15 https://doi.org/10.1186/s12880-015-0089-5
  4. Ayoub A and Pulijala Y. The application of virtual reality and augmented reality in Oral and Maxillofacial Surgery (2019) BMC Oral Health 8

 https://doi.org/10.1186/s12903-019-0937-8

  1. Tran HH, Suenaga H, Kuwana K, Masamune K and Dohi T, et al. Augmented reality system for oral surgery using 3D auto stereoscopic visualization (2011) Lect Notes Comput Sci 81-88 https://doi.org/10.1007/978-3-642-23623-5_11
  2. Won Y-J and Kang S-H. Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note (2017) J Dent Anesth Pain Med 17: 129-134 https://doi.org/10.17245/jdapm.2017.17.2.129
  3. Murugesan YP, Alsadoon A, Manoranjan P and Prasad PWC. A novel rotational matrix and translation vector algorithm: geometric accuracy for augmented reality in oral and maxillofacial surgeries (2018) Int J Med Robot Comput Assist Surg 14: 1-14

https://doi.org/10.1002/rcs.1889

  1. Bosc R, Fitoussi A, Hersant B, Dao TH and Meningaud JP. Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies (2019) Int J Oral Maxillofac Surg 48: 132-139.

https://doi.org/10.1016/j.ijom.2018.09.010

  1. Zhou Y, Yoo P, Feng Y, Sankar A and Sadr A, et al. Towards AR-assisted visualisation and guidance for imaging of dental decay (2019) Healthcare Tech Letters 6: 243-248

https://doi.org/10.1049/htl.2019.0082

  1. Kwon HB, Park YS and Han JS. Augmented reality in dentistry: a current perspective (2018) Acta Odontol Scand 76: 497-503 https://doi.org/10.1080/00016357.2018.1441437
  2. Jiang W, Ma L, Zhang B, Fan Y and Qu X, et al. Evaluation of the 3d augmented reality-guided intraoperative positioning of dental implants in edentulous mandibular models (2018) Int J Oral Maxillofac Implants 33: 1219-1228
  3. Lin YK, Yau HT, Wang IC, Zheng C and Chung KH. A novel dental implant guided surgery based on integration of surgical template and augmented reality (2015) Clin Implant Dent Relat Res 17: 543-553

https://doi.org/10.1111/cid.12119

  1. Aichert A, Wein W, Ladikos A, Reichl T and Navab N. Image-based tracking of the teeth for orthodontic augmented reality (2012) Lect Notes Comput Sci 601-608

https://doi.org/10.1007/978-3-642-33418-4_74

  1. Bruellmann DD, Tjaden H, Schwanecke U and Barth P. An optimized video system for augmented reality in endodontics: A feasibility study (2013) Clin Oral Investig 17: 441-448 https://doi.org/10.1007/s00784-012-0718-0
  2. Touati R, Richert R, Millet C, Farges J-C and Sailer I, et al. Comparison of two innovative strategies using augmented reality for communication in aesthetic dentistry: a pilot study (2019) J Health c Eng 1-6

https://doi.org/10.1155/2019/7019046

  1. Wang J, Suenaga H, Hoshi K, Yang L and Kobayashi E. Augmented reality navigation with automatic marker-free image registration using 3-d image overlay for dental surgery (2014) Biomed Eng 61: 1295-1304

https://doi.org/10.1109/tbme.2014.2301191

  1. Zinser MJ, Mischkowski RA, Dreiseidler T, Thamm OC and Rothamel D, et al. Computer-assisted orthognathic surgery: waferless maxillary positioning, versatility, and accuracy of an image-guided visualisation display (2013) Br J Oral Maxillofac Surg 51: 827-833

http://dx.doi.org/10.1016/j.bjoms.2013.06.014

  1. Zafar S and Zachar JJ. Evaluation of holohuman augmented reality application as a novel educational tool in dentistry (2020) Eur J Dent Educ 24: 259-265

https://onlinelibrary.wiley.com/doi/abs/10.1111/eje.12492

  1. Wang J, Shen Y and Yang S. A practical marker-less image registration method for augmented reality oral and maxillofacial surgery (2019) Int J Comput Assist Radiol Surg 14: 763-773 https://doi.org/10.1007/s11548-019-01921-5
  2. Wang J, Suenaga H, Yang L, Kobayashi E and Sakuma I. Video see-through augmented reality for oral and maxillofacial surgery (2016) Int J med robotics and comp ass surgery 13: 211-215 https://doi.org/10.1002/rcs.1754
  3. Basnet BR, Alsadoon A, Withana C, Deva A and Paul M. A novel noise filtered and occlusion removal: navigational accuracy in augmented reality-based constructive jaw surgery (2018) Oral Maxillofac Surg 22: 385-401

https://doi.org/10.1007/s10006-018-0719-5

  1. Ma L, Jiang W, Zhang B, Qu X and Ning G, et al. Augmented reality surgical navigation with accurate cbct-patient registration for dental implant placement (2019) Med Biol Eng Comput 57: 47-57 https://doi.org/10.1007/s11517-018-1861-9
  2. Pellegrino G, Mangano C, Mangano R, Ferri A and Taraschi V, et al. Augmented reality for dental implantology: a pilot clinical report of two cases (2019) BMC Oral Health 19 https://doi.org/10.1186/s12903-019-0853-y
  3. Mischkowski RA, Zinser MJ, Kübler AC, Krug B and Seifert U, et al. Application of an augmented reality tool for maxillary positioning in orthognathic surgery – A feasibility study (2006) J Cranio-Maxillofacial Surg 34: 478-483

https://doi.org/10.1016/j.jcms.2006.07.862

  1. Zhu M, Liu F, Zhou C, Lin L and Zhang Y, et al. Does intraoperative navigation improve the accuracy of mandibular angle osteotomy: comparison between augmented reality navigation, individualised templates and free-hand techniques (2018) J Plast Reconstr Aesthetic Surg 71: 1188-1195 https://doi.org/10.1016/j.bjps.2018.03.018
  2. Pietruski P, Majak M, Światek-Najwer E, Żuk M and Popek M, et al. Supporting mandibular resection with intraoperative navigation utilizing augmented reality technology-a proof of concept study (2019) J Cranio-Maxillofacial Surg47: 854-859

https://doi.org/10.1016/j.jcms.2019.03.004

  1. Goodacre CJ. Digital learning resources for prosthodontic education:the perspectives of a long-term dental educator regarding 4 key factors (2018) J Prosthodont 27: 791-797 https://doi.org/10.1111/jopr.12987
  2. Kim-Berman H, Karl E, Sherbel J, Sytek L and Ramaswamy V. Validity and user experience in an augmented reality virtual tooth identification test (2019) J Dent Educ 83: 1345-1352 https://doi.org/10.21815/jde.019.139
  3. Llena C, Folguera S, Forner L and Rodríguez-Lozano FJ. Implementation of augmented reality in operative dentistry learning (2018) Eur J Dent Educ 22: 122-130

https://doi.org/10.1111/eje.12269

  1. Albuha Al-Mussawi RM and Farid F. Computer-based technologies in dentistry: types and applications (2016) J Dent (Tehran) 13: 215-222

http://www.ncbi.nlm.nih.gov/pubmed/28392819

  1. Espejo-Trung LC, Elian SN and Luz MAADC. Development and application of a new learning object for teaching operative dentistry using augmented reality (2015) J of dental edu79: 1356-1362

https://pubmed.ncbi.nlm.nih.gov/26522642/

  1. Wolz MM. Language barriers: challenges to quality healthcare (2015) Int J Dermatol 54: 248-250

http://doi.wiley.com/10.1111/ijd.12663

  1. Mladenovic R, Pereira LAP, Mladenovic K, Videnovic N and Bukumiric Z, et al. Effectiveness of augmented reality mobile simulator in teaching local anesthesia of inferior alveolar nerve block (2019) J Dent Educ 83: 423-428

https://doi.org/10.21815/jde.019.050

Prototype of Augmented Reality Technology for Orthodontic Bracket Positioning: An In Vivo Study

Abstract

To improve the accuracy of bracket placement in vivo, a protocol and device were introduced, which consisted of operative procedures for accurate control, a computer-aided design, and an augmented reality–assisted bracket navigation system. The present study evaluated the accuracy of this protocol. Methods: Thirty-one incisor teeth were tested from four participators. The teeth were bonded by novice and expert orthodontists. Compared with the control group by Boone gauge and the experiment group by augmented reality-assisted bracket navigation system, our study used for brackets measurement. To evaluate the accuracy, deviations of positions for bracket placement were measured. Results: The augmented reality-assisted bracket navigation system and control group were used in the same 31 cases. The priority of bonding brackets between control group or experiment group was decided by tossing coins, and then the teeth were debonded and the other technique was used. The medium vertical (incisogingival) position deviation in the control and AR groups by the novice orthodontist was 0.90 ± 0.06 mm and 0.51 ± 0.24 mm, respectively (p < 0.05), and by the expert orthodontist was 0.40 ± 0.29 mm and 0.29 ± 0.08 mm, respectively (p < 0.05). No significant changes in the horizontal position deviation were noted regardless of the orthodontist experience or use of the augmented reality–assisted bracket navigation system. Conclusion: The augmented reality–assisted bracket navigation system increased the accuracy rate by the expert orthodontist in the incisogingival direction and helped the novice orthodontist guide the bracket position within an acceptable clinical error of approximately 0.5 mm.

Keywords: augmented reality; orthodontic; bracket navigation system; bracket positioning

1. Introduction

Augmented reality (AR) is a technology that can accurately and reproducibly superimpose to the real-world environment. An AR system fulfills 3 features: A combination of real and virtual objects, real-time interaction, and accurate 3 dimensions of space registration of virtual and real objects. The first AR device was developed by Ivan Sutherland in 1968. He set up a head-mounted 3-dimensional display, and the observer could see a cube in the view [1]. The development of AR is attributed to Boeing in 1990. Before AR technology, Boeing’s workers were required to continuously discuss a laptop screen to ensure that the numerous wires were correctly connected. The process was exhausting and time consuming. Boeing’s engineers developed AR headsets, which helped the workers see the information projected in front of their eyes, which improved their wire construction efficiency and lowered errors [2,3]. Gradually, AR expanded to various fields, including gaming, service, factory assembly, and medical industries. In the field of oral medicine, AR is also used in oral maxillofacial surgeries, dental implant [4,5], and oral education [6]. In addition, AR can promote the development of orthodontics [3]. The problem of accurate orthodontic bracket positioning is being discussed for many decades now. In 1972, Andrews devised the straight wire (preadjusted) appliance, which the technique reduces a part of wire bending [7–9]. The concept of straight wire appliance is each bracket has built-in specific torque, tip, in/out, and proper offset to achieve proper alignment of the center of the slot and to allow teeth to properly settle in the ideal final positions. The most important thing when using the straight wire appliance is bracket positions. However, inaccurate bracket positions can affect occlusal function and dental aesthetics, causing articulation disorders and grooved marginal ridges between teeth; inaccurate bracket positions can also increase the likelihood of food becoming lodged between teeth, which can inflame the gingiva, damage the alveolar bone, and lead to periodontal disease [10]. The problem of inaccurate bracket positions does not only occur because of orthodontics being performed by different clinical dentists. Even when the treatment is performed by the same dentist, bracket deviation of teeth in different quadrants is likely. Various factors can affect bracket positioning, including the clinical crown height, tooth morphology, abnormal incisor line, and occlusal interference. Therefore, making accurate orthodontic decisions is critical for clinical dentists. In the past, dentists could only rely on their experience and adjust the bracket positions until correct positioning was achieved. Following advances in information and communications technology, digital applications in dentistry have increased, including three-dimensional (3D) intraoral scanning, cone beam computed tomography, computer-aided design and manufacturing, and the production of surgical guides. Digital dentistry has increased the quality and accuracy of dental treatment. The technique of 3D intraoral scanning can digitalize the intraoral structure, facilitating treatment planning and simulation with the aid of 3D software. To prevent inaccurate bracket positioning and help dentists make appropriate decisions, researchers have proposed the incorporation of customized orthodontics into digital orthodontics. Advances in digital orthodontics allow the use of 3D models in customizing brackets according to the patient’s malocclusion type and individual problems. Studies [11–13] have focused on bracket positioning methods, including positioning methods using orthodontic surgical guides, positioning gauges, and 3D-printed guides [11–13]. Despite the progress of digital orthodontic techniques in recent years, the use of advanced techniques (e.g., using 3D-printed guides) for bracket positioning can incur heavy costs and thus reduce the generalizability of digital orthodontics. To solve this problem, the present study proposed an AR-assisted bracket navigation system to shift preoperational planning to clinical practice, with the aim of reducing guide production time and cost, enhancing the accuracy of bracket bonding, and achieving customized orthodontics. To facilitate the accurate localization of bracket positioning, we developed an ARassisted bracket navigation system. This study analyzed the accuracy of bracket placement between the conventional direct bonding technique with Boone gauge and AR-assisted bracket navigation system by the novice and the expert (Figure 1)

2. Materials and Methods

2.1. Samples and Operators

This clinical trial enrolled 4 patients (2 women and 2 men). All patients had intact central and lateral upper and lower incisors, and clinical crown heights of all teeth were available. Thirty-two teeth (8 dentitions × 4 patients) were examined. Incisors with the following conditions were excluded: (1) Limited space for bracket placement, (2) severe defects that could interfere with bracket placement, and (3) teeth with prosthesis that could hinder bracket bonding and tooth morphology. One crowded tooth was excluded because of limited space for bracket placement. First, 31 brackets were bonded using either the AR-assisted bracket navigation system or Boone gauge and subjected to oral scan; then, the teeth were debonded and were rebonded with the other technique, and then were re-subjected to oral scan. Bracket bonding was performed in each tooth by 2 operators: One was well trained in the orthodontic program for 3 years, and the other was an intern with only basic orthodontic knowledge and without any clinical orthodontic experience. The study was performed in the department of orthodontics, China Medical University Hospital Medical Center, Taichung, Taiwan. The study was approved by the local research ethics committee (CMUH106-REC3-146), and each participant signed an informed consent form.

2.2. AR and Device Settings

A wireless handpiece intra-oral camera (QOCA® Q-tube Wi-Fi Teeth Scope Pro, Quanta Computer Inc., Taoyuan City, Taiwan) was used, with the resolution and operation distance set to 640 × 640 pixels and 3–12 mm, respectively. The operator can visualize the view of the patient’s mouth and augmented information from the screen (Figure 2). Before starting bracket placement, all teeth were scanned, and the ideal bracket position was analyzed using the intra-oral scanner (TRIOS 3 Basic, 3Shape Dental Systems, Copenhagen, Denmark) and digital modelling software(Ortho Analyzer, 3Shape Dental Systems, Copenhagen, Denmark). Pre-adjusted edgewise plastic brackets with an 0.018-inch slot (orthoEsther MB; Tomy) were bonded on the teeth. For bracket height measurement, the Boone gauge (G&H) was used in the control group (Figure 3). With an interval of 0.5 mm, the ruler allows the measurement of bracket height at 0.35, 0.40, 0.45, and 5.0 mm. The AR-assisted bracket navigation system was adopted in the experimental group. After bracket bonding, all teeth were scanned, and the final bracket position was analyzed.

2.3. Method for Applying the AR-Assisted Bracket Navigation System

The AR-assisted bracket navigation system comprised two technical modules, namely the facial axis of the clinical crown (FACC) detection module and the bracket bonding navigation module. Collecting real-time images captured by the intraoral camera, the FACC module segmented the image of each tooth through computer-vision–based image analysis, thereby generating dental features that included the complete contour and FACC of each tooth. The bracket bonding navigation module overlapped the real-time image with the bracket bonding positions determined during preoperational planning, thereby achieving preoperational planning visualization.

2.3.1. FACC Detection Module

The FACC detection module performed morphological analysis to roughly segment the contour of the entire tooth. To accurately extract the detailed tooth contour, the GrabCut algorithm was adopted to facilitate interactive foreground extraction and fix the contour of each tooth. Specifically, the algorithm, targeting regions of interest in each tooth extracted from the morphological analysis, implemented foreground image segmentation using source and sink nodes, foreground–background separation, graph cuts, and energy function. During image segmentation, pixels were considered nodes, and these nodes comprised two types, namely source nodes and sink nodes. Source nodes represented the foreground image of regions of interest, whereas sink nodes indicated unwanted background image regions. This technique can create a great pixel difference on the edge of the target image, allowing the system to distinguish regional foregrounds from backgrounds, isolate the tooth contour, and achieve tooth segmentation [14]. After extracting the contour of each tooth using real-time imaging, we extracted the bounding box of the closed contour formulated by each tooth to detect the anatomical FACC. For example, when segmenting the maxillary teeth, the tooth zenith can be identified along the upper side of the bounding box, whereas the bottom side denotes the incisor line. By connecting the zenith and incisor line, the FACC can be detected. The FACC, as defined by Andrews (1979), serves as a reference line in straight-wire appliance design. We adopted the FACC because it is highly rediscoverable and reliable as a reference line, susceptible to the surroundings, and unlikely to change in the entire lifetime; moreover, theaxis does not require X-ray imaging in clinical practice, and can be seen by the naked eye, thereby allowing direct observation during treatments [15].

2.3.2. Bracket Bonding Navigation Module

In a limited space, the bracket bonding navigation module mapped the bracket position data (planned prior to the operation) to real-time images, during which the tooth contour and FACC information acquired by the FACC model was used, the problem of optical distortion was considered, and scale-ruler–based proportional measurement conversions were performed. The module could avoid operational interference and inconvenience caused by the use of positioning two-dimensional barcode; the module managed to transfer the bracket coordinates from a virtual space to actual space. During scale conversion, the researchers had to identify the immovable hard tissues for the system to calculate the ratio of pixel to actual distance (Figure 4). To do so, they obtained the mean of the left-to-target and right-to-target FACC distances for each tooth (Figure 5). The present study regarded the FACC of neighboring teeth as the reference points because the FACC distance is less likely to be prone to cumulative errors caused by gingival changes and recession, missing information, and dental crowding. Furthermore, said mean could reduce unilateral measurement errors. Conversion was performed using the following equations:

where F denotes the FACC, B the bracket, D the distance, P the pixel, r the right tooth, and l the left tooth. The distance between the FACC of the target tooth and that of the right tooth is represented by Pr pixels (Dr mm), whereas the FACC of the target tooth and that of the left tooth are Pl pixels (Dl mm). The distance between the bracket and incisor line is denoted by PB pixels (DB mm).

2.4. Intraoral Scanner and Its Operations

An intraoral scanner (TRIOS 3 Basic, 3Shape Dental Systems, Copenhagen, Denmark) was used to record the oral status of participants. Following the manufacturer’s instructions, the scanner was calibrated prior to scanning with the lens preheated. High power suction was used to suck most of the saliva, after which the remaining saliva was dried using compression air. According to the procedures recommended by the scanner software, the scanning began with the occlusal surface from the left mandibular second molar to the right mandibular second molar. Next, the lingual surface was scanned, followed by the buccal surface. Similarly, the occlusal surface was scanned for the maxillary teeth, followed by the buccal surface and the palatal surface. When scanning the occlusal surface, the lens– tooth distance was fixed at 0–5 mm; when scanning the buccal surface and lingual surface, the camera was turned from 45◦ to 90◦ , during which the image capturing process was displayed on the monitor, thereby ensuring complete scanning without missing any blind spots. Following maxillary and mandibular teeth scanning, the centric occlusion positions were recorded, which comprised the positions of premolars and molars on both sides. Data obtained through the aforementioned procedures were used for preoperational modeling and post-operational bracket bonding modeling. The scanner accuracy and consistency were 6.9 ± 0.9 µm and 4.5 ± 0.9 µm, respectively [16].

2.5. Pretreatment Digital Setup and Planning of Bracket Positions

The intraoral scanner was also adopted to establish intraoral digital models of participants. Specifically, pretreatment digital models were applied for a virtual setup using the digital modeling software Ortho Analyzer (3Shape Dental Systems, Copenhagen, Denmark). A virtual setup, facilitated on software, can separate each tooth and move it to the planned position. According to the software program, unwanted and nonexistent tooth positions were excluded, after which the mesial and distal positions of each tooth were marked. Subsequently, the program semiautomatically marked the gingival margin of each tooth and allowed manual adjustment of unclear margins. After confirmation, the program defined proximal contact points according to the obtained gingival margin and mesiodistal direction. According to the six keys to normal occlusion proposed by Andrews et al. (1976; i.e., molar relationship, mesiodistal crown angulation, labiolingual crown inclination, no rotations, tight contacts, and a flat occlusal plane) [17], we defined the final occlusion status. To avoid mesiodistal deviation of brackets causing crown rotation during treatment, we overlapped the Y axis with the FACC; the incisor line was regarded as a reference line for calculating the height of each bracket. The vertical positions of each bracket were then aligned with the final occlusion plane, namely Andrews’ plane [15], to ensure correct bracket position marking as planned. After bracket planning was completed using 3D digital setup software, we obtained the absolute coordinates of each bracket on the tooth crown. The teeth and brackets could then be bounded using the developed AR-assisted bracket navigation system.

2.6. Clinical Procedure

  1. Place the retractor in the patient’s mouth.
  2. Clean the tooth surface with a low-speed brush and a polishing paste.
  3. Dry the tooth surface (Do not etch the tooth surface and use the bonding agent).
  4. Apply orthodontic adhesive (Orthomite LC; SunMedical) on the bracket base.
  5. Place the bracket with the Boone gauge (control group)/AR system (experiment group).
  6. Remove excessive adhesive.
  7. Light cure each tooth for 20 s.

2.7. Assessment of Accurate Bracket Positioning

Ortho Analyzer was used to compare the preoperational bracket position plan and bonded bracket model through coordinate system quantification. The measurement parameters required were as follows: (1) The center of the bracket P(Xp, Yp) in preoperational planning, with the preoperational planning position of each tooth as the origin; (2) the position of the bonded bracket R(Xr, Yr); and (3) the horizontal deviation (∆Xt) and vertical deviation (∆Yt) between the actual adherence position and planned position. The X axis denotes the mesiodistal direction, whereas the Y axis denotes the inciogingival direction (Figure 6). The error of measuring position deviation ranged from—0.01 to 0.01 mm, with a mean (standard deviation [SD]) of 0.01 (0.001) mm.

2.8. The Reliability of AR–Assisted Bracket Navigation System

Next, the reliability of the AR system was verified. The resolution of the intraoral scanner was 640 × 640 pixels. In the preliminary test, when the operating distance was 12 mm, 33–35 pixels were observed every 10 mm, indicating an interval of 0.285–0.30 mm per pixel. When the effective operating distance increased, the number of pixels per unit also increased, creating little interval as well as high accuracy.

2.9. Statistical Analysis

Continuous variables are presented as the medium ± interquartile range (IQR). The Shapiro–Wilk test indicated that the variables were not normally distributed. The Mann– Whitney U-test was used to compare continuous variables between the novice and expert groups. The Wilcoxon sign rank test was used to compare continuous variables between the experiment and control groups. p < 0.05 was considered statistically significant. All statistical analyses were conducted using SPSS for Windows (version 12, SPSS, Chicago, IL, USA). A power analysis for Wilcoxon-Mann-Whitney test (two groups) conducted posteriori using G*Power 3.1.9.7 indicated 95% power to detect a small effect size at a significance level of 0.05.

3. Results
3.1. The Results of Vertical Position Devation

The position deviations between the control and AR groups are presented in Table 1. The medium vertical position deviation in the control and AR groups was 0.64 ± 0.37 mm and 0.35 ± 0.15 mm,respectively (p < 0.001). The medium vertical position deviation in the control and AR groups by the novice orthodontist was 0.90 ± 0.06 mm and 0.51 ± 0.24 mm, respectively (p < 0.001), and that by the expert orthodontist was 0.40 ± 0.29 mm and 0.29 ± 0.08 mm, respectively (p < 0.001). The percentage of vertical deviation improvement between the novice and expert orthodontists was 43% (p < 0.05) and 28% (p < 0.05), respectively.

3.2. The Results of Horizontal Position Devation

The position deviations between the control and AR groups are presented in Table 1. The medium horizontal position deviation was 0.32 ± 0.18 mm and 0.26 ± 0.11 mm, respectively (p > 0.05). The medium horizontal position deviation in the control and AR groups by the novice orthodontist was 0.28 ± 0.09 mm and 0.27 ± 0.16 mm, respectively (p > 0.05), and that by the expert orthodontist was 0.36 ± 0.21 mm and 0.25 ± 0.10 mm, respectively (p > 0.05). The percentage of horizontal deviation improvement was 4% (p > 0.05) and 31% (p > 0.05), respectively.

3.3. The Effect of AR–Assisted Bracket Navigation System

The results revealed that after implementing the AR system, the vertical error was reduced by approximately 0.29 (45%) mm, whereas the horizontal error was reduced by only approximately 0.06 (18%) mm because the features between the teeth were inconspicuous. Furthermore, the novice orthodontist using the AR system achieved the near accuracy as the expert orthodontist using the Boone gauge.

4. Discussion

The ideal bracket position is vital for efficient orthodontic treatment. Many studies have discussed ideal positioning since 1976. Andrews introduced the bracketing technique of the straight wire concept, which involved placing the bracket up or down until the middle of the bracket slot base is at the same height as the mid-point of the clinical crown. [7–9] In the current study, we used initial digital modelling to predict the final setup to define each volunteer’s bracket position and measure the bracket height before transferring to the AR-assisted bracket navigation system. Various methods have been proposed for ideal bracket positioning [2,12,13,18–20]. They can be divided into direct and indirect bonding techniques. The direct bonding technique uses a gauge to directly measure the ideal bracket positions in oral cavity. This technique has the advantages of no interference with the guiding tray, more acceptable for crowded teeth, and immediately adjustable bracket position. However, the direct bonding technique is influenced by the measure scale of gauge, tooth morphology, and operator’s experience. The indirect bonding technique uses a transfer tray to transfer the brackets into the intra-oral teeth. They are divided into 2 stages: The lab stage, which involves locating the bracket position on the stone model and fabricating a transfer tray to duplicate the bracket position, and the clinical stage, which involves setting the tray with brackets into the patient’s mouth. Therefore, the indirect bonding technique overcomes the limitation of the limited view in the oral cavity. Nevertheless, placement errors may occur depending on the transfer tray material, operator’s finger pressure, and technician skills. Several researchers have designed many types of transfer trays, including polyvinyl siloxane trays, vacuum-formed trays, 3-dimensional-printed stent, and transfer jigs [11,13,18,21–23]. ARassisted bracket navigation system combines the advantage of both bonding techniques and overcomes the disadvantage.

Aichert et al. (2012) used digital volume tomography of a computed tomography scan to overlay the orthodontist bonded procedure video image by using an AR system in vitro, and the average error with their technique was 2.1 mm [24]. By contrast, the average error in our study with novice and expert orthodontists was 0.51 mm and 0.29 mm, respectively, indicating significantly high accuracy. Our study has clinical significance because the clinically acceptable placement error is 0.5 mm. Furthermore, we used digital modeling with an intra-oral scanner to determine the bracket position, thus avoiding any radiation exposure due to computed tomography, unlike Aichertet al.’s study. The device used a wireless handheld intra-oral camera to capture oral teeth image and sent it for data processing. The registration method used the real-time natural feature registration to superimpose the ideal bracket position of the virtual object on the tooth surface (Figure 7).

We used the real-time corner detection algorithm to achieve real-world integrity and high registration accuracy [3]. The AR-assisted bracket navigation system was displayed on the screen in front of the operator instead of a head-mounted device to reduce the load on the operator’s neck. Under acceptable error rate, the proposed system could provide more fluent user experience because there was no huge number of calculations that needed to be done within the limited time. As a result, deviations in the expert group in the vertical (incisogingival) direction were 0.40 mm and 0.29 mm for the control and AR groups, respectively, and the improvement rate was 28%. Deviations in the novice group were 0.90 mm and 0.51 mm, respectively, and the improvement rate was 43%. Notably, using the AR-assisted bracket navigation system, the accuracy of the novice orthodontist closed that of the expert orthodontist using the Boone gauge. However, no significant differences were observed in the horizontal (mesiodistal) direction regardless of the orthodontist’s experience or the system used. The results showed no significant difference between the experimental group and the control group, and the possible reason is that the pre-treatment of root parallelism was not taken into consideration. However, the research is still valuable for clinical practice due to the fact that the horizontal deviations of the two methods were less than 0.5 mm. The standard of 0.5 mm is regarded as the acceptable clinical limitation according to the professional standard of the American Board of Orthodontics Objective Grading System [25]. This system recommends that the deviation of alignment and marginal ridges should be at the same level or within 0.5 mm [25]. As reported by Castilla et al. (2014) [26], who investigated the linear differences between five indirect bonding techniques, the deviation ranged between 0.06 mm and 0.49 mm [26]. According to an in vivo study by Grünheid et al. (2015) [13], statistical differences were confirmed under the 0.5 mm deviation limitation. The acceptable deviation in the present study was consistent with those in the two aforementioned studies. Intraoral camera resolution plays a crucial role in the accuracy of bracket bonding when the AR-assisted bracket navigation system is used. The higher the resolution, the more detailed the information in the unit pixel. We proposed the feature-based facial axis of the clinical crown (FACC) detection algorithm to decrease the error in mapping the augmented information to the real image. We extracted the contour of each tooth from the color image based on the difference in the signal between the hard and soft tissue in the color space. Next, on the basis of the tooth contour, the incisor line and FACC—two crucial reference lines—were detected for each tooth using the proposed algorithm. The positions of the bracket on each tooth depended on these two reference lines. Thus, the system used indirect information to ensure the accuracy of the augmented information during mapping from the virtual world to the real world. A high accuracy in bracket positioning can reduce the need for the first- and secondorder bends, thus reducing treatment time and complexity, as well as reduce the clinical chair time required for bracket rebonding and leveling, thus ensuring efficient and precise treatment. The third-order bends are not discussed in our study because they are easily influenced by the high contour of tooth morphology, adhesive thickness, and position of bracket height. One limitation of the device was not considering the root information of the facial tooth axis of the clinical crown. Adding the root information in the AR-assisted bracket navigation system might increase the accuracy in the horizontal (mesiodistal) direction. Another limitation was the inadequate fluency and optimization of the system. The system might benefit from hardware and software upgrades, including higher graphics processing unit, high-resolution delicate micro lens, higher internet transfer speed, algorithm optimizing, and intact image extraction technology. Future studies should combine the AR-assisted bracket navigation system with artificial intelligence models for teeth image extraction [27–29] and use a 5G technology for data transfer speed [30], more efficient graphics processing unit, optimized algorithm, and high-resolution delicate micro lens, thereby increasing the accuracy and reliability. This AR system can have many applications in dentistry, including temporary anchorage devicesnavigation system, root parallel information display, and Bolton analysis [31]. The present study represents just the beginning of AR application in the orthodontic field [32]. Many clinical treatments can use the AR system, which will make treatments safer and more efficient.

5. Conclusions

Using the AR-assisted bracket navigation system improved the accuracy of bracket placement and decreased the procedure time of lab stage. Specifically, the use of this system increased the accuracy rate with an expert orthodontist in the incisogingival direction and helped the novice orthodontist guide the bracket position within an acceptable clinical error of approximately 0.5 mm. The application of AR in orthodontic bracketing is just a beginning of digitalization, and the orthodontic field will benefit from numerous inventions and development ideas. In addition, the AR-assisted bracket navigation system can serve as a clinical training and education tool for novice dentists. Author Contributions: Conceptualization, Y.-C.L., G.-A.C. and J.-H.Y.; methodology, Y.-C.L., G.-A.C., J.-T.H. and J.-H.Y.; writing—original draft preparation, Y.-C.L., G.-A.C., Y.-C.L., Y.-H.C., J.-T.H. and J.-H.Y.; writing—Y.-C.L. and G.-A.C. All authors have read and agreed to the published version of the manuscript. Funding: The authors declare that they have not received funding.

Institutional Review Board Statement: The study was performed in the department of orthodontics, China Medical University Hospital Medical Center, Taichung, Taiwan. The study was approved by the local research ethics committee (CMUH106-REC3-146), and each participant signed an informed consent form. Informed Consent Statement: Informed consent was obtained from all subjects involved in the study. Data Availability Statement: Data is contained within the article and the data presented in this study are available. Conflicts of Interest: The authors declare no conflict of interest.

References:

  1. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the December 9–11, 1968, Fall Joint Computer Conference, Part I on—AFIPS ’68 (Fall, Part I); ACM: New York, NY, USA, 1968; pp. 757–764.
  2. Caudell, T.; Mizell, D. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7–10 January 1992; pp. 659–669. [CrossRef]
  3. Jiang, J.; Huang, Z.; Qian, W.; Zhang, Y.; Liu, Y. Registration Technology of Augmented Reality in Oral Medicine: A Review. IEEE Access 2019, 7, 53566–53584. [CrossRef]
  4. Lin, Y.-K.; Yau, H.-T.; Wang, I.-C.; Zheng, C.; Chung, K.-H. A Novel Dental Implant Guided Surgery Based on Integration of Surgical Template and Augmented Reality. Clin. Implant. Dent. Relat. Res. 2013, 17, 543–553. [CrossRef] [PubMed]
  5. Pellegrino, G.; Mangano, C.; Mangano, R.; Ferri, A.; Taraschi, V.; Marchetti, C. Augmented reality for dental implantology: Apilot clinical report of two cases. BMC Oral Health 2019, 19, 1–8. [CrossRef] [PubMed]
  6. Huang, T.-K.; Yang, C.-H.; Hsieh, Y.-H.; Wang, J.-C.; Hung, C.-C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J. Med. Sci. 2018, 34, 243–248. [CrossRef] [PubMed]
  7. Andrews, L.F. The straight-wire appliance. Explained and compared. J. Clin. Orthod. 1976, 10, 174–195. [PubMed]
  8. Andrews, L.F. The straight-wire appliance, origin, controversy, commentary. J. Clin. Orthod. JCO 1976, 10, 99–114. [PubMed]
  9. Andrews, L.F. The straight-wire appliance arch form, wire bending & an experiment. J. Clin. Orthod. JCO 1976, 10, 581–588.
  10. Jernberg, G.R.; Bakdash, M.B.; Keenan, K.M. Relationship between Proximal Tooth Open Contacts and Periodontal Disease. J. Periodontol. 1983, 54, 529–533. [CrossRef]
  11. Xue, C.; Xu, H.; Guo, Y.; Xu, L.; Dhami, Y.; Wang, H.; Liu, Z.; Ma, J.; Bai, D. Accurate bracket placement using a computer-aided design and computer-aided manufacturing–guided bonding device: An in vivo study. Am. J. Orthod. Dentofac. Orthop. 2020, 157, 269–277. [CrossRef]
  12. Koo, B.C.; Chung, C.-H.; Vanarsdall, R.L. Comparison of the accuracy of bracket placement between direct and indirect bonding techniques. Am. J. Orthod. Dentofac. Orthop. 1999, 116, 346–351. [CrossRef]
  13. Grünheid, T.; Lee, M.S.; Larson, B.E. Transfer accuracy of vinyl polysiloxane trays for indirect bonding. Angle Orthod. 2016, 86, 468–474. [CrossRef]
  14. Rother, C.; Kolmogorov, V.; Blake, A. “GrabCut”. ACM Trans. Graph. 2004, 23, 309–314. [CrossRef]
  15. Andrews, L.F. The Straight-Wire Appliance. Br. J. Orthod. 1979, 6, 125–143. [CrossRef] [PubMed]
  16. Chiu, A.; Chen, Y.-W.; Hayashi, J.; Sadr, A. Accuracy of CAD/CAM Digital Impressions with Different Intraoral Scanner Parameters. Sensors 2020, 20, 1157. [CrossRef] [PubMed]
  17. Andrews, L.F. The six keys to normal occlusion. Am. J. Orthod. 1972, 62, 296–309. [CrossRef]
  18. Schmid, J.; Brenner, D.; Recheis, W.; Hofer-Picout, P.; Brenner, M.; Crismani, A.G. Transfer accuracy of two indirect bonding techniques—an in vitro study with 3D scanned models. Eur. J. Orthod. 2018, 40, 549–555. [CrossRef]
  19. Ousehal, L.; Lazrak, L.; Troedhan, A.; Kurrek, A.; Wainwright, M. The accuracy of brackets placement in direct bonding technique: A comparison between the pole-like bracket positioning gauge and the star-like bracket positioning gauge. Open J. Stomatol. 2011, 1, 121–125. [CrossRef]
  20. Balut, N.; Klapper, L.; Sandrik, J.; Bowman, D. Variations in bracket placement in the preadjusted orthodontic appliance. Am. J.Orthod. Dentofac. Orthop. 1992, 102, 62–67. [CrossRef]
  21. Kalange, J.T. Prescription-Based Precision Full Arch Indirect Bonding. Semin. Orthod. 2007, 13, 19–42. [CrossRef]
  22. Sondhi, A. Effective and Efficient Indirect Bonding: The Sondhi Method. Semin. Orthod. 2007, 13, 43–57. [CrossRef]
  23. Moskowitz, E.M. Indirect Bonding with a Thermal Cured Composite. Semin. Orthod. 2007, 13, 69–74. [CrossRef]
  24. Aichert, A.; Wein, W.; Ladikos, A.; Reichl, T.; Navab, N. Image-Based Tracking of the Teeth for Orthodontic Augmented Reality; Springer: Berlin/Heidelberg, Germany, 2012; pp. 601–608.
  25. Casko, J.S.; Vaden, J.L.; Kokich, V.G.; Damone, J.; James, R.; Cangialosi, T.J.; Riolo, M.L.; Owens, S.E.; Bills, E.D. Objective grading system for dental casts and panoramic radiographs. Am. J. Orthod. Dentofac. Orthop. 1998, 114, 589–599. [CrossRef]
  26. Castilla, A.E.; Crowe, J.J.; Moses, J.R.; Wang, M.; Ferracane, J.L.; Covell, D.A. Measurement and comparison of bracket transfer accuracy of five indirect bonding techniques. Angle Orthod. 2014, 84, 607–614. [CrossRef] [PubMed]
  27. Hung, H.-C.; Wang, Y.-C.; Wang, Y.-C. Applications of Artificial Intelligence in Orthodontics. Taiwan J. Orthod. 2020, 32, 3.
  28. Prados-Privado, M.; Villalón, J.G.; Martínez-Martínez, C.H.; Ivorra, C. Dental Images Recognition Technology and Applications: A Literature Review. Appl. Sci. 2020, 10, 2856. [CrossRef]
  29. Chen, H.; Zhang, K.; Lyu, P.; Li, H.; Zhang, L.; Wu, J.; Lee, C.-H. A deep learning approach to automatic teeth detection and numbering based on object detection in dental periapical films. Sci. Rep. 2019, 9, 1–11. [CrossRef]
  30. Al-Falahy, N.; Alani, O.Y.K. Technologies for 5G Networks: Challenges and Opportunities. IT Prof. 2017, 19, 12–20. [CrossRef]
  31. Farronato, M.; Maspero, C.; Lanteri, V.; Fama, A.; Ferrati, F.; Pettenuzzo, A.; Farronato, D. Current state of the art in the use of augmented reality in dentistry: A systematic review of the literature. BMC Oral Health 2019, 19, 1–15. [CrossRef]
  32. Kwon, H.-B.; Park, Y.-S.; Han, J.-S. Augmented reality in dentistry: A current perspective. Acta Odontol. Scand. 2018, 76, 497–503. [CrossRef]

Current state of the art in the use of augmented reality in dentistry: a systematic review of the literature

Current state of the art in the use of augmented reality in dentistry: a systematic review of the literature

© The Author(s). 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Abstract

Background: The aim of the present systematic review was to screen the literature and to describe current applications of augmented reality.

Materials and methods: The protocol design was structured according to PRISMA-P guidelines and registered in PROSPERO. A review of the following databases was carried out: Medline, Ovid, Embase, Cochrane Library, Google Scholar and the Gray literature. Data was extracted, summarized and collected for qualitative analysis and evaluated for individual risk of bias (R.O.B.) assessment, by two independent examiners. Collected data included: year of publishing, journal with reviewing system and impact factor, study design, sample size, target of the study, hardware(s) and software(s) used or custom developed, primary outcomes, field of interest and quantification of the displacement error and timing measurements, when available. Qualitative evidence synthesis refers to SPIDER.

Results: From primary research of 17,652 articles, 33 were considered in the review for qualitative synthesis. 16 among selected articles were eligible for quantitative synthesis of heterogenous data, 12 out of 13 judged the precision at least as acceptable, while 3 out of 6 described an increase in operation timing of about 1 h. 60% (n = 20) of selected studies refers to a camera-display augmented reality system while 21% (n = 7) refers to a head-mounted system. The software proposed in the articles were self-developed by 7 authors while the majority proposed commercially available ones. The applications proposed for augmented reality are: Oral and maxillo-facial surgery (OMS) in 21 studies, restorative dentistry in 5 studies, educational purposes in 4 studies and orthodontics in 1 study. The majority of the studies were carried on phantoms (51%) and those on patients were 11 (33%).

Conclusions: On the base of literature the current development is still insufficient for full validation process, however independent sources of customized software for augmented reality seems promising to help routinely procedures, complicate or specific interventions, education and learning. Oral and maxillofacial area is predominant, the results in precision are promising, while timing is still very controversial since some authors describe longer preparation time when using augmented reality up to 60 min while others describe a reduced operating time of 50/100%.

Trial registration: The following systematic review was registered in PROSPERO with RN: CRD42019120058.

Keywords: Augmented reality, Virtual reality, Digital dentistry, Orthodontics, Maxillofacial surgery, Implantology, Systematic review, Education, Dental training.

Background

The first application of augmented reality was developed by Ivan Edward Sunderland in 1968 with a binocular system “kinetic depth effect” made of two cathode ray tubes. It wasn’t until 1991 that the definition of “augmented reality” was first described by Tom Caudell of the Boeing Company. Since then, the popularity explosion of augmented reality has reached high levels in the last lustrum. Its applications are also easier since many existing devices are compatible with this technology while other are being developed in order to maximize its performances. The gaming industry is predominant in the augmented reality area because of the expertise brought by virtual reality development. The inherence from this specific field provided tools which are being used by some researchers, for example, virtual reality headset. The definition of augmented reality refers to: “a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view”. Augmented reality, however is commonly confused with virtual reality since both have many aspects in common, even though the outcomes are completely different. Virtual reality, as the name suggests, is a virtual immersive environment where the user’s senses are stimulated with computer-generated sensations and feedbacks generating an “interaction”. Augmented reality, instead, generates an interaction between the real environment and virtual objects. For example a virtual reality system would be a head worn helmet which simulates navigation inside human body and permits the user to explore it on the base of a virtual three-dimensional reconstruction. A similar example with the augmented reality would permit to directly observe a human body and to see virtual objects on it, or through it as the anatomy of the body was superimposed. Immersive reality is similar to augmented reality but the user is interacting with a digital 3d world recreated through 360° real records. The user can navigate recordings which replace the real world in a convincing way. The 360 records recreate the continuity of the surrounding with no interruptions. There also might be physical interaction with the environment and physical feedback given by haptic response when interacting with an object. Other features can be added as 3D audio direction, freedom of movement in the environment and conformance to human vision, which permits correct sizing of object in distance. Its application in dentistry begun with the development of new visualization system for anatomic exploration from the use of virtual-reality based software. The growth in popularity has brought the use of augmented reality to the attention of the medical researchers and of the digital centers who are following two different methods: using already available systems or developing their own, customized combination of hardware and software. However, substituting virtual reality with augmented reality means to superimpose virtual objects to the reality in a precise and reproducible way considering the three dimensions of space as well as the user’s and patient’s movements. This is still a controversial topic since it is highly affected by the system used. Most authors propose a handmade pre-operative calibration, instead of an automatic one. However the use of markers simplifies this tracking process. The most commonly used systems are head mounted displays and half, silvered mirror projections, both of them are valid systems for augmented reality and have a multitude of different setting as described by Azuma et al. The superimposed virtual objects are usually obtained with 3-dimensional X-rays as CT dental scans which are then manipulated with commercially available software for CBCT manipulation. Also MRI, angiography or any other three-dimensional data could be used in the same way. The most commonly used software is Mimics (Materialise, Leuven). The object is exported in a widely recognisable format (.stl for example) using “mask” function set with thresholding on the area of interest and 3D reconstruction function. The revolutionary scope of developing an augmented reality based system is to solve one of the biggest issue in the structure of most digital dentistry commonly available systematics; in fact, the use digital technologies like the scanners is structured in a 3-step procedure which can be summarized as follows: the digital image is acquired by a scanning device, the changes are performed digitally from T0 to T1, the new information is transferred back to solid state. The use of augmented reality permits direct visualization bypassing the last transfer step, which means, on a large scale, to avoid data and time loss. Visualization of digital data directly on the patient means the possibility of achieving great advantages in digital procedures. The aim of this systematic review was to collect and to describe available literature about the use of augmented reality in different fields of dentistry and maxillofacial surgery. Collected data will be used to describe the current combinations of hardware and software proposed by the authors, with a focus on self-developement, the field of interest where augmented reality is being used, the primary outcomes which are being obtained by the use of different systems and the precision and timing of the procedures performed. Data about sample considered in the different studies and the designs of the protocol proposed will be also described.

Materials and methods

A prior research was made before the beginning of the study design. Manuscripts from 1968, the year when augmented reality was first described, to the end of 2018 were considered. A protocol for the research was structured by the authors after screening the titles and the abstracts of the articles found. After full accordance among the authors it was registered in PROSPERO with rn:CRD42019120058. The search strategy included the databases to be screened and the search query. The articles found were selected with the application of inclusion and exclusion criteria. The resulting full texts were analyzed by the authors for data extraction. Full text access has been granted by “Università Degli Studi di Milano” – University of Milan, Orthodontics department for the research.

Search strategy

The review was researched using the following electronic databases: Medline, Ovid, Pubmed, Embase, Cochrane Library and Google Scholar. The research refers to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA-P) 2015. The search query used is available in (Fig. 1) MeSH terms. Grey literature was also screened according to Pisa declaration on Policy Development for Grey Literature Resources.

Inclusion criteria

Articles describing new or already existing applications or frameworks for augmented reality methodologies and relevant informations include: type of intervention, field of interest, clinical outcomes, precision and timing efficiency of the proposed system and combination of software and hardware used were considered. Articles in English referring to Dentistry, oral and maxillofacial surgery were included. No limit for study design was applied, the target of the studies considered are: humans, human parts (extracted teeth), phantoms, animals. Studies from 1968 were considered.

Exclusion criteria

All the articles describing virtual reality systems were discarded, like anatomical explorations, improper use or any concept which doesn’t refer to the exact definition of augmented reality as described in the introduction section. All the articles lacking methodology description with at least less than 3 of the following were discarded: study design, sample size, hardware utilized, software installed. All the descriptive methodologies, conference papers, patents, and all the publications in general not identified as “Articles” were discarded. All the application areas not related to dentistry, oral surgery or cranio-facial district where discarded.

Qualitative analysis and quantitative synthesis

The research outcomes synthesis refers to SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type) tool. Data regarding precision measurements and times of the procedure were collected. The data regarding precision described the error in millimeters or percentage between the markers and the digital image, the error between the real object and the superposed digital image and the degree of the orientation error. Time measurements were taken regarding the additional time required to fit the digital models or the gain in the operative procedures. The high number of variables made the data inconsistent for meta-analysis. The variables considered for qualitative synthesis were: the type of procedure and field of interest, primary outcome and results obtained, study design, software used and if custom made or already existing, type of hardware, sample size and target of the study: animal/human or phantom.

Risk of bias in individual studies

Due to the high heterogeneity of the studies design, which is common for new technologies independently developed with different features, common tools for risk of bias assessment were not applicable. In general, risk of bias was considered and judged by the authors low or null for data description but very high for analyzing effectiveness of such methodologies. All the studies found in literature presented high or unknown selection bias and reference standards. Also none of the studies refers to a specific protocol.

Results

The primary research gave a total of 17,652 records after duplicates removal, 17,603 results were excluded on the base of full title and abstract, 45 out of 49 studies were considered eligible. After full text reading by two among the authors, 33 articles were selected after application of exclusion and inclusion criteria (Fig. 2). Variables regarding the sample size and target of the study: animal/human or phantom, type of hardware, software used, field of interest of the proposed procedure and study design, were then extracted from the text, collected and discussed among all the authors. Out of the 33 articles 16 contained at least one quantitative description of the following variables regarding timing of the procedure and precision o the proposed system.

Data extraction
sample

Out of the selected studies we found that the 51% (n = 17) is performed on phantoms, with 2 of those performed also in one single volunteer: for video see-through on maxillofacial surgery and for the overlaying of computed tomography on the surgical area. Studies referring to experiments carried out on real human patients are 33% (n = 11) considering the two with a single volunteer. Out of in vivo studies on humans the ones referring to actual interventions carried out on patients with the use of augmented reality systems are: intra-oral distractor positioning on 10 cases with 10 controls (OMS/maxillofacial surgery F.O.I), 16 class III patients for waferless maxillary positioning (OMS/maxillofacial surgery F.O.I), one subject for orthodontic positioning of brackets (Orthodontics F.O.I), MASO on 15 patients (OMS/maxillofacial surgery F.O.I), maxillary positioning on 5 patients (OMS/maxillofacial surgery F.O.I), and on 148 patients for multiple operations (OMS F.O.I). All the interventions carried out on humans outcomes are positively described by the authors, with no exception. Other samples considered refer to animal with the studies in 2017 for MASO on dogs, in 2015 for vascular landmarks on one porcine tongue and in 2015 for dental implants on a pig corpse. Other studies have been carried out in vitro on 126 human teeth for Endodontics F.O.I in 2013 (Table 1).

Phenomenon of interest: hardware used

Out of the studies considered the majority (60% n = 20) refers to camera-display based systems although the most classical use of augmented reality refers to systems which are head-mounted used in 21% of the studies considered (n = 7). Other systems described are glass silvered mirrors or mirror based systems (n = 3) with 3 selected studies using other specific systems. One system described consists in an interactive portable display unit which can be defined as camera-display based, portable as the H.M.S. but not wearable (Table 2).

Phenomenon of interest: software used

Extracted data about software used in the studies brings those 7 authors describe new custom-made software for a total of 9 studies. The authors involved into the development of the customized new software are describe using C++ programming language to develop the new software while Bogdan describes using C++ and Java language. The majority of studies presents a variety of commercially available softwares as well as: Leap Motion®;, Ar Toolkits®, ITK Snap®, Hitlab NZ®, Aumentaty®, Maya® , Iplan®, Implant Smart® , Dentsim®, Xscope®, Medscan®  and multiple software. The only software used by a multitude of authors is Mimics® from 4 authors for a total of 6 manuscripts (Table 2).

Field of interest: F.O.I

Out of 33 selected studies the majority refers to the OMS (Oral and Maxillofacial Surgery) area which can be divided into three specific areas: implantology, maxillofacial surgery and oral surgery; the following area were restorative dentistry, educational and learning and orthodontics.

Respectively OMS included 21 studies divided into 17 for maxillo-facial, 3 for implantology and 1 for oral surgery; Restorative dentistry included 5 studies; educational and learning 4 studies and orthodontics 1 study (Fig. 3). The studies considered applied augmented reality technology for the following operations: Implant placing performed on 3D printed mandibular models with better better accuracy applicability and efficiency as outcome: < 1.5 mm as linear deviation and < 5.5 degree of angular deviation by Jiang et al. Lefort 1 has been performed on models by surgical residents with more selfconfidence and knowledge as overall resulting experience. A very specific operation like orbital implant placement has been tested out on 3D printed mode which is very useful for the instant feedback and with a translation error of 1.12–1.15 mm and rotational of < 3°. Inferior block nerve anesthesia have been tested on one phantom model with good results using just a camera and a laptop. MASO have been carried on by two different authors, coauthors in one of the manuscripts; they described an increase of time needed of about 1 h of preparation before the surgery on human in their first study. In the second study they don’t report such data on MASO performed on dog mandibles, even by unexperienced operators, both of them landed good results and were judged helpful. Othe authors proposed the use of augmented reality with one of the most sophisticated hardware found in literature which is the Da Vinci si robot in 2015, their experiment involved the resection of a neoplasm on a porcine

tongue using vascular landmarks. This is one of the only articles referring a clear failure of the experiment with a mean error of more than 5 mm. Other authors proposed the positioning of distractors for hemifacial microsomia with the use of augmented reality in 2015, for their study they enrolled 10 randomized cases and 10 controls presenting microsomia. The aim was to transfer the surgical planning to the surgical site in hemifacial microsomia elongment using a Head Mounted Display predisposed with the use of Mimics and of the software AR Toolkits. They found the technology useful with difference between the vertical distances from the coronoid to the plane CP1 (AA′) and CP2 (AA′′) of 1.43 ± 0.13 mm in the AR group and 2.53 ± 0.39 mm in the control group. Another interesting study proposed the use of NDI Polaris tracking system to solve the positioning issues related with the use of augmented reality. NDI Polaris is a tracking device which use spherical markers capture by a set of two rapid movement camera. The systems was implemented with self developed software with the use of a head-mounted device as described by the authors and it was used for implant placing in a pig corpse. The outcomes were evaluated through questionnaires which assessed ergonomic benefits and easier procedures, linear and angular error in the positioning were not assessed.

Outcomes

13 studies quantified the errors in the superposition of the virtual objects with reality or compared the outcomes with traditional set up, while 6 studies evaluated the changes in time needed for the intervention, a total of 16 studies considered at least one of the two variables as described in (Table 3). All the studies considered the results satisfactory for the quantification of the error/ precision except for one but not many considered satisfactory the timing comparisons.

Authors considered the mean error of the tracking tool for vascular landmarks of the base of the tongue for neoplasm resection by using the Da Vinci si robot of 5 mm not acceptable. Other authors evaluated the maxillary reposition with X-scope prolonged by approximately 1 h, while others considered MASO with computer aided tools needs approximately 1 h of registration before the start but they suggest that it can be improved with experience in the future. some authors in 2013 considered maxillary repositioning using a custom portable device 60 min longer than a conventional operation. All the outcomes are collected in (Table 3).

Research types/design

The design proposed for the selected studies is experimental randomized clinical trial in one of the studies proposed, there are 3 Cohort studies and three review studies.

Discussion

The first studies taken into account were published in 2005, 38 years after the publishing of the first headmounted augmented reality system by Sunderland. Even though augmented reality is a specifically visual immersive system, most of the authors are proposing non-wearable display-camera systems. This reduces the efforts related to stabilization of overlapping two different dynamic systems, which is preponderant in head-mounted and portable systems but also reduces the scope of “augmenting” the perception of the operator. The studies considered are rapidly growing from 2013 as can be seen by (Fig. 4) and the most productive state are China and Japan, which also collaborated between each other in different studies, followed by Germany, UK and Belgium. The majority of the systems refer to OMS area specifically to maxillofacial surgery. Implantology and oral surgery, the two other subgroups, include just 4 studies out of 21 in the OMS, which means that 84% of OMS studies refers specifically to maxillofacial surgery. Educational and learning studies are almost equivalent to restorative dentistry respectively they include 5 and 4 studies. Orthodontics and endodontics are represented with one study each. There is lack of a system studied in different fields, this could be explained with the high customization and knowledge required for every system to adapt to a specific field. Even though some systems share the same hardware. The prevalence of studies in the maxillofacial area can be associated with the extension of the area of intervention. The larger is the subject to be seen in augmented reality the more applicability finds the system. This fact can be associated with contemporary availability of already existing hardware and components used for customized systems. High precision cameras with efficient stabilization and the possibility to zoom in a small area are still very expensive and big in size. Also the landmark of reference are highly influencing predominant interest in the OMS area, in fact trials carried on using vascular references, even with high precision hardwares, obtained result where the outcomes were considered not satisfactory (mean error more than 5 mm).

This could be a major limitation of this new technology in operations carried on exclusively on soft tissues since the lack of stability represents an obstacle to stabilization of the overlapping images. Primary endpoints of the studies show general positivity for improvement, usefulness and even good outcomes in the precision of the proposed systems (higher than usual standards in some cases). Educational systems were evaluated through questionnaires and brought great response in the students. While other fields of interest might appear as they are making their first step on the augmented technologies, education seems already available for wider studies since navigation systems were already available with the use of virtual reality, having a low cost . Also a good response is to be expected from young generations which are more prominent to adapt to new technologies. The use of this technology could simplify digital procedures with direct visualization of virtual informations (Fig. 5). Timing, although, is more controversial and highly depends on the structure of the system proposed. The timing outcomes are very different between each other, in fact some relates to the setting and calibration time, some other refers to the duration of the intervention and the educational studies refer to the time needed for gaining a given skill in dental training and manual dexterity . The positivity in the outcomes and primary endpoints of the studies considered (31/33) should be taken with caution since many of the systems described are selfdeveloped by the same institutions of the authors. Custom made software were not used by other authors except the first describing them, which is a major flaw and could represent conflict of interest in validating a new proposed system. Also, there is a lack of randomized clinical trials with a proper sample size calculation and other effort to avoid major bias.

Conclusions

Most recent technologies are being developed with custom software: 7 out of 9 were self-developed by the authors in the last 5 years. More efforts is needed to implement the hardware support. From what is known a simple, portable and accessible tool is needed. Timing is a controversial topic in different fields of interest since half of the authors (3 out of 6) report an increase of at least one hour while precision is judged satisfactory by most authors (12 out of 13). Although the technologies proposed are not validated by external teams, customized augmented reality systems seems to provide great results in simple experimental models since most of the studies were carried on phantoms (51% n = 17). OMS area is referee of great advantages in interventions carried on medium sized surgical areas and its gaining the most benefits from this technology since superposition of digital images is easier on bony structures. Most of the studies were carried on this augmented reality field of application (21 out of 33).

Abbreviations

F.O.I: Field of interest; H.M.S: Head Mounted System; O.M.S: Oral and Maxillofacial Surgery

Authors’ contributions

MF; DF. substantial contributions to the conception and design of the work. CM; AF; VL; FF; AP. the acquisition, analysis and interpretation of data. MF; DF; AF. have drafted the work or substantively revised it. MF; CM; VL; AF; FF; AP; DF. have approved the submitted version (and any substantially modified version that involves the author’s contribution to the study);MF; CM; VL; AF; FF; AP; DF. have agreed both to be personally accountable for the author’s own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Authors’ information

The authors MF; DF; FF; are working on a new augmented reality system.

References:

 

  1. Azuma RT. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997;6(4):355–85.
  2. Bimber O, Raskar R. Spatial augmented reality: merging real and virtual worlds. Wellesley: AK Peters/CRC Press; 2005.
  3. Van Krevelen D, Poelman R. Augmented reality: technologies, applications, and limitations. Vrije Univ Amsterdam Dep Comput Sci. 2007;9(2):1-20.
  4. Ausburn LJ, Ausburn FB. Desktop virtual reality: a powerful new technology for teaching and research in industrial teacher education. J Ind Teach Educ. 2004;41(4):1–16.
  5. Pulijala Y, Ma M, Pears M, Peebles D, Ayoub A. Effectiveness of immersive virtual reality in surgical training—a randomized control trial. J Oral Maxillofac Surg. 2018;76(5):1065–72.
  6. Joda T, Gallucci GO, Wismeijer D, Zitzmann NU. Augmented and virtual reality in dental medicine: a systematic review. Comput Biol Med. 2019;108:93-100.
  7. Satava RM, Jones SB. Current and future applications of virtual reality for medicine. Proc IEEE. 1998;86(3):484–9.
  8. Suenaga H, Tran HH, Liao H, Masamune K, Dohi T, Hoshi K, et al. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study. Int J Oral Sci. 2013;5(2):98.
  9. Jayaram S, Connacher HI, Lyons KW. Virtual assembly using virtual reality techniques. Comput Aided Des. 1997;29(8):575–84.
  10. Farronato G, Santamaria G, Cressoni P, Falzone D, Colombo M. The digitaltitanium Herbst. J Clin Orthod. 2011;45(5):263-7. quiz 287-8.
  11. Farronato G, Galbiati G, Esposito L, Mortellaro C, Zanoni F, Maspero C. Three-dimensional virtual treatment planning: Presurgical evaluation. J Craniofac Surg. 2018;29(5):e433–7.
  12. Caudell TP, Mizell DW. Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the twenty-fifth Hawaii international conference on system sciences, vol. 2. Kauai: IEEE; 1992. p. 659–69.
  13. Mangano F, Shibli JA, Fortin T. Digital dentistry: new materials and techniques. Int J Dent. 2016;2016:5261247.
  14. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4(1):1.
  15. Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qual Health Res. 2012;22(10):1435–43.
  16. Jiang W, Ma L, Zhang B, Fan Y, Qu X, Zhang X, Liao H. Evaluation of the 3D augmented reality–guided intraoperative posit ioning of dental implants in edentulous mandibular models. Int J Oral Maxillofac Implants. 2018;33(6): 1219-28.
  17. Murugesan YP, Alsadoon A, Manoranjan P, Prasad PWC. A novel rotational matrix and translation vector algorithm: geometric accuracy for augmented reality in oral and maxillofacial surgeries. Int J Med Robot Comput Assist Surg. 2018;14(3):e1889.
  18. Schreurs R, Dubois L, Becking AG, Maal TJJ. Implant-oriented navigation in orbital reconstruction. Part 1: technique and accuracy study. Int J Oral Maxillofac Surg. 2018;47(3):395–402.
  19. Liu WP, Richmon JD, Sorger JM, Azizian M, Taylor RH. Augmented reality and cone beam CT guidance for transoral robotic surgery. J Robot Surg. 2015;9(3):223–33.
  20. Qu M, Hou Y, Xu Y, Shen C, Zhu M, Xie L, et al. Precise positioning of an intraoral distractor using augmented reality in patients with hemifacial microsomia. J Cranio-Maxillofac Surg. 2015;43(1):106–12.
  21. Wang J, Suenaga H, Yang L, Kobayashi E, Sakuma I. Video see-through augmented reality for oral and maxillofacial surgery. Int J Med Robot Comput Assist Surg. 2017;13(2):e1754.
  22. Zinser MJ, Mischkowski RA, Dreiseidler T, Thamm OC, Rothamel D, Zöller JE. Computer-assisted orthognathic surgery: waferless maxillary positioning, versatility, and accuracy of an image-guided visualisation display. Br J Oral Maxillofac Surg. 2013;51(8):827–33.
  23. Lin YK, Yau HT, Wang IC, Zheng C, Chung KH. A novel dental implant guided surgery based on integration of surgical template and augmented reality. Clin Implant Dent Relat Res. 2015;17(3):543–53.
  24. Aichert A, Wein W, Ladikos A, Reichl T, Navab N. Image-based tracking of the teeth for orthodontic augmented reality. In: International conference on medical image computing and computer-assisted intervention. Berlin, Heidelberg: Springer; 2012. p. 601–8.
  25. Bruellmann DD, Tjaden H, Schwanecke U, Barth P. An optimized video system for augmented reality in endodontics: a feasibility study. Clin Oral Investig. 2013;17(2):441–8.
  26. Zhu M, Chai G, Zhang Y, Ma X, Gan J. Registration strategy using occlusal splint based on augmented reality for mandibular angle oblique split osteotomy. J Craniofac Surg. 2011;22(5):1806–9.
  27. Mischkowski RA, Zinser MJ, Kübler AC, Krug B, Seifert U, Zöller JE. Application of an augmented reality tool for maxillary positioning in orthognathic surgery–a feasibility study. J Cranio-Maxillofac Surg. 2006;34(8): 478–83.
  28. Ewers R, Schicho K, Undt G, Wanschitz F, Truppe M, Seemann R, Wagner A. Basic research and 12 years of clinical experience in computer-assisted navigation technology: a review. Int J Oral Maxillofac Surg. 2005;34(1):1–8.
  29. Wierinck E, Puttemans V, Van Steenberghe D. Effect of tutorial input in addition to augmented feedback on manual dexterity training and its retention. Eur J Dent Educ. 2006;10(1):24–31.
  30. Ewers R, Schicho K, Wagner A, Undt G, Seemann R, Figl M, Truppe M. Seven years of clinical experience with teleconsultation in craniomaxillofacial surgery. J Oral Maxillofac Surg. 2005;63(10):1447–54.
  31. Bogdan CM, Popovici DM. Information system analysis of an e-learning system used for dental restorations simulation. Comput Methods Prog Biomed. 2012;107(3):357–66.
  32. Espejo-Trung LC, Elian SN, Luz MAADC. Development and application of a new learning object for teaching operative dentistry using augmented reality. J Dent Educ. 2015;79(11):1356–62.
  33. Llena C, Folguera S, Forner L, Rodríguez-Lozano FJ. Implementation of augmented reality in operative dentistry learning. Eur J Dent Educ. 2018; 22(1):e122–30.
  34. Badiali G, Ferrari V, Cutolo F, Freschi C, Caramella D, Bianchi A, Marchetti C. Augmented reality as an aid in maxillofacial surgery: validation of a wearable system allowing maxillary repositioning. J Cranio-Maxillofac Surg. 2014;42(8):1970–6.
  35. Farronato M, Lucchina AG, Mortellaro C, Fama A, Galbiati G, Farronato G, Maspero C. Bilateral hyperplasia of the coronoid process in pediatric patients: what is the gold standard for treatment? J Craniofac Surg. 2019; 30(4):1058-63.
  36. Won YJ, Kang SH. Application of augmented reality for inferior alveolar nerve block anesthesia: a technical note. J Dent Anesth Pain Med. 2017; 17(2):129–34.
  37. Zhou C, Zhu M, Shi Y, Lin L, Chai G, Zhang Y, Xie L. Robot-assisted surgery for mandibular angle split osteotomy using augmented reality: preliminary results on clinical animal experiment. Aesthet Plast Surg. 2017;41(5):1228–36.
  38. Plessas A. Computerized virtual reality simulation in preclinical dentistry: can a computerized simulator replace the conventional phantom heads and human instruction? Simul Healthc. 2017;12(5):332–8.
  39. Katić D, Spengler P, Bodenstedt S, Castrillon-Oberndorfer G, Seeberger R, Hoffmann J, et al. A system for context-aware intraoperative augmented reality in dental implant surgery. Int J Comput Assist Radiol Surg. 2015;10(1):101–8.
  40. Wang J, Suenaga H, Hoshi K, Yang L, Kobayashi E, Sakuma I, Liao H. Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans Biomed Eng. 2014;61(4):1295–304.
  41. Wierinck ER, Puttemans V, Swinnen SP, van Steenberghe D. Expert performance on a virtual reality simulation system. J Dent Educ. 2007;71(6):759–66.
  42. Wierinck E, Puttemans V, Swinnen S, van Steenberghe D. Effect of augmented visual feedback from a virtual reality simulation system on manual dexterity training. Eur J Dent Educ. 2005;9(1):10–6.
  43. Nijmeh AD, Goodger NM, Hawkes D, Edwards PJ, McGurk M. Image-guided navigation in oral and maxillofacial surgery. Br J Oral Maxillofac Surg. 2005; 43(4):294–302.
  44.  Shahrbanian S, Ma X, Aghaei N, Korner-Bitensky N, Moshiri K, Simmonds MJ. Use of virtual reality (immersive vs. non immersive) for pain management in children and adults: a systematic review of evidence from randomized controlled trials. Eur J Exp Biol. 2012;2(5):1408–22.

  45. Wang J, Suenaga H, Liao H, Hoshi K, Yang L, Kobayashi E, Sakuma I. Realtime computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Comput Med Imaging Graphics. 2015;40:147-59.

Augmented Reality in Esthetic Dentistry: a Case Report

Authors

Romane Touati & Vincent Fehmer & Maxime Ducret & Irena Sailer & Laurent Marchand

Abstract

Purpose of Review The aim of this case report was to illustrate the clinical procedure integrating augmented reality (AR) for complex patient cases requiring full mouth rehabilitation.

Recent Findings The introduction of AR technology to the fields of medicine and dentistry has led to numerous applications in education, surgery, and esthetics. Recently, a new AR software was introduced in esthetic dentistry which allows for real-time smile projection and thus improves communication with patients and the dental laboratory.

Summary The presented case shows a patient with multiple missing teeth, diastemata, and an impaired masticatory and phonetic ability. After reconstruction of the posterior zone, the AR software was used for the conception of the esthetic zone, integrating the patient into the decision-making process. The result was an esthetic rehabilitation applying palatal and buccal veneers which corresponded to the chosen AR design. The patient appreciated the opportunity to pre-visualize a possible final outcome in an interactive way which increased his confidence in the chosen treatment. Further studies are needed to assess the precision and reproducibility of the described protocol.

Introduction

Augmented reality (AR) is a technology that combines the virtual and physical worlds and therefore augments the real world experience [1]. Although the term “augmented reality” has been coined in the early 1990s, the origins of the idea to merge the digital and physical worlds date back to 30 years before. Early predecessors to modern AR technology were originally invented for aerospace engineering [2] and related industrial applications [3]. With the rapid evolution of computer and camera technology around the turn of the millennium however, the scope of application for AR has increased significantly.

Apart from industrial uses, AR has since been introduced into the fields of medicine and dentistry with a multitude of emerging applications. Today, AR is used in dentistry for educational purposes [4•], providing a tool to objectively evaluate students and give them direct feedback. Further developments of AR technology have led to its application in oral guided surgery or pre-operative planification [4•, 5, 6].

In esthetic dentistry, the attainment of a patient-specific optimal appearance of the planned restorations may be considered an important objective. A pre-visualization can be achieved by means of a conventional laboratory-made wax up and intraoral mock-up, in the form of a two-dimensional smile design by overlapping idealized teeth forms onto a portrait picture of the patient [7•]. Recently, a procedure using a three-dimensional facial scan of the patient was proposed, but this requires a lot of time and the ability to combine different software, which is not yet simple for all the prosthetic laboratories [8–10]. To overcome this problem, a solution using AR software was proposed, offering fast 3D conception and promising results [11, 12]. However, this approach is currently limited to the design and conception of teeth in the anterior maxilla. In more complex patient cases, such pre-visualization is often difficult to implement due to a multitude of variables that must be considered. In situations where a full mouth rehabilitation is indicated, there is currently no protocol for the application of AR technology.

The present case report proposes to illustrate the application, as well as the advantages and current drawbacks of AR technology for esthetic full mouth reconstruction.

Case Illustration

A 59-year-old male patient searched for treatment at the University Clinic for Dental Medicine of Geneva with the primary wish to improve his masticatory and phonetic function as well as the esthetic appearance of his smile. The patient was healthy, took no medication, but was a former smoker with a history of periodontitis. The patient had lost all premolar and molar teeth in the upper arch along with the upper left canine due to periodontitis (Fig.1). In the lower arch, all molar teeth were missing as well as the right central incisor (tooth 41, Fig.1). Due to the lack of posterior support, the remaining dentition was heavily worn down with multiple areas of exposed dentine. Furthermore, due to lack of interproximal contacts, multiple visible diastemata opened up impairing the patient’s smile.

Apart from the esthetic aspect, the diastemata disturbed the patient’s phonetic ability as well as the ability to play brass instruments, which is his favorite passion. As the patient’s intraoral situation deteriorated slowly over a period of several decades, the patient became anxious about undergoing a full mouth rehabilitation. The patient had been restored with removable partial dentures in the past, but ultimately never wore them due to lack of comfort. He therefore had high expectations of a possible fixed rehabilitation in terms of esthetic appearance, function, and comfort. At the same time, the patient was having difficulty formulating his opinion on how his teeth should look like as he had been with worn down teeth for decades. The ability to pre-visualize a possible final outcome and discuss the treatment goals was, therefore, of great importance.

For situations where purely an esthetic correction is desired, AR technology has been applied successfully to provide such pre-visualization [12]. As AR software uses existing teeth as fix points to provide a predictable smile projection, its use in the present case was limited and inaccurate. For this reason, an initial approach using a conventional wax-up and mock-up was chosen to define the goal of the treatment which served as a guideline for the restorative team (Fig. 2). The final treatment plan was to first reconstruct the posterior areas in both jaws with implant-supported restorations to stabilize the bite. For the rehabilitation of the maxillary anterior zone, an adhesive approach using palatal and vestibular veneers was planned. This combined sandwich technique allowed for a tissue-preserving preparation of the elongated and periodontally involved maxillary incisors. In the anterior mandible, direct composites were planned to reconstruct the incisal edges, and a resin-bonded bridge with a one wing design was to replace the missing tooth 41.

The initial mock-up was used as a reference for implant placement (tissue level implants, Straumann, Basel, Switzerland) in the posterior regions. After an uneventful healing period, digitally designed and computer-assisted-manufactured reconstructions with a cobalt-chromium framework (Coron cobalt-chromium alloy, Straumann, Switzerland) and feldspathic ceramic veneering were fabricated (Fig. 3). This re-established a correct vertical dimension of occlusion (VDO) and stabilized the bite. Before starting the rehabilitation of the esthetic zone however, a smile proposal using an AR software (IvoSmile, Ivoclar Vivadent, Schaan, Liechtenstein) was realized in collaboration with the patient (Fig. 4). This co-diagnostic session represented a discussion platform where the patient could evaluate and visualize his opinion regarding the future position, length, and shape of his teeth. The AR software allowed for projecting different teeth length, width, and color onto the patients face in real time [11]. As the posterior restorations were already in place, the AR projection could be performed accurately, using teeth and restorations as fix points. The patient could modify the smile projection until he felt comfortable with the result, and an overall harmonious result was achieved (Fig. 4). The chosen AR projection was saved and later imported into a computer-assisted design (CAD) software and matched with intraoral scan data. Using the multiple 2D pictures including the smile proposal out of the IvoSmile app to create the patient driven design accordingly. A physical model of this was then 3D-printed and used for a physical mock-up to confirm the AR smile projection clinically (Fig. 5). After the patient validated the proposition, a silicone key of the digital wax-up was fabricated to analyze the necessary tooth preparation in all dimensions. To ensure a minimally invasive preparation, only a small amount of tooth substance had to be removed using a diamond bur to respect the minimal material thickness of the final restoration. Furthermore, due to the missing volume in both palatal and buccal dimensions, the final restorations could be designed additively, thus requiring very little tooth preparation. The palatal veneers were fabricated out of a polymer-infiltrated ceramic network (PICN) material (Vita Enamic, Vita Zahnfabrik, Bad Säckingen, Germany) and adhesively cemented using pre-heated resin composite (Tetric 210 A2, Ivoclar Vivadent, Liechtenstein). The buccal veneers were then manufactured out of lithium-disilicate (IPS e.max CAD, Ivoclar Vivadent, Schaan, Liechtenstein) and microveneered on the buccal aspect for an optimal esthetic result.

Adhesive cementation of the restorations was performed with a dual-curing resin cement (Variolink Esthetic, Ivoclar Vivadent, Liechtenstein). The one wing framework for resin bonded bridge to replace tooth 41 was fabricated out of zirconia (Lava Plus, 3M ESPE, St. Paul, MN, USA) and buccally veneered with feldspathic ceramic. The finalized bridge was cemented with a self-curing resin cement containing MDP molecules (Panavia 21, Kuraray Noritake Dental Inc., Tokyo, Japan) for optimal adhesion to zirconia [13]. After completion of the treatment, the patient was very happy with the final result (Fig. 6). He appreciated the opportunity to pre-visualize a possible final outcome in such an interactive way during treatment. Furthermore, the potential to influence the decision-making process increased his adherence in the chosen treatment.

Discussion

The present case report shows a patient with several missing teeth in both jaws as well as a compromised functional, phonetic, and esthetic situation. AR technology was used to integrate the patient into the decision-making process for reconstruction of the anterior esthetic zone [11].

Traditional smile design protocols are very effective for communication between the interdisciplinary dental team and the dental technician [7•, 14, 15]. In those protocols, a first session is mandatory to realize photographs as well as videos [14]. Data are then analyzed by the practitioners and the dental technician to create a new smile proposition that will be exposed to the patient on a second appointment [14, 15]. Patients are mostly passive regarding the esthetic diagnosis and design, because they can comment and address his demands only after one or two propositions are already done.

In the presented protocol using AR, the patient is enabled to be an active member of the reconstructive team, as a coauthor of his rehabilitation. The patient is given the opportunity to explore the possibilities of his smile reconstruction and gives input by using a tool that helps remove language and technical barriers. This technology proposes to overhaul the organization of the first appointment in esthetic dentistry by taking less time for photos and videos acquisition and more time for discussion with the patient. In the presented patient case, it has been decided to use the device in a second phase, after stabilizing the posterior area and validating the VDO.

This was necessary to ensure a correct matching of the AR smile projection and allow for increased precision. This technology may present a paradigm shift in communication with both the patient and the dental laboratory. Indeed, thanks to a new tool called “CAD-link” [12], it enables the direct matching of the final AR proposal with the digital impression in order to create a digital wax-up. This strategy is likely more time efficient when compared to protocols proposed in recent publications that mainly required conversion of the digital smile design into a digital wax-up, with supplementary manual steps executed by the technician [8, 16•].

The AR tool used in this case report still presents several limitations, such as the increased knowledge required in the handling of digital technologies. The AR tool also represents a certain learning curve for patient and practitioner. The optimal use of this device depends on the accuracy when matching the 3D data acquired by the intraoral scanning system and the AR smile proposal. This step is highly sensitive as the camera of the tablet device only offers a single frontal view [11, 17]. Indeed, the matching technology uses algorithms that require coincident areas positioned under different planes [18]. In the presented patient case, the initial situation offered insufficient compatible fix points for a correct AR smile projection. Therefore, in situations where multiple teeth are missing, caution should be taken when applying AR technology. The lack of landmarks may create an imprecision in the smile projection of unknown degree.

Finally, further technological evolutions of AR may open the spectrum of its application in dentistry, but more research in this field is necessary.

Conclusion

Co-diagnostic use of AR software may enhance the communication strategy between the clinician, dental technician, and the patient. In the case of full mouth rehabilitation, the use of AR software is recommended after stabilizing the posterior area. Further studies are needed to assess the precision and reproducibility of the described software and protocol.

References

1. Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M. Augmented reality technologies, systems and applications. Multimed Tools Appl. 2011;51(1):341–77.

2. Caudell TP, Mizell DW. Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, vol.2. Kauai: IEEE; 1992, p. 659–69. http://ieeexplore.ieee.org/document/183317/

3. Louis Rosenberg. The use of virtual fixtures as perceptual overlays to enhance operator performance in remote environments. 1992.

4.• Huang T-K, Yang C-H, Hsieh Y-H, Wang J-C, Hung C-C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J Med Sci. 2018;34(4):243–8. Interesting input on new technologies used in dentistry.

5. Kwon H-B, Park Y-S, Han J-S. Augmented reality in dentistry: a current perspective. Acta Odontol Scand. 2018;76(7):497–503. 6. Vávra P, Roman J, Zonča P, Ihnát P, Němec M, Kumar J, et al. Recent development of augmented reality in surgery: a review. J Healthc Eng. 2017;2017:1–9.

7.• Coachman C, Georg R, Bohner L, Rigo LC, Sesma N. Chairside 3D digital design and trial restoration workflow. J Prosthet Dent. 2020; https://linkinghub.elsevier.com/retrieve/pii/S002239131930695X;

124:514–20. Presentation of a chairside app for digital smile design that is not using AR.

8. StanleyM, PazAG, Miguel I, Coachman C. Fully digital workflow, integrating dental scan, smile design and CAD-CAM: case report. BMC Oral Health. 2018;18(1). https://bmcoralhealth.biomedcentral.com/articles/10.1186/s12903-018-0597-0

9. Galibourg A, Brenes C. Virtual smile design tip: from 2D to 3D design with free software. J Prosthet Dent. 2019;121(5):863–4.

10. Lavorgna L, Cervino G, Fiorillo L, Di Leo G, Troiano G, Ortensi M, et al. Reliability of a virtual prosthodontic project realized through a 2D and 3D photographic acquisition: an experimental study on the accuracy of different digital systems. Int J Environ Res Public Health. 2019;16(24):5139.

11. Touati R, Richert R, Millet C, Farges J-C, Sailer I, Ducret M. Comparison of two innovative strategies using augmented reality for communication in aesthetic dentistry: a pilot study. J Healthc Eng. 2019;2019:1–6.

12. Marchand L, Touati R, Fehmer V, Ducret M, Sailer I. Latest advances in augmented reality technology and its integration into the digital workflow Neueste Entwicklungen in der Augmented- Reality-Technologie und ihre Integration in den digitalen

Workflow. Int J Comput Dent. Accepted.

13. Scaminaci Russo D, Cinelli F, Sarti C, Giachetti L. Adhesion to zirconia: a systematic review of current conditioning methods and bonding materials. Dent J. 2019;7(3):74.

14. Coachman C, Calamita M. Digital smile design: a tool for treatment planning and communication in esthetic dentistry.Quint Dent Tech. 2012;2012:1–9.

15. Coachman C, Calamita MA, Sesma N. Dynamic documentation of the smile and the 2D/3D digital smile design process. Restorative Dent. 2017;37(2):12.

16.• Lin W-S, Harris BT, Phasuk K, Llop DR, Morton D. Integrating a facial scan, virtual smile design, and 3D virtual patient for treatment with CAD-CAMceramic veneers: a clinical report. J Prosthet Dent. 2018;119(2):200–5. Description of a new technology for smile design using a facial scan but not chairside.

17. Zimmermann M. Virtual smile design systems: a current review Virtuelle Smile Design-Systeme: eine aktuelle Übersicht. Int J Comput Dent. 2015;18(4):303–17.

18. Richert R, Goujat A, Venet L, Viguie G, Viennot S, Robinson P, et al. Intraoral Scanner Technologies: a Review to Make a Successful Impression. J Healthc Eng. 2017;2017:1–9.

Author information

Affiliations

Division of Fixed Prosthodontics and Biomaterials, University Clinic of Dental Medicine, University of Geneva, Geneva, Switzerland

Romane Touati, Vincent Fehmer, Irena Sailer & Laurent Marchand

Hospices Civils de Lyon, Service de Consultations et Traitements Dentaires, Lyon, France

Romane Touati & Maxime Ducret

Division de Prothèse Fixe et Biomatériaux, Clinique Universitaire de Médecine Dentaire, Université de Genève, 1, Rue Michel-Servet, 1211, Genève4, Suisse, Switzerland

Romane Touati

Faculty of Odontology, University Claude Bernard Lyon 1, University of Lyon, Lyon, France

Maxime Ducret

Corresponding author

Correspondence to Romane Touati.

Ethics declarations

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the Topical Collection on Modern Production Laboratory Advances in Dental Technology

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

The availability of clinical education is decreasing over the days

The need and demand for the same is required for better specialized services. The development of clinical skills requires extensive knowledge and ability. Recently, the requirement is substituted by simulated training obtained through artificial intelligence (AI) – virtual reality (VR) and augmented reality (AR). The VR is creating a simulated environment, whereas AR is a division of the same, but it augments sensory perception and replicates real environment in virtual world. The AR system combines virtual and real objects in single realistic environment. It can register both virtual and real objects reciprocally. It runs interactively in real time. AR superimposes a computer‑generated image on an operator’s view of the real world, thus providing a composite view. The invention of augmented reality has simplified many of the prosthetic treatment processes and meets the patient’s expectations easily.AR has diverse applications. It is used as a guide to enhance the understanding. AR technology is used for smart learning, interactive tutoring, diagnosis, and treatment planning. AR is widely used in digital radiographs, dental scans, computer‑aided design–computer‑aided manufacturing (CAD‑CAM) restorations, orthodontic aligners, oral surgery, and implantology. The information obtained from wider sources are stored and computed, and the algorithm is created by AR to support the dental or prosthodontic needs. The significant advantage of these machines with AR is the extended working time, and it completes the tasks without human fatigue. Preclinical and clinical training has witnessed a significant transformation with the use of AR technology for training and evaluation. The use of AR draws more attention and imparts quality training, and on the longer run, it decreases cost. The technology enables and simplifies the training for advanced clinical situations. The technology has been used successfully in many dental schools, and in few, integrations of these systems and technology into the dental curriculum have been done to enhance the training quality.The significant application of AR in prosthodontics is used in CAD‑CAM restorations, implantology, and esthetic planning. It aids in designing the restoration and mills a restoration with great precision that offers superior function and esthetics.

In addition, the concepts are employed in obtaining other craniofacial prosthesis which shall be a reality in the near future. The communication and visualization of esthetic planning and smile designing is always challenging. The developed software such as Digital smile design, Digital smile system, Trios smile design, and Smile pro aids in better conception and interaction. In addition to the present software technologies, these systems integrate the programs of facial recognition technology and automatic picture‑based strategy (APBS) and enhanced mirror strategy (EMS) of AI technology. Facial recognition identifies the person with digital images using extraoral facial reference line and by mathematical algorithms. APBS and EMS use the intelligence technology to aid in designing, planning, communication, and education. Although these systems are widely used, still more advancements are required in quality of image construction, flexibility of the software, and easy integration in daily dental office. Most of the present‑generation technologies are of two dimensional, partially immersive, or partly computer assisted. More evolution required for three‑dimensional conception, video analysis, functional movement evaluation, and designing are required. These features are complex that requires extensive research and development for future. The use of dental robots, especially in prosthodontics, can be realistic future. The robotic trials are done in fabrication of removable partial denture, complete denture, and implant prosthesis. The data input of experienced personnel is coded to robots which can aid in productive prosthodontic actions. Extensive studies have been done in Canada on CRS robots for complete denture tooth arrangement. MOTOMAN UP6 robot, 50DOF multimanipulator tooth‑arrangement robot system, 84DOF multimanipulator tooth‑arrangement robot, and miniature Cartesian robot are some of robotic prosthodontic systems that are in various stages of experimentation. Various universities such as Coimbra, Ecole des Mines de Paris, Ume˚a Universitet, Dusseldorf, Chosun University, Mahidol University and National Science and Technology Development agency are studying on robotic prototypes that use AI for the use in implant prosthodontics. Recently, in China, robots were used to execute a guided implant surgery. The robotics in prosthodontics is under constant development, and in future with the use of human–computer interaction technology and with sensor control technique, extensive prosthodontic procedure can be made with robots. AR aids in realistic predictions of treatment. The navigations systems aid in obtaining superior results in implantology and maxillofacial surgery. The simulators aids in high‑quality training to students. AR aids in precise diagnosis and calibration of procedures and it saves time. The use of technology also has limitations. The cost of the system is still expensive. In future with extensive use and economic alternatives, the cost can be reduced. Limitations exist in technology where few aspects of understanding and conversions to clinical requirements are difficult. With more researches and developments, these limitations shall be reduced. The experiments are done to use the photo emission tomography, infrared spectroscopy, indocyanine green dyes to determine tissue vascularity and sentinel nots. The use of haptic force feedback and robotics enhances the use of AR technology. The review of studies by Joda et al. Indicated that the number of studies on AR in dentistry is low, and more established designs and long‑term studies are required for definitive protocols. The studies found that it is effective in use for interactive learning and objective evaluation. Encouraging results were found in maxillofacial surgery, CAD‑CAM, and implantology. The literature, however, stressed on the importance of establishing technological standards with high data quality and developing approved applications for dental/prosthodontic AR devices for effortlessness clinical use. The applications and interest of AR have increased recently with the availability of free source development programs. The open platforms have stimulated more participants, and it has led to increased applications in prosthetic dentistry. AR technology basically involves computation devices, software, exhibit devices and sensors. An effective AR system can be generated with the database of real and virtual information, recording techniques, image processing, display forms, perception settings, and response mechanisms.

AR is a novel technology in prosthodontics. Although effectively used in learning and CAD‑CAM applications, in future it shall have wider applications for effective clinical procedures. It becomes mandatory to understand these concepts and techniques to utilize its advantages

REFERENCES

1. Nair KC. Marching ahead to the future. J Indian Prosthodont Soc 2007;7:162‑5

. 2. Bhambhani R, Bhattacharya J, Sen SK. Digitization and its futuristic approach in prosthodontics. J Indian Prosthodont Soc 2013;13:165‑74.

3. Prithviraj DR, Bhalla HK, Vashisht R, Sounderraj K, Prithvi S. Revolutionizing restorative dentistry: An overview. J Indian Prosthodont Soc 2014;14:333‑43.

4. Huang TK, Yang CH, Hsieh YH, Wang JC, Hung CC. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J Med Sci 2018;34:243‑8.

5. Albuha Al‑Mussawi RM, Farid F. Computer‑based technologies in dentistry: Types and applications. J Dent (Tehran) 2016;13:215‑22.

6. Kwon HB, Park YS, Han JS. Augmented reality in dentistry: A current perspective. Acta Odontol Scand 2018;76:497‑503.

7. Touati R, Richert R, Millet C, FargesJC, SailerI, Ducret M. Comparison of two innovative strategies using augmented reality for communication in aesthetic dentistry: A Pilot study. J Healthc Eng 2019;2019:7019046.

8. Jiang JG, Zhang YD, Wei CG, He TH, Liu Y. A review on robot in prosthodontics and orthodontics. Adv Mech Eng 2014;7:1-11. [doi: 10.1155/2014/198748].

9. Joda T, Gallucci GO, Wismeijer D, Zitzmann NU. Augmented and virtual reality in dental medicine: A systematic review. Comput Biol Med 2019;108:93‑100.

Comparison of Two Innovative Strategies Using Augmented Reality for Communication in Aesthetic Dentistry: A Pilot Study

Abstract

During dental prosthetic rehabilitation, communication and conception are achieved using rigorous methodologies such as smile design protocols. The aim of the present pilot study was to compare two innovative strategies that used augmented reality for communication in dentistry. These strategies enable the user to instantly try a virtual smile proposition by taking a set of pictures from different points of view or by using the iPad as an enhanced mirror. Sixth-year dental students (women = 13, men = 5, mean age = 23.8) were included in this pilot study and were asked to answer a 5-question questionnaire studying the user experience using a visual analog scale (VAS). Answers were converted into a numerical result ranging from 0 to 100 for statistical analysis. Participants were not able to report a difference between the two strategies in terms of handling of the device, quality of the reconstruction, and fluidity of the software. Even if the participants’ experience with the enhanced mirror was more often reported as immersive and more likely to be integrated in a daily dental office practice, no significant increase was reported. Further investigations are required to evaluate time and cost savings in daily practice. Software accuracy is also a major point to investigate in order to go further in clinical applications.

1. Introduction

In dentistry, smile reconstruction is achieved using rigorous and detailed methodologies which are essential for communication between the practitioner, the laboratory, and the patient [1]. Several protocols were previously proposed, such as the “Digital Smile Design®” (DSD), developed by Christian Coachman [2]. Using only a set of photographs and presentation software, this picture-based strategy (PBS) offers a predictive view of the future patient’s smile and makes treatment planning and communication with the patient easier. Until now, protocols have been limited by the following factors: they are handmade or only partly computer-assisted, are two-dimensional (2D), and are only partially immersive for patients. To improve the patient’s experience and patient-practitioner communication, clinical protocols and technological evolutions were proposed, such as a mock-up, a video analysis, or a 3D facial conception [23]. These tools provided a better immersivity for patients and additional details for practitioners, who were able to objectively evaluate facial movements in response to emotion and speech. However, all these features are complex to integrate for both the clinician and the laboratory, and they require a significant amount of time, energy, and cost [4].

Technological evolution of hardware and software aims to reduce the time and errors during information sharing between patients, practitioners, and laboratories. The aim of the technology presented in this pilot study is to improve the communication with the patient using facial recognition (FR) and augmented reality (AR). FR is a technology capable of automatically identifying a person from a digital image, using reference lines of the face and mathematical algorithms [5]. AR is a type of technology in which an environment is enhanced through the process of superimposing computer-generated virtual content over a real structure [67]. Even if AR tools are mainly used for video games and animations, the medical field is working to integrate these technologies for diagnosis, surgery, education, and communication with patients [8]. In dentistry, AR was firstly used for educational purposes as a tool to objectively evaluate students and give them direct feedback [8]. However, there is no study that evaluates AR as a tool to improve communication in aesthetic dentistry.

The present pilot study tested the user experiences using two innovative software of augmented reality for communication in aesthetic dentistry; one using a set of pictures and described as an automatized picture-based strategy (APBS), and the other using the front camera system of the touchpad called enhanced mirror strategy (EMS).

2. Materials and Methods

In this study, a recent application released for iOS 11 was evaluated, allowing for AR experiences to be created using a recent iPad or iPhone [911]. This application (IvoSmile®/Kapanu, Ivoclar-Vivadent) uses the captor camera integrated in a tablet to recognize the patient’s face. After having determined virtual facial and oral landmarks [41213], a second software proposes an artificial layer of smile propositions that is superimposed on the patient’s smile (Figure 1).

Figure 1 
Schematic representation showing the basic principles of this technology. After having captured the patient’s face with a picture or live with the touchpad camera (a), the FR software recognized virtual landmarks on the face (b), the lips and the smile of the patient (c). The software proposed a first mask on the patient’s teeth (d). The overlay of the new mask enabled the visualization of the smile (e), and the patient was able to see the smile projected on the screen, with a set of pictures for APBS or in motion as a mirror in EMS (f).

Two strategies are possible: the first one (APBS) consists in taking a set of photographs in an automatized version of PBS. The user can instantly change the point of view by scrolling through the different photographs. In the second strategy (EMS), the patient can directly try and modify the proposition by looking at the iPad screen in motion, as an enhanced mirror (Figure 2).

Figure 2 
Illustration of the use of the software. (a) Using an iPad camera, the FR software is able to recognize nonfiducial markers (lips, smile, gum, and teeth) (b) and to propose a first mask overlaid on the initial face capture (c). A first smile design proposition is instantly obtained (d)

Users can interact and change the shape, size, and color of the teeth using a large range of tools. The software gives the possibility to the user to modify the center of the arch according to the facial midlines (Figure 3(a)) and choose tooth form and proportion within different catalogues of the teeth (Figure 3(b)). The user can also modify the incisal edge position by raising or lowering length and width of the teeth (Figure 3(c)) or by changing the occlusal plane (Figure 3(d)) or the dental arch inclination and width (Figure 3(e)). Finally, the software allows the user to modify the shade and luminosity of the teeth (Figures 3(g)3(i)).

Figure 3 
Illustration of some of the different features offered by the software and their impact on the smile rendering. (a) Software determination of the ideal dental midline according to the horizontal and vertical facial midlines, the interpapillary line, and the incisal edge position. (b) Proposition of form from the software catalogue. (c) Determination of the length and width of the teeth. (d)–(f) Determination of the occlusal plan height, inclination, width, and depth of the arch. (g)–(i) The final proposition can be chosen according to luminosity, shade, and color of the teeth.

In the present study, one operator (RT) presented the device to the sixth-year volunteer dental students (18 subjects, women , men ; mean age: 23.8 years). After study subjects provided informed consent, they received some explanation and were requested to freely use the device and the different tools on their own smile (Figure 4).

Figure 4 
Use of the device. (a) Participant can use the technology by maintaining the tablet at a required minimal distance as a mirror. (b) User can see himself on the screen and interact with the software.

After ten minutes of use, participants were asked to compare the two strategies (APBS and EMS). The experience of participants while using the application was rated using an anonymous questionnaire and a visual analog scale (VAS). The questionnaire included 5 questions and was adapted from a previous study [3] (Table 1). All supplementary declarative comments of participants were also collected and reported in the present report. VAS answers were converted into a numerical result ranging from 0 to 100 for statistical analysis. A statistical software (IBM SPSS Statistics v24) was used for analyzing data normality. Data were not normally distributed, and a Wilcoxon test was applied to evaluate the difference between the 2 camera systems (α = .05).

3. Results

18 participants (13 women and 5 men; mean age: 23.8 years) were included in the study. Results of the questionnaire were reported in Table 2. In the present pilot study, participants’ preference for one strategy over the other was not significant. Authors were not able to prove a difference between strategies in terms of handling of the device, quality of the reconstruction, and fluidity of the software. According to the participants’ experience, EMS was more often reported as immersive, but this study failed to report a significant advantage over the APBS. Similarly, participants reported a preference regarding EMS, but the difference was not significant. Participants reported that both AR strategies were complementary as they are not used for the same purpose. APBS was described by the participants as a pedagogic tool useful to explain the different smile possibilities to the patient, whereas EMS was used as the virtual try-in phase of the proposed smile project.

4. Discussion

The results of the study did not manage to report a significant difference between the two strategies in terms of handling of the device, quality of the reconstruction, fluidity of the software, and immersivity and interest for integration in a daily dental office practice. However, many questions still need to be discussed about this application.

The handling of this innovative software requires a learning curve, and many users reported that the overabundant offer of choices could make the decision process more difficult. Some suggested that the software could be simplified by creating, for example, a step-by-step version of the application, where the user is driven by the software through the different features in a logical and chronological way. Inversely, restricted freedom was reported for the determination of vertical facial and dental midlines, whereas these midlines play a significant role in the smile analysis and differences up to 2-3 mm between facial and dental midlines could be visually noticed [14]. The catalogue of teeth options was also limited, and a deep learning approach could be a valuable way of enhancing the catalogue of the teeth by collecting data from patients’ and practitioners’ projects.

It has been shown before that mobile devices could serve as an excellent way to communicate in dentistry [3]. Participants reported also a good immersivity for both strategies. These results are close to those of Kim et al., which noted that AR technology was associated with excellent user experiences in education [14]. However, the present work failed to report that immersivity was significantly enhanced using EMS. These results could be explained by the fact that some participants reported a poorer picture quality using EMS, due to the video captor that leads to occasional mismatch or image pixelation. Even if a majority of participants reported their interest in using a similar tool in their daily practice for conception and communication with patients, further investigations are required to evaluate the cost and time savings brought by the device, compared to other PBS such as DSD [2]. It has to be noted that the present pilot study reported only experiences and analysis of the sixth-year students, and it could be interesting to propose this questionnaire to larger amounts of patients and clinicians in order to evaluate the impact of the device in daily professional practice.

Finally, another limitation reported by authors with present APBS or EMS was the impossibility to match the smile design with the digital cast of the patient. Indeed, in order to perform a realistic computer-assisted design (CAD) of the patient prosthesis, the software needs to be highly precise to prevent alignment mistakes during the matching process with the teeth [15]. Moreover, it was impossible to extract data from the software which prevented the analysis of software accuracy. Similar optical systems designed for AR software show a precision close to 5 mm [7]. This accuracy was considered sufficient for clinical applications in maxillofacial surgery, neurosurgery, or surgical endoscopy [1618]. However, some limitations were reported for these optical systems, and the addition of infrared captors [1921], structured light, fiducial landmarks [20], or radiopaque markers attached to the patient’s skin [22] has been proposed to help the accuracy for facial and dental recognition [5722]. Further investigations are then required to evaluate the accuracy of this innovative device and to determine the precision needed in dentistry.

5. Conclusion

Although the size of the sample was limited, observations underline a good experience (handling of the device, quality of image, fluidity, and immersion) for users in both techniques. However, no statistically significant difference was observed between the two strategies. Further investigations are required for evaluating the efficacy of such a device in daily practice in particular regarding the economy of time and cost. The software accuracy is also a major point to investigate before going further in clinical practice.

References

  1. C. Coachman and R. D. Paravina, “Digitally enhanced esthetic dentistry—from treatment planning to quality control,” Journal of Esthetic and Restorative Dentistry, vol. 28, pp. S3–S4, 2016.View at: Publisher Site | Google Scholar
  2. C. Coachman, M. Calamita, and N. Sesma, “Dynamic documentation of the smile and the 2D/3D digital smile design process,” International Journal of Periodontics & Restorative Dentistry, vol. 37, no. 2, pp. 183–193, 2017.View at: Publisher Site | Google Scholar
  3. I. Sailer, S. Liu, R. Mörzinger et al., “Comparison of user satisfaction and image quality of fixed and mobile camera systems for 3-dimensional image capture of edentulous patients: a pilot clinical study,” Journal of Prosthetic Dentistry, vol. 120, no. 4, pp. 520–524, 2018.View at: Publisher Site | Google Scholar
  4. H. Popat, S. Richmond, R. Playle, D. Marshall, P. L. Rosin, and D. Cosker, “Three-dimensional motion analysis—an exploratory study. Part 1: assessment of facial movement,” Orthodontics & Craniofacial Research, vol. 11, no. 4, pp. 216–223, 2008.View at: Publisher Site | Google Scholar
  5. M. Zollhöfer, J. Thies, P. Garrido et al., “State of the art on monocular 3D face reconstruction, tracking, and applications,” Computer Graphics Forum, vol. 37, no. 2, pp. 523–550, 2018.View at: Publisher Site | Google Scholar
  6. H.-B. Kwon, Y.-S. Park, and J.-S. Han, “Augmented reality in dentistry: a current perspective,” Acta Odontologica Scandinavica, vol. 76, no. 7, pp. 497–503, 2018.View at: Publisher Site | Google Scholar
  7. P. Vávra, J. Roman, P. Zonča et al., “Recent development of augmented reality in surgery: a review,” Journal of Healthcare Engineering, vol. 2017, Article ID 4574172, 9 pages, 2017.View at: Publisher Site | Google Scholar
  8. T.-K. Huang, C.-H. Yang, Y.-H. Hsieh, J.-C. Wang, and C.-C. Hung, “Augmented reality (AR) and virtual reality (VR) applied in dentistry,” Kaohsiung Journal of Medical Sciences, vol. 34, no. 4, pp. 243–248, 2018.View at: Publisher Site | Google Scholar
  9. M. Speicher, B. D. Hall, A. Yu et al., “XD-AR,” Proceedings of the ACM on Human-Computer Interaction, vol. 2, pp. 1–24, 2018.View at: Publisher Site | Google Scholar
  10. W. Zhang, B. Han, P. Hui, V. Gopalakrishnan, E. Zavesky, and F. Qian, “CARS: collaborative augmented reality for socialization,” in Proceedings of the 19th International Workshop on Mobile Computing Systems & Applications—HotMobile’18, pp. 25–30, Tempe, AZ, USA, February 2018.View at: Publisher Site | Google Scholar
  11. C. Wu, D. Bradley, P. Garrido et al., “Model-based teeth reconstruction,” ACM Transactions on Graphics, vol. 35, no. 6, pp. 1–13, 2016.View at: Publisher Site | Google Scholar
  12. T. Kilgus, E. Heim, S. Haase et al., “Mobile markerless augmented reality and its application in forensic medicine,” International Journal of Computer Assisted Radiology and Surgery, vol. 10, no. 5, pp. 573–586, 2015.View at: Publisher Site | Google Scholar
  13. V. O. Kokich, H. A. Kiyak, and P. A. Shapiro, “Comparing the perception of dentists and lay people to altered dental esthetics,” Journal of Esthetic and Restorative Dentistry, vol. 11, no. 6, pp. 311–324, 1999.View at: Publisher Site | Google Scholar
  14. Y. Kim, H. Kim, and Y. O. Kim, “Virtual reality and augmented reality in plastic surgery: a review,” Archives of Plastic Surgery, vol. 44, no. 3, pp. 179–187, 2017.View at: Publisher Site | Google Scholar
  15. R. Richert, A. Goujat, L. Venet et al., “Intraoral scanner technologies: a review to make a successful impression,” Journal of Healthcare Engineering, vol. 2017, Article ID 8427595, 9 pages, 2017.View at: Publisher Site | Google Scholar
  16. G. Badiali, V. Ferrari, F. Cutolo et al., “Augmented reality as an aid in maxillofacial surgery: validation of a wearable system allowing maxillary repositioning,” Journal of Cranio-Maxillofacial Surgery, vol. 42, no. 8, pp. 1970–1976, 2014.View at: Publisher Site | Google Scholar
  17. D. Inoue, B. Cho, M. Mori et al., “Preliminary study on the clinical application of augmented reality neuronavigation,” Journal of Neurological Surgery Part A: Central European Neurosurgery, vol. 74, no. 2, pp. 71–76, 2013.View at: Publisher Site | Google Scholar
  18. X. Kang, M. Azizian, E. Wilson et al., “Stereoscopic augmented reality for laparoscopic surgery,” Surgical Endoscopy, vol. 28, no. 7, pp. 2227–2235, 2014.View at: Publisher Site | Google Scholar
  19. B. Fida, F. Cutolo, G. di Franco, M. Ferrari, and V. Ferrari, “Augmented reality in open surgery,” Updates in Surgery, vol. 70, no. 3, pp. 389–400, 2018.View at: Publisher Site | Google Scholar
  20. K. A. Gavaghan, M. Peterhans, T. Oliveira-Santos, and S. Weber, “A portable image overlay projection device for computer-aided open liver surgery,” IEEE Transactions on Biomedical Engineering, vol. 58, no. 6, pp. 1855–1864, 2011.View at: Publisher Site | Google Scholar
  21. T. Okamoto, S. Onda, M. Matsumoto et al., “Utility of augmented reality system in hepatobiliary surgery,” Journal of Hepato-Biliary-Pancreatic Sciences, vol. 20, no. 2, pp. 249–253, 2013.View at: Publisher Site | Google Scholar
  22. S. F. Rosenstiel, D. H. Ward, and R. G. Rashid, “Dentists’ preferences of anterior tooth proportion-a web-based study,” Journal of Prosthodontics, vol. 9, no. 3, pp. 123–136, 2000.View at: Publisher Site | Google Scholar
  23. Copyright © 2019 Romane Touati et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Digital Smile Design An innovative tool in aesthetic dentistry

Abstract

A fundamental objective of an aesthetic treatment is the patient’s satisfaction and that the outcome of the treatment should meet the patient’s expectation of enhancing his/her facial aesthetics and smile. A patient constantly doubting the end result of the treatment, which is an irreversible procedure, can be motivated and educated through Digital Smile Designing (DSD) technique. DSD is a technical tool which is used to design and modify the smile of patients digitally and help them to visualize it beforehand by creating and presenting a digital mockup of their new smile design before the treatment physically starts. It helps in visual communication and involvement of the patients in their own smile design process, thus ensuring predictable treatment outcome and increasing case acceptance. This article reviews the aspects of digital smile designing in aesthetic dental practice pertaining to its use, advantages, limitations and future prospects.

1. Introduction

A beautiful confident smile is desired by all. When a patient wishes to attain that smile but is skeptical to undertake the treatment procedure, for not being able to visualize his or her treatment outcome, is when, a clinician can use the Digital smile designing (DSD) tool. DSD concept aims to help clinician by improving the aesthetic visualization of the patient’s concern, giving understanding of the possible solution therefore educating and motivating them about the benefits of the treatment and increasing the case acceptance. Digital smile design is a digital mode that help us to create and project the new smile design by attaining a simulation and pre visualization of the ultimate result of the proposed treatment. A design created digitally involves participation of the patients on the designing process of their self-smile design, leading to customization of smile design as per individual needs and desires that complements with the morpho psychological characteristics of the patient, relating patient to an emotional level, increasing their confidence in the process and better acceptance of the anticipated treatment.

Coachman and Calamita described DSD as a multi-use conceptual tool that can support diagnostic vision, improve communication, and enhance treatment predictability, by permitting careful analysis of the patient’s facial and dental characteristics that may have gone unnoticed by clinical, photographic or diagnostic cast based evaluation procedures.

2. Evolution of digital smile designing

In the last two decades smile designing has progressively evolved from physical analogue to digital designing which has advanced from 2D to 3D. From the earlier times when hand drawing on printed photos of the patient were used to communicate and explain the patients of how the end result would look like, it has now progressed into complete digital drawing on DSD software on computer. This can be easily be edited and can be done and undone anytime to achieve the final design balancing patients aesthetic and functional needs.

Christian Coachman in 2017 has proposed this evolution in generations as:

Generation 1. Analogue drawings over photos and no connection to the analogue model. It was the time when drawing with pen was done on printed copy of photographs to visualize the treatment result but that could not be co-related with the study model. Digital dentistry by now was not introduced.

Generation 2. Digital 2D drawings and visual connection to the analogue model. With the advent of digital world, certain software like PowerPoint were familiarized which permitted digital drawing. Although not specific to dentistry and limited to drawing in two dimension it was more accurate and less time consuming than hand drawing. The drawing could be visually connected to the study model but physical connection still lacked.

Generation 3. Digital 2D drawings and analogue connection to the model. This was the beginning of digital-analogue connection. The very first drawing software specific to digital dentistry was introduced which linked 2D digital smile design to 3D wax-up. Facial integration to smile design was also introduced at this stage, but connection to 3D digital world was missing.

Generation 4. Digital 2D drawings and digital connection to the 3D model. Now was the time when digital dentistry progressed from 2D to 3D analysis. 3D digital wax-up could be done involving facial integration and predetermined dental aesthetic parameters.

Generation 5. Complete 3D workflow.

Generation 6. The 4D concept. Adding motion to the smile design process.

3. Requirements for DSD

DSD technique is carried out by digital equipment already prevailing in current dental practice like a computer with one of the DSD software, a digital SLR camera or even a smart phone. A digital intra-oral scanner for digital impression, a 3D printer and CAD/CAM are additional tools for complete digital 3D work flow. An accurate photographic documentation is essential as complete facial and dental analysis rests on preliminary photographs on which changes and designing is formulated, a video documentation is required for dynamic analysis of teeth, gingiva, lips and face during smiling, laughing and talking in order to integrate facially guided principles to the smile design.

3.1. Photography protocol

To proceed with a correct digital planning it is crucial to follow a photography protocol. Photographs taken should be of utmost quality and precision, with correct posture and standardized techniques, as facial reference lines like the commissural lines, lip line and inter-pupillary line which forms the basis of smile designing are established on them. Poor photography misrepresents the reference image and may lead to an improper diagnosis and planning.

The following photographic views in fixed head position are necessary:

1.Three frontal views:

 Full face with a wide smile and the teeth apart

 Full face at rest

 Retracted view of the full maxillary and mandibular arch with teeth apart.

2.Two profile views:

 Side Profile at Rest

 Side Profile with a full Smile

3.A 12 O, clock view with a wide smile and incisal edge of maxillary teeth visible and resting on lower lip.
4.An intra occlusal view of maxillary arch from second premolar to second premolar.

3.2. Videography protocol

According to Coachman during videography best framing and zoom should be adjusted with suitable exposure and focus adjusted to mouth. For ideal development of the facially guided smile frame, four videos from specific angles should be taken:

1.A facial frontal video with retractor and without retractor smiling,
2.A facial profile video with lips at rest and wide-E smile,
3.A 12 O’clock video above the head at the most coronal angle that still allows visualization of the incisal edge,
4.An anterior occlusal video to record maxillary teeth from second premolar to second premolar with the palatine raphe as a straight line.

Four complementary videos should also be taken for facial, phonetic, functional and structural analysis.

As it is, that a static photograph taken at a particular time cannot guarantee the ideal moment captured at the idealistic rest position and a real maximum full smile position, videos are helpful to allow the choice of capturing photo at the perfect moment. Videos can be paused and transformed into a photo by making a screenshot of the best recorded moment at the desired angle. A study conducted by Tjan and Miller on static photographs of a posed smile, reported that 11% of the patients presented a high smile as opposed to the 21% of patients with an anterior high smile in a study with video recording. Tarantili et al. also studied the smile on video and observed that the average duration of a spontaneous smile was 500 ms, which emphasizes the difficulty of recording this moment in photographs.

3.3. Types of DSD software

The clinician may follow any one of the given softwares-

1.Photoshop CS6 (Adobe Systems Incorporated),

2.Microsoft PowerPoint (Microsoft Office, Microsoft, Redmond, Washington, USA).

3.Smile Designer Pro (SDP) (Tasty Tech Ltd),

4.Aaesthetic Digital Smile Design (ADSD – Dr. Valerio Bini),

5.Cerec SW 4.2 (Sirona Dental Systems Inc.),

6.Planmeca Romexis Smile Design (PRSD) (Planmeca Romexis®),

7.VisagiSMile (Web Motion LTD),

8.DSD App by Coachman (DSDApp LLC),

9.Keynote (iWork, Apple, Cupertino, California, USA)

10.Guided Positioning System (GPS)

11.DSS (EGSolution)
12.NemoDSD (3D)
13.Exocad DentalCAD 2.3

Factors such as dentofacial aesthetic parameters, ease of use, case documentation ability, cost, time efficiency, systematic digital workflow and organization, and compatibility of the program with CAD/CAM or other digital systems may influence the user’s decision.

There are many aesthetic parameters that guide smile evaluation and design such as the midline, height, and the curve of the smile and intra- and interdental proportion. A study conducted by Doya Omar et al. compared eight DSD softwares (Photoshop CS6, Keynote, Planmeca Romexis Smile Design, Cerec SW 4.2, Aesthetic Digital Smile Design, Smile Designer Pro, DSD App and VisagiSMile in their capability to evaluate and digitally modify these aesthetic parameters i.e facial, dento-gingival and dental parameter and concluded that Photoshop, Keynote and Aesthetic Digital Smile Design included the largest number of aesthetic analysis parameters. Apart from these the other included DSD softwares were deficient in analyzing the facial aesthetic parameters although they had wide range of dentogingival and dental aesthetic features. According to the authors, “the DSD App, Planmeca Romexis Smile Design, and Cerec SW 4.2 could execute 3D analysis; moreover, Cerec SW 4.2 and PRSD worked together with CAD/CAM. The DSD App and Smile Designer Pro were offered as mobile phone applications. SDP and ADSD were marketed as specialized digital design programs. Furthermore, VisagiSMile and DSD App shared the idea of visagism” which was first introduced by Braulio Paolucci, that suggests temperament can be used as a factor in smile design.

More recently, Exocad DentalCAD 2.3 was introduced which does 3D analysis and could be incorporated with CAD/CAM.

4. Procedure of carrying DSD

Although the inclusion of aesthetic parameters in different DSD software varies, basic procedure of smile designing remains the same. All the DSD software allows for aesthetic designing through the drawing of reference lines and shapes on extra- and intraoral digital photographs. Facial analysis is done using reference lines from which uniform parameters are developed for frontal view of the face. The horizontal reference lines consist of the inter-pupillary and inter-commissural lines that deliver a complete sense of balance and horizontal over view in the aesthetically pleasing face, while the vertical reference line includes the facial midline, passing the glabella, nose, and the chin (Fig. 1a). The horizontal and vertical lines are crossed against each other to measure symmetry and cant of the face. The facial photograph with a wide smile and the teeth apart is moved behind this cross to determine the ideal horizontal plane and vertical midline which permits a comparative analysis of the teeth and face.

After facial analysis, dento gingival analysis is done. The length of the upper lip at rest and in a smile is checked to determine the gingival display. Smile curve is established by correlating the curvature of the incisal edges of the maxillary anterior teeth. The dental contour is made according to the lower lip proportions and the anterior-posterior curvature of the teeth. This facial photograph is then cropped to show only the intraoral view. Three reference lines are marked on the teeth, a straight horizontal line drawn from canine tip to canine tip, one more horizontal line on the incisal edges of central incisors and another vertical line passing through the dental midline (passing through the interdental papillae). This supports in reproducing the cross, that is, the reference inter-pupillary and facial midline on the face onto the intraoral view. Few additional lines are drawn such as the gingival zenith, joining lines of the gingival and incisal battlements for complete dental analysis. For adequate teeth dimension the ideal size of dental width to length ratio can be incorporated by any one of the published theories which includes Golden proportion, Pound’s theory,, Recurring aesthetic dental proportion, Dentogenic theory,, or Visagism.

Required changes are carried out with the help of a digital ruler (Fig. 1b) which can be calibrated on the photograph by measuring the width of the central incisors in the study model. Changes can be modified, decreased or adapted to different situations, depending on the aesthetic requirement and individual needs of the patient. Fig. 2 shows the procedure of digital smile designing on a DSD 3D software Exocad.

5. Advantages

Digital imaging and designing helps patients to visualize the expected final result before the treatment itself starts which enhances the predictability of the treatment., The clinician can address patients concern by showing digitally the final outcome, motivating and educating them about the benefits of the treatment. It improves clinician diagnosis and treatment plan by aesthetic visualization of patients problem through digital analysis of facial, gingival and dental parameters that will analyze the smile and the face in an objective and standardized manner.

DSD leads to customization of smile design by increasing the participation of patient in their own smile design which result in a more aesthetically driven, humanistic, emotional and confident smile. The patient may evaluate, provide opinion, and approve the final shape of the new smile before any treatment procedures are performed thus enhancing patients satisfaction. It leaves no scope of regret post treatment where the irreversible procedures once carried out cannot be undone. It also helps to evaluate and compare pre and post treatment changes. With the digital ruler, drawings, and reference lines, easy comparisons can be made between pre- and post-treatment photographs.

It not only improves communication between clinician and patient but also between interdisciplinary team members, between clinicians, clinician and lab technician. All team members can access this information whenever necessary to review, change, or add components during the diagnostic and treatment phases, without being available in the same place or at the same time. This enhances visual communication, improves transparency, creates a better team work, and interdisciplinary treatment planning. The lab technician also receives feedback of patients expectation related to tooth shape, arrangement, and color to enable any desired modifications. This persistent double-checking ensures the quality of the final result.

A study conducted by Gabriele Cervino et al. reviewed as much as 24 articles on DSD published up till the year 2018 with the purpose to evaluate the effectiveness of the use of Digital Smile Design techniques and whether Digital Smile Design is bringing any improvements in the comfort of patients and in their treatments. It took into consideration, the “communicative” utility of the software, the therapeutic planning, and, of aesthetic and functional rehabilitation of the patients. The authors concluded from all of the articles present in the literature regarding Digital Smile Design, that, this tool provides important information to the clinician and patient. Patients can view their rehabilitations even before they start, and this can also have important medico-legal functions.

6. Limitations

1. As the diagnosis and treatment plan depends on photographic and video documentation, inadequacy in them may distort the reference image and may result in an incorrect diagnosis and planning.
2. For complete 3D digital work flow, 3D softwares with updates, intraoral scanner, 3D printer and CAD/CAM are required which makes it economically expensive.
3. Training and handling for certain software are necessary which further increases time and cost.

7. Future prospects

Complete 3D digital workflow is still not extensively used which in future may come into practice far and wide when more and more clinician will adopt digital scanner, 3D printers, CAD/CAM, then the need for time-consuming impressions, plaster and wax will become far less necessary. With the improvements in the software over the next few years, it will be possible to address facial aesthetics in advanced cases where implants need to be placed by superimposing the files coming from a CT scan or a Cone Beam, along with 3D files of an oral impression or a facial scan and a photo. There also is a possibility of incorporating 4D concept in which motion can be added to the smile design concept., With ever evolving fast paced technology a time, not so far, may come, when digitally designed smile can be projected to virtual reality glasses to foresee the desired smile in actual reality.

8. Conclusion

Digital smile design concept is a helpful tool in aesthetic visualization of patient’s problem. It not only helps patients to envision their treatment outcome but also improves clinician’s diagnosis and treatment planning.

 

References

1. Coachman C., Yoshinaga L., Calamita M., Sesma N. Digital smile design concepts. The Technologist. 2014 []

2. Coachman C., Calamita M. Digital smile design: a tool for treatment planning and communication in aesthetic dentistry. Quintessence Dent Technol. 2012;35:103–111. []
3. Evolution of Smile Design. https://media.digitalsmiledesign.com/christian-coachman-thoughts/smile-design-evolution (Accessed on 15th February 2020) Available online:
4. Daher R., Ardu S., Vjero O., Krejci I. 3D digital smile design with a mobile phone and intraoral optical scanner. Comp Cont Educ Dent. 2018;39(6):e5–8. [PubMed[]
5. Aragón M.L., Pontes L., Bichara L., Flores-Mir C., Normando D. Validity and reliability of intraoral scanners compared to conventional gypsum models measurements: a systematic review. Eur J Orthod. 2016;38:429–434. [PubMed[]
6. Zanardi P.R., Zanardi R.L., Stegun R.C., Sesma N., Costa B.N., Laganá D.C. The use of the digital smile design concept as an auxiliary tool in aesthetic rehabilitation: a case report. Open Dent J. 2016;10:28. [PMC free article] [PubMed[]
7. Coachman C., Calamita M.A., Sesma N. Dynamic documentation of the smile and the 2D/3D digital smile design process. Int J Periodontics Restor Dent. 2017;37(2):183–193. [PubMed[]
8. Tjan A.H., Miller G.D. Some aesthetic factors in a smile. J Prosthet Dent. 1984;51(1):24–28. [PubMed[]
9. Tarantili V.V., Halazonetis D.J., Spyropoulos M.N. The spontaneous smile in dynamic motion. Am J Orthod Dentofacial Orthop. 2005;128(1):8–15. [PubMed[]
10. Omar D., Duarte C. The application of parameters for comprehensive smile aesthetics by digital smile design programs: a review of literature. Saudi Dent J. 2018;30(1):7–11. [PMC free article] [PubMed[]
11. Fradeani M. Quintessence; Chicago: 2004. Esthetic Rehabilitation in Fixed Prosthodontics. []
12. Davis N.C. Smile design. Dent Clin North Am. 2007;51(2):299–318. [PubMed[]
13. Dias N.S., Tsingene F. SAEF–Smile’s aesthetic evaluation form: a useful tool to improve communication between clinicians and patients during multidisciplinary treatment. Eur J Esthetic Dent. 2011;6(2):160–176. [PubMed[]
14. Paolucci B. Visagismo e odontologia. In: Hallawell P., editor. Visagismo Integrado: Identidade, Estilo, Beleza. São Paulo: Senac; 2009. pp. 243–250. []
15. Paolucci B., Calamita M., Coachman C., Gurel G., Shayder A., Hallawell P. Quintessence of Dental Technology; 2012. Visagism: The Art of Dental Composition; pp. 1–14. []
16. Chiche G., Pinault A. Aesthetics of Anterior Prosthodontics. Quintessence. 2004. Diagnosis and treatment planning of aesthetic problems; pp. 13–25. []
17. Cohen S.E. third ed. PMPH; 2007. Fundamentals of Dental Aesthetics: Analysis. Atlas of Cosmetic and Reconstructive Periodontal Surgery; pp. 217–238. []
18. Naini F.B., Gill D.S. Facial aesthetics: 2. Clinical assessment. Dent Update. 2008;35(3):159–170. [PubMed[]
19. Priya K., Rahul D.P., Varma S., Namitha R. Norms for crafting a beautiful smile. Amirta J Med. 2013;2(9):4–9. []
20. Vassantha Kumar M., Ahila S.C., Suganya Devi S. The science of anterior teeth selection for a completely edentulous patient: a literature review. J Indian Prosthodont Soc. 2011;11(1):7–13. [PMC free article] [PubMed[]
21. Ward H.D. Proportional smile design using: the recurring esthetic dental proportion to correlate the widths and lengths of the maxillary anterior teeth with the size of the face. Dent Clin North Am. 2015;59(3):623–638. [PubMed[]
22. Farias F.O., Ennes J.P., Zorzatoo J.R. Aesthetic value of the relationship between the shapes of the face and permanent upper central incisor. Int J Dent. 2010;1:1–6. [PMC free article] [PubMed[]
23. Pedrosa V.O., Franca F.M., Florio M.F., Basting R.T. Study of the morpho-dimensional relationship between the maxillary central incisors and the face. Braz Oral Res. 2011;25(3):210–216. [PubMed[]
24. Sharma A., Luthra R., Kaur P. A photographic study on Visagism. Indian J Oral Sci. 2015;6(3):122–127. []
25. Ahmad N., Ahmed M., Jafri Z. Aesthetics considerations in the selection of teeth for complete denture patients: a Review. Ann Dent Spec. 2013;1(1):4. []
26. GGmbH Exocad. Nov 2019.  Viewed March 2020.
27. Neto A.F., Bandeira A.S., de Miranda B.F., Sánchez-Ayala A. The use of mock-up in dentistry: working with predictability. Full Dent Sci. 2015;6:256–260. []
28. Lin W.S., Zandinejad A., Metz M.J., Harris B.T., Morton D. Predictable restorative work flow for computer-aided design/computer-aided manufacture–fabricated ceramic veneers utilizing a virtual smile design principle. Operat Dent. 2015;40(4):357–363. [PubMed[]
29. Fan F., Li N., Huang S., Ma J. A multidisciplinary approach to the functional and aesthetic rehabilitation of dentinogenesis imperfecta type II: a clinical report. J Prosthet Dent. 2019;122(2):95–103. [PubMed[]
30. Ahrberg D., Lauer H.C., Ahrberg M., Weigl P. Evaluation of fit and efficiency of CAD/CAM fabricated all-ceramic restorations based on direct and indirect digitalization: a double-blinded, randomized clinical trial. Clin Oral Invest. 2016;20(2):291–300. [PubMed[]
31. Cervino G., Fiorillo L., Arzukanyan A.V., Spagnuolo G., Cicciù M. Dental restorative digital workflow: digital smile design from aaesthetic to function. Dent J (Basel) 2019;7(2):30. [PMC free article] [PubMed[]
32. Meereis C.T., De Souza G.B., Albino L.G., Ogliari F.A., Piva E., Lima G.S. Digital smile design for computer-assisted aesthetic rehabilitation: two-year follow-up. Operat Dent. 2016;41(1):E13–E22. [PubMed[]
33. Halley E. The future—3D planning but with the face in motion. Br Dent J. 2015;218:326–327. [PubMed[]