Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Book Review
Case Report
Case Series
Clinical Article
Clinical Innovation
Clinical Pearl
Clinical Pearls
Clinical Showcase
Clinical Technique
Critical Review
Editorial
Expert Corner
Experts Corner
Featured Case Report
Guest Editorial
Letter to Editor
Media and News
Orginal Article
Original Article
Original Research
Research Gallery
Review Article
Special Article
Special Feature
Systematic Review
Systematic Review and Meta-analysis
The Experts Corner
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Book Review
Case Report
Case Series
Clinical Article
Clinical Innovation
Clinical Pearl
Clinical Pearls
Clinical Showcase
Clinical Technique
Critical Review
Editorial
Expert Corner
Experts Corner
Featured Case Report
Guest Editorial
Letter to Editor
Media and News
Orginal Article
Original Article
Original Research
Research Gallery
Review Article
Special Article
Special Feature
Systematic Review
Systematic Review and Meta-analysis
The Experts Corner
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Book Review
Case Report
Case Series
Clinical Article
Clinical Innovation
Clinical Pearl
Clinical Pearls
Clinical Showcase
Clinical Technique
Critical Review
Editorial
Expert Corner
Experts Corner
Featured Case Report
Guest Editorial
Letter to Editor
Media and News
Orginal Article
Original Article
Original Research
Research Gallery
Review Article
Special Article
Special Feature
Systematic Review
Systematic Review and Meta-analysis
The Experts Corner
View/Download PDF

Translate this page into:

Original Article
ARTICLE IN PRESS
doi:
10.25259/APOS_326_2024

A content analysis of YouTube videos on interproximal enamel reduction

Division of Clinical Oral Health Sciences, School of Dentistry, International Medical University, Kuala Lumpur, Malaysia.
Author image

*Corresponding author: Kirti Saxena, Division of Clinical Oral Health Sciences, International Medical University, Kuala Lumpur, Malaysia. kirtisaxena219@gmail.com

Licence
This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, transform, and build upon the work non-commercially, as long as the author is credited and the new creations are licensed under the identical terms.

How to cite this article: Tam W, Tham J, Nimbalkar S, Saxena K, Gunjal S. A content analysis of YouTube videos on interproximal enamel reduction. APOS Trends Orthod. doi: 10.25259/APOS_326_2024

Abstract

Objectives:

Interproximal enamel reduction (IPR) is a routine orthodontic procedure. This study evaluated the credibility, educational value, understandability, and actionability of IPR videos on YouTube, given their ease of access to patients on YouTube despite lacking peer review.

Material and Methods:

YouTube videos were systematically searched and assessed. Twenty-nine videos met the predetermined criteria and were assessed. The Journal of American Medical Association benchmark score was used to assess credibility of information (COI), and a novel scoring system was used to score the educational value (EV), while understandability of information (UOI) and actionability of information (AOI) were determined by the patient education materials assessment tool for audio-visual material.

Results:

The mean COI score was low (2.55 ± 1.30), the EV score was also deficient (6.18 ± 6.37), the UOI score was good (71.93%), but the overall AOI score was low (44.83%).

Conclusion:

The COI in YouTube videos on IPR is deficient, particularly for authenticity. The EV is low considering post-procedural steps. There are good-quality videos in terms of UOI but depict poor AOI. Dental professionals should recognize the significance of YouTube as an information source for patients and direct patients to reliable content for informed decision-making. The guidelines provided can guide dental professionals to produce evidence-based content.

Keywords

Dental videos
Health-related videos
Interproximal enamel reduction
Social media
YouTube

INTRODUCTION

Interproximal enamel reduction (IPR) is routinely undertaken in orthodontic treatment to create minor intra-arch space.[1] IPR involves the selective removal of proximal enamel between teeth while preserving the natural shape of teeth.[2] It may be utilized alongside active orthodontic treatment for a Bolton discrepancy, mild crowding, midline shift, and black triangles.[3,4]

Supplementary material

Nowadays, patients turn to online resources to gather information before making decisions.[5] In the field of dentistry, patients have many resources to retrieve health-related information from different platforms. YouTube, established in 2005, is an online video-sharing platform that has steadily grown in popularity, with approximately 5 billion videos viewed each day.[5] Although it has become the most visited website for accessing health-related videos, this platform may potentially contain inaccurate information as YouTube permits content without regulation, peer review, or moderation.[5-7]

It is crucial to evaluate the credibility of videos to assist dental professionals in choosing a video with evidence-based information for patient education.[8-10] Studies evaluating the credibility of information (COI) of dental videos on YouTube using the Journal of the American Medical Association (JAMA) benchmarks concluded their results differently.[8-10] Ozsoy concluded that smile design videos from YouTube depicted fair credibility.[8] According to Kodonas and Fardi, videos on vital pulp treatment from YouTube are low in credibility.[9] However, Olkun et al. reported that online information on lingual orthodontics was of poor credibility with a low JAMA score.[10]

Patients must be informed about the dental treatments offered to them.[11,12] Therefore, the educational value (EV) of YouTube videos on IPR needs to be evaluated for accuracy and entirety. There are various studies that have evaluated the EV of dental videos on YouTube using different tools. A study by Meade et al. reported that the quality of information related to orthodontic retention and retainers was deficient on YouTube.[11] Another study by Yahya et al. investigated the quality of information related to temporary anchorage devices on YouTube and found the EV of the videos to be inadequate.[12]

Understandability can be explained as the comprehension of the key message of a video by patients from diverse backgrounds. Actionability, on the other hand, is defined as the capability of viewers to identify what they can do based on the audiovisual information provided.[13] It is mandatory to analyze the understandability of information (UOI) and actionability of information (AOI) of YouTube videos. The patient education materials assessment tool for audio-visual material (PEMAT-A/V) assesses how well the audio-visual (A/V) material addresses organization, reliability of visual aids, word choice, and overall clarity. PEMAT-A/V will identify the IPR videos that are easy to understand and act on.[14]

IPR has been widely utilized in dentistry and performed by specialists as well as general dental practitioners.[1] Even though there are some studies examining different dental procedure-related content on YouTube, there is a paucity of research to appraise the accuracy and reliability of the content of videos on IPR.[5,14] YouTube videos can play a role in shaping public opinion about IPR. Therefore, this study aimed to evaluate the COI, EV, UOI, and AOI of YouTube videos on IPR for patient education.

METHODS

The study design was a process-based audit, as the information available on YouTube was being examined. This study was approved by the Joint Research and Ethics Committee of the University vide grant no.: BDS I-01-2023(18).

Search strategy

The research tool used YouTube website (https://www.youtube.com), with an advanced search of different terms related to IPR. YouTube was systematically searched using six search terms: (1) Interproximal reduction, (2) enamel stripping, (3) air-rotor stripping, (4) mesiodistal reduction, (5) slenderization, and (6) re-proximation. Videos were searched from 3rd to 12th December 2023. Using the YouTube advanced search option “sort by view-count,” the top 50 videos per search term by “number of times viewed” were stored in a specially created account.

Inclusion and exclusion criteria

The videos, which were (1) >1 minute duration, (2) with English narration, (3) uploaded between 2014 and 2023 with (4) content relevant to IPR, were included in the study. The exclusion criteria included (1) duplicate videos, (2) YouTube shorts, and (3) video duration > 15 minutes. The video selection is shown in [Figure 1].

Interproximal enamel reduction-related domains.
Figure 1:
Interproximal enamel reduction-related domains.

Video evaluation

A total of 115 videos were viewed and graded by two evaluators (T.W.Y. and T.J.S.) for video characteristics, COI, EV, UOI, and AOI.

Video characteristics

The following video characteristics were recorded: (1) Source, (2) duration, (3) months since upload, (4) likes, (5) number of views, (6) number of subscribers, and (7) viewing rate. The viewing rate of the videos was calculated using the following equation:

Viewing rate %=number of viewsdays since upload×100

Credibility of information (COI)

The JAMA benchmark scale was used to score COI as depicted in [Table 1]. Each video could achieve a maximum score of 4, which depicted a high credibility.[7,15,16]

Table 1: The Journal of the American Medical Association score criteria for assessing the credibility of information.
Criteria Description Scoring criteria
Authorship Author and contributor credentials and affiliations are clearly stated Absent: 0, Mentioned: 1
Attribution Clearly lists all copyright information and includes references or sources for content Absent: 0, Mentioned: 1
Disclosure Date of post and subsequent updates to content are included Absent: 0, Mentioned: 1
Currency Conflicts of interest, funding, sponsorship, advertising, support, and video ownership are disclosed Absent: 0, Mentioned: 1
Total /4

Educational value (EV)

Eight domains related to IPR were identified from the British Orthodontic Society website, as shown in [Figure 1].[17]

The evaluation criteria for EV were developed in accordance with similar YouTube video analysis studies related to oral health.[1,12] The EV scoring criteria were based on the information provided about eight predetermined domains in the videos, as depicted in [Table 2]. Each domain is graded from 0 to 3. Therefore, each video could achieve a maximum score of 24 points, indicating comprehensive and scientifically valid information for all eight domains. A score of 18 points was deemed the minimum value for a video to be considered of adequate EV.

Table 2: The educational value score criteria and description.
Interproximal reduction-related domains Video contains no information or misleading information (0) Video contains inadequate information (1) Video contains adequate information (1) Video contains excellent information (1) Score
Definition /3
Safe amount of tooth grinding /3
Indications /3
Contraindications /3
Different methods /3
Procedure /3
Complication /3
Post-procedural instructions /3
Total Score /24

Understandability of information (UOI) and Actionability of information (AOI)

Each video was evaluated by PEMAT-A/V, which is designed to evaluate and compare the UOI and AOI of patient educational A/V resources.[13] It also provides suggestions to improve educational material, based on the parameters assessed.[13] UOI was measured using an 11-item scale, which was sub-divided into five parts, namely (1) content (1 item), (2) word choice and style (3 items), (3) organization (4 items), (4) layout and design (2 items), and (5) use of visual aids (1 item). The AOI was scored using a 3-item scale considering (1) identification of action to be taken by the user, (2) addressing the user directly, and (3) breaking the action into steps as depicted in [Table 3]. UOI and AOI were reported as a percentage of agreed responses out of 11 and 3 items, respectively.

Table 3: PEMAT-A/V scoring criteria for understandability of information and actionability of information.
Understandability
No Description Response option
Topic content
1 The material makes its purpose completely evident Disagree: 0, Agree: 1
Word choice and style
2 The material uses common, everyday language Disagree: 0, Agree: 1
3 Medical terms are used only to familiarize the audience with the terms. When used, medical terms are defined Disagree: 0, Agree: 1
4 The material uses the active voice Disagree: 0, Agree: 1
Organization
5 The material breaks or “chunks” information into short sections Disagree: 0, Agree: 1, Very short material:N/A
6 The material’s sections have informative headers Disagree: 0, Agree: 1, Very short material:N/A
7 The material presents information in a logical sequence Disagree: 0, Agree: 1
8 The material provides a summary Disagree: 0, Agree: 1, Very short material:N/A
Layout and design
9 Text on the screen is easy to read Disagree: 0, Agree: 1, No text or all text is narrated=N/A
10 The material allows the user to hear the words clearly (e.g., not too fast, not garbled) Disagree: 0, Agree: 1, No narration: N/A
Use of visual aids
11 The material uses illustrations and photographs that are clear and uncluttered Disagree: 0, Agree: 1, No visual aid: N/A
Total score /no. of items×100
Actionability
12 The material clearly identifies at least one action the user can take Disagree: 0, Agree: 1
13 The material addresses the user directly when describing actions Disagree: 0, Agree: 1
14 The material breaks down any action into manageable, explicit steps Disagree: 0, Agree: 1
Total score /no. of items×100

PEMAT-A/V: Patient education materials assessment tool for audio-visual material, N/A: Not applicable

As per the PEMAT-A/V user guidelines, an item is rated as “agree” when the respective characteristic occurs through 80–100% of the video and “disagree” when the respective characteristic is not adequately met throughout the video. Finally, the total scores for UOI and AOI are added and divided by the total possible scores (the number of items on which material was rated, excluding the items that were deemed not applicable) and multiplied by 100 to yield a percentage. A higher percentage indicates UOI and AOI, implying exceptionally good or poor audiovisual materials. PEMAT-A/V was modified and used as a systematic method to evaluate and compare the UOI and AOI of patient education materials.[13] Any video below 70% is categorized as having poor UOI or AOI.[18]

Statistical analysis

All videos were viewed and assessed independently by two evaluators (T.W.Y. and T.J.S.) in Microsoft Office Excel. Descriptive statistics were calculated using IBM Statistical Package for the Social Sciences (version 29). Before beginning the assessment of videos, both evaluators calibrated themselves by comparing the ratings against an orthodontic specialist (S.K.), who rated the videos based on the set parameters. Intraclass correlation coefficient (ICC) was used to examine intra-rater and inter-rater agreement for the COI, EV, UOI, and AOI, and repeated after an interval of 4 weeks.

RESULTS

A total of 29 videos were found to be eligible for analysis, as shown in [Figure 2]. The descriptive statistics are presented in [Table 4]. Most videos were uploaded by dental professionals (13 [44.82%]), followed by academic institutions (8 [27.58%]), a company or commercial organization (7 [24.13%]), and patients (1[3.44%]), respectively.

Table 4: Video characteristics.
Characteristics Minimum Maximum Mean±SD*
Duration (s) 60 569 230.46±166.33
Months since upload 12 156 60.79±42.99
Number of likes 3 3700 687.48±1044.70
Number of views 77 423909 74,815±103,110
Number of subscribers 13 328000 56,704±78,765
Viewing rate (%) 77.17±132.24
SD: Standard deviation
Flowchart showing video selection.
Figure 2:
Flowchart showing video selection.

The mean COI score for each criterion is depicted in [Table 5]. This table shows that currency is clearly portrayed in all videos, followed by disclosure and authorship, whereas attribution for references and copyright information were least covered. [Figure 3] depicts the COI scores of all videos where only two videos display high COI.

Table 5: Credibility of information of videos using JAMA benchmark.
JAMA benchmarks Range Mean±SD*
Authorship 0–1 0.62±0.49
Attribution 0–1 0.21±0.41
Disclosure 0–1 0.72±0.45
Currency 0–1 1
SD: Standard deviation, *Maximum score 1, JAMA: Journal of the American Medical Association
The overall credibility of information score using JAMA benchmark. JAMA: Journal of American Medical Association.
Figure 3:
The overall credibility of information score using JAMA benchmark. JAMA: Journal of American Medical Association.

Considering the different parameters for COI: Attribution scored the lowest with 0.21 ± 0.41, as only 6 videos scored 1, 18 videos scored 1 for “authorship,” 21 videos scored 1 for “disclosure,” and “currency” scored the highest for all 29 videos.

[Table 6] shows the mean EV score for each domain. The domain “procedure” scored the highest with a mean of 1.38 ± 0.94, followed closely by “different methods” and “definition,” while “post procedural instructions” scored the lowest mean of 0.17 ± 0.47 followed by “contraindications,” “safe amount of tooth grinding,” and “complications.” [Figure 4] shows overall EV scores, where only one video scored above 18, indicating that most videos had inadequate EV.

Table 6: Educational value of videos.
Domains Range Mean±SD*
Definition 0–3 1.07±0.96
Safe amount of tooth grinding 0–3 0.48±0.74
Indications 0–3 0.97±0.82
Contraindications 0–3 0.21±0.77
Different methods 0–3 1.31±0.85
Procedure 0–3 1.38±0.94
Complications 0–3 0.59±0.82
Post-procedural instructions 0–3 0.17±0.47
The overall educational value scores of YouTube videos.
Figure 4:
The overall educational value scores of YouTube videos.

[Figure 5] depicts the mean scores for each category for UOI and AOI parameters. In the UOI aspect, all videos made the purpose evident and were presented sequentially and used simple language (26 [89.66%]). However, only 1 (3.45%) video summarized key points at the end and just 10 (34.48%) videos had headers for different parts for ease in understanding.

The overall mean patient education materials assessment tool for audio (UOI)-visual material scoring criteria for understandability of information and actionability of information (AOI).
Figure 5:
The overall mean patient education materials assessment tool for audio (UOI)-visual material scoring criteria for understandability of information and actionability of information (AOI).

Considering the AOI, 20 videos (68.97%) clearly identified at least one action for the user but only four videos (13.79%) addressed the user directly.

[Figure 6] shows the overall scores of UOI and AOI. Considering the PEMAT-A/V criteria of scoring at least 70% and above, good UOI and AOI were seen in 17 and 3 videos, respectively.

The overall understandability of information and actionability of information scores.
Figure 6:
The overall understandability of information and actionability of information scores.

The intra and inter-rater reliability was compared by ICC. The ICC values were defined according to current literature, with 0.40 as poor, 0.40–0.59 as fair, 0.60–0.74 as good, and 0.75–1.0 as excellent. The intra-rater reliability (T.W.Y. and T.J.S.) for COI, EV, and PEMAT-A/V was 0.82, 0.99, and 0.99, and 0.90, 0.99, and 0.99, respectively, indicating strong intra-rater reliability. The inter-rater reliability of COI, EV, and PEMAT-A/V was 0.77, 0.99, and 0.94, respectively.

DISCUSSION

There is a rise in the number of patients using the internet to search for health-related information.[19,20] According to YouTube analytics, the majority of users are 25–34 (20.7%) years and 35–44 (16.7%) years which are the fastest-growing demographics on the platform.[21] Furthermore, people prefer to obtain information from A/V sources in comparison to printed online educational materials.[22] Studies conducted across multiple specialties have assessed A/V education materials and have concluded that online materials are often inaccurate and incomprehensible.[23] Many studies have evaluated the quality of oral healthcare videos on YouTube pertaining to dental procedures. As YouTube allows free uploads, the quality of the content cannot be controlled.[24,25] Therefore, it is crucial to evaluate the credibility of oral health-related videos on YouTube to ascertain the reliability and accuracy of authors’ credentials, content sources, and ownership disclosure.[11] The COI, EV, UOI, and AOI of YouTube videos for obtaining information about procedures are debated due to the lack of standardization in video content.[25]

The mean duration of the 29 videos assessed was 230 seconds (3.8 min), a little shorter than other studies about orthodontic clear aligners and lingual orthodontics, with a mean duration of 7.40 min and 4.42 min, respectively.[25,26] According to Chauvet et al, the video duration should be <10 min to maintain the patient’s focus and attention span, implying that the videos presented with an acceptable duration.[27]

The mean number of views per video was 74,815 ± 103,110 [Table 4], which lies within the average number of views of 10,765 recorded for YouTube videos on lingual orthodontic treatment and 113,839 recorded on orthodontic clear aligners.[25,26] The number of subscribers showed a mean score of 56,704 ± 78,765, with 687.48 ± 1044.70 mean number of likes. The videos were uploaded since a mean of 60.79 ± 42.99 months, and the mean viewing rate was 77.17%. All these values lean toward the upper range, reflecting the scarcity of information available to patients about IPR. Most videos were uploaded by dental professionals which highlights a need for the development of a framework for making dental videos for patient education.

Intra-rater and inter-rater scores in this study for COI, EV, UOI, and AOI were high, indicating the reliability of JAMA for scoring COI, the novel tool for assessing EV and PEMAT (A/V) to gauge the UOI and AOI of the videos in this study.

JAMA is a tool that evaluates the reliability and plausibility of videos.[8] The results from this study showed that the COI of videos about IPR uploaded to YouTube was deficient at a mean score of 0.62. Olkun et al. reported that the average mean of JAMA benchmarks for videos on lingual orthodontics was 1.71.[10] As shown in [Figure 3], only two videos scored 4 for JAMA benchmarks as indicated below: (1) “Flash -IPR Process” by Flash Orthodontics (ln.run/g6nnC) and (2) “Orthostripping with disc” by Dental Movies (ln.run/wyGAg). These findings highlight the deficiency in COI, especially regarding authorship, attribution, and disclosure. Attribution is defined as acknowledgment of copyright information and references to the content.[10] Attribution received the lowest score, undermining the scientific credibility of the videos. Consequently, these videos may not be evidence-based or convey information derived from expert opinions. Only six videos scored 1 for attribution, depicting a COI, similar to Olkun’s et al. study, in which only two videos scored 1 for attribution.[10] It is important to evaluate the attribution of oral health-related videos on YouTube to ensure that individuals are not misled by information which is not evidence based.[15] The currency domain, which indicates funding and sponsorship, scored high for all videos.

The EV scoring systems used in this research were adapted from similar YouTube video analysis studies related to oral health.[12,17,28] Most videos contained information about the procedure of interproximal reduction (86.2%) and the different methods of performing IPR (79.3%). The indications for performing IPR (72.4%) were also explained in most of the videos. On the other hand, most videos failed to provide post-procedural instructions (13.8%) and contraindications (6.9%) for IPR. These values imply a paucity in the domains that guide the patient with post-operative hygiene maintenance and the cases which are contraindicated for IPR. Only one video attained a score of 19, indicating a high EV: “Work Tips: Interproximal Enamel Stripping (IPR)” by VinciSmile USA (ln.run/b60yo). Both the overall EV score and domain score for post-operative care are comparable to the study on the analysis of YouTube videos with miniscrew anchorage.[12]

YouTube videos on IPR are applicable as a source of patient education; hence, UOI and AOI were investigated using the PEMAT-A/V. This tool assesses the video and can be used by untrained professionals, making it an easy-to-use tool. Unfortunately, no dental health-related research employing the PEMAT-A/V tool was found in the literature. Almost 17 videos, i.e., 58.72% obtained a good UOI score, which depicted that patients from diverse backgrounds and literacy could understand IPR. However, only three videos, i.e., 10.34% of videos had good AOI scores, implying that very few videos could guide the patient with post-operative care.

Rubel et al. reported 35% and 37.5% of videos with good understandability and good actionability scores, respectively, for analysis of YouTube videos on sinusitis.[29] In the analysis by Salama et al., understandability and actionability scores of 54.5% and 21.8%, respectively, were found for videos on hypospadias reporting higher AOI scores.[30]

The mean UOI was found to be good at 71.93% in contrast to the 65% of videos reported as low in terms of understandability scores of YouTube videos on sinusitis.[30] Some IPR videos (6) had actionability scores of 0% whereas 20 videos had actionability scores ≤70%, relatively similar in actionability score in comparison with YouTube videos on sinusitis.[29] The overall understandability score of our videos was good; however, the actionability score was generally poor. Most videos lost the AOI scores for not addressing the user directly when describing actions, which hindered their actionability. Only one video attained a score of 100% for both UOI and AOI, indicating that this video was easy to comprehend and implement: What is and How to use Orthodontic Interproximal Reduction Kit|Waldent IPR Kit’ by Dentalkart (ln.run/XvGXb).

The content analysis depicted that the number of videos with high credibility, EV, along with understandability and actionability, is very limited depicting that currently, YouTube is inadequate as a source of information for IPR. Thus, health professionals must guide patients to reliable content to help patients understand better.

This study has many limitations, like only English language videos being assessed which may limit the generalizability of the findings. This is a cross-sectional study; thus, time-specific changes in video characteristics could not be recorded. Only the top 50 videos were analyzed for 6 search terms related to IPR; hence, a possibility exists that we may have missed other related content. We recommend that similar studies be conducted periodically in the future to update the improvement of JAMA, EV, and PEMAT-A/V scores for online IPR videos.

These results imply that YouTube is a poor source to retrieve information about IPR; thus, the links of IPR videos with high scores in COI, EV, understandability, and actionability have been provided to assist the dental professionals in guiding their patients, as no video was found to have high scores in all aspects.

CONCLUSION

The credibility, EV, understandability, and actionability of IPR on YouTube are unreliable and deficient. This study demonstrated that even though the IPR videos have good understandability, they show poor actionability as they do not motivate action in viewers.

As more patients are turning to YouTube for health-related information, dental professionals need to be vigilant about the ease of access to unreliable content by patients and guide patients towards authentic and evidence-based content.

Dental professionals, societies, and academic institutions should consider developing high-quality content that is evidence-based to align with the needs of the targeted patient population to guide prospective patients. This study may provide guidelines to produce appropriate and high-quality patient education videos.

Ethical approval:

The research/study was approved by the Institutional Review Board at IMU University, approval number BDS I-01-2023(18), dated 3rd July 2023.

Declaration of patient consent:

Patient’s consent is not required as there are no patients in this study.

Conflicts of interest:

There are no conflicts of interest.

Use of artificial intelligence (AI)-assisted technology for manuscript preparation:

The authors confirm that there was no use of artificial intelligence (AI)-assisted technology for assisting in the writing or editing of the manuscript and no images were manipulated using AI.

Financial support and sponsorship: Nil.

References

  1. , , , , . Interproximal reduction of teeth: Differences in perspective between orthodontists and dentists. Angle Orthod. 2015;85:820-5.
    [CrossRef] [PubMed] [Google Scholar]
  2. , . Interproximal enamel reduction as a part of orthodontic treatment. Stomatologija. 2014;6:19-24.
    [Google Scholar]
  3. , . Enamel reduction procedures in orthodontic treatment. J Can Dent Assoc. 2003;69:378-83.
    [Google Scholar]
  4. , , . Contemporary orthodontics St. Louis, MO: Mosby Elsevier; . p. :406. Ch. 7
    [Google Scholar]
  5. . Analysis of YouTube videos related to interproximal reduction. J Dent Res Rev. 2022;9:165-72.
    [CrossRef] [Google Scholar]
  6. , . Orthodontic extractions and the internet: Quality of online information available to the public. Am J Orthod Dentofacial Orthop. 2011;139:e103-9.
    [CrossRef] [PubMed] [Google Scholar]
  7. , , , , . "How I whiten my teeth": YouTube™ as a patient information resource for teeth whitening. BMC Oral Health. 2020;20:183.
    [CrossRef] [PubMed] [Google Scholar]
  8. . Evaluation of YouTube videos about smile design using the DISCERN tool and journal of the American medical association benchmarks. J Prosthet Dent. 2021;125:151-4.
    [CrossRef] [PubMed] [Google Scholar]
  9. , . YouTube as a source of information about pulpotomy and pulp capping: A cross sectional reliability analysis. Restor Dent Endod. 2021;46:e40.
    [CrossRef] [PubMed] [Google Scholar]
  10. , , . The quality of internet information on lingual orthodontics in the English language, with DISCERN and JAMA. J Orthod. 2019;46:20-6.
    [CrossRef] [PubMed] [Google Scholar]
  11. , , . Orthodontic retention and retainers: Quality of information provided by dental professionals on YouTube. Am J Orthod Dentofacial Orthop. 2020;158:229-36.
    [CrossRef] [PubMed] [Google Scholar]
  12. , , . Orthodontic treatment with miniscrew anchorage: Analysis of quality of information on YouTube. Am J Orthod Dentofacial Orthop. 2023;164:97-105.
    [CrossRef] [PubMed] [Google Scholar]
  13. . Agency for healthcare research and quality. . Available from: https://www.ahrq.gov/health/literacy/patient-education/pemat.html [Last accessed on 2023 Nov 20]
    [Google Scholar]
  14. , . Dental patients' use of the internet. Br Dent J. 2009;207:583-75.
    [CrossRef] [PubMed] [Google Scholar]
  15. , . Quality of free gingival graft content in YouTube videos: Usability in patient information and student education. Med Oral Patol Oral Cir Bucal. 2023;28:e607-13.
    [Google Scholar]
  16. , , , . Audiovisual information of oral epithelial dysplasia: Quality, understandability and actionability. Oral Dis. 2023;30:1945-55.
    [CrossRef] [PubMed] [Google Scholar]
  17. . Patient information leaflet: Interproximal reduction. Available from: https://www.bos.org.uk [Last accessed on 2023 Oct 24]
    [Google Scholar]
  18. , , . Development of the patient education materials assessment tool (PEMAT): A new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns. 2014;96:395-403.
    [CrossRef] [PubMed] [Google Scholar]
  19. , , , , , . A new dimension of health care: Systematic review of the uses, benefits, and limitations of Social Media for Health Communication. J Med Int Res. 2013;15(4):e85.
    [CrossRef] [PubMed] [Google Scholar]
  20. , , . Medical YouTube videos and methods of evaluation: Literature review. JMIR Med Educ. 2018;4:e3.
    [CrossRef] [PubMed] [Google Scholar]
  21. . YouTube by the numbers (2019): Stats, demographics and fun facts. Omni Core Agency 2019 Available from: https://www.omnicoreagency.com/youtube-statistics [Last accessed on 2023 Nov 20]
    [Google Scholar]
  22. , . Assessing of the audiovisual patient educational materials on diabetes care with PEMAT. Public Health Nurs. 2019;36:379-87.
    [CrossRef] [PubMed] [Google Scholar]
  23. . Relationship between health service use and health information technology use among older adults: Analysis of the US national health interview survey. J Med Internet Res. 2011;13:e33.
    [CrossRef] [PubMed] [Google Scholar]
  24. , , , . YouTube as a source of information on mouth (oral) cancer. Oral Dis. 2016;22:202-8.
    [CrossRef] [PubMed] [Google Scholar]
  25. , . Lingual orthodontic treatment: A YouTube™ video analysis. Angle Orthod. 2018;88:208-14.
    [CrossRef] [PubMed] [Google Scholar]
  26. , . YouTube as a source of information about orthodontic clear aligners. Angle Orthod. 2020;90:419-24.
    [CrossRef] [PubMed] [Google Scholar]
  27. , , , , , , et al. What is a good teaching video? Results of an online international survey. J Minim Invasive Gynecol. 2019;27:738-47.
    [CrossRef] [PubMed] [Google Scholar]
  28. , . Guidelines for contemporary air-rotor stripping. J Clin Orthod. 2007;41:315-20.
    [Google Scholar]
  29. , , , , , , et al. Understandability and actionability of audiovisual patient education materials on sinusitis. Int Forum Allergy Rhinol. 2020;10:564-71.
    [CrossRef] [PubMed] [Google Scholar]
  30. , , , , , , et al. Consulting “Dr. YouTube”: An objective evaluation of hypospadias videos on a popular video-sharing website. J Pediatr Urol. 2020;16:70.e1-9.
    [CrossRef] [PubMed] [Google Scholar]
Show Sections