OR WAIT null SECS
Dr. Marty Jablow is America’s Dental Technology Coach. Due to his expertise in dental technologies, he is in high demand both as a lecturer worldwide and as a consultant to the manufacturer community. He is a member of the prestigious Cellerant Best of Class selection committee, charged with the annual selection of industry awards, and is the Chief Development Officer of Cellerant Consulting Group dentistry’s leading incubator and accelerator. Dr. Jablow has written articles for every major dental journal including his own column and video series. He is president of Dental Tech Advisors, a dental consulting company.
Angam Parashar is the co-founder of Dentistry.AI, a cloud-based artificial intelligence technology built to serve the dental industry.
While you may think sci-fi and imagination when you hear about artificial intelligence, the future of AI in dentistry is very, very real.
Some of us remember Will Robinson’s loyal robotic pal in the “Lost in Space” series of the 1960s. Others will trace the sci-fi vision of intelligent autonomous machines to the day Skynet became self-aware and turned on humanity in the “Terminator” films.
The term artificial intelligence (AI) and the official pursuit of intelligent machines in the scientific community actually dates to a 1956 conference of researchers from Dartmouth and IBM.
Today’s AI is invading our everyday lives, albeit in more subtle ways, such as digital assistants like Alexa and Siri. And now, AI in dentistry has arrived!
Easy for dentists … hard for HAL
Consider a daily task that we as dentists view as routine and relatively simple: finding caries on X-rays. In fact, in doing so we are “processing” conversations with the patient, the patient’s history, complex and nuanced radiographic images, and our direct intraoral exam. We’re also leveraging our training, which includes our dental education and having read thousands of radiographs over years in practice. Even so, it’s estimated that our misdiagnosis rate of caries from X-rays may be 20 percent or higher.
For machines to perform tasks such as reading radiographs, they must be “trained” on huge data sets to recognize meaningful patterns. They must be able to understand new information in the form of spoken language, written text or images with proper context and nuance. Finally, they must be able to make intelligent decisions regarding that new information and then learn from mistakes to improve the decision-making process. In order for an AI system to have a practical benefit in the real world, all of this must happen in about the same time that a human being can perform the same task. Until very recently, applications of AI on a broad scale weren’t technically feasible or cost-effective, so the reality of AI hasn’t yet matched the possibilities.
Whatever the technological challenges, machines do offer some clear advantages. Computers aren’t biased. As humans, we come with innate biases and we may judge things prematurely. Computers consider only the data being provided. Machines also don’t get tired. We can work for four or five hours straight before getting fatigued; machines work 24/7 without coffee breaks. Another advantage is that machines don’t get bored. The tasks that we gladly offload are monotonous and repetitive. Finally, machines are fast. While current AI systems are largely one dimensional based on the specific task for which they are trained and programmed (e.g., read radiographs and predict the locations of caries), they’re often far faster at that task than humans.
The machines rise
The last five years have marked the modern era evolution of AI, which is being ushered in with tremendous hype and investment. Big data and cloud computing have provided ready access to the large data sets required to train intelligent systems. All that data requires vast amounts of storage, which has become not only cheap but also fast in terms of data retrieval.
New York Times tech author John Markoff reported on another breakthrough that came out of a 2011 project called Google Brain, which applied “deep learning” methods to the challenge of extracting meaning from 10 million YouTube videos images. Ironically, once set free to surf the internet, Google Brain did what its human counterparts do every day - it searched for (and successfully found) cats. What’s astounding about this accomplishment is that scientists never told the machine, “This is a cat.” By using deep learning, Google Brain created an abstract construct for cats based on image recognition.
Deep learning, the most cutting-edge AI technique in the broader field known as machine learning, uses layered neural networks patterned after the human brain. Traditional machine learning techniques rely on handcrafted rules defined by human domain experts and don’t improve with larger data sets. Deep learning creates its own rules that improve with additional data, making it well suited to interpret the unstructured data required for advanced applications such as self-driving cars, predicting earthquakes, and disease detection, diagnosis and treatment recommendations in medicine. Dentists actually have access to a deep-learning AI platform for detecting caries right now. Dentistry.AI, in the late stages of clinical evaluation, allows any licensed dentist to sign up as an investigator and utilize the system.
The only drawback to deep learning at the time of the Google Brain project was that it called for tremendous computing resources. Finding cats on the internet required 2,000 CPUs housed in climate-controlled data centers. In a moment that has been called the Big Bang of AI, researchers equaled the deep learning capabilities of those 2,000 CPUs using 12 NVIDIA Graphical Processing Units (GPUs). It turns out that the graphics chips that enable today’s ultra-realistic computer games also provide the speed and processing firepower to fuel a revolution in artificial intelligence systems. (As a side effect, gamers have been frustrated by backorders and shortages as the industry has snapped up NVIDIA’s chip supply, which is also used for mining cryptocurrency).
Continue to page two for more...
AI on the dental horizon
Healthcare in general is a very natural customer for artificial intelligence applications. After conquering the television game show “Jeopardy” in 2011, IBM’s Watson has gone on to a second career in medicine. Oncologists at New York’s Memorial Sloan Kettering Cancer Center have recently trained Watson to help fight cancer. While the program is still in the early phases, the machine already does very specific, monotonous and time-intensive tasks extremely well. For example, Watson can read a half million medical research papers in 15 seconds and, with deep learning, can recommend diagnoses and the most promising treatment options.
With the ability to analyze vast numbers of diagnostic images such as X-rays, CT scans and MRIs, systems like this can point doctors and radiologists to the most probable areas of concern, increasing both the speed and probability of detection. And now with the FDA creating regulatory pathways to encourage developers of medical decision support software, analysts predict that the use of artificial intelligence in healthcare will grow tenfold in the next five years.
As noted earlier, the Dentistry.AI team has a platform for caries detection that’s in the final stages of clinical evaluation. Active development began just over two years ago and the engineers quickly learned that teaching a computer even this singular dimension of clinical dentistry isn’t easy. Yet there has been significant progress toward a clinically relevant predictive assistant for the dental practice.
In a recently published study, they presented the results of a man vs. machine caries detection challenge that pitted three practicing dentists against Dentistry.AI in evaluating 500 bitewings. The machine outperformed the dentists in “sensitivity,” which measures the proportion of caries correctly predicted when compared with total caries present in ground truth. The dentists won the day in “precision” (the ratio of correct predictions versus total predicted caries sites), although the machine was comparable to one of the three dentists in precision as well. Let’s call it a tie for now, although the system continues to learn and improve. Commercial availability of a reliable caries detection tool based on deep learning artificial intelligence appears likely in the next 12 to 24 months. Applications for detecting periodontal disease and the bone loss that accompanies it aren’t far behind.
With the continued adoption of CBCT, interpreting cone beam images is another area in which AI can boost productivity. At this point, analyzing cone beam data requires a specific level of training and expertise. This analysis can be time consuming, involving sifting through hundreds of image slices. With AI, the entire process of interpretation can be automated to assess the image as a whole to detect dental pathologies more quickly and accurately. Clearly, the stage is set for the rapid proliferation of truly impactful applications of AI in dentistry over the next year or two. In 10 to 15 years, the use of AI-based technologies in the practice will be as commonplace and pervasive as practice management and imaging systems are today.
The next frontier
With technological obstacles falling and research turning to development, we’re certainly on the threshold of a range of AI-enabled tools for dentistry. We’ve already seen recent product introductions incorporating elements of artificial intelligence and machine learning (AI/ML). DEXVoice, launched at the Chicago Midwinter Meeting, is a stunning technology enabled by natural-language processing (NLP). Developed by DEXIS in conjunction with Simplifeye, the digital assistant replaces traditional point-and-click interfaces with simple and quick voice instructions (“Show me the last full-mouth X-ray.”). Also recently announced was a machine learning-based schedule optimization program called MMG ChairFill. This program interfaces with your practice management and marketing systems to proactively schedule unfinished treatment and launch new patient marketing campaigns based on profit maximization algorithms. We’ll continue to see AI rapidly employed in the practice management and growth arena.
With the newest capabilities enabled by deep learning techniques, AI will begin to impact dentistry on a clinical level as well. First-hand experience with development stage technologies (i.e., caries detection) has already demonstrated AI’s potential value in everyday practice. We’ve confirmed that these tools can recognize things on images that even the most experienced dentist may otherwise miss. Furthermore, we’ve seen the results returned in near real time, fast enough to be incorporated into a busy practice workflow.
In the very future, we foresee deep learning analysis tools for images, assisting in diagnosing and treatment planning of periodontal disease by enabling early detection of bone loss and changes in bone density. Detection of peri-implantitis and early intervention is a likely benefit in implant dentistry. In orthodontics, more sophisticated predictive models for tooth movement will likely enhance digital treatment planning. Applying deep learning image analysis to oral cancer will lead to earlier detection and more accurate diagnoses with lifesaving implications.
The apocalyptic sci-fi fantasy that these machines and systems will replace us as dentists is nowhere near reality. It is, however, certain that they’ll soon make us better dentists by providing more data points for our clinical decision making.
For technology enthusiasts, our advice is to strap yourself in - there is an incredible voyage just ahead!
1 White SC, Hollender L, Gratt BM. Comparison of xeroradiographs and film for detection of proximal surface caries. J Am Dent Assoc. 1984;108:755-759.
2 Captain, Sean. Paging Dr. Robot: The Coming AI Health Care Boom. Jan. 8, 2016. https://www.fastcompany.com/3055256/paging-dr-robot-the-coming-ai-health-care-boom