Saturday, 29 December 2012

General Dental Council, CPD and what the point of it all is

The General Dental Council is, until the end of January 2013, seeking comments from the public and dental professions on its new ideas for CPD.

As I've learnt more about the way we do - and don't (largely) - put new knowledge into practice I have become more sceptical that sitting in a lecture theatre for 15 / 20 / 50 hours a year is going to change my practice much. Indeed, there is a systematic review published by the EPOC group at Cochrane that looked at various educational meeting formats (conferences, lectures, workshops, seminars, symposia, and courses) to see how effective they were at improving professional practice or health outcomes, how they compared to other types of interventions to improve practice or health outcomes, and whether they can be made more effective by changing the way they are delivered.

The review included 81 randomised controlled trials. There was a wide range of educational delivery and support amongst these including didactic teaching, interactive teaching, mixed formats, reminders, patient education materials, supportive services, feedback reports, and educational outreach. The authors judged that 17 studies had a low risk of bias, 44 a moderate risk, and 20 a high risk.

...if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.

Overall they found that any intervention where educational meetings were a component resulted in a median risk difference for complying with 'desired practice' of 6% (interquartile range 1.8 to 15.9%) for the low to medium risk of bias studies. What this means is that if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.

Mixed didactic and intervention meetings had a risk difference of 13.6%, didactic a risk difference of 6.9% and interactive meetings a risk difference of 3.0%, when compared to no intervention. 

There are a number of other reviews from EPOC that look at other ways of trying to improve on practice and health outcomes, such as audit and feedback and educational outreach visits. These find risk differences of  4.3% (interquartile range 0.5% to 16%) and 5.6% (interquartile range 3.0% to 9.0%) respectively. Thus, whilst these interventions don't seem to be much different in their effectiveness in changing professional practice, educational meetings (which are a core part of verifiable CPD) are not the only game in town.

the GDC needs to think more broadly about what counts as useful CPD

So it seems to me that if the intention of CPD is - at least in part - to help improve patient outcomes then the GDC needs to think more broadly about what counts as useful CPD. After all, if attending a meeting that gives me an hour's worth of CPD results in no improvement in patient outcomes, what was the point? But if instead I read a Cochrane Review (for which I receive no hours because it is non-verifiable) and yet I change the way I and colleagues manage patients for the better by using audit and feedback and peer-support, wouldn't this be something that really counts?

Please go read the documents and give them some feedback before the end of January 2013...

Sunday, 23 December 2012

What evidence is supporting what you're being told to do?

Communication, evidence and ignorance

We had a patient come in with irreversible pulpitis in a lower premolar a few days ago. The student treating the patient was good clinically and she opened the tooth, did all the necessary instrumentation and a was ready to obturate. But she hesitated. She had been taught that you don't obturate when a tooth (or its periodontal ligament) is symptomatic. We dressed the tooth and had a look to see whether anyone had researched this (I'm a cynic by nature, remember). 

On a quick search of PubMed on clinic we were able to find randomised controlled trials involving symptomatic teeth some of which found no difference in postoperative pain or success, whether the tooth was left for a second appointment or not, and some of which did. There was the Cochrane review from 2008 that found no difference in healing outcomes whether the tooth was treated in one or two visits, though it did find that single visit treatments resulted in significantly more people taking pain killers.

When we discussed this the students said they assumed that what they were told in their course was based on black and white evidence: obturate when the tooth is symptomatic and your failure rate is higher. In fact this recommendation was based on the personal opinion of the teacher (I checked) perhaps drawing on some of the studies showing single visit treatments resulted in more pain killers being taken, who thought dentists and patients would be happier knowing the tooth had settled down before obturating. I know this person very well and I am fairly certain that there is no intention to mislead students into thinking the healing was better in two-visit treatments (evidence level: personal opinion). This to me seems to be as much about miscommunication as about anything else: it isn't clear to the students what the basis of a recommendation is (evidence level: personal opinion).

Stay with me here as I move from undergraduate teaching to continuing professional development (CPD).

CPD: just wise thinking or based in high level evidence?

A few weeks ago Martin Kelleher wrote a decent opinion piece in the BDJ asking how much of what is taught on CPD courses gets translated into practice. He noted the difficulty of measuring the uptake of new knowledge in practice. There are all sorts of issues around this, amongst them: does a user of the knowledge need to use it as it was taught for it to qualify as being used; what if it gets used, but many months or years later; even if it gets used does it change patient outcomes?

Martin is right to raise these issues but I think there is something else we need to think about, whether we are teachers or learners.

When we sit down in a lecture or some other environment where we hope to learn something, the "knowledge" we gain from it could come from many sources. We are often listening to someone we consider an expert (at least, relative to us) and they have experience beyond ours. But their experience its still limited to the things they have done and rarely have they compared what they have done to what they haven't done in an objective and open way. Don't get me wrong - in dentistry this may often be the best evidence we can get even after thorough searches of the medical databases.

Yet when those educating us tell us what the best thing to do is, it is often unclear whether what we are being told its based on their experience, some old dental folklore, a lab study from which they are extrapolating, a single clinical study at high risk of bias or a systematic review of high quality and highly relevant clinical trials. Does this matter?

Well if you, like me, want to do the things that are most likely to benefit the patients you treat and minimise the time you waste trying out useless techniques, then surely it makes sense to know what the likelihood is that the change being advocated will improve your patients' outcomes. You'll want to know what the recommendations are based on.  

Learning the evidence level too

But are you told - whether as an undergraduate or someone doing cpd or postgraduate studies - the level of evidence supporting what you are being taught? My personal experience (evidence level: personal experience) from which I make an assumption is that you aren't. Are you happy about that?

I'm not. I feel that anyone teaching others should be open about whether what they teach is based on a high level of evidence or something less than this. To me that is simply respecting that as learners we need the information necessary to help us decide whether we change our practice or in some other way apply their teaching to practice - or not. 

So what do we do? 

Well, evidence levels have been used in guidelines for years. I would think we could start creating an adaptation of this that does not make CPD and other teaching clunky but which gives the learners a summary of the levels of evidence supporting key components of the teaching - perhaps a page with evidence level and references that accompanies the course. And as students (we're often both teachers and students at the same time) we ought to ask our teachers to provide these. 

Friday, 21 December 2012

December issue of EBD journal out

The December issue of the Evidence-Based Dentistry Journal is out. 

If you are an undergraduate at QMUL you have full access through your institutional login. 

If you're a BDA member you get access through your automatic subscription to the BDJ. 

If you're working for the NHS in England and are not a member of the BDA then you should get access through your local NHS library using Athens (register here).

Wednesday, 19 December 2012


This blog has little to do with evidence-based dentistry but is simply to draw attention to Prezi. I have used this presentation software in preference to PowerPoint for a couple of years and personally find it much more fun to use. I was chatting about this with a couple of students on clinic this week and thought it may be helpful for others to know about it too.

Students and teachers get to use Prezi for free so long as its used for educational purposes. See here. Below is an example of a Prezi I used when presenting to the British Society of Gerodontology - it won't make much sense without me speaking but it'll give you as sense of what is possible.

Tuesday, 11 December 2012

NHS OpenAthens access in the UK - yippee...perhaps

A couple of days ago I blogged about how hard it is to get access to journals without an institutional subscription, imagining that most general dental practitioners wouldn't have such access. Well, I learnt something today. If you work in the UK and are treating patients under an NHS contract it seems you may well be able to access at least some journals (though I've yet to find out how many).

It takes a little to work out but basically, even if you don't have an nhs or academic email address, if you have some form of professional email address (this can be one from where you work e.g. a practice email address) and/or a practice website get in touch with your local deanery administrator. If you let them also know your GDC number then this may help speed things along. If you don't have an academic or nhs email address it seems that trying to register automatically online will just result in you being rejected.

Good luck.

Sunday, 9 December 2012

Dental Journal Access - ha! ha! ha!

I presented to the British Society of Gerodontology Thursday last week on how to find and access research, with an emphasis on Atraumatic Restorative Treatment (Prezi available here).

I had begun to develop an interest in the problem of access once people are outside of the institutional access that comes with being part of an educational establishment or NHS Trust because a couple of former students had contacted me to say this was a problem now that they are out in practice.

So I did a little investigation for the BSG presentation into the access one can obtain to the top 81 journals as ranked by impact factor when you have:

  1. no institutional access 
  2. 3-4 years after publication again without institutional access
  3. with an educational institution's access (in my case my university's - QMUL) 
  4. the access you could obtain as a member of the British Dental Association through their library.

I was not surprised to find that only 12% of these journals allowed access to non-subscribers but was very pleasantly surprised that the BDA has access to 87% of them. I have been in contact with the very helpful staff at the library there and it seems unlikely that the BDA will be able to afford the high cost of online institutional access for its members. They make a charge of £2.50 per article that they copy / scan for you but my feeling is that if one reads the abstract well and chooses only the studies with the most appropriate design (i.e. controlled trials for intervention studies) that one could keep the number of articles down to a minimum.

I think there are two further points that stand out for me here. One is that accessing research needs to be easy and immediate if we are to encourage its use in day-to-day decision-making and the current difficulty accessing journals hampers this (even with the excellent BDA service). Second, this makes it even more important the dental profession as a whole begins to contribute to the summarising of research in the form of systematic reviews or guidelines that are freely accessible. It would be much simpler for all of us if there were up to date summaries of research relating to particular clinical problems that we could access any time we wanted. 

Because of this I'm beginning to think we could work collaboratively in the form of an evidence-based dentistry wiki to collect, critique and summarise evidence relating to clinical problems. Anybody interested in helping me out here - please get in touch!

Wednesday, 21 November 2012

Knowledge translation / utilisation / transfer - getting research into practice

I am sitting on a train speeding back down from Edinburgh to London after attending a conference called "Improving Quality in Healthcare: Translating Evidence into Practice".

It has become increasingly interesting to me that even where there is good evidence for some things it often isn't used or its used inconsistently. There is what is called a 'knowledge gap' between what the research suggests we should be doing to improve patient care and what we are actually doing.

Now, you'll be aware that in EBD research isn't everything. Just as crucial are the patient's values and aspirations and our own clinical expertise and experience. So there are many points between when a researcher (or bunch of researchers) demonstrates convincingly that we ought to be doing something we're not and that thing being applied to our patients' care.

First we need to find or be made aware of the research. Then we consciously or subconsciously accept or reject it or decide it isn't appropriate to our environment or our patients. Then (if we accept it) we might decide to use it and then we have to overcome the barriers that stop us putting it into practice. And even then the way the research says we should do things doesn't fit with what our patients want. If we do manage to change the way we do things we then we have to work out how to keep on using it and not slip back into the 'old ways'.

It's amazing we ever change what we do faced with all that.

Well, this conference was concerned with this field. And I think it's fascinating for what is the point in spending millions of dollars on research, putting patients through the trials and wrapping up clinicians in conducting the research if it never gets used?

Now we don't teach much about this in the undergraduate course but if you find that you've read something that sounds like good evidence and yet you see clinicians around you doing something else, think about that long chain of events from research to the patient. This field is demonstrating that there are effective ways to change behaviour for the better but there's much to more to be done. Keep an ear to the ground...

Some pertinent references if you're interested in learning more...

Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. The Medical journal of Australia. 2004 Mar 15;180(6 Suppl):S57-60. PubMed PMID: 15012583.

Mickan S, Burls A, Glasziou P. Patterns of 'leakage' in the utilisation of clinical guidelines: a systematic review. Postgraduate medical journal. 2011 Oct;87(1032):670-9. PubMed PMID: 21715571.

Michie S, van Stralen MM, West R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42. PubMed PMID: 21513547. 

A PICO lecture - helping a customer buy the "right" phone

I have uploaded a lecture I recorded on using PICOs to help turn problems into structured questions.

It is aimed at 1st year undergraduates and so I have used instead of a clinical problem one that involves someone buying a phone. I hope it works...

Please take a look at it here.

Feedback always welcome

Sunday, 28 October 2012

Dental Elf questionnaire

The Dental ElfDental Elf is asking readers to take a survey and enter a £30 Amazon voucher prize draw.

I've blogged about Dental Elf previously and personally have an RSS feed to my Google Reader account so that I can keep an eye on latest research. The site doesn't critically appraise the articles like the Evidence-Based Dental Journal does but it does alert me to papers I might want to delve into....

TRIP and its PICO

TRIP now includes PICO search

TRIP Database has just been redesigned and now incorporates a PICO search box alongside the google-style one. I've blogged about PICOs before and it's great to see a major search engine innovating with them. There's a screencast here

One suggestion, though - I would avoid trying to fill each of the 4 boxes. It's likely you'll end up with very few papers if you do. When I begin a search I usually start with search terms around just two concepts - say the problem / population and the intervention or intervention and comparison. If I find then that there are loads of hits with irrelevant outcomes, say, I may then add terms for the outcome of interest to me. But this is rare.

Monday, 17 September 2012

What's wrong with what you learn?

I love this quote and just saw it again on the Evidence 2013 conference site (

"Half of what you are taught as medical students will in 10 years have been shown to be wrong.

And the trouble is none of your teachers know which half"  

Dr Sydney Burwell, Dean Harvard Medical School. In: Evidence Based Medicine, Sackett et al, 2000: 31.

Tuesday, 3 July 2012

Steroid cover for patients on long term steroids

A student asked me recently how I actually use evidence in my clinical practice. I think this is a good question as there's no point in teaching what EBD is but not showing how to apply it.

In several of my blogs I raise queries that arise typically from discussions with students. These often alert me to gaps in my knowledge of the current best evidence - even if I can regurgitate something I was once taught / read that sounds plausible.

Well this afternoon I had a moment of uncertainty regarding what we were taught to call 'steroid cover'. That is, when I was an undergraduate I was taught that patients on long term steroids may require a supplementary dose around the time of a stressful surgical event. I don't actually know what is taught to the undergraduates today but there was certainly a gap in my 'up-to-date' knowledge that I needed to sort out.

The patient I saw this afternoon is a not a well lady and is on a fairly high dose of steroids so I thought I had better just check that I wasn't about to send her into adrenal insufficiency and shock as a result of me taking out a tooth.

So I went to the TRIP database and plugged in 'steroids dental treatment' and this lead me to this:

And clicking through led me to a TRIP answer here.

But further down the TRIP page was this:

 Now this looked helpful but was way out of date. If something is 8 years old there's a possibility some new evidence has come to light since. However, if one clicks through to Pubmed it often shows similar articles on the right hand side. So this is what I saw:

That link on the side was for a 2008 systematic review. Still not ideal time wise but 4 years younger...However, on the right of this page there was a link to a more recent Cochrane review:

You can click through to the Cochrane review here.

I generally prefer to read Cochrane reviews because they are on the whole better done and a lot stricter on the trials they allow into the review. I therefore feel more confident with their conclusions.

The first review above had included randomised controlled trials (RCTs) and cohort studies whereas the Cochrane review only allowed RCTs. Basically, though, they both concluded there wasn't much evidence to support the use of steroid cover. However, the Cochrane review is cautious because in the 2 RCTs it included there were only 37 patients and so the evidence is not strong. The non-Cochrane review had over 300 patients but a large proportion were patients followed up after not giving them steroid cover - none of whom developed an adrenal crisis.

So to answer the student's question about how I use EBD I have in about 5-10 minutes established that the evidence is weak on the use of supplementary corticosteroids and so rather than use a poorly-substantiated intervention (extra steroids) I will take the tooth out with her having taken just her normal steroid dose. 

Now maybe all you students are saying "yeah - that's what they taught us in the Human, Health and Disease course", but this illustrates that like me you will find yourself in the future unsure whether what you were taught as undergraduates is still relevant. Don't be ashamed to ask the questions and admit to up-to-date knowledge gaps.

Friday, 29 June 2012

June 2012 EBD Journal out

The latest issue of the EBD journal is out. 

Elliot Abt from Chicago digests the study that reckoned bite-wings could cause menangioma - and spits it out in disgust. The commentary covers a number of the problems with case-control studies and reporting bias, which is an education.
There's a commentary on a systematic review that asks if asthma is associated with increased caries risk and concludes that the evidence for a link is of low quality.
A commentary on a Cochrane Review of one-to-one dietary interventions in the dental setting concludes that this can work but the evidence relates mostly to studies looking at changing fruit / vegetable and alcohol consumption than sugar intake. The original review was co-authored by one of my colleagues, which is very gratifying!
There are three articles in this issue written by Queen Mary staff: Dr John Buchanan writes about a study using a tongue protector for patients with burning mouth and Dr Simon Critchlow comments on a review of ceramic inlays. And I have written a commentary on a systematic review of ART failure rates. 
Don't forget that you may need to sign in with your institutional access to read the articles. 

Thursday, 21 June 2012

New Centre for Evidence-Based Dentistry site goes live


The Centre for Evidence-Based Dentistry has revamped its website, which has gone live today.

This is a great resource for anyone wanting to develop their EBD skills. I thoroughly recommend the site and congratulate Derek Richards and the web design team for putting it together.

Saturday, 16 June 2012

Erythromycin and statins

Although probably not a first line antibiotic for most dental infections, erythromycin is in our armamentarium as dentists. I was surprised therefore to read of potential problems for patients taking statins - common drugs used for reducing cholesterol to reduce risk of heart attack. Because of the way they are metabolised taking erythromycin can raise the levels of statin.

Is there an interaction between erythromycin and statins? - NeLM

Wednesday, 13 June 2012

Searching for patient information

If you read this blog regularly or have scanned the entries you'll see there's quite a bit about searching for evidence and resources to help us do that. What I haven't paid much attention to is helping patients find reliable information relevant to them.

Of course, if they feel up to it, they can search the Cochrane Library and read the 'plain language' conclusions that are helpfully part of each review. But if they want some more basic background knowledge there are resources that may be more reliable than Google.

PRODIGY One is Prodigy, which was formerly the Clinical Knowledge Summaries (CKS) site, which was part of the UK's National Health Service. Here patients can select a tab for 'Patient Info' and this links through to leaflets on a myriad of medical and dental topics.

MedlinePlus Trusted Health Information for You The other resource is Medline Plus, which is the US equivalent and allows patients to link through to leaflets from a variety of sources.

Having patients better understand the aetiology, prevention, prognosis and treatment of their conditions is helpful, I believe, on several fronts. Firstly it helps go some way to improving 'informed' consent, something I am inherently dubious of because of knowing how informed a patient is or needs to be to make 'informed' consent. Secondly it simply helps improve communication and may mean that patients respond better, for example, to preventive advice or to treatment suggestions. Finally, patients informed by more reliable information may be more likely to accept an evidence-based approach to their care since, hopefully, the resource offers higher level evidence than mere opinion and here-say.

If I was in practice now I would probably stick a link to one or both of these on my website. It may improve a practice's image as a patient-focused place as well as directing patients to helpful information.

Wednesday, 6 June 2012

Still experimenting

Please excuse this experimenting of mine. I had some fun recording a lecture in our elearning suite today and just wanted to put it out a kid with brand new toy...

This lecture will be the beginning of a series of short lectures on EBD beginning with why we might want to use it through to developing the tools. Yes - it means many more hours of me standing in front of a screen on my own in room talking to a virtual's what students want we're told!

Feedback, as ever, is much appreciated.

Saturday, 26 May 2012

Number needed to treat - NNT

When someone asks me about something statistical I always panic, if only internally. It's not my strong point and that's why I'm a dentist and not a statistician.

Somehow on clinic on Friday, a student and I got to looking at the 5 year results on the Hall technique trial taking place in Scotland (link here). I was waxing on, as I do, about how great the results looked - major failures (i.e. irreversible pulpitis, loss of vitality, abscess, or unrestorable tooth) were 3% in teeth treated with the Hall technique and 16.5% for those with conventional treatment. The p value was 0.000488 - highly significant. And then it said "NNT 8". That's when the question came - what is NNT?

Let's say there's a drug that prevents heart attacks and in a trial comparing it to placebo, the results report an NNT of 8 over, say, 5 years. What this means is that 8 patients would have to take the drug for 5 years to prevent one heart attack. 

Now, this is a useful indicator of the effectiveness of the drug. Say the NNT was 100. That would mean 100 patients would need to take the drug for 5 years to prevent a single heart attack. Now you might think that it's still worth taking the drug but if the cost were, say, £1000 a year for the drug, it starts to look like a pretty expensive game. And what if the side effects are significant. For example, the drug causes severe and debilitating muscle cramp in 20% of people. So one person in 100 doesn't get a heart attack but 20 people get laid low by leg cramp. Maybe it's still worth it, but the point is that you begin to be able to get a sense of the benefit of an intervention by using NNT.

What the Hall trial here was saying is that since there was an NNT of 8, this means that - compared to conventional treatment - 8 teeth would need to be treated with the Hall technique instead of that conventional treatment to prevent one major failure. If that NNT had been 100, the technique wouldn't have looked that much better than conventional. If the Hall technique is, say double the cost of conventional treatment, treating 100 teeth with it instead of the conventional treatment could seem extravagant.

If the NNT had been 2 then that would mean for every 2 teeth treated with the Hall technique compared to the conventional treatment there'd be one tooth saved. That would be very impressive.

But cost is only one issue. What are the consequences of a major failure? This is an important issue when deciding what the NNT needs to be to have significance clinically. Say you had a drug that stopped a lethal cancer but the NNT was 100. You might feel that this was good as you save a life for every 100 people given the drug. The consequence of the intervention is massive. But what about saving a tooth? Is an NNT of 100 still significant? An NNT of 8 probably is from a humble dentist's (rather than oncologist's) point of view given that this means fewer children needing extractions and the loss of space maintenance. I'd like to know, though, what the cost of doing this is to every deciduous tooth with caries as this would allow an economic evaluation of the technique too.

There are no right and wrong answers to what the NNT should be, but perhaps you can see how the NNT can help inform us as clinicians, our patients and policy people who have to decide how to allocate scarce NHS resources.

If you'd like to read some more about this there's a very accessible explanation here.

Wednesday, 23 May 2012

Writing up research

Learning how to write an academic article can take a long time and involve a lot of frustration. The BioMed group has put together some guidance when submitting research to journals but glancing through this open resource I felt there was sound advice for anyone writing up a bit of undergraduate or postgraduate research:

BioMed Central | BioMed Central author academy

BioMed is an open access online publishing group.

Tuesday, 22 May 2012

EBD intro prezi

I'm experimenting with a series of simple prezis as part of our new curriculum. The idea is to get stuff up online for students to watch and then to spend face-to-face time discussing it and developing the ideas. Any feedback welcome.

Monday, 21 May 2012

Crowns or not for root-filled teeth?

Some time ago I wrote a paper for the EBD journal looking at whether there was high level evidence (i.e. systematic reviews or randomised controlled trials) for restoring heavily-filled vital posterior teeth with crowns (1). I was unable to find a single RCT let alone a systematic review of RCTs. At the time, though, I came across a study that compared crowns versus no crown on root-filled premolars (2). It was as small study with 117 participants and a fairly low failure rate in both groups (root-filling plus composite versus root-filling plus crown) and no statistical difference between the two. My search strategy would have allowed for other trials involving root-filled teeth but there appeared to be none. 

And so, since that time, I have been discussing with students that the evidence for placing crowns on root-filled posterior teeth is poor, and that there is therefore a reasonable degree of uncertainty over whether we should place them or not. I raise this because there are known negative consequences of placing crowns: cost, time, removal of sound tooth tissue, possibly increased risk of caries due to poor margins and poor OH, and probably some more. Do the positives of preventing tooth fracture and maintaining coronal seal outweigh these?

By coincidence, this morning I have just extracted a root-filled and crowned lower left 2nd molar with the students because it was grossly carious beneath the crown, causing it to fail (only roots retained). And that was in a well-motivated patient with good OH and low sugar intake. 

The Cochrane Library - Independent high-quality evidence for health care decision making

A systematic review has just been published (3) that, funnily enough, identified just one RCT comparing crowns to no crowns on root-filled teeth - the one I described above. What was the conclusion? 

"There is insufficient evidence to support or refute the effectiveness of conventional fillings over crowns for the restoration of root filled teeth. Until more evidence becomes available clinicians should continue to base decisions on how to restore root filled teeth on their own clinical experience, whilst taking into consideration the individual circumstances and preferences of their patients."

I think I might have worded this differently and suggest that, equally, there is insufficient evidence to support or refute the effectiveness of crowns over conventional fillings but the point is still the same - we are left with personal experience and patient values to guide us (2 of the 3 components for evidence-based decision-making) but are left bereft of good research to inform this.

Given the number of crowns placed in practice and the cost of these to individuals and society, plus the cost of root-canal fillings in the first place, it seems ludicrous that those who pay for these services (the NHS, private insurance groups, patients) do not demand an RCT or two to be done. If anyone's got an idea of where to get the funding and if there's anyone in practice who wants to participate, I'm ready to run one!

Happy decision-making ;-)


Wednesday, 14 March 2012

Vodcast for students on reading bitewing radiographs

On Monday I ran over some bitewings with one of the students and it seemed that it could be useful to do a recording and put it out there for other students to watch. So here it is:

This is the first vodcast I've done using QM's personal lecture capture software. Does it work for you? Would more like this be useful? Was it too long / slow / monotonous?

Please let me know in person, by email or by leaving a comment.

That picture clearly has nothing to do with radiographs. I saw it in a basin on clinic this afternoon and thought it was kind of pretty - it's there to spruce up the blog.

Sunday, 11 March 2012

Confidence intervals

I came across a very nicely written description of what confidence intervals and p-values say to us about studies, and how they differ.

It's here.

This is one of a series of brief summaries on several aspects of clinical and economic evidence available here.

Friday, 24 February 2012

Diagnosing pulpal pathology

Cover image for Vol. 45 Issue 3

I spend a lot of time discussing diagnosis of pain originating from the pulp with students. We have mostly been brought up on making diagnoses of irreversible or reversible pulpitis based on signs and symptoms - spontaneous pain, length of pain, reaction to thermal stimuli, percussion, electric pulp testers, ethyl chloride, bladebladebla. And what I realised a while ago was that some of those teeth - based on these criteria - that I had determined as irreversibly damaged or non-vital actually turned out to respond well to simple temporary dressings without going anywhere near the pulp i.e. the inflammation in the pulp was actually reversible.

So I was pleasantly surprised to come across a systematic review that looks at the sensitivity and specificity of signs, symptoms and diagnostic tests commonly used to make decisions about whether a pulp is alive and well or ailing.

Diagnosis of the condition of the dental pulp: a systematic review

Of 18 studies, none were high quality according to pre-specified criteria, 2 were moderate and the rest were low quality.

Regarding signs and symptoms as indicators of the inflammatory status of pulp the authors concluded: "there is insufficient evidence to determine whether the presence, nature and duration of toothache offer accurate information about the extent to which dental pulp is inflamed. The evidence base is also insufficient to assess the accuracy of other commonly used clinical markers of pulp inflammation "

and with regard to sensibility and vitality testing they conclude: "there is insufficient evidence to determine the diagnostic accuracy of tests used to assess pulp vitality".

I think this is borne out by my experience and suggests that, with plenty of discussion with the patient, we take a conservative view in deciding whether a tooth needs to be opened up or extracted. It may be that, in the absence of a clear diagnosis, we are best temporising and warning the patient the pain may get worse.

One thing, however, is that this review did not take into account the use of radiographs. We often don't see apical changes with irreversible pulpitic teeth but a tooth that does not respond to sensitivity testing and that has an apical radiolucency plus a potential cause for pulpal necrosis (deep filling, caries, fracture) may be diagnosed more confidently as being non-vital.

Wednesday, 25 January 2012

Zinc-oxide-eugenol and composite

For a little while I have been keeping a secret. I had read that eugenol was a pretty effective antibacterial agent and it had been suggested by a specialist endodontist that we place it in the coronal portion of the root canal as an antibacterial plug.

So I place IRM as a plug. What do I then do when I want to place a composite? Well, I place a composite.

That's the secret and I'm ashamed. But why? Because when I was an undergraduate student I seemed to remember being told Zinc oxide eugenol stopped composite from.....what? Setting? Bonding? I couldn't remember but I figured all that eugenol was far enough away from where I was bonding so what did it matter.

But there was always a niggling doubt.

Anyway, it all came out when a student on clinic needed to place a restoration in a tooth that had had IRM placed as a temporary, and which plugged the root canal. I suggested she leave the IRM plug and suddenly I heard this voice saying "that's wrong - very wrong" and I hastily said to the student that it was likely the materials lecturers would lynch me if she ever told them.

But then I thought, what is the science that underpins this doubt I have about using zinc-oxide eugenol? And what about leaving it as a sedative base in those deeply-cavitated pulpitic teeth?

I did a search in pubmed:
(composite OR bonding*agent*) AND (eugenol OR IRM OR kalzinol OR ZnO) AND (strength OR hard*)

There were 116 hits. There were conflicting results on the whether zinc oxide eugenol (rather than just eugenol) significantly affected the bond strength of resins to enamel and dentine. It certainly wasn't as clear cut as I had thought.

I was only able to find one paper that looked at how the composite itself (rather than the bonding agent) was affected. In this paper, whilst composite hardness was affected by the presence of IRM, it was only to a depth of 100 microns - 0.1mm. (He LH, Purton DG, Swain MV: A suitable base material for composite resin restorations: Zinc oxide eugenol. J Dent 2010;38:290-295.) The authors suggested this wouldn't have an affect on bonding elsewhere.

This is in no way a good systematic appraisal of the literature and I hope to soon get round to doing one but  for the moment I'm feeling like I might have a defence should anyone from the materials team coming looking for me... ;-)

Thursday, 19 January 2012

DEBTs in the EBD journal

You may be aware of papers in the EBD journal called DEBTs - or Dental Evidence-based Topics.

The idea is that you as a clinician or student will have a clinical question. You might wonder if you should be putting adhesive under fissure sealants or not, how best to treat caries in deciduous teeth or how long a posterior resin-bonded bridge is likely to last.

In order to arrive at a (hopefully) more informed position you then search for research evidence that could help using the PICO structure to guide you as discussed elsewhere in this blog.

It's important to think about what research would best inform your decision - it may not always be randomised controlled trials.

Having found what evidence there is, you then need to critically-appraise it and summarise whether the evidence you have found is valid internally and externally, and how it informs you in trying to answer your clinical question.

If well conducted the journal will publish these DEBTs because they bring the latest level of clinical evidence to the notice of other clinicians, who may well be wondering about the same clinical problem.

Any students who would like to have a go and get a publication to their name, feel free to drop me a note.