Saturday, April 7, 2007

The Obama Video and My CME Blog

Recently, someone edited Apple Computer's 1984 Super Bowl ad to promote candidacy of Barack Obama. The resulting video was posted on YouTube and viewed by millions, demonstrating the power of individuals who skillfully use technology to broadcast their message.



Communications technologies also facilitate the power of individuals to share information about CME. This blog being one example.

In some sense this is not new. CME professionals have long shared information to help one another with various common issues. However, technology now allows individuals to reach a wider audience than before. Imagine how less influential the Obama video would have been if it had been distributed only on film or videotape.

As technology amplifies the distribution of information, it can also amplify the distribution of errors. The controversy about errors in John Seigenthalter's Wikipedia entry is an example. And, I expect that we'll hear much more about risks of empowering individuals to publish on the Internet, such as the upcoming book, The Cult of the Amateur: How today's Internet is killing our culture. On the other hand, the publishing and sharing of information in the digital commons promotes collaboration and progress that wasn't previously possible. Some examples that I've encountered recently include the pilot project to improve the patent application process by posting applications for public comment and Kiva, an organization that uses the web to promote loans to businesses in developing countries.

Will posting personal opinions and observations about CME on the Internet promote confusion among newer members of the CME profession? Will the Internet degrade the quality of generally available medical information? I'd expect the opposite.

The more opinions available about a topic, the more those seeking information will be able to distinguish between the general consensus and divergent outlier views. Furthermore, in comparison with other media, the Internet allows information consumers to interact with information producers in ways not previously possible. Unlike printed materials, readers can quickly and easily ask for clarification, and authors can respond and, if necessary, make corrections.

Even without the Internet, CME professionals might sometimes receive incorrect opinions from their colleagues. And, the technologies that empower publishing by individuals also enhance the ability of accrediting organizations and professional societies (e.g., ACME, SACME, etc.) to disseminate official information and the consensus opinions of expert practitioners. Therefore, CME managers already need to be able to distinguish between the opinions of particular colleagues, the consensus of the professional community, and official standards. Applying this ability to web-based information shouldn't be too far a stretch, especially for professionals who deal with levels of evidence in their educational content.

There has been criticism about blogging in general, and blogging about CME in particular. While I don't agree with these views, I don't dismiss them without consideration. Writing My CME Blog allows me to collect and organize my thoughts, as well as share them with others. Thefore, I consider it both a communications medium and a personal knowledge management system. Still, the criticism I've encountered has prompted me to be more careful about clearly stating what is my opinion and not official guidance from ACCME or any other organization. I suppose some readers might miss "rants, raves, and ruminations" in the subtitle at the top of this page.

CME professionals and the doctors we serve are all members of value networks. By value networks, I mean that we exchange valuable information that contributes to the attainment of each other's goals and professional development. In my opinion, technologies that facilitate our information exchanges will ultimately strengthen these networks and improve their results. And, there is little reason to doubt that technology will continue to amplify our abilities to communicate, whether by blogging, posting videos to YouTube, or by other means. So perhaps a more useful question than whether or not there should be blogs about CME, is how can we harness these technologies to achieve our goals?

Tuesday, March 20, 2007

How Doctors Think: Implications for CME

When doctors make mistakes in practice the consequences can be very bad. When doctors make mistakes in CME the consequences can be very good. Making mistakes prepares doctors for learning by demonstrating:
a. That there is something that they should know, and
b. Their performance will improve if they learn it.

I was reminded of the educational value of mistakes by a short television interview with Jerome Groopman, author of How Doctors Think. His main point was that doctors' incorrect decisions are often caused by using "gut" reactions instead of scientific analysis based on evidence. I look forward to reading his book.



The implication of How Doctors Think for CME seems to be that doctors' knowledge is sometimes automated and applied too generally. Therefore, CME activities should highlight situations in which intuition, gut reactions, and stereotypical thinking result in impaired performance. We can then reduce medical errors by providing models of more flexible and scientific problem solving.

A similar approach was employed by the late great scientist and educator Robert Karplus. Karplus' Learning Cycle began with a hook to engage students. Typically, he'd demonstrate that what students thought they knew was inconsistent with reality. This challenge to their tacit assumptions and previous knowledge was used to motivate students and heighten their receptivity to learning.

CME is the right place to identify and correct mistakes. Let's not miss the opportunity to use mistakes constructively. As Mario Zanotti of Computer Curriculum Corporation used to say, "A mistake is an opportunity to learn."

Sunday, March 18, 2007

CME Disambiguation Glossary: Balanced

Update: This is a new version of a previous post that was revised for better clarity.

Put away your books and take out a sharpened No 2 pencil. Today's CME Glossary entry is going to be a short quiz. Which of the following best match ACCME's definition of "balanced?"

1. Balanced: A fair representation of both the benefits and risks of a treatment, therapy, or medical device.

2. Balanced: A fair representation of the range of treatments, therapies, or medical devices for a particular medical condition.

Time's up! Put your pencils way. ACCME discusses "balance" in Section 5.2 of their Standards for Commercial Support. Here's what they have to say:

STANDARD 5. Content and Format without Commercial Bias
5.2 "Presentations must give a balanced view of therapeutic options. Use of generic names will contribute to this impartiality. If the CME educational material or content includes trade names, where available trade names from several companies should be used, not just trade names from a single company."

So the ACCME definition seems to emphasize fairly representing the range of alternative therapies (Definition 2). However, I wouldn't assume that a balanced presentation of risks and benefits of particular therapies is not important. In the "Ask ACCME" section of their website, ACCME provides additional details that extend the meaning of "balance":

"A ‘balanced view’ means that recommendations or emphasis must fairly represent, and be based on, a reasonable and valid interpretation of the information available on the subject (e.g., “On balance the data support the following …”). A ‘balanced view of therapeutic options’ also means that no single product or service is over represented in the education activity when other equal but competing products or services are available for inclusion."


Since ACCME's concept of "balance" extends to "a reasonable and valid interpretation of the information available on the subject," it would seem to encompass providing coverage of both the risks and benefits of particular therapies (i.e., Definition 1).

ACCME's "reasonable and valid" standard also raises the issue of individual differences. Since different faculty can have different opinions about what is "reasonable and valid," presentations advocating strong opinions either for or against particular treatments or therapies can be considered balanced, as long as they are based on interpretations of relevant data that can be considered "reasonable and valid." Variation among different interpretations is a good reason to collect learners' opinions about balance when we evaluate our CME activities.

Can a CME presentation focus on a particular treatment or therapy? For example, a radiologist might want to do an educational presentation about the clinical use of a particular medical device. While such a presentation might be "balanced" with respect to the risks/benefits of that device (Definition 1), it might not provide equivalent detailed information about all alternative therapeutic options (Definition 2). So can it be balanced?

I think so. Notice that there is a subtle, but significant, difference between ACCME's wording in Section 5.2 and Ask ACCME. Section 5.2 states that "presentations" must be balanced, while Ask ACCME precludes the over representation of particular products or services in "educational activities." My interpretation is that Ask ACCME provides CME program managers the flexibility to allow a presentations that focus on particular treatments or therapies as long as the overall educational activity provides a balanced view of alternatives.

Case studies often highlight both a particular therapy and a particular outcome. So attaining balance for case studies can be a challenge with respect both outcomes (Definition 1) and the range of therapeutic options (Definition 2). When using case studies, we should probably make sure learners are informed how often they should expect the same result, as well as therapeutic alternatives.

Tuesday, March 13, 2007

Great News! Make Money and Improve Learning Too!

Many CME programs are required to be financially self-sufficient. So money is a big issue for CME managers. In a previous post I discussed the cost of the instructional design process, and how cutting corners can affect educational results. However, the financing of CME is not always opposed to its educational goals. Here some examples of potential synergies between finance and education.

1. Reinforcement and Retention
Sending your learners reminders and updates is good business and good education. Learners are less likely to forget what they learned, and they are more likely to remember your CME program when they want to learn something else. In sum, reminders and updates can help your learners retain knowledge as they help you retain learners.

2. Motivation to Learn, and Return
I don't think we need to do a study to prove that physicians learn more when they are awake. They are also more likely to return to hear a speaker that stimulated their interest than to one that put them to sleep. So educational activities that are motivating, fun, or engaging are both more effective educational experiences and more attractive "products" when competing for registration dollars.

3. Assessment and ROI
Assessment data can be helpful in improving the effectiveness of your educational activities. They can also be helpful in demonstrating to commercial supporters that your activities are worth their investment. Perhaps some day, assessment data from CME activities will be linked to clinical care quality improvements that result in lowering malpractice insurance premiums. It would be nice for CME managers to be able to take some credit for that financial outcome!

4. Just-in-Time and Online CME
Learners are receptive to information that is useful and convenient. And, it seems reasonable to expect that better solutions for delivering information where it is needed and when it is wanted will be available in the future. Can CME piggy-back on these systems to make money without interfering with physician workflow? It might take some trial and error, but someone will probably make POC and online CME financially viable without ongoing infusions of commercial support.

5. Learning Communities
I'm told that an expert can be defined as someone who participates in a community of experts. And, providing the opportunity to participate in a community of experts is an important function of CME programs. Physicians value informal peer interactions both professionally and educationally. So the promotion of live and virtual learning communities can be good for business and good for learning.

The financial benefits of producing high quality CME activities don't always flow back to those who bore their development expenses. Still, it is useful to be aware of these benefits when gathering support for your program.

Thursday, March 8, 2007

CME Disambiguation Glossary: Promotion

In CME, we use the word "promotion" in various ways. Some of our uses of "promotion" are different from those of laypersons. Here are some definitions organized into categories.

A. The Promotion of CME Activities

1. Promotion: Brochure, flyer or website describing speakers and program agenda to promote a CME activity. This promotion must include an accreditation statement, and is often also used to inform learners of learning objectives and commercial supporters.

2. Promotion: Postcard or brief "save-the-date" announcement to alert potential participants in an upcoming CME activity. This does not need to include an accreditation statement.

3. Promotion: The marketing of CME activities developed by medical education communications companies (MECCs) to CME providers. MECCs generally pay providers a fee for sponsorship and/or hosting, and development is funded by commercial supporters.

B. The Promotion of "Commercial Interests" as Defined by ACCME

4. Promotion: The influence of commercial supporters on the content of a CME activity. Strictly verboten.

5. Promotion: The influence of someone's personal financial relationships with commercial interests on the content of a CME activity. Also verboten.

6. Promotion: Acknowledgement of commercial supporters of a CME activity. This is strictly required. Acknowledgement of support might be considered to be type of "promotion" in common usage. However, in CME nomenclature it is considered to be a "disclosure."

7. Promotion: Exhibits by vendors at CME activities or advertisements by commercial interests in ancillary materials related to CME activities. This is allowed, though promotional content needs to be separate from CME content.

8. Promotion: Reception, party, dinner, recreational activity sponsored by a commercial interest to promote their products and services. It's a free country, but these activities shouldn't coincide with or take precedence over a nearby CME activity.

C. The Promotion of Other Business Interests

9. Promotion: Any content that promotes a business interest (including commercial supporters, vendors, and even non-health care-related organizations) that is juxtaposed with any form of CME content (online, printed, live). This includes product brochures in the conference room and "Powered by Company X" on the screen of an online course. Again verboten.

10. Promotion: A CME activity that promotes a health care provider's services or generates referrals. This can happen, but only if it's an indirect consequence of activity planning that is based on documented professional performance gaps. Think of referrals as you would think of money. It is acceptable to make money from a CME activity. However, financial gain cannot be used as the underlying rationale for why the activity was implemented.

11. Promotion: The selling of CME-related products and services (e.g., meeting planning, media production, mail lists, psychometric evaluation) to CME providers. For example, see: "Promotional Opportunities" on the ACME website. At conferences for CME professionals, this type of promotion is separated from educational content as prohibited in Definition 9 above.

The Assessment of Competence

With the advent of ACCME's Updated Criteria, many CME managers will be assessing the competence of their learners. So what is meant by "competence" is important to understand. Competence includes both:
- an ability, knowledge, or skill, and
- the intention, plan, or disposition to use that ability, knowledge, or skill

Therefore, to assess competence, it might seem logical to take a two-step approach:
A) Did you obtain this ability, knowledge, or skill through this CME activity?
B) Do intend to use this new ability, knowledge, or skill in your practice?
For learners who answer "No" to Question B, we can include another assessment item to determine if there are barriers to implementation, thus addressing Criterion 18.

It might be counter-intuitive, but I think it is more efficient to start with Question B. If learners answer affirmatively, we can assume that they have obtained the required ability, skill, or knowledge. If they respond negatively, we can inquire whether the reason was our failure to adequately teach the topic, irrelevance to their practice, or if there is an obstacle to implementation.

A related issue is the conversion of competence-related learning objectives to assessment items. Here is a system based on three types of learning objectives.

1. Behavioral Objectives
Some learning objectives involve observable behaviors. These objectives have the form “students will be able to” operative, diagnose, use new techniques, etc. To convert behavioral learning objectives to evaluation items, replace “students will be able to” with “I plan to.” The format of the evaluation item then becomes, I plan to operate, diagnose, use new techniques, etc., with the learner providing a Yes or No response.

2. Cognitive Abilities
Learning objectives related to cognitive abilities follow the same format as behavioral learning objectives, “Students will be able to…” However, instead of overt behaviors, these objectives describe traits that might not be observable. For example, cognitive abilities might involve being able to summarize, compare, discuss, solve, evaluate, etc.

To convert a cognitive ability learning objective to an evaluation item, replace “students will be able to” with “I plan to use the ability to.” Therefore, evaluation items will have the format, “I plan to use the ability to summarize, discuss, solve, or evaluate…” Again, learners respond to this statement by agreeing or disagreeing.

3. Knowledge Transfer
Finally, some learning objectives involve knowledge transfer that is not associated with a particular ability. An example of this type of objective is, “Students will know that…” We can convert these objectives to evaluation items in the format “I plan to use knowledge about… in my practice.”

Since knowledge is only one component of competence, your objectives for a CME activity should not be restricted to knowledge transfer.

Saturday, March 3, 2007

CME meets ADDIE, Part 2

In Part 1, I described how ACCME's updated criteria seem to be aligned with ADDIE, an instructional design model with origins in military training. While there are innumerable differences between medicine and the military, they do have one thing in common. People can be killed or injured when training isn't done properly. So there is ample justification for a employing a reliable and well-tested instructional methodology.

ACCME's updated criteria establish higher standards for educational needs, objectives, and evaluation that are consistent with the ADDIE model. Communicating and implementing these higher standards will expand the role of CME managers as instructional design consultants. This will be challenging. Most instructional designers would probably agree that working with subject matter experts (SMEs) is the hardest part of their jobs. SMEs are busy, and don't appreciate intrusions into the way they do things. (Sound familiar?) Furthermore, many CME managers lack instructional design experience that would enhance the depth and credibility of their advice and feedback.

Yet, many attributes of CME managers will help them in their enhanced role as instructional design consultants. While the updated criteria require higher standards for instruction and evaluation (i.e., competence, performance, patient outcomes), they are still a refinement of the educational planning and evaluation elements of the previous ACCME essentials. CME managers are also experienced in providing consulting services for our clients. And, while it would be nice to have a background designing instructional materials, it is not a requirement to be effective in this role. After all, you don't need to be a lawyer to establish and implement a disclosure policy.

The role of CME programs is also evolving. The ADDIE method presupposes that a significant part of the responsibility for learning rests with those that designed the instruction. This brings CME a step closer to the educational programs of medical schools and residency programs, who take very seriously their responsibility for their students' learning. It is also a logical evolution for CME given that the amount of learning required for doctors to keep current is continuously increasing.

The ADDIE method has been criticized for resulting in instructional materials that are dull and uncreative. I think this criticism misses the point. The ADDIE method is about systematically defining what needs to be learned, how content should be sequenced, and whether it was learned. Using this method will not ensure that your instruction is attractive, creative or engaging, but neither will it prevent you from doing so if you have the skills.

The ADDIE method is not, however, without limitations. It was designed for developing instruction for which you can analyze the audience, needs, content, and evaluation criteria in advance. Therefore, this method can only be applied in a general way to educational activities in which you don't know who is going to learn what and when, such as self-directed independent study and informal learning from peer interactions.

Perhaps the most common criticism of the ADDIE method is cost. When I managed e-learning development projects, some clients would say that they want to thoroughly analyze needs, develop objectives, and evaluate the results, but only if it doesn't cost more. I had to respond that this couldn't be done. In some cases we could cut cost by skipping the evaluation phase or conducting a needs analysis based on relatively few sources. However, cutting corners within the ADDIE process shifts responsibility for learning from the instructional provider to learners.

Due to the cost of employing the ADDIE method, it is not surprising that businesses and professional organizations are not using it for developing much of their instructional materials. Multimedia technologies for capturing content and Internet technologies for publishing often make much cheaper to send presentations from subject matter experts directly to learners. For businesses with short product cycles, the incentive skip an instructional design process is even more stark. It doesn't make sense to spend four months developing instructional materials to support a product that will be obsolete in four months.

Overall, the ADDIE method is a system that uses every opportunity to improve instructional efficacy as educational activities are designed, developed, delivered, and evaluated. It will take time, and thus cost money. However, if we don't take responsibility for learning, CME won't be considered a co-equal partner with other medical education enterprises.