This post isn’t a litany of the things that I don’t like about RoboDemo, although there will necessarily will be some of that. Rather, it’s my own speculation as to why a smart company with a reputation for good software produced a lemon of a release. Make no mistake about it, though; Robodemo 5 sucks. It mostly sucks in fixable ways and it’s possible that some of the problems have already been fixed in Captivate, it’s newly released successor. (I won’t know for certain until I find time to kick the tires.) But the question remains; why did Macromedia release a product that is badly broken?
Before continuing, let me put in a caveat. I am far from a RoboDemo expert. My observations are from working with it on one project with teammates who, like me, had lots of simulation design and production experience using other products but had no RoboDemo-specific experience. I wouldn’t be surprised if there were some things that we missed. Nevertheless, most of the problems that I intend to write about here should have been much harder to hit and easier to solve than they were. Even if we missed some functionality in the product, that does not absolve Macromedia from not making that functionality the default behavior or, at least, more obvious.
The first and least interesting possible reason why Macromedia released an inferior product is that they rushed a release out with known bugs in response to external pressures. Given that this is the first release of the product since they bought the product line, I wouldn’t be surprised if they felt they needed to push a first release out the door quickly. And there is evidence of this. For example, when you put an interactive element on a screen (e.g., a click box) and set the option to “wait for user response” (or something like that; I don’t have a copy of the software with me at the moment), RoboDemo ignores this setting and moves onto the next frame regardless of whether the user has clicked. In order to make the proper behavior happen, you have to change the setting for the frame from the default “continue” and hard-wire it to go to the next numbered frame. That adds two clicks of authoring for every single interaction screen, not to mention the fact that you have to do extra editing clicks if you happen to add, delete, or change the order of the screens. Plus, it creates many more points (one per interaction screen, to be exact) where you need to QA for possible authoring errors. In a project of any substantial size, this ends up being a significant time drag. (Also, RoboDemo apparently doesn’t handle right-clicks for interactivity options. How could that be?)
At first, we thought that we had to be missing some setting; how could Macromedia release a product with such an obvious glitch? Incredibly, their customer support operator told us that, yes, the only way to get Robodemo to actually wait for the user’s interaction is to hard-wire each frame as we were doing. They refused to acknowledge that this was a bug despite the fact that they also said the behavior would change in the next release. I saw a handful glaring and careless bugs like this one, and we weren’t even pushing the product that hard.
The second possible cause of RoboDemo’s suckiness is that e-learning functionality appears to be tacked on as an afterthought. It is, after all, called RoboDemo. While there are features that allow you to build in interactivity, add scoring, etc., these all have to be added manually. It should be possible to set the product in either demo or e-learning simulation mode. When set for the latter, the product should add click boxes and the like by default. There is some indication in the marketing copy for Captivate (a.k.a. RoboDemo 6) that the new version has something like this. If it exists in version 5.0, though, we certainly couldn’t find it.
The third possible cause is in some ways the most interesting. It’s possible that some of the problems with RoboDemo are because it is built on top of Flash and relies heavily on the Flash authoring paradigm. Remember, as an animation product, Flash was fundamentally designed to make it easy to move objects around on a screen following a timeline. Some of the features that make it suitable for e-learning development were added as afterthoughts and were not necessarily added in a way that makes sense from an e-learning-centric perspective.
Take sound, for example. In Flash, sound is built on a timeline. According to one of the Flash developers on my team, there’s no real easy way to tell animations in a flash movie to pause and wait for sound file to finish playing. You basically have to fiddle with the timeline until they line up.
Maybe this is why there is no way in RoboDemo to automatically synch up audio narration so that the learner isn’t moved along to the next screen before the narration is finished. You basically have to go to the settings for one of the visual elements on the page (like a call-out bubble) and set it to display for the same number of seconds as the length of your sound file.
For each and every page.
In order to fix this behavior, two things would have had to happen. First, the RoboDemo developers would have had to break their habitual mindset as Flash developers and see that the Flash audio model is not the right one for simulation development. My impression of RoboDemo is that it was designed by Flash developers to create a bunch of short-cuts for some of the things they would do when developing a simulation by hand in the Flash authoring environment. This isn’t a bad start, but they need to stop thinking like Flash developers and start thinking like e-Learning designers to gain more significant authoring time savings for their customers. Second, once they realize that the Flash model isn’t right, they would have to develop a work-around, essentially fighting against the way that Flash naturally wants to do things. This just isn’t easy even for the best development teams.
So those are the three reasons why I think RoboDemo 5 ended up being a deeply flawed release. Again, I’m not a RoboDemo expert; these are my impressions after a little more than a week of getting up-close and personal with it. Still, I’m pretty confident that the margin of error on my suckiness assessment is relatively modest.
We’ll see if Macromedia does any better with the new version.