This year was a great year for us at NAB, we revealed two new products in beta which we're very excited about. First of all, for those of you doing unscripted work with lots of dialogue to go through, you are going to need this: Scribeomatic, our cloud based speech to text service that puts your dialogue back into FCP X either as markers or lets you build a rough cut right in our application by clicking on sentences (yes you heard me right). First here's the beta reveal presentation:
Recorded at the LumaForge Faster Together Stage
So, to answer some of your likely questions:
- Pricing? To be finalised based on feedback during the beta process, it will be a subscription service with various options for buying packs of minutes, pay as you go or monthly plans.
- Turn around time? Depending on the speed of your connection it is usually almost real time, eg a minute of dialogue will take approx a minute to get transcribed.
- Which cloud service are you using, Watson, etc? We speak to all of them. Depending on your dialogue accent, language etc we will send your audio to the service that gives the best results in our testing. We will manage this for you so you don't need to set up accounts with all the different providers or have unpredictable pricing.
- Does it deal with different accents? Yes we have tested with US, UK and Australian accents, very strong accents may of course have lower accuracy.
- Will you support other languages? Yes, the first release will be english only. We believe we can add support for Spanish, Italian, French, German and Japanese fairly soon after initial release.
- Will you be supporting other applications other than FCP X? Maybe, but at first we are focusing on having the best possible integration with FCP X fully supporting all of the ways we can pass the dialogue back to FCP X using meta data. Other NLE's do not have the deep support for meta data that FCP X does.
- How accurate is it? With clean dialogue recorded on its own channel you can expect about 90-95% accuracy, you can correct mistakes in our application easily before sending back to FCP X.
- Can it be used for closed captions or subtitles? Yes but that's not the main task we are focusing on. We believe text based search is the best usage for this technology at the moment. For these applications a human pass will be necessary to ensure accuracy.
- Ok I'm convinced, when can I have it? We are now entering private beta testing and estimate around two months before we do the first public release. If you would like to take part in the beta please contact us, note please only apply if you are willing to take the time to give feedback. The beta application may crash or give poor results but we will not be charging for usage for the beta. Please don't sign up for the beta expecting to use it on a time critical project.
Here's some additional screenshots, but really the video is what you want, and please note the user interface will change before release.
Above video recorded at the LumaForge Faster Together Stage, make sure you catch the videos of the rest of the sessions which will be online soon.