Azure Speech API language

試著忘記壹切 提交于 2019-12-25 02:24:52

问题


I have implemented a chat on a web page, with the possibility to use Speech to text, using Azure Speech API. It works fine but I don't understand where I can set the language understood by the API. I want it to understand french, but when I talk in french, it transcripts in english words with familiar sound. How / Where I can I set the language ? I precise that I'm not the one who set up the service on Azure dashboard.


回答1:


There is a locale parameter that you can use optionally like the following example:

export interface ICognitiveServicesSpeechRecognizerProperties {
    locale?: string,
    subscriptionKey?: string,
    fetchCallback?: (authFetchEventId: string) => Promise<string>,
    fetchOnExpiryCallback?: (authFetchEventId: string) => Promise<string>
}

If you don't provide a value the following example is used:

const locale = properties.locale || 'en-US';

You can find the possible values for those parameters here




回答2:


The new SpeechSDK supports recognition in different languages, please check samples here.

Thanks,



来源:https://stackoverflow.com/questions/51300534/azure-speech-api-language

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!