Unity with Azure Speech SDK

kninwzqo  于 2023-06-07  发布在  其他
关注(0)|答案(2)|浏览(643)

当我在Unity上使用Azure语音SDK时,当我在电脑上测试它时,它工作正常,我可以说话,它在语音中识别和响应都正常。
当我为Android和iOS构建时,它不起作用。在iOS和Android上,它都在不尝试识别任何东西的情况下提高了识别点,如果我只是从SDK中输入一个简单的语音,它也不会给予任何东西。
如何解决这个问题?
下面是在Unity和Windows Build中运行的代码:

---------------------------------------------
    void Start()
    {
        anim = gameObject.GetComponent<Animator>();

        var config = SpeechConfig.FromSubscription("xxxxxxxxxxxx", "northeurope");
     
        cred(config);
    }


 async Task cred(SpeechConfig config)
    { 
       texttest.GetComponent<Text>().text = config.ToString();

        var audioConfig = AudioConfig.FromDefaultMicrophoneInput();

        var synthesizer2 = new SpeechRecognizer(config, audioConfig);

        var result = await synthesizer2.RecognizeOnceAsync();

        var synthesizer = new SpeechSynthesizer(config);

            SynthesizeAudioAsync(config, synthesizer2, result);
    } 

  async Task SynthesizeAudioAsync(SpeechConfig config, SpeechRecognizer synthesizer2, SpeechRecognitionResult result)
    {
        texttest.GetComponent<Text>().text = "syn1 " + result.Text;

        OutputSpeechRecognitionResult(result);
        if (result.Reason == ResultReason.RecognizedSpeech)
            {
                if (result.Text == "xx" || result.Text == "xx" || result.Text == xx." || result.Text == "xx")
                {

                var synthesizer = new SpeechSynthesizer(config); 
  
                anim.Play("helloAll", 0, 0); 
                await synthesizer.SpeakTextAsync("Helloxx");
                chooseTopic(config,  synthesizer, result.Text);

在iOS上,它在控制台中给了我这个:

--------------------------------------------------
CANCELED: Did you set the speech resource key and region values?speakTest:OutputSpeechRecognitionResult(SpeechRecognitionResult)<SynthesizeAudioAsync>d__10:MoveNext()

CANCELED: ErrorDetails=0x15 (SPXERR_MIC_ERROR)

[CALL STACK BEGIN]


3   UnityFramework                      0x0000000109336810 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxMicrophonePumpBase9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 756

4   UnityFramework                      0x000000010931c010 _ZN9Microsoft17CognitiveServices6Speech4Impl25ISpxDelegateAudioPumpImpl9StartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 84

5   UnityFramework                      0x000000010932cc0c _ZN9Microsoft17CognitiveServices6Speech4Impl27CSpxAudioPumpDelegateHelperINS2_29CSpxDelegateToSharedPtrHelperINS2_13ISpxAudioPumpELb0EEEE17DelegateStartPumpENSt3__110shared_ptrINS2_18ISpxAudioProcessorEEE + 220

6   UnityFramework                      0x0000000109325e1c _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE9StartPumpEv + 304

7   UnityFramework                      0x0000000109325664 _ZN9Microsoft17CognitiveServices6Speech4Impl41ISpxAudioSourceControlAdaptsAudioPumpImplINS2_32CSpxMicrophoneAudioSourceAdapterEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS7_INS2_15ISpxAudioSourceEEERKNS7_INS2_14ISpxBufferDataEEEEEEEE + 184

8   UnityFramework                      0x00000001093221d4 _ZN9Microsoft17CognitiveServices6Speech4Impl34ISpxAudioSourceControlDelegateImplINS2_29CSpxDelegateToSharedPtrHelperINS2_22ISpxAudioSourceControlELb0EEEE10StartAudioENSt3__110shared_ptrINS2_12ISpxNotifyMeIJRKNS9_INS2_15ISpxAudioSourceEEERKNS9_INS2_14ISpxBufferDataEEEEEEEE + 220

9   UnityFramework                      0x00000001094a41f4 _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE16StartAudioSourceERKNSt3__110shared_ptrINS2_15ISpxAudioSourceEEE + 504

10  UnityFramework                      0x00000001094a0dbc _ZN9Microsoft17CognitiveServices6Speech4Impl28CSpxSessionAudioSourceHelperINS2_20CSpxAudioSessionShimEE22EnsureStartAudioSourceEv + 124

11  UnityFramework                      0x0000000109408dcc _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession14StartAudioPumpENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 2300

12  UnityFramework                      0x0000000109406760 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession16StartRecognizingENS3_15RecognitionKindENSt3__110shared_ptrINS2_12ISpxKwsModelEEE + 616

13  UnityFramework                      0x0000000109406098 _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession18RecognizeOnceAsyncERKNSt3__110shared_ptrINS3_9OperationEEENS5_INS2_12ISpxKwsModelEEE + 464

14  UnityFramework                      0x0000000109424d4c _ZN9Microsoft17CognitiveServices6Speech4Impl22CSpxAudioStreamSession9OperationC2ENS3_15RecognitionKindE + 1040

15  UnityFramework                      0x0000000109420af4 _ZN9Microsoft17CognitiveServices6Speech4Impl7SpxTermINS2_21ISpxAudioStreamReaderEEEvRKNSt3__110shared_ptrIT_EE + 2004

16  UnityFramework                      0x0000000109354c28 _ZNSt3__113packaged_taskIFvvEEclEv + 96

17  UnityFramework                      0x0000000109354bb4 _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService4Task3RunEv + 32

18  UnityFramework                      0x00000001093566fc _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread7RunTaskINSt3__14pairINS6_10shared_ptrINS3_4TaskEEENS6_7promiseIbEEEEEEvRNS6_11unique_lockINS6_5mutexEEERNS6_5dequeIT_NS6_9allocatorISJ_EEEE + 332

19  UnityFramework                      0x0000000109354d8c _ZN9Microsoft17CognitiveServices6Speech4Impl17CSpxThreadService6Thread10WorkerLoopENSt3__110shared_ptrIS4_EE + 216

[CALL STACK END]
wooyq4lh

wooyq4lh1#

该问题是由于iOS和Android中的配置问题而引起的。查看androidiOS中的配置文档。
有一个github repo可以解决类似的问题。检查一次。
https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/csharp/unity/from-microphone
iOS库将使用Objective - C开发,不支持任何类型的JavaScript库。同样,对于Android,Cordova将用于处理JavaScript库。这两个对于每个相对的平台都是不可支持的。因此,请检查开发平台的配置以及支持库语言。

Cordova Platforms : android 7.1.4 ios 4.5.5
Ionic Framework : ionic-angular 3.9.2
iOS: Objective - C
fkaflof6

fkaflof62#

您报告的问题(SPXERR_MIC_ERROR)表明您没有正确配置麦克风。请先尝试Unity快速入门https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/csharp/unity/from-microphone,并测试它在Android和iOS设备上的工作情况,然后将配置应用到您的应用程序。

相关问题