/api-ai-ios-sdk

iOS SDK for API.AI

Primary LanguageObjective-COtherNOASSERTION

iOS SDK for api.ai

Build Status


Overview

The API.AI iOS SDK makes it easy to integrate speech recognition with API.AI natural language processing API on iOS devices. API.AI allows using voice commands and integration with dialog scenarios defined for a particular agent in API.AI.

Prerequsites

Running the Demo app

  • Run pod update in the ApiAiDemo project folder.

  • Open ApiAIDemo.xworkspace in Xcode.

  • In ViewController -viewDidLoad insert API key & subscription.

    configuration.clientAccessToken = @"YOUR_CLIENT_ACCESS_TOKEN";
    configuration.subscriptionKey = @"YOUR_SUBSCRIPTION_KEY";
    

    Note: an agent in api.ai should exist. Keys could be obtained on the agent's settings page.

  • Define sample intents in the agent.

  • Run the app in Xcode. Inputs are possible with text and voice (experimental).

Integrating into your app

1. Initialize CocoaPods

  • Run pod install in your project folder.

  • Update Podfile to include:

    pod 'ApiAI'
    
  • Run pod update

2. Init audio session.

In the AppDelegate.m, add

  [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
  [[AVAudioSession sharedInstance] setActive:YES error:nil];

3. Init the SDK.

In the AppDelegate.h, add ApiAI.h import and property:

#import <ApiAI/ApiAI.h>

@property(nonatomic, strong) ApiAI *apiAI;

In the AppDelegate.m, add

   self.apiAI = [[ApiAI alloc] init];
  
  // Define API.AI configuration here.
  Configuration *configuration = [[Configuration alloc] init];
  configuration.baseURL = [NSURL URLWithString:@"https://api.api.ai/v1"];
  configuration.clientAccessToken = @"YOUR_CLIENT_ACCESS_TOKEN_HERE";
  configuration.subscriptionKey = @"YOUR_SUBSCRIPTION_KEY_HERE";
  
  self.apiAI.configuration = configuration;

4. Perform request using text.

...
// Request using text (assumes that speech recognition / ASR is done using a third-party library, e.g. AT&T)
AITextRequest *request = (AITextRequest *)[_apiAI requestWithType:AIRequestTypeText];
request.query = @[@"hello"];
[request setCompletionBlockSuccess:^(OPRequest *request, id response) {
    // Handle success ...
} failure:^(OPRequest *request, NSError *error) {
    // Handle error ...
}];

[_openAPI enqueue:request];

5. Or perform request using voice:

// Request using voice
  AIVoiceRequest *request = (AIVoiceRequest *)[_apiAI requestWithType:AIRequestTypeVoice];
  
  [request setCompletionBlockSuccess:^(AIRequest *request, id response) {
      // Handle success ...
  } failure:^(AIRequest *request, NSError *error) {
      // Handle error ...
  }];
  
  self.voiceRequest = request;
  [_apiAI enqueue:request];