Camera device guide for iOS

The Camera device type is implemented using two traits: PushAvStreamTransportTrait, which handles audio and video stream transport using push-based protocols, and WebRtcLiveViewTrait, which provides the ability to control livestreams and talkback. The Doorbell device type, for those implementations that have camera capabilities, also uses these traits.

Home APIs Device Type Traits Swift Sample App Use Case

Camera

GoogleCameraDeviceType

home.matter.6006.types.0158

A device that captures still images or video. Cameras may feature accessible livestreams, two-way talkback, or detection events.

Required Traits
     google PushAvStreamTransportTrait
     google WebRtcLiveViewTrait

Camera

Doorbell

GoogleDoorbellDeviceType

home.matter.6006.types.0113

A device actuated by a button outside a door that makes an audible and/or visual signal, used to request the attention of a person who is somewhere on the other side of the door. Doorbells may feature accessible livestreams, two-way talkback, or detection events.

Required Traits
     google PushAvStreamTransportTrait
     google WebRtcLiveViewTrait

Doorbell

Start a livestream

To start a livestream, send the Session Description Protocol (SDP) string to the WebRtcLiveViewTrait trait's startLiveView(offerSdp:) method, which returns three values:

  • The SDP for the session.
  • The session duration in seconds.
  • The session ID, which may be used to extend or terminate the session.
public func sendOffer(offerSdp: String) async throws 
-> (answerSdp: String, mediaSessionId: String, liveViewDuration: TimeInterval) 
{
  do {
    Logger.info("Sending StartLiveView command...")
    let response = try await liveViewTrait.startLiveView(
      offerSdp: offerSdp
    )
    Logger.info("Received StartLiveView response: \(response)")
    return (
      answerSdp: response.answerSdp,
      mediaSessionId: response.mediaSessionId,
      liveViewDuration: TimeInterval(response.liveSessionDurationSeconds)
    )
  } catch {
    Logger.error("Failed to send StartLiveView command: \(error)")
    throw error
  }
}

Extend a livestream

Livestreams have a preset duration after which they expire. To lengthen the duration of an active stream, issue an extension request using the extendLiveView(mediaSessionId:optionalArgsProvider:) method:

public func extendLiveView(mediaSessionId: String) async throws {
  do {
    Logger.info("Extending live view...")
    let extendedDuration = try await liveViewTrait.extendLiveView(mediaSessionId: mediaSessionId)
    Logger.info("Extended live view for \(extendedDuration.liveSessionDurationSeconds) seconds.")
  } catch {
    Logger.error("Failed to extend live view: \(error)")
    throw error
  }
}

Enable and disable recording capability

To enable the camera's recording capability, pass TransportStatusEnum.Active to the PushAvStreamTransportTrait trait's setTransportStatus(transportStatus:optionalArgsProvider:) method. To disable the recording capability, pass it TransportStatusEnum.Inactive. In the following example, we wrap these calls in a single call that uses a Boolean to toggle the recording capability:

public func toggleIsRecording(isOn: Bool) {
  self.uiState = .loading

  guard let pushAvStreamTransportTrait else {
    Logger.error("PushAvStreamTransportTrait not found.")
    return
  }
  Task {
    do {
      Logger.debug("Toggling onOff to \(isOn ? "ON" : "OFF")...")
      try await pushAvStreamTransportTrait.setTransportStatus(
        transportStatus: isOn ? .active : .inactive)
      if isOn {
        do {
          self.player = try self.createWebRtcPlayer()
        } catch {
          Logger.error("Failed to initialize WebRtcPlayer: \(error)")
          self.uiState = .disconnected
          return
        }
        await self.player?.initialize()
        self.uiState = .live
      } else {
        self.player = nil
        self.uiState = .off
      }
    } catch {
      Logger.error("Failed to toggle onOff: \(error)")
    }
  }
}

Check to see if recording capability is enabled

To determine if a camera's recording capability is enabled, check to see if any connections are active. The following example defines two functions to do this:

public func isDeviceRecording() -> Bool {
  guard let pushAvStreamTransportTrait else {
    Logger.error("PushAvStreamTransportTrait not found.")
    return false
  }
  guard
    let hasActiveConnection =
      pushAvStreamTransportTrait
      .attributes
      .currentConnections?
      .contains(where: { $0.transportStatus == .active })
  else {
    return false
  }
  return hasActiveConnection
}

Start and stop talkback

To start talkback, call the WebRtcLiveViewTrait trait's startTalkback(mediaSessionId:optionalArgsProvider:) method. To stop, use stopTalkback(mediaSessionId:).

public func toggleTwoWayTalk(isOn: Bool, mediaSessionId: String) async throws {
  do {
    Logger.info("Toggling twoWayTalk to \(isOn ? "ON" : "OFF")...")
    if isOn {
      try await liveViewTrait.startTalkback(mediaSessionId: mediaSessionId)
    } else {
      try await liveViewTrait.stopTalkback(mediaSessionId: mediaSessionId)
    }
  } catch {
    throw HomeError.commandFailed("Failed to toggle twoWayTalk: \(error)")
  }
}