Doorbell device guide for iOS

The Doorbell device type is implemented using two traits: PushAvStreamTransportTrait, which handles audio and video stream transport using push-based protocols, and WebRtcLiveViewTrait, which provides the ability to control livestreams and talkback.

Always check for attribute and command support for a device prior to using any features or attempting to update attributes. See Control devices on iOS for more information.

Home APIs Device Type Traits Swift Sample App Use Case

Doorbell

GoogleDoorbellDeviceType

home.matter.6006.types.0113

A device actuated by a button outside a door that makes an audible and/or visual signal, used to request the attention of a person who is somewhere on the other side of the door. Doorbells may feature accessible livestreams, two-way talkback, or detection events.

Required Traits
     google PushAvStreamTransportTrait
     google WebRtcLiveViewTrait

Doorbell

Start a livestream

To start a livestream, send the Session Description Protocol (SDP) string to the WebRtcLiveViewTrait trait's startLiveView(offerSdp:) method, which returns three values:

  • The SDP for the session.
  • The session duration in seconds.
  • The session ID, which may be used to extend or terminate the session.
public func sendOffer(offerSdp: String) async throws
-> (answerSdp: String, mediaSessionId: String, liveViewDuration: TimeInterval)
{
  do {
    Logger.info("Sending StartLiveView command...")
    let response = try await liveViewTrait.startLiveView(
      offerSdp: offerSdp
    )
    Logger.info("Received StartLiveView response: \(response)")
    return (
      answerSdp: response.answerSdp,
      mediaSessionId: response.mediaSessionId,
      liveViewDuration: TimeInterval(response.liveSessionDurationSeconds)
    )
  } catch {
    Logger.error("Failed to send StartLiveView command: \(error)")
    throw error
  }
}

Extend a livestream

Livestreams have a preset duration after which they expire. To lengthen the duration of an active stream, issue an extension request using the extendLiveView(mediaSessionId:optionalArgsProvider:) method:

public func extendLiveView(mediaSessionId: String) async throws {
  do {
    Logger.info("Extending live view...")
    let extendedDuration = try await liveViewTrait.extendLiveView(mediaSessionId: mediaSessionId)
    Logger.info("Extended live view for \(extendedDuration.liveSessionDurationSeconds) seconds.")
  } catch {
    Logger.error("Failed to extend live view: \(error)")
    throw error
  }
}

Start and stop talkback

To start talkback, call the WebRtcLiveViewTrait trait's startTalkback(mediaSessionId:optionalArgsProvider:) method. To stop, use stopTalkback(mediaSessionId:).

public func toggleTwoWayTalk(isOn: Bool, mediaSessionId: String) async throws {
  do {
    Logger.info("Toggling twoWayTalk to \(isOn ? "ON" : "OFF")...")
    if isOn {
      try await liveViewTrait.startTalkback(mediaSessionId: mediaSessionId)
    } else {
      try await liveViewTrait.stopTalkback(mediaSessionId: mediaSessionId)
    }
  } catch {
    throw HomeError.commandFailed("Failed to toggle twoWayTalk: \(error)")
  }
}

Enable and disable recording capability

To enable the camera's recording capability, pass TransportStatusEnum.Active to the PushAvStreamTransportTrait trait's setTransportStatus(transportStatus:optionalArgsProvider:) method. To disable the recording capability, pass it TransportStatusEnum.Inactive. In the following example, we wrap these calls in a single call that uses a Boolean to toggle the recording capability:

public func toggleIsRecording(isOn: Bool) {
  self.uiState = .loading

  guard let pushAvStreamTransportTrait else {
    Logger.error("PushAvStreamTransportTrait not found.")
    return
  }
  Task {
    do {
      try await pushAvStreamTransportTrait.setTransportStatus(
        transportStatus: isOn ? .active : .inactive)
      if isOn {
        do {
          self.player = try self.createWebRtcPlayer()
        } catch {
          Logger.error("Failed to initialize WebRtcPlayer: \(error)")
          self.uiState = .disconnected
          return
        }
        await self.player?.initialize()
        self.uiState = .live
      } else {
        self.player = nil
        self.uiState = .off
      }
    } catch {
      Logger.error("Failed to toggle onOff: \(error)")
    }
  }
}

Enabling or disabling the camera's recording capability is the same as turning the camera video on or off. When a camera's video is on, it is recording (for purposes of events and related clips).

When the recording capability is disabled (the camera video is off):

Check to see if recording capability is enabled

To determine if a camera's recording capability is enabled, check to see if any connections are active. The following example defines two functions to do this:

public func isDeviceRecording() -> Bool {
  guard let pushAvStreamTransportTrait else {
    Logger.error("PushAvStreamTransportTrait not found.")
    return false
  }
  guard
    let hasActiveConnection =
      pushAvStreamTransportTrait
      .attributes
      .currentConnections?
      .contains(where: { $0.transportStatus == .active })
  else {
    return false
  }
  return hasActiveConnection
}

Audio settings

Various camera audio settings can be controlled through the Home APIs.

Turn microphone on or off

To turn the device's microphone on or off, update the microphoneMuted attribute of the CameraAvStreamManagementTrait trait using the built-in setMicrophoneMuted function:

// Turn the device's microphone on or off
func setMicrophone(on: Bool) async {
  do {
    _ = try await self.cameraAvStreamManagementTrait?.update {
      $0.setMicrophoneMuted(!on)
    }
  } catch {
    // Error
  }
}

Turn audio recording on or off

To turn audio recording on or off for the device, update the recordingMicrophoneMuted attribute of the CameraAvStreamManagementTrait trait using the built-in setRecordingMicrophoneMuted function:

// Turn audio recording on or off for the device
func setAudioRecording(on: Bool) async {
  do {
    _ = try await self.cameraAvStreamManagementTrait?.update {
      $0.setRecordingMicrophoneMuted(!on)
    }
  } catch {
    // Error
  }
}

Adjust the speaker volume

To adjust the speaker volume for the device, update the speakerVolumeLevel attribute of the CameraAvStreamManagementTrait trait using the built-in setSpeakerVolumeLevel function:

// Adjust the camera speaker volume
func setSpeakerVolume(to value: UInt8) async {
  do {
    _ = try await cameraAvStreamManagementTrait.update {
      $0.setSpeakerVolumeLevel(value)
    }
  } catch {
    // Error
  }
}

Other settings

Various other camera settings can be controlled through the Home APIs.

Turn night vision on or off

To turn night vision on or off for the camera, use TriStateAutoEnum to update the nightVision attribute of the CameraAvStreamManagementTrait trait using the built-in setNightVision function:

// Turn night vision on or off
func setNightVision(
  to value: Google.CameraAvStreamManagementTrait.TriStateAutoEnum
) async {
  do {
    _ = try await cameraAvStreamManagementTrait.update {
      $0.setNightVision(value)
    }
  } catch {
    // Error
  }
}

Change the brightness of the status LED

To change the brightness of the status LED, use ThreeLevelAutoEnum to update the statusLightBrightness attribute of the CameraAvStreamManagementTrait trait using the built-in setStatusLightBrightness function:

// Set the LED brightness
func setStatusLightBrightness(
  to value: Google.CameraAvStreamManagementTrait.ThreeLevelAutoEnum
) async {
  do {
    _ = try await cameraAvStreamManagementTrait.update {
      $0.setStatusLightBrightness(value)
    }
  } catch {
    // Error
  }
}

Change the camera viewport

The camera viewport is the same as the Zoom and Crop feature described in the Zoom and enhance Nest camera video support article.

The viewport is defined in a ViewportStruct that contains four values, which are used as the coordinates of the viewport. The coordinates are defined as:

(x1,y1) -- (x2,y1)
   |          |
(x1,y2) -- (x2,y2)

Determining the values for the ViewportStruct depends on an app's UI and camera implementation. At a very basic level, to set the viewport of the camera video, update the viewport attribute of the CameraAvStreamManagementTrait trait with a ViewportStruct, using the built-in setViewport function.

func setCrop(x1: UInt16, y1: UInt16, x2: UInt16, y2: UInt16) {

  let viewport = Google.CameraAvStreamManagementTrait.ViewportStruct(
    x1: x1,
    y1: y1,
    x2: x2,
    y2: y2
  )

  Task {
    do {
      try await cameraAvStreamManagementTrait.update {
        $0.setViewport(viewport)
      }
    } catch {
      // Error
    }
  }

}

Generate a TransportOptionsStruct

Some settings require modifications to properties in a TransportOptionsStruct, which is then passed into the transport options of a streaming connection. For Swift, this struct needs to be generated prior to updating any properties.

Use this helper function to generate the struct for use with the following setting changes:

func getTransportOptions(
  transportOptions: Google.PushAvStreamTransportTrait.TransportOptionsStruct,
  wakeUpSensitivity: UInt8?,
  maxEventLength: UInt32?
) async throws
  -> Google.PushAvStreamTransportTrait.TransportOptionsStruct
{

  var newMotionTimeControl:
    Google.PushAvStreamTransportTrait.TransportMotionTriggerTimeControlStruct? = nil
  if let maxEventLength {
    guard let motionTimeControl = transportOptions.triggerOptions.motionTimeControl else {
      throw HomeError.failedPrecondition(
        // Error - cannot update max event length without motion time control
    }
    newMotionTimeControl =
      Google.PushAvStreamTransportTrait.TransportMotionTriggerTimeControlStruct(
        initialDuration: motionTimeControl.initialDuration,
        augmentationDuration: motionTimeControl.augmentationDuration,
        maxDuration: maxEventLength,
        blindDuration: motionTimeControl.blindDuration
      )
  }

  return Google.PushAvStreamTransportTrait.TransportOptionsStruct(
    streamUsage: .recording,
    videoStreamID: nil,
    audioStreamID: nil,
    tlsEndpointID: transportOptions.tlsEndpointID,
    url: transportOptions.url,
    triggerOptions: Google.PushAvStreamTransportTrait.TransportTriggerOptionsStruct(
      triggerType: .motion,
      motionZones: nil,
      motionSensitivity: wakeUpSensitivity,
      motionTimeControl: newMotionTimeControl,
      maxPreRollLen: nil
    ),
    ingestMethod: .cmafIngest,
    containerOptions: Google.PushAvStreamTransportTrait.ContainerOptionsStruct(
      containerType: .cmaf,
      cmafContainerOptions: nil
    ),
    expiryTime: nil
  )
}

private func getRecordingConnection() async throws
  -> Google.PushAvStreamTransportTrait.TransportConfigurationStruct?
{
  guard let pushAvStreamTransportTrait else {
    // Error - PushAvStreamTransport trait not available
    return nil
  }

  let connections = try await pushAvStreamTransportTrait.findTransport().transportConfigurations

  for connection in connections {
    guard let transportOptions = connection.transportOptions,
      transportOptions.streamUsage == .recording
    else {
      continue
    }

    return connection
  }

  return nil
}

Adjust the device wake-up sensitivity

The device's wake-up sensitivity is used to conserve battery by decreasing the range at which the device can sense activity and increasing the time to wake up after detecting that activity.

In the Home APIs, this can be set using the motionSensitivity property of the triggerOptions in the device's transportOptions. These options are defined within the PushAvStreamTransportTrait trait for each device.

The wake-up sensitivity can only be set to the following values:

  • 1 = Low
  • 5 = Medium
  • 10 = High

The process for updating is to find the transport configuration for active recording streams using the findTransport command, then modify the configuration with the new sensitivity value using the modifyPushTransport command.

The modifyPushTransport command requires the full TransportOptionsStruct to be passed, so you must copy existing values from the current configuration first. See Generate a TransportOptionsStruct for a helper function to do this.

func setWakeUpSensitivity(to value: UInt8) async {
  do {
    let connection = try await getRecordingConnection()
    guard let connection,
      let transportOptions = connection.transportOptions
    else {
      // Error - Transport options not available
      return
    }

    guard transportOptions.triggerOptions.motionSensitivity != nil else {
      // Error - Motion sensitivity not available to be updated for this device
      return
    }

    try await pushAvStreamTransportTrait.modifyPushTransport(
      connectionID: connection.connectionID,
      transportOptions: self.getTransportOptions(
        transportOptions: transportOptions,
        wakeUpSensitivity: value,
        maxEventLength: nil
      )
    )

  } catch {
    // Error
  }
}

Adjust the maximum event length

The maximum event length is the length of time the camera will record a clip for an event. Through the Home APIs this can be configured, per device, to the same lengths as through the Google Home app (GHA), in intervals of seconds:

  • 10 seconds
  • 15 seconds
  • 30 seconds
  • 60 seconds (1 minute)
  • 120 seconds (2 minutes)
  • 180 seconds (3 minutes)

In the Home APIs, this can be set using the motionTimeControl property of the triggerOptions in the device's transportOptions. These options are defined within the PushAvStreamTransportTrait trait for each device.

The process for updating is to find the transport configuration for active recording streams using the findTransport command, then modify the configuration with the new event length value using the modifyPushTransport command.

The modifyPushTransport command requires the full TransportOptionsStruct to be passed, so you must copy existing values from the current configuration first. See Generate a TransportOptionsStruct for a helper function to do this.

func setMaxEventLength(to value: UInt32) async {
  do {
    let connection = try await getRecordingConnection()
    guard let connection,
      let transportOptions = connection.transportOptions
    else {
      // Error - Transport options not available
      return
    }

    guard transportOptions.triggerOptions.motionTimeControl != nil else {
      // Error - Motion time control not available to be updated for this device
      return
    }

    try await pushAvStreamTransportTrait.modifyPushTransport(
      connectionID: connection.connectionID,
      transportOptions: self.getTransportOptions(
        transportOptions: transportOptions,
        wakeUpSensitivity: nil,
        maxEventLength: value
      )
    )

  } catch {
    // Error
  }
}

Chime settings

Various doorbell chime settings can be controlled through the Home APIs.

Change the chime sound

To change the doorbell chime sound, first get the list of chime sounds that are installed the device, using the installedChimeSounds attribute of the ChimeTrait trait:

doorbellChimeTrait.attributes.installedChimeSounds?.compactMap { chimeSound in
  return chimeSound.chimeID, chimeSound.name
}

Then, update the selectedChime attribute of the ChimeTrait trait using the built-in setSelectedChime function:

func setDoorbellChime(chimeID: UInt8) async {
  do {
    _ = try await doorbellChimeTrait.update {
      $0.setSelectedChime(chimeID)
    }
  } catch {
    // Error
  }
}

Use an external chime

The doorbell can be configured to use an external chime, such as a mechanical bell installed inside the home. This should be configured during doorbell installation to avoid potential damage to the external chime.

To indicate what type of external chime is installed, use ExternalChimeType to update the externalChime attribute of the ChimeTrait trait using the built-in setExternalChime function:

// Indicate the external chime is mechanical
func setExternalChime(to value: Google.ChimeTrait.ExternalChimeType) async {
  do {
    _ = try await doorbellChimeTrait.update {
      $0.setExternalChime(value)
    }
  } catch {
    // Error
  }
}

Change the external chime duration

The duration, in seconds, that an external chime rings can be configured through the Home APIs. If the external chime supports a chime duration, a user may want to configure this.

The value set here is dependent on the specifications of the external chime itself, and what its recommended chime duration is.

To change the external chime duration, update the externalChimeDurationSeconds attribute of the ChimeTrait trait using the built-in setExternalChimeDurationSeconds function:

// Change the external chime duration
func setExternalChimeDuration(to value: UInt16) async {
  do {
    _ = try await doorbellChimeTrait.update {
      $0.setExternalChimeDuration(value)
    }
  } catch {
    // Error
  }
}